Podcast Episode
This data creates what researchers describe as a high-fidelity behavioural fingerprint. By combining the User Action Sequence data with a method called Candidate Isolation, which scores content based solely on individual user history, the researcher demonstrated how anonymous accounts could be matched to known identities with what he described as abnormally high accuracy.
X Open-Sources Algorithm, Inadvertently Creates Blueprint for Unmasking Anonymous Users
February 2, 2026
Audio archived. Episodes older than 60 days are removed to save server storage. Story details remain below.
X has published its full recommendation algorithm as open source, but security researchers have discovered the code contains detailed behavioral tracking that could be weaponised to identify anonymous account holders through their unique interaction patterns.
The Transparency Paradox
X has released its complete recommendation algorithm as open-source code on GitHub, fulfilling Elon Musk's January pledge to provide unprecedented visibility into how the platform decides what users see. The move comes amid mounting regulatory pressure from the European Union, which fined the company one hundred and twenty million euros in December for violating transparency requirements under the Digital Services Act.Hidden Privacy Implications
An OSINT researcher posting under the handle Harrris0n has identified troubling privacy implications buried within the code. At the centre of the discovery is a feature called the User Action Sequence, a transformer context that records granular details of user behaviour including millisecond-level scroll patterns, which accounts a user blocks, specific content preferences, and the exact timing of interactions.This data creates what researchers describe as a high-fidelity behavioural fingerprint. By combining the User Action Sequence data with a method called Candidate Isolation, which scores content based solely on individual user history, the researcher demonstrated how anonymous accounts could be matched to known identities with what he described as abnormally high accuracy.
Low Barrier to Entry
The tools required to potentially de-anonymise users are relatively accessible. According to the analysis, an individual would need only the action sequence encoder now publicly available in the repository, an embedding similarity search capability, and training data of confirmed alternative accounts. As the researcher noted, you can easily change your username, but it is much harder to change your habits.Regulatory Context
The open-source release follows the European Commission's December fine for X's misleading verification system, insufficient advertising transparency, and failure to provide researchers access to public data. Musk has pledged to update the algorithm code every four weeks with developer notes, though privacy advocates warn this transparency creates opportunities for exploitation of behavioural data that users never intended to expose.Published February 2, 2026 at 3:10pm