Meta Installs Keylogging Software on Employee Computers to Train Its AI Models

Meta logo with keyboard surveillance eye concept for AI data collection

Meta is installing tracking software on US-based employees' computers that captures mouse movements, keystrokes, and clicks in work-related applications — and using this behavioral data to train its artificial intelligence models. The revelation has sparked immediate backlash from privacy advocates and labor law experts who say the practice crosses ethical and potentially legal lines.

What Meta's Tracking Software Actually Does

According to sources familiar with the program, the software runs in the background on employee machines and monitors activity across a defined set of work applications including internal communication tools, document editors, and development environments. The data captured — including what employees type, how they navigate interfaces, and how they interact with software — is fed into Meta's AI training pipelines.

Meta has framed the program internally as a way to improve the accuracy and usefulness of its AI models by learning from real-world human-computer interaction patterns. The company has not publicly disclosed the program, and employees have reportedly been informed only through fine-print updates to internal acceptable-use policies.

Privacy and Legal Concerns

Employment lawyers contacted by media outlets say the program raises significant concerns under US labor law, particularly in states like California and Illinois with strict biometric and electronic monitoring statutes. California law, for example, requires explicit employee consent before employers can monitor electronic communications — and keylogging arguably qualifies.

Privacy advocates note that capturing keystrokes is among the most intrusive forms of employee monitoring possible, potentially capturing personal communications, passwords typed in company apps, and confidential client information. The fact that this data is then used to train commercial AI models adds another dimension of concern, as employees effectively become unpaid contributors to a product they have no control over.

Industry Context: AI Companies and Workforce Data

Meta is not alone in seeking novel data sources for AI training. With publicly available web data increasingly subject to copyright litigation and scraping restrictions, major AI companies have turned to proprietary datasets — including internal enterprise data — as a competitive advantage. Meta's Superintelligence Labs, launched earlier this year, has been under pressure to produce models that can compete with OpenAI and Anthropic, creating internal pressure to accelerate training data collection.

The practice is also drawing attention because it mirrors tactics that have already been challenged by regulators in Europe, where the GDPR imposes strict limits on employee monitoring and data use for commercial AI training. The EU's AI Act, which takes effect for high-risk systems in 2025, could create additional exposure for Meta's European operations.

Frequently Asked Questions

What is Meta using employee keylogging data for?

Meta is using captured employee keystroke, mouse movement, and click data from work computers to train its artificial intelligence models, according to reports. The data is collected from work-related applications only, per Meta's internal policy descriptions.

Is Meta's employee monitoring software legal?

Legal experts are divided. In many US states, broad employee monitoring is permitted with notice. However, California and Illinois have stricter laws requiring explicit consent for electronic monitoring, and the use of keystroke data for commercial AI training may exceed what employees reasonably consented to.

Have Meta employees been told about the tracking software?

Reports indicate employees were notified only via updates to internal acceptable-use policies, without a direct disclosure. Most employees were reportedly unaware the data was being used for AI training purposes.

The Bottom Line

Meta's keylogging program reveals the lengths to which AI companies will go to source behavioral training data as traditional web scraping becomes legally contested. Whether the practice survives regulatory scrutiny remains to be seen — but it sets a troubling precedent for what employers can extract from their own workforce in the name of AI development. Expect this story to generate significant regulatory attention in both the US and Europe over the coming months.