Your Old Employer May Be Selling Your Emails to AI (2026)
You left that startup three years ago. The company ran out of money six months later, and the founders moved on. But your Slack messages did not move on with them. Neither did your emails, your Jira tickets, your Google Docs comments, or the time you vented about a coworker in a direct message you assumed nobody would ever read again. A growing number of shuttered companies are now selling that internal data — your data — to AI companies willing to pay $10,000 to $100,000 per deal. And you will almost certainly never be told.
The Market for Dead Company Data
When a startup dies, its remaining assets go through a wind-down process. Office furniture gets auctioned. Domain names get sold. Intellectual property gets transferred to creditors. Increasingly, internal communications are being treated as just another asset in that liquidation.
Platforms like SimpleClosure, which help founders shut down companies, have begun brokering deals to sell internal data archives to AI companies. The appeal is straightforward: AI training requires massive volumes of natural human language, and internal corporate communications are a rich, largely untapped source. Slack threads contain unscripted conversation. Email chains show how people actually make decisions. Jira tickets document technical problem-solving in real time. This is exactly the kind of data AI labs struggle to find on the open web, where content is increasingly polluted by SEO spam and AI-generated text.
Nearly 100 of these deals have already closed as of early 2026. The transactions are quiet. There is no public database. Former employees are not consulted and not notified. The data changes hands during the wind-down process, packaged alongside patents and customer lists as part of a final asset sale designed to return whatever value possible to investors and creditors.
What Exactly Gets Sold
The scope of data in these deals is broad. Internal communication platforms like Slack generate enormous archives — a 50-person company operating for three years can easily produce hundreds of thousands of messages. When that archive is sold, it typically includes:
- Public channel messages and private channel messages
- Direct messages between individuals
- Internal email threads (Google Workspace or Microsoft 365 exports)
- Jira, Asana, and Linear tickets with full comment history
- Google Docs, Notion pages, and Confluence wikis
- Code review comments on GitHub or GitLab
- Meeting notes and calendar event descriptions
- Hiring pipeline discussions and candidate evaluations
This data contains names. It contains opinions. It contains things people said in confidence — complaints about management, salary negotiations, medical leave discussions, personal struggles shared with a trusted colleague over DM. The employees who wrote these messages did so in an environment they understood to be private, or at least internal. None of them imagined their words would end up in a training dataset for a language model.
Why AI Companies Want Your Workplace Data
The AI training data market has a supply problem. The most accessible data sources — the public web, Wikipedia, published books, Reddit — have already been scraped extensively. AI labs are running into diminishing returns and legal challenges. Major platforms have started letting users opt out of training, and several high-profile lawsuits over web scraping have made companies cautious about data sourcing.
Internal corporate data fills a gap that public data cannot. It contains natural human dialogue — not the performative writing people do on social media, but the way people actually communicate with colleagues. It includes domain expertise in fields like engineering, finance, legal, and healthcare. And it comes pre-organized with context: a Slack thread about debugging a production outage is inherently more structured and informative than a random Stack Overflow post.
For AI companies building tools aimed at enterprise customers — writing assistants, code completion, meeting summarizers — workplace data is not just useful, it is the ideal training material. The irony is thick: the data your former employer's team generated is being used to train the tools that will replace teams like the one you worked on.
The Legal Gray Zone
Most employment agreements contain clauses granting the company ownership of work product and communications made on company systems. These clauses were written to protect intellectual property — trade secrets, source code, client information — not to authorize the bulk sale of employee conversations to third parties for AI training. But the language is typically broad enough to cover it.
When a company is solvent, employees have some theoretical leverage: they can raise concerns internally, file complaints, or leave. When a company is dead, those channels disappear. The wind-down is managed by a skeleton crew or an outside service. The former CEO is focused on closing out obligations, not on whether Sarah from engineering is comfortable with her Slack messages being sold to an AI lab.
Federal privacy law in the United States does not specifically address this scenario. There is no statute requiring employers to notify former employees when internal communications are sold. State laws offer some protection — California's CCPA gives residents the right to request deletion of personal information held by businesses, but exercising that right against a company that no longer exists is a practical impossibility. State privacy laws are catching up, but enforcement against defunct entities remains an open question.
In the EU, GDPR provides stronger protections. Employee communications containing personal data require a legal basis for processing, and selling that data to a third party for an unrelated purpose likely requires fresh consent. But cross-border enforcement is slow, and many of the companies in question were US-based startups with global employees who have limited recourse.
A New Category of Data Exposure
Most people think about data exposure in terms of breaches — a hacker gets into a database, credit card numbers leak, your SSN ends up on the dark web. This is different. This is not a breach. It is a transaction. Your data is not being stolen; it is being sold by the entity that collected it, following a process that is arguably legal.
What makes it particularly insidious is the permanence. When your credit card number leaks, you can cancel the card. When your email address appears in a breach database, you can change it. But once your workplace conversations enter an AI training set, there is no mechanism to remove them. The data becomes part of the model's weights. Your words, your name, your communication patterns — they are baked into the system. No opt-out form addresses this.
Consider what a determined bad actor could do with this information. If an AI model has ingested your Slack messages, it may be possible to extract information about you through careful prompting — your communication style, your opinions, your relationships with colleagues, details about projects you worked on. Combine that with a data broker profile that links your name to your current employer, home address, and phone number, and you have a deeply personal dossier assembled from sources you never authorized.
The Data Broker Multiplier Effect
This is where the data broker ecosystem makes the problem worse. On its own, a Slack message with your name attached is a single data point. But data brokers have been independently assembling comprehensive profiles on you for years. People-search sites like Spokeo, BeenVerified, and Whitepages aggregate and sell your full name, email addresses, phone numbers, home address, employment history, family members, and social media accounts.
When your name appears in a sold Slack archive, that name can be cross-referenced against people-search sites to build a complete picture. Your former employer sold your messages. Data brokers sell your identity. Together, they connect what you said to who you are, where you live, and how to reach you.
This cross-referencing is not hypothetical. It is the standard operating procedure for social engineering attacks, targeted phishing, and identity theft. Attackers routinely combine data from multiple sources — breach databases, people-search sites, social media — to construct detailed profiles of targets. Workplace data from dead companies is simply a new input to that pipeline.
What You Can Do About It
The uncomfortable truth is that you cannot retroactively prevent your former employer from selling data you generated on their systems. The messages are already written. The archives already exist. And the legal frameworks that might have protected you were not designed for this scenario. But you can reduce the damage.
- Remove your data from broker databases. You cannot control whether your Slack messages were sold, but you can control whether your name, email, and employment history are freely available to anyone who wants to connect those messages to your real identity. Data removal services automate this process across hundreds of sites. GhostVault covers 500+ data brokers for $3.99/month.
- Audit your past employers. Make a list of every company you have worked for. Check whether any have shut down. If they used a wind-down service, your data may have been included in an asset sale. Knowing is the first step.
- Exercise state privacy rights where applicable. If you are a California resident, the CCPA gives you the right to request deletion of personal information. If the acquiring company is still operational, you can file a request directly. Other states have similar laws with varying scope.
- Be deliberate about workplace communications going forward. This is not about paranoia — it is about awareness. Anything you write on a company system is company property. Assume that Slack messages, emails, and documents may outlive the company itself. Use personal devices for personal conversations.
- Monitor your digital footprint. Set up alerts for your name and email addresses. If your information surfaces in unexpected contexts — AI model outputs, data broker listings, breach databases — early detection gives you the best chance of mitigating the exposure.
- Support privacy legislation. The reason these transactions happen in a gray zone is that the law has not caught up. Federal privacy legislation that addresses employee data rights, notification requirements for bulk data sales, and AI training data transparency would change the equation. Contact your representatives.
Break the Cross-Reference Pipeline
You cannot pull your Slack messages out of an AI training set. But you can remove the data broker profiles that link your name to your current address, phone number, and personal details. GhostVault continuously monitors and removes your information from 500+ data broker sites for $3.99/month — cutting off the connection between what you said at a dead company and who you are today.
The Bigger Picture
The sale of defunct company data to AI labs is a symptom of a broader structural problem: the people who generate data have almost no rights over what happens to it after the fact. You wrote those Slack messages. You composed those emails. You documented your work in Jira tickets. But legally, those artifacts belong to the company. And when the company dies, they belong to whoever buys them.
This matters beyond AI training. The same dynamic applies to social media platforms using your content for AI, fitness apps selling health data, and car manufacturers sharing driving behavior with insurers. The pattern is consistent: you create the data, someone else owns it, and they monetize it without your input.
The practical response is layered defense. You opt out where you can. You remove your information from broker databases to reduce the connective tissue between data sources. You make informed choices about what you share and where. And you push for laws that give individuals meaningful control over their own data — not as a theoretical right buried in a terms of service, but as an enforceable standard.
Nearly 100 deals have already closed. The market is growing. The next company to shut down might be one you worked for last year. The messages you wrote this morning could be in a training dataset five years from now. That is the reality. The question is whether you reduce the blast radius before it happens.
Frequently Asked Questions
Can a defunct company legally sell employee Slack messages and emails to AI companies?
In most cases, yes. When a company shuts down, its digital assets — including internal communications, emails, Slack logs, and project management data — are typically treated as corporate property. During wind-down or bankruptcy proceedings, these assets can be sold to generate returns for creditors and investors. Most employment agreements grant the company broad rights over communications made on company systems. Employees are rarely notified and have no standard legal mechanism to object, unless the data falls under specific state privacy laws like the CCPA.
How much do AI companies pay for internal corporate data from dead startups?
Deals typically range from $10,000 to $100,000 depending on the volume, quality, and domain specificity of the data. Platforms like SimpleClosure facilitate these transactions as part of the company wind-down process. AI companies value internal corporate data because it contains natural human conversation, technical discussions, and domain expertise that is difficult to obtain from public web scrapes. Nearly 100 such deals have closed as of early 2026.
What types of employee data are being sold to AI training companies?
The data sold typically includes Slack messages and direct messages, internal email threads, Jira and project management tickets, Google Docs and shared documents, code review comments, hiring discussions, and meeting notes. This data often contains full names, personal opinions, salary discussions, health disclosures, and other sensitive information that employees shared in what they believed was a private workplace context. The data is generally not scrubbed of personally identifiable information before transfer.
How can I find out if my old employer sold my workplace data to an AI company?
There is currently no centralized registry or notification requirement for these transactions. You can check whether your former employer used a wind-down service like SimpleClosure, search for bankruptcy filings that list digital asset sales, and monitor AI company disclosures about training data sources. However, transparency is severely limited. The more practical approach is to reduce your overall data footprint by removing your information from data broker sites, which limits how easily your workplace data can be connected to your real identity.
Does removing my data from brokers help if my Slack messages were sold to AI?
It helps significantly. When AI companies acquire workplace data, the messages contain your name, email address, and employment history. Data brokers independently sell this same identifying information — your full name, email addresses, phone numbers, home address, and personal details. If your broker profiles are active, anyone can cross-reference your name from a Slack message with your current contact information and home address. Removing yourself from data broker databases breaks this cross-referencing pipeline. GhostVault automates removal across 500+ broker sites for $3.99/month, continuously monitoring for re-listing.

This is just one of 500+ brokers selling your data.
GhostVault removes you from all of them automatically — and keeps you removed.