GitHub Copilot Will Use Your Code for AI Training Starting April 24 — How to Opt Out

GitHub Copilot Will Use Your Code for AI Training Starting April 24 — How to Opt Out

Microsoft has announced that starting April 24, 2026, GitHub will use Copilot interaction data — including your code inputs, outputs, snippets, and context — to train its AI models. The change affects all Copilot Free, Pro, and Pro+ users by default, though you can opt out.

What Data Is Being Collected

The scope of data collection is extensive. GitHub will use:

  • Code snippets — both what you write and what Copilot suggests
  • Accepted or modified outputs — tracking which suggestions you use
  • Code context — surrounding code near your cursor position
  • Comments and documentation — text around your code
  • File names and repository structure — how your projects are organized
  • Navigation patterns — how you move through your codebase
  • Chat interactions — conversations with Copilot chat and inline suggestions
  • Feedback signals — thumbs up/down ratings on suggestions

Who Is Affected

The policy change impacts:

  • Copilot Free users — affected by default
  • Copilot Pro users — affected by default
  • Copilot Pro+ users — affected by default

Copilot Business and Enterprise users are NOT affected. Their data usage policies remain unchanged — a clear recognition that enterprise customers have stricter data governance requirements.

How to Opt Out

If you don't want your data used for AI training, here's what to do:

  1. Go to github.com/settings/copilot/features
  2. Under the Privacy heading, find "Allow GitHub to use my data for AI model training"
  3. Disable the toggle

If you previously opted out of data collection for product improvements, your preference has been preserved — you don't need to take action again.

Where Does Your Data Go?

GitHub states the data may be shared with "affiliates" — which includes Microsoft and companies in its corporate family. However, GitHub explicitly states this data will not be shared with third-party AI model providers or independent service providers.

Microsoft's justification: adding interaction data from Microsoft employees led to "meaningful improvements" in suggestion acceptance rates. They want the same signal from the broader developer community.

The Developer Community Reaction

The announcement has sparked significant pushback on Hacker News, Reddit, and developer forums. Key concerns include:

  • Opt-out vs. opt-in: Critics argue this should have been opt-in, not opt-out by default
  • Proprietary code exposure: Developers worry their private code patterns could influence Copilot's suggestions to other users
  • Trust erosion: This feels like a bait-and-switch — developers adopted Copilot without expecting their code to become training data
  • Legal implications: Questions about code ownership and intellectual property when AI models are trained on developer inputs

Bottom Line

GitHub's decision to use Copilot interaction data for AI training is a calculated trade-off: better AI suggestions in exchange for your code being part of the training pipeline. The opt-out mechanism exists, but the default-on approach will ensure the vast majority of developers contribute data whether they realize it or not. If you value code privacy, go to your settings before April 24 and disable the toggle. If you're on Copilot Business or Enterprise, you're unaffected — for now.

Frequently Asked Questions

When does this take effect?

April 24, 2026. Data collection will begin automatically on that date unless you opt out before then.

Is my private repository code safe?

GitHub collects interaction data (inputs, outputs, context) regardless of repository visibility. However, they state the raw code won't be directly exposed to other users — it's used to improve model quality.

Can I still use Copilot without sharing data?

Yes. You can disable the data sharing toggle in settings and continue using Copilot normally. The opt-out only affects whether your data is used for training.

Are enterprise users affected?

No. Copilot Business and Copilot Enterprise users are explicitly excluded from this policy change.