Updates to GitHub Copilot interaction data usage policy

https://github.blog/news-insights/company-news/updates-to-github-copilot-interaction-data-usage-policy/

Comments

stefankuehnelMar 25, 2026, 7:44 PM
If you scroll down to "Allow GitHub to use my data for AI model training" in GitHub settings, you can enable or disable it. However, what really gets me is how they pitch it like it’s some kind of user-facing feature:

Enabled = You will have access to the feature

Disabled = You won't have access to the feature

As if handing over your data for free is a perk. Kinda hilarious.

data-ottawaMar 25, 2026, 10:40 PM
It’s not so bad, there’s no double negative and it’s not a confusing “switch” that is always ambiguous as to whether it’s enabled or not.

In contrast when you create a a GCS bucket it uses a checkmark for enabling “public access prevention”. Who designed that modal? It takes me a solid minute to figure out if I’m publishing private data or not.

stavrosMar 26, 2026, 12:50 AM
Disabled - You won't have access to this feature of disallowing training.
lbeckman314Mar 26, 2026, 1:22 AM
"<sighs> They could've made this clearer..."

https://old.reddit.com/r/TheSimpsons/comments/26vdkf/dont_do...

a1oMar 25, 2026, 8:02 PM
I went to check on this and I have everything copilot related disabled and in the two bars that measure usage my Copilot Chat usage was somehow in 2%, how is this possible?

Before anyone comes to me to sell me on AI, this is on my personal account, I have and use it in my business account (but it is a completely different user account), I just make it a point to not use it in my personal time so I can keep my skills sharp.

hakuninMar 25, 2026, 8:20 PM
Does Github count it as copilot chat usage when you use AI search form on their website, I wonder?
a1oMar 25, 2026, 10:36 PM
I wonder if that’s it! I occasionally do some code search on GitHub and then remember it doesn’t work well and go back to searching in the IDE. I usually need to look into not the main branch because I do a lot of projects that have a develop branch where things actually happen. But that would explain so I guess this is it.
saratogacxMar 25, 2026, 8:59 PM
If you're taking about the quota bar. That is only measuring your premium request usage (models with a #.#x multiplier next to the name). If you only use the free models and code completion you won't actually consume any "usage". If you use AI code review that consumes a single request (now). Same with the Github Copilot web chat, if you use a free model, it doesn't count, if you use a premium model you get charged the usage cost.
martin-tMar 25, 2026, 9:27 PM
A few days ago, I unchecked it, only to see it checked again when I reloaded the page.

It could be incompetence but it shouldn't matter. This level of incompetence should be punished equally to malice.

TJ_FLEETMar 26, 2026, 7:13 AM
the framing is so manipulative. "you will have access to the feature" — what feature? the feature of giving away my data? at least be honest and call it what it is. i turned it off immediately but i wonder how many people just leave it because the wording makes it sound like you lose something.
petcatMar 25, 2026, 8:00 PM
I guess the "perk" is that maybe their models get retrained on your data making them slightly more useful to you (and everyone else) in the future? idk
mirekrusinMar 25, 2026, 8:49 PM
The feature is that your coding style will be in next models!
rzmmmMar 25, 2026, 9:04 PM
I wish my GPL license would transit along with my code.
mirekrusinMar 26, 2026, 6:30 AM
I said it few years back that code license doesn't exist anymore, some people just haven't realized it yet.
ImustaskforhelpMar 26, 2026, 7:49 AM
Previously, big tech used to still somehow find loopholes for GPL and licenses still had some value.

Nowadays, It genuinely feels a lot less because there are now services who will re-write the code to prevent the license.

Previously, I used to still think that somewhat non propreitory licenses like the SSPL license etc. might be interesting approaches but I feel like they aren't that much prone to this either now anymore.

So now I am not exactly sure.

UqWBcuFx6NV4rMar 25, 2026, 11:43 PM
If you are wholly confident that model training is a violation of the GPL then go sue.
tglmanMar 26, 2026, 3:06 AM
I guess freedom of study and use may include also training AI, but would be cool if all the derivate work, as AI models and generated code from AI models should be licensed as GPL, layers needed here
7bitMar 25, 2026, 8:33 PM
It's worded that way to create FOMO in the hopes people keep it enabled.

Dark pattern and dick move.

RapzidMar 25, 2026, 9:16 PM
Is that not some stock feature-flag verbiage?
bigiainMar 25, 2026, 10:09 PM
Stock dark pattern verbiage...

I'm a little surprised the options aren't "Enable" and "Ask me later".

NewJazzMar 25, 2026, 10:34 PM
But it isn't a feature, so using a feature flag is a bit weird.
UqWBcuFx6NV4rMar 25, 2026, 11:42 PM
No, it’s not. Please think like a developer and not like someone playing amateur gotcha journalist on social media. Feature flags are (ab)used in this way all the time. What is a feature? What is a feature flag? It’s like asking what authorisation is vs all your other business rules. There’s grey area.
NewJazzMar 25, 2026, 11:50 PM
"Please think like a developer" lmao if I said this to someone at my dayjob I'd be gone.
RapzidMar 26, 2026, 9:53 PM
How is it not a feature from a development standpoint? Colloquially any bit of intended functionality qualifies as a "feature" and certainly any functionality you conditionally enable/disable would be controlled by a "feature flag" regardless.
NewJazzMar 27, 2026, 6:38 AM
Because the user sees no difference in experience.
ImustaskforhelpMar 26, 2026, 7:59 AM
Thanks to your comment, I have disabled it now :-)

I agree that it feels like a dart pattern for the most part, makes me want to use codeberg/self hosted git

mentalgearMar 25, 2026, 7:39 PM
> On April 24 we'll start using GitHub Copilot interaction data for AI model training unless you opt out. Review this update and manage your preferences in your GitHub account settings.

Now "Allow GitHub to use my data for AI model training" is enabled by default.

Turn it off here: https://github.com/settings/copilot/features

Do they have this set on business accounts also by default? If so, this is really shady.

lenovaMar 25, 2026, 8:01 PM
Ugh, can't believe they made this opt-in by default, and didn't even post the direct URLs to disable in their blog post.

To add on to your (already helpful!) instructions:

- Go to https://github.com/settings/copilot/features - Go to the "Privacy" section - Find: "Allow GitHub to use my data for AI model training" - Set to disabled

thrdbndndnMar 26, 2026, 2:59 AM
I always thought "opt-in" (not "opt in") meant something you have to actively choose to enable; otherwise, it stays off. So calling something "opt-in by default" sounds like a misnomer to me.

But English is not my first language so please correct me if I'm wrong.

squeegmeisterMar 26, 2026, 4:31 AM
You are correct
inetknghtMar 25, 2026, 8:40 PM
> can't believe they made this opt-in by default

You can't believe Microslop is force-feeding people Copilot in yet another way?

> and didn't even post the direct URLs to disable in their blog post

You can't believe Microshaft didn't tell you how to not get shafted?

miohtamaMar 26, 2026, 7:21 AM
He must be new here
g947oMar 25, 2026, 7:48 PM
https://github.com/orgs/community/discussions/188488

> Why are you only using data from individuals while excluding businesses and enterprises?

> Our agreements with Business and Enterprise customers prohibit using their Copilot interaction data for model training, and we honor those commitments. Individual users on Free, Pro, and Pro+ plans have control over their data and can opt out at any time.

dormentoMar 25, 2026, 8:23 PM
Aka "they have lawyers and you usually don't, so we think we can get away with it."
gentleman11Mar 25, 2026, 9:57 PM
only big companies have access to the legal system. nobody else can afford it
apublicfrogMar 27, 2026, 11:32 PM
Most of the world aren't American and we can afford our legal systems ;).
kyleeeMar 28, 2026, 3:30 PM
Thankfully shariah court is free. Inshallah
themafiaMar 25, 2026, 8:24 PM
> and we honor those commitments.

Ah, so when the inevitable "bug" appears, and we all learn that you've completely failed to honor anything, what will be your "commitment" then? An apology and a few free months?

Time to start pushing for a self hosted git service again.

parkerswebMar 25, 2026, 8:55 PM
Yes - not impressed at all that this is opt-in default for business users. We have a policy in place with clients that code we write for them won’t be used in AI training - so expecting us to opt out isn’t an acceptable approach for a business relationship where the expectation is security and privacy.
aksssMar 25, 2026, 10:21 PM
It is not opt-in by default for business users. The feature flag doesn't show in org policies and github states that it's not scoped to business users.
parkerswebMar 25, 2026, 11:20 PM
Gah - you’re right - but given that I don’t use personal copilot - but I do manage an organisation that gives copilot to some of our developers AND I was sent an email this evening making no mention at all of business copilot being excluded it could definitely have been communicated better…
PalmikMar 26, 2026, 7:50 AM
My email does mention it clearly:

> Again, your organization's Copilot interaction data is not included in model training under this new policy, but we are excited for you to enjoy the product improvements it will unlock.

gentleman11Mar 25, 2026, 9:56 PM
What did everyone expect? I can't understand this community's trust of microsoft or startups. It's the typical land grab: start off decent, win people over, build a moat, then start shaking everybody down in the most egregious way possible.

It's just unusual how quickly they're going for the shakedown this time

martinwoodwardMar 25, 2026, 7:55 PM
Just confirming, we do not use Copilot interaction data for model training of Copilot Business or Enterprise customers.
verdvermMar 26, 2026, 6:15 AM
You shouldn't do it for public by opt-in, it should be opt-out. But that is the Microslop effect on GitHub, users are an afterthought.
whynotmaybeMar 26, 2026, 12:48 AM
Per their blog post

> Business and Copilot Enterprise users are not affected by this update.

archbMar 25, 2026, 7:41 PM
Interestingly, it is disabled by default for me.
crashingintoyouMar 25, 2026, 7:51 PM
Reading the github blog post "If you previously opted out of the setting allowing GitHub to collect this data for product improvements, your preference has been retained—your choice is preserved, and your data will not be used for training unless you opt in."
verdvermMar 26, 2026, 6:26 AM
Is this the new name for the setting? I cannot find one that sounds like the previous one you mention

Notable that they have no "privacy" section in account settings

gpmMar 25, 2026, 7:44 PM
Me too, which is making me wonder if they're planning on silently flipping this setting on April 24th (making it impossible to opt out in advance).
martinwoodwardMar 26, 2026, 3:27 AM
We are not. The reason we wanted to announce early was so that folks had plenty of time to opt-out now. We've also added the opt-out setting even if you don't use Copilot so that you can opt-out now before you forget and then if you decide to use Copilot in the future it will remember your preference.
pred_Mar 26, 2026, 7:03 AM
Would you be able to comment on https://news.ycombinator.com/item?id=47522876, i.e. explain the legal basis for this change for EU based users? If there is none, you may have to expect that people will exercise their right to lodge a complaint with a supervisory authority.
ThrowawayB7Mar 26, 2026, 3:24 PM
Why would you expect an engineer to be able to comment on legal affairs? Presumably it was cleared with Microsoft's legal department or whatever GitHub's divisional equivalent is.
1718627440Mar 27, 2026, 10:26 AM
That's precisely what the term 'engineer' signifies. (I know it gets used incorrectly for software developers.) Workers in general need to decide whether something is legal independently of their company, because the company lawyers have the interest of the company in mind, which might conflict with the workers interest to not do illegal things.

Big Tech is known for clearing illegal things by their legal departments all the time.

spiderfarmerMar 25, 2026, 7:45 PM
Is it because I'm in the EU?
paularmstrongMar 25, 2026, 7:51 PM
I'm in the US and it's off for me. I believe I've previously opted out of everything copilot related in the past if there was anything.
gpmMar 25, 2026, 9:05 PM
I'm in Canada, so not only the EU at least.
xgdgscMar 26, 2026, 1:14 AM
I guess we have to check out again on April 24 ?
pjmlpMar 26, 2026, 9:20 AM
We have a business account, and because of issues like this, access to anything CoPilot is blocked.
DavidSJMar 25, 2026, 7:53 PM
> Do they have this set on business accounts also by default? If so, this is really shady.

Looks like not, but would it actually have been shadier, or are we just used to individual users being fucked over?

hrmtst93837Mar 25, 2026, 8:13 PM
If they turned it on for business orgs, that would blow up fast. The line between "helpful telemetry" and "silent corporate data mining" gets blurry once your team's repo is feeding the next Copilot.

People are weirdly willing to shrug when it's some solo coder getting fleeced instead of a company with lawyers and procurement people in the room. If an account tier is doing all the moral cleanup, the policy is bad.

AbanoubRodolfMar 26, 2026, 6:55 AM
The individual/corporate asymmetry you're describing is standard across B2B SaaS. Slack, Notion, and Figma all include ML training carve-outs in enterprise DPAs that free users don't get. GitHub isn't doing anything unusual here — they're just doing it with code, which feels more sensitive than documents or messages because it might literally be your employer's IP you're working on from a personal account.

The interesting nuance is the enforcement mechanism. martinwoodward clarified below that exclusion happens at the user level, not the repo level: if you're a member of a paid org, your interaction data is excluded even on a free personal Copilot account. That's actually more protective than I expected — it handles the contractor case where someone works across multiple repos of varying org types.

The remaining ambiguity is temporal: if someone leaves an org, do their historical interactions get retroactively included? Policy answers to that question are hard to verify and even harder to audit.

QuadrupleAMar 25, 2026, 9:51 PM
Fun fact: Copilot gives you no way to ignore sensitive files with API keys, passwords, DB credentials, etc.: https://github.com/orgs/community/discussions/11254#discussi...

So by default you send all this to Microsoft by opening your IDE.

0xbadcafebeeMar 25, 2026, 11:32 PM
Separate fun fact: Gemini CLI blocks env vars with strings like 'AUTH' in the name. They have two separate configuration options that both let you allow specific env vars. Neither work (bad vibe coding). Tried opening an issue and a PR, and two separate vibe-coding bots picked up my issue and wrote PRs, but nobody has looked at them. Bug's still there, so can't do git code signing via ssh agent socket. Only choice is to do the less-secure, not-signed git commits.

On top of that, Gemini 3 refuses to refactor open source code, even if you fork it, if Gemini thinks your changes would violate the spirit of the intent of the original developers in a safety/security context. Even if you think you're actually making it more secure, but Gemini doesn't, it won't write your code.

WatchDogMar 26, 2026, 3:07 AM
Gemini also won't help you with C++ if you are under 18, since it would be unsafe.

https://news.ycombinator.com/item?id=39632959

verdvermMar 26, 2026, 6:22 AM
Is it still true? That's two years old
WatchDogMar 26, 2026, 7:48 AM
It's improved significantly in that time, but relative to the other frontier models, it is still the one that is the most condescending and coddling.
verdvermMar 26, 2026, 6:23 AM
I use Gemini 3 to edit multiple forks. Your statement is false based on stuff I actually do.
0xbadcafebeeMar 26, 2026, 7:07 AM
Well it's true based on my running into the issue 8 hours ago
verdvermMar 26, 2026, 6:48 PM
Maybe it's your prompts? I've never had Gemini refuse to write any code in any context. I use it with Claude prompts, edited down, in particular to remove guardrails.

You shouldn't use Google Ai products, they are inferior. Their models are quite good. It's confusing when people use the model name when referring to a product. What's your setup?

nulld3vMar 26, 2026, 12:01 AM
Sadly, this issue is systemic: https://github.com/openai/codex/issues/2847
stavrosMar 26, 2026, 2:11 AM
OpenCode has a plugin that lets you add an .ignore file (though I think .agentignore would be better). The problem is that, even though the plugin makes it so the agent can't directly read the file, there's no guarantee the agent will try to be helpful and do something like "well I can't read .envrc using my read tool, so let me cat .envrc and read it that way".
solaire_oaMar 28, 2026, 1:46 AM
This points out that agentic security flaws are worse than "systemic", they're the feature. Agents are literal backdoors.

It's so bizarre to be discussing minor security concerns of backdoors, like trying to block env vars. Of course the maintainers don't care about blocking env vars. It's security theater.

sceptic123Mar 26, 2026, 8:52 AM
Fun fact: you shouldn't have sensitive files with API keys, passwords, DB credentials, etc. in your repo
wzddMar 26, 2026, 9:05 AM
“In your repo” and “in the directory you are running copilot” are two separate things.
nunezMar 28, 2026, 6:06 AM
It’s fine to have them in your repo if they’re encrypted and the private key isn’t in there as well!
malnourishMar 25, 2026, 11:18 PM
I swear I just set up enterprise and org level ignore paths.
veverkapMar 25, 2026, 11:27 PM
Yeah, it's a Copilot Business/Enterprise feature
kevcampbMar 29, 2026, 6:03 AM
This is terrifying. Github was the one provider I did not expect to make such an action. We're now playing whack-a-mole with vendors to try and ensure that our company IP doesn't end up being used to train a model.
AlexeyBelovMar 29, 2026, 7:33 AM
> Github was the one provider I did not expect to make such an action

There is no Github anymore in any meaningful way. It's Microsoft. Github doesn't have a CEO anymore.

section_meMar 25, 2026, 7:30 PM
If I'm paying, which I am, I want to have to opt-in, not opt-out, Mario Rodriguez / @mariorod needs to give his head a wobble.

What on earth are they thinking...

sphMar 25, 2026, 7:36 PM
> What on earth are they thinking...

@mariorod's public README says one of his focuses is "shaping narratives and changing \"How we Work\"", so there you go.

fmjreyMar 25, 2026, 8:08 PM
Translation: more alignment with Microsoft practices
section_meMar 25, 2026, 7:42 PM
"shaping narratives", sounds like they follow the methodologies of a current president
okanatMar 25, 2026, 7:50 PM
It looks like the literal translation of "manipulation" to Linkedin-speak.
efilifeMar 26, 2026, 5:17 PM
which one?
wenldevMar 25, 2026, 7:48 PM
[dead]
pred_Mar 25, 2026, 8:37 PM
What is the legal basis of this in the EU? Ignoring the fact they could end up stealing IP, it seems like the collected information could easily contain PII, and consent would have to be

> freely given, specific, informed and unambiguous. In order to obtain freely given consent, it must be given on a voluntary basis.

rennokkiMar 26, 2026, 8:50 AM
It breaks GDPR easily: GDPR enforces you to comply with opt-out by default, no workaround by prefilling before hitting submit.

While some think this applies only to personal data, then yes. But it takes only one line of code to use my phone number for testing while I test locally a register form in the application I'm developing.

Once it gets sent to Copilot I can threaten with legal action if they are not taking it down.

pred_Mar 26, 2026, 9:54 AM
Based on https://github.blog/changelog/2026-03-25-updates-to-our-priv..., it looks like they are going to go for “legitimate interest” which seems clearly overridden by data subject interests in this case, hence not lawful.

If you don't want to wait until your PII inevitably gets sent through, you can already now file a complaint to your local supervisory authority: https://www.edpb.europa.eu/about-edpb/about-edpb/members_en

edelbitterMar 27, 2026, 10:13 PM
Has there ever been a GDPR fine that actually exhausted all applicable legal challenges within a sufficiently short delay from initial violation to actually matter?
port11Mar 28, 2026, 7:45 PM
https://www.enforcementtracker.com/

Short delay: depends on your DPA, I doubt any country is fast enough. On the other hand, this is the legitimate interest of GitHub, so it would require investigation, maybe even litigation.

LadyCailinMar 25, 2026, 10:46 PM
I actually don’t seem to have this option on my GitHub settings page, which leads me to wonder if this only applies to Americans.
LauraMediaMar 25, 2026, 11:10 PM
I actually did have to manually disable this from Germany, so it might be a different reason you don't have it?
LadyCailinMar 28, 2026, 11:11 AM
Dunno! I would have expected Germany and Norway to be the same.
spartanatreyuMar 25, 2026, 11:50 PM
I have the setting in Australia.

I'd be curious to see which countries are affected

sphMar 25, 2026, 7:34 PM
Thanks to Github and the AI apocalypse, all my software is now stored on a private git repository on my server.

Why would I even spend time choosing a copyleft license if any bot will use my code as training data to be used in commercial applications? I'm not planning on creating any more opensource code, and what projects of mine still have users will be left on GH for posterity.

If you're still serious about opensource, time to move to Codeberg.

heavyset_goMar 26, 2026, 2:18 AM
Made the same choice, my open source projects with users are in maintenance mode or archived. New projects are released via SaaS, compiled artifacts or not at all.

I scratch my open source itch by contributing to existing language and OS projects where incremental change means eventually having to retrain models to get accurate inference :)

thesmartMar 25, 2026, 8:34 PM
Yeah, I'm guessing that probably because in their TOS you grant them some license work-around for running the service, which can mean anything.
midaszMar 25, 2026, 9:03 PM
I'm in my happy space selfhosting forgejo and having a runner on my own hardware
diathMar 25, 2026, 7:44 PM
> This approach aligns with established industry practices

"others are doing it too so it's ok"

theshrike79Mar 25, 2026, 8:13 PM
Ackshually Anthropic is opt-in AND they give you discounts if you enable it
stingraycharlesMar 26, 2026, 4:29 AM
It’s opt-out, not opt-in, at least for Claude Desktop and Claude Code, unless you use the API.
nodar86Mar 25, 2026, 9:18 PM
What kind of discounts? I have never heard of this
cmaMar 25, 2026, 8:53 PM
Anthropic puts up random prompts defaulting to enabled to trick you into accidentally enabling.
DeukhoofdMar 25, 2026, 7:37 PM
So basically they want to retain everyone's full codebases?

> The data used in this program may be shared with GitHub affiliates, which are companies in our corporate family including Microsoft

So every Microsoft owned company will have access to all data Copilot wants to store?

hotenMar 25, 2026, 7:47 PM
Why is there no cancel copilot subscription option here?. Docs say there should be...

Mobile

https://github.com/settings/billing/licensing

EDIT:

https://docs.github.com/en/copilot/how-tos/manage-your-accou...

> If you have been granted a free access to Copilot as a verified student, teacher, or maintainer of a popular open source project, you won’t be able to cancel your plan.

Oh. jeez.

hmate9Mar 25, 2026, 7:46 PM
For what it's worth they're not trying to hide this change at all and are very upfront about it and made it quite simple to opt out.
matltcMar 25, 2026, 8:10 PM
They didn't even link the setting in their email. They didn't even name it specifically, just vaguely gestured toward it. Dark patterns, but that's Microslop for ya
hmate9Mar 25, 2026, 8:39 PM
going to github i was greeted with a banner and a link directly to the settings for changing it
w4yaiMar 26, 2026, 2:46 AM
I've seen worse dark pattern to be honest... I don't think they're being malicious here.
ncr100Mar 26, 2026, 6:21 AM
They do not make it very simple to opt out. That is false.

On Android for instance I invite you to use the GitHub app and modify your opt-in or opt outside settings... You will find that nothing works on the settings page once you actually find the settings page after digging through a couple of layers and scrolling about 2 ft.

badthingfactoryMar 26, 2026, 12:22 AM
I appreciated the notification at the top of the screen because it prompted me to disable every single copilot feature I possibly could from my account. I also appreciated Microsoft for making Windows 11 horrible so I could fall back in love with Linux again.
_pdp_Mar 25, 2026, 8:34 PM
Microsoft doing dumb things once again.

Who in their right mind will opt into sharing their code for training? Absolutely nobody. This is just a dark pattern.

Btw, even if disabled, I have zero confidence they are not already training on our data.

I would also recommend to sprinkle copyright noticed all over the place and change the license of every file, just in case they have some sanity checks before your data gets consumed - just to be sure.

TZubiriMar 25, 2026, 7:42 PM
Two issues with this:

1- Vulnerabilities, Secrets can be leaked to other users. 2- Intellectual Property, can also be leaked to other users.

Most smart clients won't opt-out, they will just cut usage entirely.

matltcMar 25, 2026, 8:13 PM
That's me. Frankly, looking at just uninstalling VSCode because Copilot straight-up gets in the way of so much, and they stopped even bothering with features that are not related to it (with one exception of native browser in v112, which, admittedly, is great)
nine_kMar 25, 2026, 9:27 PM
VSCode can be cleaned: https://github.com/VSCodium/vscodium

(I prefer Emacs anyway, but VSCode is a worthy tool.)

stefanos82Mar 26, 2026, 12:14 AM
Serious question: let's say I host my code on this platform which is proprietary and is for my various clients. Who can guarantee me that AI won't replicate it to competitors who decide to create something similar to my product?
halfcatMar 26, 2026, 1:28 AM
If the code is ever visible to anyone else ever, you have no guarantee. If it’s actually valuable, you have to protect it the same way you’d protect a pile of gold bars.

What does “my code...for my clients” mean (is it yours or theirs)? If it’s theirs let them house it and delegate access to you. If they want to risk it being, ahem...borrowed, that’s their business decision to make.

If it’s yours, you can host it yourself and maintain privacy, but the long tail risk of maintaining it is not as trivial as it seems on the surface. You need to have backups, encrypted, at different locations, geographically distant, so either you need physical security, or you’re using the cloud and need monitoring and alerting, and then need something to monitor the monitor.

It’s like life. Freedom means freedom from tyranny, not freedom from obligation. Choosing a community or living solo in the wilderness both come with different obligations. You can pay taxes (and hope you’re not getting screwed, too much), or you can fight off bears yourself, etc.

OtherShrezzingMar 25, 2026, 8:21 PM
It’s not clear to me how GitHub would enforce the “we don’t use enterprise repos” stuff alongside “we will use free tier copilot for training”.

A user can be a contributor to a private repository, but not have that repository owner organisation’s license to use copilot. They can still use their personal free tier copilot on that repository.

How can enterprises be confident that their IP isn’t being absorbed into the GH models in that scenario?

martinwoodwardMar 25, 2026, 9:33 PM
We do not train on the contents from any paid organization’s repos, regardless of whether a user is working in that repo with a Copilot Free, Pro, or Pro+ subscription. If a user’s GitHub account is a member of or outside collaborator with a paid organization, we exclude their interaction data from model training.
8cvor6j844qw_d6Mar 26, 2026, 3:48 AM
For private repositories under a personal account, if the repo owner has opted out of model training but a collaborator has not, would the collaborator's Copilot interactions with that repo still be used for training?
lmcMar 26, 2026, 5:54 AM
Thank you for clarifying this.
danelskiMar 25, 2026, 11:53 PM
Quite simply, that's just a matter of the corporate internal policy and its (lack of) enforcement. This problem is just a subset of the wider IP breach with some people happily feeding their work documents into the free tier of ChatGPT.
robeymMar 28, 2026, 12:21 PM
There are several settings in my account relating to Copilot that are locked/enabled with a shield and key icon next to it. Any idea how to disable these settings? It's on the same settings/copilot/features page.
TZubiriMar 25, 2026, 7:45 PM
If this doesn't sound bad enough, it's possible that Copilot is already enabled. As we know this kind of features are pushed to users instead of being asked for.

Maybe it's already active in our accounts and we don't realize it, so our code will be used to train the AI.

Now we can't be sure if this will happen or not, but a company like GitHub should be staying miles away from this kind of policy. I personally wouldn't use GitHub for private corporate repositories. Only as a public web interface for public repos.

pizzafeelsrightMar 25, 2026, 8:12 PM
I am not certain this is that big of a deal outside of "making AI better".

At this point, is there any magic in software development?

If you have super-secret-content is a third party the best location?

danelskiMar 25, 2026, 11:56 PM
They've had ample access to the final output - our code, but they still hope with enough data on HOW we work they can close the agentic gap and finally get those stinky, lazy humans that demand salary out of the loop.
thesmartMar 25, 2026, 8:35 PM
How about "no." You may be okay giving away your individual rights, including to copyright, but I am not.
rectangMar 25, 2026, 9:25 PM
I just checked my Github settings, and found that sharing my data was "enabled".

This setting does not represent my wishes and I definitely would not have set it that way on purpose. It was either defaulted that way, or when the option was presented to me I configured it the opposite of how I intended.

Fortunately, none of the work I do these days with Copilot enabled is sensitive (if it was I would have been much more paranoid).

I'm in the USA and pay for Copilot as an individual.

Shit like this is why I pay for duck.ai where the main selling point is that the product is private by default.

liquid_thymeMar 25, 2026, 8:47 PM
They use data from the poor student tier, but arguably, large corporates and businesses hiring talented devs are going to create higher quality training data. Just looking at it logically, not that I like any of this...
dartfMar 28, 2026, 2:47 PM
I don't see an option to opt-out? Is it US only thing?
etothetMar 26, 2026, 12:06 PM
The fact that this is on by default, especially for paid accounts and even more especially for organizations, where certain types of privacy is sometimes mandated by the industry your business is in, is ridiculous.

There should also be a much easier one-click to opt out without having to scroll way down on the settings page.

cebertMar 25, 2026, 9:59 PM
I wish GitHub would focus on making their service reliable instead of Copilot and opting folks into their data being stolen for training.
david_allisonMar 25, 2026, 9:49 PM
I have GitHub Copilot Pro. I don't believe I signed up for it. I neither use it nor want it.

1. A lot of settings are 'Enabled' with no option to opt out. What can I do?

2. How do I opt out of data collection? I see the message informing me to opt out, but 'Allow GitHub to use my data for AI model training' is already disabled for my account.

martinwoodwardMar 25, 2026, 10:13 PM
Hey David - if you want to send me (martinwoodward at github.com) details of your GitHub account I can take a look. At a guess I suspect you are one of the many folks who qualified for GitHub Copilot Pro for free as a maintainer of a popular open source project.

Sounds like you are already opted out because you'd previously opted out of the setting allowing GitHub to collect this data for product improvements. But I can check that.

Note, it's only _usage_ data when using Copilot that is being trained on. Therefore if you are not using Copilot there is no usage data. We do not train on private data at rest in your repos etc.

david_allisonMar 25, 2026, 10:31 PM
Cheers!
ncr100Mar 26, 2026, 6:19 AM
On my Android phone I was able to change the setting using Firefox by logging into GitHub and not allowing it to launch the GitHub app.

I was unable to change the setting when I used the GitHub app to open up the web page in a container.. button clicks weren't working. Quite frustrating.

jmhammondMar 26, 2026, 12:23 AM
Mine was defaulted to disabled. I’m on the Education pro plan (academic), so maybe that’s different than personal?
OtherShrezzingMar 25, 2026, 8:23 PM
So, how does this work with source-available code, that’s still licensed as proprietary - or released under a license which requires attribution?

If someone takes that code and pokes around on it with a free tier copilot account, GitHub will just absorb it into their model - even if it’s explicitly against that code’s license to do so?

danelskiMar 25, 2026, 11:59 PM
Most of the new culture and website contents is under full copyright. How much of an obstacle was that to these companies?
thesmartMar 25, 2026, 8:31 PM
I'm ready to abandon Github. Enschitification of the world's source infrastructure is just a matter of time.
phendrenad2Mar 26, 2026, 2:20 AM
So I do all the work of thinking about how to do something, and as soon as I tell Copilot about it, not it's in the training data and anyone can ask the LLM and it'll tell them the solution I came up with? Great. I'm going to cancel.
sbinneeMar 26, 2026, 12:59 AM
Bold move. Who uses Copilot these days? Unless they have free credit I mean.
rvzMar 25, 2026, 7:43 PM
> From April 24 onward, interaction data—specifically inputs, outputs, code snippets, and associated context—from Copilot Free, Pro, and Pro+ users will be used to train and improve our AI models unless they opt out.

Now is the time to run off of GitHub and consider Codeberg or self hosting like I said before. [0]

[0] https://news.ycombinator.com/item?id=22867803

0x3fMar 25, 2026, 8:17 PM
Codeberg doesn't support non OSS and I'd rather just have one 'git' thing I have to know for both OSS and private work. So it's not a great option, IMO. Self-hosting also for other reasons.

I'm not sure there are any good GitHub alternatives. I don't trust Gitlab either. Their landing page title currently starts with "Finally, AI". Eek.

eipi10_hnMar 25, 2026, 11:39 PM
Maybe sourcehut? https://sourcehut.org
0x3fMar 26, 2026, 8:43 AM
It's an option but I can't really take the platform seriously when the owner removes content based on his personal whims. He currently removes crypto projects because of their 'social ills'. I don't work on crypto, but he might start deleting AI projects for the same reason, say.
HeliodexMar 26, 2026, 12:17 AM
Finally. The option for me to enable Copilot data sharing has been locked as disabled for some time, so until now I couldn't even enable it if I wanted to.
indigodaddyMar 25, 2026, 7:41 PM
Checked and mine was already on disabled. Don't remember if I previously toggled it or not..
martinwoodwardMar 25, 2026, 7:58 PM
If you previously opted out of the setting allowing GitHub to collect data for product improvements, your preference has been retained here. We figured if you didn't want that then you definitely wouldn't want this..
djmashko2Mar 25, 2026, 7:22 PM
> Content from your issues, discussions, or private repositories at rest. We use the phrase “at rest” deliberately because Copilot does process code from private repositories when you are actively using Copilot. This interaction data is required to run the service and could be used for model training unless you opt out.

Sounds like it's even likely to train on content from private repositories. This feels like a bit of an overstep to me.

mt42orMar 25, 2026, 7:27 PM
Is it legal ? Surely not in any EU countries.
okanatMar 25, 2026, 7:56 PM
Does it even matter? They trained AI on obviously copyrighted and even pirated content. If this change is legally significant and a legal breach, the existence of all models and all AI businesses also is illegal.
0x3fMar 25, 2026, 8:13 PM
It might or might not be legal, but it seems materially worse to screw over your direct customers than to violate the social-contracty nature of copyright law. But hey ho if you're not paying then you're the product, as ever was.
mentalgearMar 25, 2026, 7:45 PM
At least one instance where it was enabled in EU countries as well.
tuananhMar 26, 2026, 1:40 PM
making this option opt-in by default is a very shady choice, GitHub.
explodesMar 26, 2026, 6:04 AM
We all knew Microsoft was going to destroy GitHub eventually when it was first bought.

How much longer do you want to tolerate the enshittification? How much longer CAN you tolerate it?

marak830Mar 26, 2026, 12:53 AM
As it's enabled by default, does that mean everything has already been siphoned off and now I'm just closing the gate behind the animals escaping?

Shit like this shouldn't be allowed.

greatgibMar 27, 2026, 12:16 PM
And something important, that is leaking from the phrasing of their blog post, is that it is not really "Github" that wants to suck all your data "prompts, code, context, documents", ... but it is "Microsoft"!
semiinfinitelyMar 25, 2026, 8:27 PM
ill be moving off github now
baobabKoodaaMar 25, 2026, 7:42 PM
(oops)
tech234aMar 25, 2026, 7:43 PM
It’s currently March
baobabKoodaaMar 25, 2026, 7:43 PM
Oops. Thank you for correcting me!
abdelmonMar 28, 2026, 4:31 AM
[flagged]
pugchatMar 26, 2026, 1:10 AM
[dead]
iam_circuitMar 26, 2026, 12:03 AM
[dead]
ryguzMar 26, 2026, 4:24 PM
[dead]
ComputeLeapMar 26, 2026, 12:16 AM
[dead]
MooshuxMar 25, 2026, 8:41 PM
[dead]
hikaru_aiMar 26, 2026, 7:06 AM
[dead]
SilentEditorMar 25, 2026, 7:45 PM
[dead]
manudaroMar 25, 2026, 10:26 PM
[dead]
patrickRyuMar 26, 2026, 6:21 AM
[dead]
xtremehdtvMar 28, 2026, 7:46 PM
[dead]
bustahMar 25, 2026, 8:52 PM
[dead]
latand6Mar 25, 2026, 10:01 PM
Why won't people like to make the models better? Aren't we all getting the benefit after all?
danelskiMar 26, 2026, 12:01 AM
That's akin to being grateful for your local shop owner that they allowed you to sweep the floor for other customers.
latand6Mar 26, 2026, 7:45 AM
Please don’t strawman me, I asked completely different question.

It’s not about being grateful or something, but that many people (devs) are too concerned about their code being stolen as if they’ve come up with something unique and the LLMs are some kind of database (which it isn’t).

At the end of the day we’re going to be using AI to write all the code, many of us already doing that. And if some GitHub copilot model would be better - we’re getting more quality code that is generally available for next pretraining runs (for your and other models). Some would even switch to copilot if it’s good.

What do you think about it?

sayamqaziMar 26, 2026, 6:37 PM
If something is mine by right, no matter how little or lot worth it has noone shoule be allowed to force/trick me to donate it. It should just be my choice.
cindyllmMar 26, 2026, 6:41 PM
[dead]
adi_kurianMar 26, 2026, 7:42 PM
People would have a different response if they did not, in my view accurately, perceive that wool is being pulled over their eyes.