YouTube caught making AI-edits to videos and adding misleading AI summaries

https://www.ynetnews.com/tech-and-digital/article/bj1qbwcklg

Comments

randycupertinoDec 6, 2025, 2:22 AM
A makeup influencer I follow noticed youtube and instagram are automatically adding filters to his face without permission to his videos. If his content was about lip makeup they make his lips enormous and if it was about eye makeup the filters make his eyes gigantic. They're having AI detecting the type of content and automatically applying filters.

https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...

The screenshots/videos of them doing it are pretty wild, and insane they are editing creators' uploads without consent!

AurornisDec 6, 2025, 3:15 AM
The video shown as evidence is full of compression artifacts. The influencer is non-technical and assumes it's an AI filter, but the output is obviously not good quality anywhere.

To me, this clearly looks like a case of a very high compression ratio with the motion blocks swimming around on screen. They might have some detail enhancement in the loop to try to overcome the blockiness which, in this case, results in the swimming effect.

It's strange to see these claims being taken at face value on a technical forum. It should be a dead giveaway that this is a compression issue because the entire video is obviously highly compressed and lacking detail.

reactordevDec 6, 2025, 2:30 AM
I can hear the ballpoint pens now…

This is going to be a huge legal fight as the terms of service you agree to on their platform is “they get to do whatever they want” (IANAL). Watch them try to spin this as “user preference” that just opted everyone into.

apiDec 6, 2025, 2:35 AM
That’s the rude awakening creators get on these platforms. If you’re a writer or an artist or a musician, you own your work by default. But if you upload it to these platforms, they own it more or less. It’s there in the terms of service.
sodapopcanDec 6, 2025, 2:43 AM
What if someone else uploads your work?
benoauDec 6, 2025, 3:13 AM
Section 230 immunity for doing whatever they want, as long as they remove it if you complain.
echelonDec 6, 2025, 2:39 AM
This is an experiment in data compression.
glitchcDec 6, 2025, 3:09 AM
Probably compression followed by regeneration during decompression. There's a brilliant technique called "Seam Carving" [1] invented two decades ago that enables content aware resizing of photos and can be sequentially applied to frames in a video stream. It's used everywhere nowadays. It wouldn't surprise me that arbitrary enlargements are artifacts produced by such techniques.

[1] https://github.com/vivianhylee/seam-carving

jazzyjacksonDec 6, 2025, 2:43 AM
Totally. Unfortunately it's not lossless and instead of just getting pixelated it's changing the size of body parts lol
JumpCrisscrossDec 6, 2025, 3:37 AM
> This is an experiment

A legal experiment for sure. Hope everyone involved can clear their schedules for hearings in multiple jurisdictions for a few years.

echelonDec 6, 2025, 3:45 AM
As soon as people start paying Google for the 30,000 hours of video uploaded every hour (2022 figure), then they can dictate what forms of compression and lossiness Google uses to save money.

That doesn't include all of the transcoding and alternate formats stored, either.

People signing up to YouTube agree to Google's ToS.

Google doesn't even say they'll keep your videos. They reserve the right to delete them, transcode them, degrade them, use them in AI training, etc.

It's a free service.

jsheardDec 6, 2025, 2:53 AM
What type of compression would change the relative scale of elements within an image? None that I'm aware of, and these platforms can't really make up new video codecs on the spot since hardware accelerated decoding is so essential for performance.

Excessive smoothing can be explained by compression, sure, but that's not the issue being raised there.

AurornisDec 6, 2025, 3:19 AM
> What type of compression would change the relative scale of elements within an image?

Video compression operates on macroblocks and calculates motion vectors of those macroblocks between frames.

When you push it to the limit, the macroblocks can appear like they're swimming around on screen.

Some decoders attempt to smooth out the boundaries between macroblocks and restore sharpness.

The giveaway is that the entire video is extremely low quality. The compression ratio is extreme.

echelonDec 6, 2025, 3:04 AM
AI models are a form of compression.

Neural compression wouldn't be like HVEC, operating on frames and pixels. Rather, these techniques can encode entire features and optical flow, which can explain the larger discrepancies. Larger fingers, slightly misplaced items, etc.

Neural compression techniques reshape the image itself.

If you've ever input an image into `gpt-image-1` and asked it to output it again, you'll notice that it's 95% similar, but entire features might move around or average out with the concept of what those items are.

jsheardDec 6, 2025, 3:09 AM
Maybe such a thing could exist in the future, but I don't think the idea that YouTube is already serving a secret neural video codec to clients is very plausible. There would be much clearer signs - dramatically higher CPU usage, and tools like yt-dlp running into bizarre undocumented streams that nothing is able to play.
planckscnstDec 6, 2025, 3:36 AM
If they were using this compression for storage on the cache layer, it could allow more videos closer to where they serve them, but they decide the. Back to webm or whatever before sending them to the client.

I don't think that's actually what's up, but I don't think it's completely ruled out either.

jsheardDec 6, 2025, 3:40 AM
That doesn't sound worth it, storage is cheap, encoding videos is expensive, caching videos in a more compact form but having to re-encode them into a different codec every single time they're requested would be ungodly expensive.
echelonDec 6, 2025, 3:30 AM
A new client-facing encoding scheme would break utilization of hardware encoders, which in turn slows down everyone's experience, chews through battery life, etc. They won't serve it that way - there's no support in the field for it.

It looks like they're compressing the data before it gets further processed with the traditional suite of video codecs. They're relying on the traditional codecs to serve, but running some internal first pass to further compress the data they have to store.

plorgDec 6, 2025, 2:48 AM
If any engineers think that's what they're doing they should be fired. More likely it's product managers who barely know what's going on in their departments except that there's a word "AI" pinging around that's good for their KPIs and keeps them from getting fired.
echelonDec 6, 2025, 2:52 AM
> If any engineers think that's what they're doing they should be fired.

Seriously?

Then why is nobody in this thread suggesting what they're actually doing?

Everyone is accusing YouTube of "AI"ing the content with "AI".

What does that even mean?

Look at these people making these (at face value - hilarious, almost "cool aid" levels of conspiratorial) accusations. All because "AI" is "evil" and "big corp" is "evil".

Use occam's razor. Videos are expensive to store. Google gets 20 million videos a day.

I'm frankly shocked Google hasn't started deleting old garbage. They probably should start culling YouTube of cruft nobody watches.

asveikauDec 6, 2025, 2:57 AM
Videos are expensive to store, but generative AI is expensive to run. That will cost them more than storage allegedly saved.

To solve this problem of adding compute heavy processing to serving videos, they will need to cache the output of the AI, which uses up the storage you say they are saving.

echelonDec 6, 2025, 3:08 AM
https://c3-neural-compression.github.io/

Google has already matched H.266. And this was over a year ago.

They've probably developed some really good models for this and are silently testing how people perceive them.

hatmanstackDec 6, 2025, 3:00 AM
If you want insight into why they haven't deleted "old garbage" you might try, The Age of Surveillance Capitalism by Zuboff. Pretty enlightening.
echelonDec 6, 2025, 3:06 AM
I'm pretty sure those 12 year olds uploading 24 hour long Sonic YouTube poops aren't creating value.
GroxxDec 6, 2025, 3:23 AM
I largely agree, I think that probably is all that it is. And it looks like shit.

Though there is a LOT of room to subtly train many kinds of lossy compression systems, which COULD still imply they're doing this intentionally. And it looks like shit.

adzmDec 6, 2025, 2:27 AM
This is ridiculous
TazeTSchnitzelDec 6, 2025, 2:34 AM
The AI filter applied server-side to YouTube Shorts (and only shorts, not regular videos) is horrible, and it feels like it must be a case of deliberate boiling the frog. If everyone gets used to overly smooth skin, weirdly pronounced wrinkles, waxy hair, and strange ringing around moving objects, then AI-generated content will stand out less when they start injecting it into the feed. At first I thought this must be some client-side upscaling filter, but tragically it is not. There's no data savings at all, and there's no way for uploaders or viewers to turn it off. I guess I wasn't cynical enough.
apiDec 6, 2025, 2:39 AM
I’ve been saying for a while that the end game for addictive short form chum feeds like TikTok and YouTube Shorts is to drop human creators entirely. They’ll be AI generated slop feeds that people will scroll, and scroll, and scroll. Basically just a never ending feed of brain rot and ads.
coliveiraDec 6, 2025, 3:28 AM
There's already a huge number of AI generated channels in youtube. The only difference is that they're uploaded by channel owners. What's is gonna happen very quickly (if not already) is that Youtube itself will start "testing" AI content that it creates on what will look like new channels. In a matter of a few years they'll promote this "content" to occupy most of the time and views in the platform.
eagleinparadiseDec 6, 2025, 3:17 AM
I buy into this conspiracy theory, it's genius. It's literally a boiling the frog kind of strategy against users. Eventually, everyone will get too lazy to go through the mental reasoning of judging every increasingly piece of content as "is this AI" as you mentally spend energy trying to find clues.

And over time the AI content will improve enough where it becomes impossible and then the Great AI Swappening will occur.

add-sub-mul-divDec 6, 2025, 3:43 AM
Perhaps the shorter/dumber the medium and format, the less discerning an audience it attracts. We're seeing a split between people who reject the idea of content without the subtext of the human creation behind it, and people who just take content for what it is on the surface without knowing why it should matter how it was created.
bitwizeDec 6, 2025, 2:47 AM
Yes, but what happens when the AIs themselves begin to brainrot (as happens when they are not fed their usual sustenance of information from humans and the real world)?
apiDec 6, 2025, 3:03 AM
Have you seen what people watch on these things? It won’t matter. In fact, the surreal incoherent schizo stuff can work well for engagement.
rhetocj23Dec 6, 2025, 3:11 AM
[dead]
AnimatsDec 6, 2025, 3:40 AM
I'm seeing Youtube summary pictures which seem to be AI-generated. I was looking at [1], which is someone in China rebuilding old machines, and some of the newer summary pictures are not frames from the video. They show machines which are the sort of thing you might get by asking a Stable Diffusion type generator to generate a picture from the description.

[1] https://www.youtube.com/@linguoermechanic

chao-Dec 6, 2025, 2:25 AM
I learned to ignore the AI summaries after the first time I saw one that described the exact OPPOSITE conclusion/stance of the video it purported to summarize.
TrasmattaDec 6, 2025, 3:31 AM
Also I just absolutely hate the tone of them. So obviously AI, and they all have the same structure, ending in "Prepare for a journey through blah blah blah".
windexDec 6, 2025, 2:46 AM
There are entire fake persona videos these days. Leading scientists, economists, politicians, tech guys, are being impersonated wholesale on youtube.
acomjeanDec 6, 2025, 3:18 AM
I saw this today where "influencers" were taking real doctors from videos and using AI to have them pitch products.

https://www.theguardian.com/society/2025/dec/05/ai-deepfakes...

AmbroseBierceDec 6, 2025, 2:48 AM
Talking about AI, Google, and shady tactics, I wouldn't be surprised if soon we discover they are purposefully adding video glitches (deformed characters and so on) in the first handful of iterations when using Veo video generation just so people gets used to trying 3 or 4 times before they receive a good one.
VTimofeenkoDec 6, 2025, 2:55 AM
Well the current models that cost per output sure love wasting those tokens on telling me how I am the greatest human being ever that only asks questions which get to the very heart of $SUBJECT.
AmbroseBierceDec 6, 2025, 3:27 AM
You are right! Would you like me to pretend I'm able to generate better responses if you just give me more input but will end up just wasting your time and your money? And with some luck when you inevitably end up frustrated you will conclude that it was your fault for not giving me good enough input and not mine for being unable to generate good output, in other words that to you just need to get better at GPTing.
delichonDec 6, 2025, 3:08 AM
YouTube should keep their grubby hands off. And give that capability to us instead. I want the power to do personal AI edits built in. Give me a prompt line under each video. Like "replace English with Gaelic", "replace dad jokes with lorem ipsum", "make the narrator's face 25% more symmetrical", "replace the puppy with a xenomorph", "change the setting to Monument Valley", etc.
someothherguyyDec 6, 2025, 3:14 AM
i wonder how many years (decades?) out this is still. it would be wild to be able to run something like that locally in a browser. although, it will probably be punishable by death by then.
data-ottawaDec 6, 2025, 2:25 AM
Are these AI filters, or just applying high compression/recompressing with new algorithms (which look like smoothing out details)?

edit: here's the effect I'm talking about with lossy compression and adaptive quantization: https://cloudinary.com/blog/what_to_focus_on_in_image_compre...

The result is smoothing of skin, and applied heavily on video (as Youtube does, just look for any old video that was HD years ago) would look this way

AurornisDec 6, 2025, 3:27 AM
It's compression artifacts. They might be heavily compressing video and trying to recover detail on the client side.
randycupertinoDec 6, 2025, 2:26 AM
It's filters, I posted an example of it below. Here is a link: https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...
data-ottawaDec 6, 2025, 2:47 AM
It's very hard to tell in that instagram video, it would be a lot clearer if someone overlaid the original unaltered video and the one viewers on YouTube are seeing.

That would presumably be an easy smoking gun for some content creator to produce.

There are heavy alterations in that link, but having not seen the original, and in this format it's not clear to me how they compare.

randycupertinoDec 6, 2025, 2:59 AM
you can literally see the filters turn on and off making his eyes and lips bigger as he moves his face. It's clearly a face filter.
diputsmonroDec 6, 2025, 3:16 AM
To be extra clear for others, keep watching until about the middle of the video where he shows clips from the YouTube videos
jeffbeeDec 6, 2025, 3:26 AM
What would "unaltered video" even mean.
ares623Dec 6, 2025, 2:30 AM
The time of giving these corps the benefit of the doubt is over.
echelonDec 6, 2025, 2:44 AM
The examples shown in the links are not filters for aesthetics. These are clearly experiments in data compression

These people are having a moral crusade against an unannounced Google data compression test thinking Google is using AI to "enhance their videos". (Did they ever stop to ask themselves why or to what end?)

This level of AI paranoia is getting annoying. This is clearly just Google trying to save money. Not undermine reality or whatever vague Orwellian thing they're being accused of.

skygazerDec 6, 2025, 3:05 AM
"My, what big eyes you have, Grandmother." "All the better to compress you with, my dear."
randycupertinoDec 6, 2025, 2:55 AM
Why would data compression make his eyes bigger?
echelonDec 6, 2025, 3:14 AM
Because it's a neural technique, not one based on pixels or frames.

https://blog.metaphysic.ai/what-is-neural-compression/

Instead of artifacts in pixels, you'll see artifacts in larger features.

https://arxiv.org/abs/2412.11379

Look at figure 5 and beyond.

brailsafeDec 6, 2025, 3:14 AM
Whatever the purpose, it's clearly surreptitious.

> This level of AI paranoia is getting annoying.

Lets be straight here, AI paranoia is near the top of the most propagated subjects across all media right now, probably for worse. If it's not "Will you ever have a job again!?" it's "Will your grandparents be robbed of their net worth!?" or even just "When will the bubble pop!? Should you be afraid!? YES!!!" and also in places like Canada where the economy is predictably crashing because of decades of failures, it's both the cause and answer to macro economic decline. Ironically/suspiciously it's all the same re-hashed redundant takes by everyone from Hank Green to CNBC to every podcast ever, late night shows, radio, everything.

So to me the target of one's annoyance should be the propaganda machine, not the targets of the machine. What are people supposed to feel, totally chill because they have tons of control?

kaveh_hDec 6, 2025, 3:51 AM
Most of the SV billionaires have become crazy high on pursuits of profits. Their ilk have hacked democracy and made many of us prisoners of a famous hotel few can checkout from deliberately.

JD Vance demands for European leaders to bow down to daddy Thiel and friends at Munich security conference shows just how much the rot have spread. That signal and these missteps should hopefully wake us up from the brink of societal collapse.

JumpCrisscrossDec 6, 2025, 3:33 AM
FYI, I used to pay for YouTube Premium and have since stopped doing that. Deleting the app and letting ad blockers filter out this nonsense is a superior experience.

Strongly recommend. We’ll get local AIs that can skip the cruft soon enough anyway.

alex1138Dec 6, 2025, 3:52 AM
Someone mentioned Insta is doing this too in this comment section

I just completely despair. What the fuck happened to the internet? Absolutely none of these CEOs give a shit. People need to face real punishments

muppetmanDec 6, 2025, 2:52 AM
They're heating the garbage slightly before serving it? Oh no.
superkuhDec 6, 2025, 2:25 AM
The citation chain for these mastodon reposts resolves to the Gamers Nexus piece on youtube https://www.youtube.com/watch?v=MrwJgDHJJoE
cyostDec 6, 2025, 3:32 AM
TheTaytayDec 6, 2025, 3:23 AM
Yes! Thank you! He is talking about AI generated summaries being inaccurate, which is plenty to get up in arms about.

A lot of folks here hate AI and YouTube and Google and stuff, but it would be more productive to hate them for what they are actually doing.

But most people here are just taking this headline at face value and getting pitchforks out. If you try to watch the makeup guy’s proof, it’s talking about Instagram (not YouTube), doesn’t have clean comparisons, is showing a video someone sent back to him, which probably means it’s a compression artifact, not a face filter that the corporate overlords are hiding from the creator. It is not exactly a smoking gun, especially for a technical crowd.

jeffbeeDec 6, 2025, 3:29 AM
I, for one, find it extremely odd that any of these video posters believe they get to control whether or not I use, directly or indirectly, an AI to summarize the video for me.
stevenaloweDec 6, 2025, 2:31 AM
Every YT short looks AI-ified and creepy now
koolbaDec 6, 2025, 2:46 AM
What’s the point of doing this?

I don't understand the justification for the expense or complexity.

coliveiraDec 6, 2025, 3:31 AM
Every engineer on Google is now measured on how much AI they use on their products. This is the predictable result.
choiliveDec 6, 2025, 2:49 AM
What PM thought this was a good idea? This has to be the result of some braindead we need more AI in the product mandate
jeeebDec 6, 2025, 2:35 AM
I really hate all the AI filters in videos. It makes everyone look like fake humans. I find it hard to believe that anyone would actually prefer this.
halaproDec 6, 2025, 3:32 AM
I don't find it hard to believe at all. Have you seen all the "240fps TVs" being sold for the past 15 years? They all apply some weird fake smoothing and people prefer them.
ChrisArchitectDec 6, 2025, 3:27 AM
Being driven mad by conspiracy paranoia about 'face filters' (possible compression artifacts) is a great example of being AI-pilled.

And then the discourse is so riddled with misnomers and baited outrage that it goes nowhere.

The other example in submitted post isn't 'edits to videos' but rather the text descriptions of automated captions. The Gemini/AI engine not being very good at summarizing is a different issue.

AurornisDec 6, 2025, 3:24 AM
This link is to a Mastodon thread which links to another blog post which links to an actual source on ynetnews.com which quotes another article that has a quote from a YouTube rep. Save yourself the trouble and go straight to that article (although it's not great either): https://www.ynetnews.com/tech-and-digital/article/bj1qbwcklg

The key section:

> Rene Ritchie, YouTube’s creator liaison, acknowledged in a post on X that the company was running “a small experiment on select Shorts, using traditional machine learning to clarify, reduce noise and improve overall video clarity—similar to what modern smartphones do when shooting video.”

So the "AI edits" are just a compression algorithm that is not that great.

kridsdale1Dec 6, 2025, 3:51 AM
THAT Rene Ritchie? Cool, I wondered what happened to him. I listened to his podcast all the time in the 2000s, when podcasts were synced to an iPod over USB before your commute.
filleduchaosDec 6, 2025, 3:29 AM
"Clarify, reduce noise, and improve overall video clarity" is not "just a compression algorithm", what? Words have meanings.
BorealidDec 6, 2025, 3:44 AM
Noise is, because of its random nature, inherently less compressible than a predictable signal.

So counterintuitively, noise reduction improves compression ratios. In fact many video codecs are about determining which portion of the video IS noise that can be discarded, and which bits are visually important...

seanmcdirmidDec 6, 2025, 3:40 AM
“a small experiment on select Shorts, using traditional machine learning to clarify, reduce noise and improve overall video clarity—similar to what modern smartphones do when shooting video.”

It looks like quality cleanup, but I can’t imagine many creators aren’t using decent camera tech and editing software for shorts.

Forgeties79Dec 6, 2025, 3:37 AM
Compression literally makes all those things worse
somnicDec 6, 2025, 3:47 AM
YouTube is not hosting and serving uncompressed video so the apt comparison is not "compression" to "no compression" rather than "fancy experimental compression" to "tried and true compression."
SilverElfinDec 6, 2025, 2:46 AM
I’ve also noticed YouTube has unbanned many channels that were previously banned for overt supremacist and racist content. They get amplified a lot more now. Between that and AI slop, I feel like Google is speed running the changes X made over the last few years.
MaxL93Dec 6, 2025, 2:31 AM
"Making AI edits to videos" strikes me as as bit of an exaggeration; it might lead you to think they're actually editing videos rather than simply... post-processing them[1].

That being said, I don't believe they should be doing anything like this without the creator's explicit consent. I do personally think there's probably a good use case for machine learning / neural network tech applied to the clean up of low-quality sources (for better transcoding that doesn't accumulate errors & therefore wastes bitrate), in the same way that RTX Video Super Resolution can do some impressive deblocking & upscaling magic[2] on Windows. But clearly they are completely missing the mark with whatever experiment they were running there.

[1] https://www.ynetnews.com/tech-and-digital/article/bj1qbwcklg

[2] compare https://i.imgur.com/U6vzssS.png & https://i.imgur.com/x63o8WQ.jpeg (upscaled 360p)

ssl-3Dec 6, 2025, 2:36 AM
Please allow me "post-process" your comment a bit. Let me know if I'm doing this right.

> "Making AI edits to videos" strikes me as something particularly egregious; it leads a viewer to see a reality that never existed, and that the creator never intended.

randycupertinoDec 6, 2025, 2:37 AM
It's not post-processing, they are applying actual filters, here is an example they make his eyes and lips bigger: https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...
MaxL93Dec 6, 2025, 2:45 AM
Sure, but that's not YouTube. That's Instagram. He says so at 1:30.

YouTube is not applying any "face filters" or anything of the sort. They did however experiment with AI upscaling the entire image which is giving the classic "bad upscale" smeary look.

Like I said, I think that's still bad and they should have never done it without the clear explicit consent of the creator. But that is, IMO, very different and considerably less bad than changing someone's face specifically.

randycupertinoDec 6, 2025, 2:48 AM
His followers also added screenshots of youtube shorts doing it. He says he reached out to both platforms and says he will be reporting back with an update from their customer service and is doing some compare an contrast testing for his audience.

Here's some other creators also talking about it happening in youtube shorts: https://www.reddit.com/r/BeautyGuruChatter/comments/1notyzo/...

another example: https://www.youtube.com/watch?v=tjnQ-s7LW-g

https://www.reddit.com/r/youtube/comments/1mw0tuz/youtube_is...

https://www.bbc.com/future/article/20250822-youtube-is-using...

MaxL93Dec 6, 2025, 2:53 AM
> Here's some other creators also talking about it happening in youtube shorts (...)

If you open the context of the comment, they are specifically talking about the bad, entire-image upscaling that gives the entire picture the oily smeary look. NOT face filters.

EDIT : same thing with the two other links you edited into your comment while I was typing my reply.

Again, I'm not defending YouTube for this. But I also don't think they should be accused of doing something they're not doing. Face filters without consent are a far, far worse offense than bad upscaling.

I would like to urge you to be more cautious, and to actually read what you brandish as proof.