YouTube is tightening its control over deceptive AI-generated content by demonetizing some of its most popular fake movie trailer channels.
Among those targeted are Screen Culture and KH Studio, along with their offshoots Screen Trailers and Royal Trailer, all of which have built massive audiences by publishing AI-crafted trailers that blur the line between fiction and reality.
These channels cleverly splice together clips of real actors, familiar IP, and generative AI elements to produce trailers for movies that don’t actually exist. Combined, these channels have racked up billions of views and built subscriber bases in the millions.
In March, YouTube removed the primary channels from its Partner Program, effectively cutting off their ad revenue. More recently, their alternate accounts also got the axe.
YouTube cited violations of its monetization guidelines, which require content to be both original and significantly transformative, not just rehashed versions of copyrighted material made to chase views.
“If you borrow content from someone else, you need to change it significantly to make it your own,” YouTube emphasized in a statement.
Studios Claim Ad Revenue From Fake Trailers
But this crackdown revealed something even more surprising: some of the very studios whose intellectual property is being copied are quietly monetizing these fake trailers themselves.
A report by Deadline uncovered that Warner Bros. and Discovery had claimed monetization rights on fake AI trailers for Superman and House of the Dragon, both produced by the same creators behind Screen Culture.
Instead of issuing takedown requests, Warner Bros. and others, like Sony and Paramount, opted to pocket the ad revenue, raising ethical concerns around fair use, copyright enforcement, and the industry’s stance on AI-generated content.
While there’s no evidence that studios are actively encouraging these videos, their willingness to profit from them suggests a tacit approval, especially when takedown tools aren’t being used.
The actors’ union SAG-AFTRA has expressed concern over the use of generative AI to replicate actors’ likenesses without their consent.
This practice, likened to the old Hollywood “fake Shemp” approach (where stand-ins replaced unavailable or deceased actors), sets a potentially troubling precedent by digitally mimicking performances without permission.
SAG-AFTRA has called for clearer protections for actors against digital impersonation, especially when the intent is to generate profit without involving the original performers.