newsweekshowcase.com

YouTube is working with music labels to figuring out its strategy for Artificial Intelligence

The Verge: https://www.theverge.com/2023/8/21/23840026/youtube-ai-music-copyright-monetization-universal

The YouTube Story about Digital Rights and Artificial Intelligence in Music and Streaming Services: A Game Theoretical Analysis of Drake’s Video Content Detection Campaign

It is clear that this is a fine solution for YouTube, which does not want to lose its music licenses in a decade-long legal fight over fair use and artificial intelligence. But it is a pretty shitty solution for the rest of us, who do not have the bargaining power of huge music labels to create bespoke platform-specific AI royalty schemes and who will probably get caught up in Content ID’s well-known false-positive error rates without any legal recourse at all.

YouTube hasn’t announced any details yet on what these new investments will be other than promising to balance rights protection with creator-led innovation, including figuring out ways to monetize AI-generated content.

What’s going to happen next is all very obvious: YouTube will attempt to expand Content ID to flag content with voices that sound like UMG artists, and UMG will be able to take those videos down or collect royalties for those songs and videos. Along the way we will get glossy videos of Ryan Tedder asking for a sad beat for a rainy day and someone else saying that Artificial Intelligence is amazing.

An AI-generated song, “Heart On My Sleeve,” went viral on Tiktok in April, featuring vocals by what sounded like Drake and The Weeknd. It soon made its way to streaming services, including YouTube. UMG, Drake’s music label, issued a strongly worded statement saying AI-generated songs violate copyright laws. The song was taken down after a while.

There’s nothing more important than making sure his estate — and his label, Universal Music Group — gets paid when people do AI versions of Ol’ Blue Eyes singing “Get Low” on YouTube, right? Even if that means creating an entirely new class of extralegal contractual royalties for large music labels to protect their online dominance, while simultaneously demanding that training search results for books and news websites without paying anyone is fair use? Right? Right?

This all led to a huge policy dilemma for the company, which is busily training its systems on the internet. None of these companies are paying anyone for making copies of all that data, and as various copyright lawsuits proliferate, they have mostly fallen back on the idea that these copies are permissible fair use under Section 107 of the Copyright Act.

Streaming services such as Apple and Spotify quickly complied with their control of their entire catalogs. There was a problem with open platforms like YouTube that doesn’t take user content down without a policy violation. And here, there wasn’t a clear policy violation: legally, voices are not copyrightable (although individual songs used to train their AI doppelgangers are), and there is no federal law protecting likenesses — it’s all a mishmash of state laws. So UMG fell back on something simple: the track contained a sample of the Metro Boomin producer tag, which is copyrighted, allowing UMG to issue takedown requests to YouTube.

The thing is that “fair use” is 1) an affirmative defense to copyright infringement, which means you have to admit you made the copy in the first place, and 2) evaluated on a messy case-by-case basis in the courts, a slow and totally inconsistent process that often leads to really bad outcomes that screw up entire creative fields for decades.

The music industry is happy because the music industry is able to keep a constant stream of money from the labels because of the blanket licenses that they have. It is not possible to compete with TikTok without expansive music rights and taking them off the table is a bad idea.

And the problems here aren’t hard to predict: right now, Content ID generally operates within the framework of intellectual property law. If you make a piece of music criticism that is flagged by Content ID as violating your rights and you disagree with it, you will be forced to engage in lengthy back-and-forth with them and then if that doesn’t work out, politely suggests you head. (YouTubers generally do not do this, instead coming up with an ever-escalating series of workarounds to defeat overzealous Content ID flags, but that’s the idea.)

There will be a new ‘YouTube Music AI Incubator’ that convenes a bunch of UMG artists and producers, and that YouTube will be expanding its content moderation policies to cover ‘The Challenges of Artificial Intelligence,’ said Mohan sandwiched that announcement. We are told that the solution to the technology problem is more technology.

“AI can also be used to identify this sort of content, and we’ll continue to invest in the AI-powered technology that helps us protect our community of viewers, creators, artists and songwriters – from Content ID, to policies and detection and enforcement systems that keep our platform safe behind the scenes,” says Neal. Yes.

First, lumping “copyright and trademark abuse” in with the “and more” of malicious deepfakes and AI-accelerated technical manipulation is actually pretty gross. One thing, at worst, causes potentially lost revenue; the others have the potential to ruin lives and destabilize democracies.

How to Have It Both Ways With AI And Copyright: Nash’s Comment on Metro Boomin, and Where Will YouTube Go Between Drake And DeSantis?

During the company’s quarterly earnings, Michael Nash, UMG’s executive VP of digital strategy, said that the company wanted to take down the song because it was based on a sample from Metro Boomin.

“Generative AI that’s enabled by large language models, which trains on our intellectual property, violates copyright law in several ways,” he said. “Companies have to obtain permission and execute a license to use copyrighted content for AI training or other purposes, and we’re committed to maintaining these legal principles.” There is an emphasis mine.

In this case, it would be in the best interests of everyone to have a private right to likenesses and voices. The Donald Trump imitations are in an election year. What about Joe Biden’s impressions? Where will YouTube draw the line between AI Drake and AI Ron DeSantis? Regular ol’ DeSantis has never met a speech regulation he didn’t like — how will YouTube withstand the pressure to remove any impression of DeSantis he requests a takedown for after opening the door to removing AI Frank Sinatra? Are we ready for that, or are we just worried about losing our music rights?

Source: Google and YouTube are trying to have it both ways with AI and copyright

The Future of Google: Disrupting Walled Garden Upstarts with Google’s SGE and its Robot Replacement for Robots.txt

The last remaining source of traffic on the internet is Google, which is why so many websites are turning into artificial intelligence-written honeypots. The situation is bad and getting worse.

In the meantime, Google is also rolling out the Search Generative Experience (SGE) so that it might answer search queries directly using AI — particularly, lucrative queries about buying things. In fact, almost every SGE demo Google has ever given has ended in a transaction of some kind.

It’s a terrible deal for publishers who are not allowed to say no to search traffic because they don’t have the ability to. On the last conference call, Pichai bluntly said of SGE, “Over time, this will just be how search works.”

A website could block Google’s crawlers in its robots.txt file — OpenAI, fresh from scraping every website in the world to build ChatGPT, just allowed its crawler to be blocked in this way — but blocking Google’s crawlers means deindexing your site from search, which is, bluntly, suicidal.

This will all take a lot of time! It’s important that GOOGLE slows roll it all while it can. For example, the company is thinking about creating a replacement for robots.txt that allows for more granular content controls but… you know, Google also promised to remove cookies from Chrome in January 2020 and recently pushed that date back yet again to 2024. A standard web standards process taking place while a fair use legal battle is going on is fine if no one can turn it off.

And you know what? That future version of Google looks an awful lot like the present version of YouTube: a new kind of cable network where a flood of user content sits next to an array of lucrative licensing deals with TV networks, music labels, and sports leagues. If you squint, it is the exact kind of walled garden upstarts like Google once set out to disrupt.

Exit mobile version