A new era is upon us

Apple Music's new AI policy could help shape the landscape.

Hi,

A few weeks ago, I wrote about what I called the zombie music industry - a version of the industry where everything was run by AI - it generated music, distributed it, recommended it, and bot accounts streamed it, all without a single person being meaningfully involved.

Last week Apple Music introduced a new feature that could set the tone for how AI music is acknowledged by the entire music industry.

Let’s explore below.

This newsletter highlights:

  • Apple Music’s new Transparency Tags feature

  • The Vault

  • B-Sides

  • Industry spotlight

  • 10 music industry job opportunities

Let’s dive in ⬇️

Last week Apple Music announced they’re adding something called “Transparency Tags” to music on its platform. These tags are essentially metadata labels that flag when AI was used to create a track, its artwork, its lyrics, or its music video.

Labels and distributors will have the ability to tag releases by different categories:

  • AI-generated track

  • AI-generated composition

  • AI-generated artwork

  • AI-generated music video

It's the first time Apple has implemented something to tell listeners when specific content was created by AI.

For the last few years, the conversation around AI music has been mostly reactive: lawsuits, takedowns, and social media posts about fake artists trying to game playlists. For the most part, the industry has largely been playing defense.

This feels a little different…because it signals that some of the most important platforms (and Apple is arguably one of the biggest companies in the world) are starting to treat AI disclosure the same way they treat genre tags and songwriter credits.

Apple's own statement from their announcement says it plainly:

"Proper tagging of content is the first step in giving the music industry the data and tools needed to develop thoughtful policies around AI."

The industry as a whole

To fully recognize why this announcement matters, it’s important to understand what's happening on the backend of streaming platforms right now.

Let’s take Deezer for example. The platform built its own AI detection system and at the start of 2025, they were receiving around 10,000 fully AI-generated tracks every single day. By November 2025, that number had grown to 50,000. By January 2026, it hit 60,000 per day - roughly 39% of everything delivered to their platform.

In total, Deezer detected and tagged over 13 million AI tracks in 2025 alone.

And we’re still in the early stages of AI…insanity right?

Of the streams those AI tracks generated, 85% were deemed to be fraudulent - bot accounts running up fake plays to siphon royalties out of the pool that real artists get paid from.

A separate Deezer study found that 97% of listeners couldn't tell the difference between AI-generated tracks and human-made ones and more than half said the results made them uncomfortable.

Different strategies

What's interesting about this moment is that every major platform is responding to the same problem in a completely different way. There's no industry consensus yet.

What's emerging instead are different strategies on what might actually work.

Strategy #1 - Deezer: Detect it yourself.
Deezer built their own detection tool and started actively tagging AI music in June 2025. They were the first platform to do so and began demonetizing fraudulent streams immediately. They've now started to license their detection technology to other companies, including Billboard, which is using it to identify AI-generated tracks when compiling charts. Deezer's approach is: don't wait for anyone to tell you, find it yourself.

Strategy #2 - Spotify: Remove the worst of it and label the rest.
Spotify removed over 75 million tracks from its platform in the past 12 months with most of those being low-effort AI spam flooding their catalog and distorting recommendation data. They also rolled out a new anti-impersonation policy for AI voice clones, a spam filter targeting mass uploaders, and announced they're adopting the DDEX industry standard for AI disclosure in music credits. 15 labels and distributors have already committed to adopting it. Spotify's approach is: cut the obvious fraud, then build a disclosure layer for everything else.

Strategy #3 - Apple Music: Create a disclosure system.
Apple's move last week is specifically about metadata and transparency. Their tags don't remove anything, they are just labeling it. The idea is to give the industry a standardized way to disclose AI involvement across every creative element of a release, so that over time, listeners, labels, and platforms all have cleaner data to work with. Apple's approach is: build the tagging system first, enforcement follows later.

Strategy #4 - YouTube: Require disclosure
YouTube requires creators to label content that's "altered or synthetic" and gives artists the ability to request removal of AI voice mimics of their likeness. Their approach sits somewhere between Spotify and Apple: disclosure is mandatory, but enforcement is artist-initiated rather than platform-driven.

It’s interesting…none of these are wrong or right. They just are lol.

They are different bets with different strategies on how to best solve this.

SUNO

One of the most interesting players in this story right now isn’t one of the streaming platforms — it's Suno.

A year ago, Suno was being sued by Sony, Universal, and Warner simultaneously for training its model on copyrighted recordings without consent. Last November, Warner settled, struck a licensing deal, and in many senses switched sides. One week later, Suno closed a $250 million Series C at a $2.45 billion valuation - nearly five times what it was worth six months prior. They also acquired Songkick, the concert discovery platform, from Warner as part of the deal.

The company now has 100 million users generating 7 million songs every day. For context: that's the equivalent of Spotify's entire library, every two weeks.

So does every song made on Suno automatically get flagged under Apple's new system? Not exactly…Apple's tags rely on the person uploading the track to self-disclose during distribution - through DistroKid, TuneCore, or wherever they're uploading it.

Deezer is different: they built specific detection infrastructure for Suno and Udio models and catch them regardless of what anyone declares. But Apple and Spotify's disclosure-based systems only work if the person on the other end is being honest.

Which has the potential to create a real gap. The fraudsters uploading 60,000 AI tracks per day to siphon royalties are not checking the "AI-generated" box. The songwriter who used Suno to sketch an idea and developed it into something real probably would. The tagging infrastructure ends up being most visible on the least problematic content.

Suno seems to understand this and their moves over the last six months suggest they're actively trying to separate themselves from the fraud problem. The download caps they agreed to with Warner are specifically designed to slow the pipeline of Suno tracks flooding streaming platforms. The licensed models launching in 2026 will include artist opt-in controls so creators can choose whether their voice or style is used as a reference. The Songkick acquisition connects them to live music arguably the most human part of the whole ecosystem.

What they're building toward, slowly, is legitimacy. Whether the tagging systems catch up to them in time (or whether the fraud problem overwhelms the infrastructure before any of it can work) is the real open question.

The reality of self reporting

Here's the part worth diving into deeper.

Apple's tags are currently optional. They'll become required for new releases in the future, but right now there's no enforcement mechanism and no independent verification. If a label or distributor chooses not to disclose, Apple assumes the music is human-made.

Apple is leaving it to content providers to determine what qualifies as AI-generated - similar to how genres and credits work today.

To be honest…it’s a reasonable starting point because the line isn't always clean.

A producer who uses an AI tool to sketch a chord progression and then rebuilds it entirely is doing something fundamentally different from someone who types a prompt into Suno and uploads the output unedited. Where exactly does the tag apply? It's genuinely hard to call.

What this could mean for artists

If the tagging system works, meaning if labels and distributors adopt it broadly and honestly, something meaningful could start to take shape.

In my mind, a two-tier ecosystem emerges.

  • Tagged music and untagged music.

  • AI-generated and human-made.

For the first time, listeners could filter, discover, and choose with an understanding of what they’re looking at.

In a world where most listeners can't tell the difference, "made by a human" starts to function as a differentiator the same way "organic" does in food. People seek it out, they trust it differently, and are willing to pay more for it.

The artists who stand to benefit most from what's happening right now aren't the ones at the top of the charts. They're the independent artists who've been drowned out by the sheer volume of AI content flooding every platform who suddenly have a structural argument for why their music deserves a different kind of attention.

The zombie loop I wrote about a few weeks ago won’t completely go away with features like this but for the first time, the biggest platforms in music are acknowledging it exists and building systems designed to make it visible.

It’ll be interesting to see where it all goes.

Hopefully this was helpful on your journey.

Thanks for reading, until next time.

The Vault

 1) Emergent - my cousin actually introduced me to this one! It’s similiar to Lovable, a platform that can be used for building web applications with AI but Emergent has more integrations. For example, it recently just integrated with Claude Sonnet 4.5 More info HERE

B-Sides

⚡ SUNO inks massive licensing deal HERE

⚡ YouTube adds AI prompting to year end recap HERE 

What I’m listening to…

Industry spotlight

These industry professionals are looking for open roles:

Derek Spence - Los Angeles, CA: "I’m an audio engineer with extensive experience recording, mixing, and managing sessions at top studios like Record Plant, Harbor Studios, and Craft Studios. I bring a mix of technical expertise, creativity, and client-focused workflow, making sure the artist’s visions come to life. I’m looking for recording and mixing engineer roles.” - LinkedIn

If you’ve been impacted by layoffs and are looking for an open role in the music or entertainment industry, submit for a chance to be featured in the Industry Spotlight section HERE

Music industry job opportunities

1) Digital Rights & Content Operations CoordinatorRebel Creator Services
Salary: $30,000 - $40,000

Location:  Remote

Apply HERE

2) Social Media Strategy and Digital Advertising Coordinator/Manager - Mascot Records 

Salary: $40,000 - $60,000
Location: New York, NY
Apply HERE

3) Administrative Assistant/Junior Agent - Dynamic Talent International

Salary: $42,000

Location: Nashville, TN

Apply HERE

4) Music Project Manager / Label Liaison - HYBE America

Salary: $175,000 to $225,000

Location: Santa Monica, CA
Apply HERE

5) Social Media Editor - Music - Future

Salary: £29,000 - £35,000

Location: London, UK / Bath, UK

Apply HERE

6) Marketing Coordinator - MiEntertainment Group

Salary: $40,000–$50,000

Location: Michigan

Apply HERE

7) Director of Music - Aspect

Salary: $150,000 - $200,000

Location: Los Angeles, CA

Apply HERE

8) Brand Manager - Audiio

Salary: Unlisted

Location: Nashville, TN 

Apply HERE

9) A&R Coordinator - Warner Music Group

Salary: Unlisted

Location: Nashville, TN

Apply HERE

10) Senior Financial Analyst, Film Production & Music - NBCUniversal

Salary: $80,000 - $92,000

Location: Universal City, CA

Apply HERE

What'd you think of this week's newsletter?

Your feedback goes a long way.

Login or Subscribe to participate in polls.

Reply

or to participate.