Apple Music adds Transparency Tags to flag AI in songs and visuals
Apple Music introduces optional "Transparency Tags" for distributors to flag AI use in artwork, tracks, lyrics and videos, raising questions about enforcement and user visibility.

Apple Music has introduced a new metadata feature called Transparency Tags that lets record labels and distributors flag where artificial intelligence was used in a release. The change, announced to industry partners via a newsletter on March 4, 2026, creates distinct fields for artwork, track (music), composition (lyrics) and music video so companies can disclose AI involvement when delivering content to the service.
Apple framed the tags as “a concrete first step toward the transparency necessary for the industry to establish best practices and policies that work for everyone.” The company’s guidance also advises that tags be applied when a “material portion” of the content has used AI. The new fields are incorporated into the Apple Music delivery specification under section 5.3.25, which indicates the tags are available to distributors and that, when omitted, “none is assumed.”
That combination of language has produced an immediate ambiguity: the tags are now part of the upload schema used to deliver song metadata to Apple Music, but the specification appears to allow distributors to leave the new fields blank. In practice, that means the system can exist as a formal delivery change without requiring distributors to affirmatively mark AI-produced material. The responsibility for flagging AI usage rests with labels and aggregators, not with automated detection inside Apple’s platform.
Industry context makes the stakes clear. Streaming rival Spotify introduced its own metadata disclosures in September 2025 and helped develop industry standards with the Digital Data Exchange to indicate specific AI roles such as vocals, instrumentation or post-production. Streaming services are also dealing with a surge in AI-originated uploads: the French service Deezer has said about 60,000 fully AI-generated tracks arrive on its platform each day. A 2025 Deezer and Ipsos study found 97 percent of respondents could not reliably tell AI music from human-made music, while 80 percent wanted clear labels and 72 percent said they wanted to know if AI music was being recommended to them.

Public reaction to Apple’s approach has been mixed. Some artists and listeners have demanded stronger, platform-level rules after high-profile incidents involving AI impersonations and unlabeled releases. Others see tags as a modest, voluntary transparency tool. As Rachel Thompson has observed, user tolerance for AI music “begins to shift once they feel deceived,” a common complaint when AI is undetected in libraries and recommendations.
Several operational questions remain unresolved. The specification and newsletter do not specify whether Apple will surface Transparency Tags directly in user interfaces, audit distributor claims, or take remedial action against false or missing labels. They also do not state an enforcement timeline or whether Apple will coordinate the fields with existing standards groups such as DDEX.
For labels, distributors and listeners, the new tags are a partial answer: a formal place to say where AI was used, but not yet a guarantee that listeners will see accurate or consistent disclosures. The long-term impact will depend on whether Apple, rights holders and metadata standards bodies convert the optional fields into enforceable practices and visible labels across services.
Know something we missed? Have a correction or additional information?
Submit a Tip

