Kirk McElhearn, the iTunes Guy, shows how Apple Music has regressed from the days of iTunes Match:
If you’ve used iTunes Match in the past, you may know that it matches music using acoustic fingerprinting, which means that iTunes scans the music, and matches it to the same music. It doesn’t matter what tags files have: you could have, say, a Grateful Dead song labeled as a song by 50 Cent, and iTunes Match will match the Grateful Dead song correctly.
Apple Music, however, works differently. It does not use the more onerous (in time and processing power) acoustic fingerprinting technique, but simply uses the tags your files contain. And it can lead to errors.
Matching Apple’s long legacy as a provider of a la carte music to a new streaming service was always going to be a bumpy ride, but this is the sort of regression that makes me think the company should’ve kept these services separate and taken more time to get it right before bringing them together.
[Update: Serenity Caldwell couldn’t reproduce what Kirk saw, but still encourages caution.]
[Update 2: I’m hearing that Apple Music does indeed use metadata to do matching, but that iTunes Match is still supposed to use acoustic fingerprinting. So this may be a case of a story that’s half feature, half bug.]
[Update III - The Updatening: Kirk has updated his post. Marco Arment has an interesting take on the whole thing.]