Subjectivity in music tags
Natalie Jacobs
Music business advocate | Helping music companies optimize their data and workflows to ensure top-tier service for artists and songwriters | Passionate about educating on the business of music
When applying tags to a piece of music, whether a sound recording or a musical work, there are certain elements that are indisputable - such as musical key, BPM, and whether it's instrumental or acoustic. However, many other tags encompassing genre, sub-genre, mood, lyrical theme, etc., can be much more complex to navigate. Different people may perceive and describe music differently based on their personal preferences, cultural background, and musical knowledge. This can lead to inconsistent tagging, where the same piece of music is labeled differently by various individuals or platforms.
Taxonomies
Beyond personal interpretation, another thing that can drive inconsistencies between tags is that, across the industry, we're working with different taxonomies. Several years ago, my team and I spent months trying to create a merged taxonomy, originating with 700+ genre/sub-genre tags, drawn from multiple different sources. It was no small feat to get to something that everybody could agree upon. Furthermore, each music company has their own taxonomy, so while there is crossover in the most common tags, the edges are blurred.
The theme of compare and contrast goes further than just rights metadata - we could make faster progress if we shared.
Do a search for any song and look at all the genres that may be returned, and the variances between them. As sounds are increasingly fused and influences drawn from many places, so are the ways to describe them - electropop vs synth-pop vs dance-pop. Very few would be able to define the differences.
One of these things is not like the other
An element of tagging that I find truly fascinating is the dichotomy between tags that could be applied to the same song.
For example, Whitney Houston's version of "I Will Always Love You" may be considered Pop, Soul, or R&B (depending on who you ask). But the underlying composition, as written by the national treasure, Dolly Parton, is undisputedly considered country.
Can genre be truly applied to a composition when the genre applied to each recording of it may be different?
Similarly, "Pumped Up Kicks" by Foster The People is slated as indie rock, electropop, alt. rock, etc. with a distinctively up-beat and catchy tune. The lyrical theme is heavy, broaching gun violence and mental health.
The latter example here might have moods attributed to "upbeat", "uplifting", or "harmonious", while also be tagged with "suicide", "violence", or "revenge". It would be all too easy for an inexperienced person to tag the song based on sound without listening to the message behind the lyrics. Without both sets of tags, the story is ultimately incomplete.
You can read more about Beyonce transcending genres here .
Less is more
It might seem like a good strategy to just add more tags, with the thought that it would make it easier to find a piece of music. Unfortunately, quantity of tags doesn't really define quality of tags. So instead of being able to accurately find what you're looking for, you may end up cluttered with non-relevant results.
In cases where commercial interests and marketing strategies can impact how music is tagged and categorized, marketability may be prioritized over musical integrity or accuracy. This can lead to misrepresentation of genres and other tags for promotional purposes, leading to confusion when a consumer (or even an internal department at a music company) is trying to find something specific.
Many music streaming platforms use algorithms to recommend music based on user preferences and tagging. Subjective tagging practices can introduce biases into these algorithms, potentially limiting the diversity of music recommendations and reinforcing existing music consumption patterns.
领英推荐
Geographic/Cultural Bias
While globalization and emerging markets may be a hot topic, have you considered how well tags are being applied for music with non-Anglo origin?
For example, the Grammys introduced a new category in 2024 for African music performance, but most everything within that is currently being placed within the "Afrobeats" or "Aaampiano" genres. Those two designations are barely skimming the surface .
This challenge isn't limited to African music alone; it extends to Latin, European, Asian, and other musical traditions as well. While we've moved away from the broad categorization of everything as "world music," there's still considerable progress needed. It's crucial to honor the origins of the music rather than simplifying it with Western-centric tags simply because they're more familiar to us.
AI for good?
Tagging is actually an area where I believe AI can provide assistance. It's a simple task for those unambiguous tags, such as BPM or key. For subjective tags, not only does it reduce individual interpretation but it also allows tagging at scale.
During a catalog acquisition, if all assets could be run through an AI model and have tags applied consistently according to the taxonomy used by that company, it would be a huge time saver for everyone involved.
That being said, there are still issues. Using the global example, unless the model has been properly trained on regional music and given a comprehensive taxonomy from which to draw, the tags will inherently be lacking proper accuracy and representation.
In summary...
While new descriptive tags for genre, subgenre, mood, etc. will continue to develop, I highly recommend you create your parameters.
Natalie Jacobs is the founder of Equalizer Consulting, specializing in helping music companies, artists, and songwriters with:
For a free 30-minute consultation, please use this link to schedule.
Product Manager - Music Content Understanding - SiriusXM/Pandora
7 个月This is something we have thought a lot about as well! Check out this blog post by my colleague Scott Rosenberg on our Analysis Genre Taxonomy: https://community.pandora.com/t5/Community-Blog/Music-Analysis-and-Genre-The-AGT/ba-p/121943
Software Engineer (5+ years @ Big Tech) | Music Tech/Production
7 个月GPT4 appears to have a good understanding of "Objective" vs "Subjective" tagging, and its nuances:
Skilled leader in Technology and Music
7 个月This is great! To double down on your point, even presumably objective metadata can be subjective, as much as it pains me to say so as someone who really likes to strictly categorize things. For example, the same song could be interpreted as 180BPM or as 90BPM depending on a variety of nuanced, inconclusive factors. And while the vast majority of popular music is a single tempo, some songs do have tempo changes; an even smaller number have meter changes, but they do exist. And key isn't always straightforward either. Your suggestion to avoid the trap of piling on excessive tags is spot on too. I do wonder though if some of the problems with that approach will be fully mitigated by the application of AI. Instead of using it solely for tagging, if AI is being used in the searching/retrieval stage, it can have some understanding of the connection between multiple seemingly conflicting tags as well as map the logical distance between different tags/concepts, similar to the way genres are shown by Every Noise. It would definitely be easier to build a working solution for this via a method such as a commercial LLM + RAG on your lyrics database than it would be to build a super accurate tagging tool via custom training a model.