I’ve concatenated the mode and the key to create a variable called master_key, which contains the complete song key information. For each song, we have the mode and the key as determined by Spotify. The head of the raw data looks like this:įor each album, we have the album name and genre, artist, as well as the names of each song. The data and code for this analysis are available on Github here. In total, I was able to retrieve the mode and key information for about 80% of the albums in my digital collection (obscure or niche recordings are not always available on Spotify). I queried the Spotify API using Python and the excellent Spotipy package. I then used the artist and album information to get the song mode and key for each album track from the Spotify API, which has catalogued this information for a huge number of albums. I have most of the music I’ve listened to over the past 10 years in a digital format, and I extracted the artist, album, and musical genre information from ID3 tags included in the files (using code adapted from a previous blog post). The data for this blog post come from the digital music (.mp3) files on my computer. Finally, we’ll explore differences across genres in the modes and keys that the music is played in, and use this information to simultaneously cluster the musical keys and genres. whether the songs are played in major or minor keys), and the musical key itself (e.g. We will focus on two primary aspects of the music: the mode (e.g. In this post, we will examine the harmonic properties of songs in my music collection.
0 Comments
Leave a Reply. |