musicube blog image

The importance of rhythm

March 11st, 2022

Last week we talked about the valence of a song, its moods, pleasantness, and engagement. This week we will walk through another essential group of categories that define the nature of the track. And these are the ones related to the rhythm.

Rhythm is one of the backbones of a musical composition, and it radically affects our way of experiencing the music. Rhythm is a basic element of nature, and we are highly sensitive to it. There’s a rhythmic pattern in our heartbeat and our breathing, in our footsteps and our speech. Depending on the activities and emotions we are experiencing in a particular moment, that rhythm changes. Thus, rhythmic patterns are related to human emotions, and, because music is all about conveying emotions through sound, rhythm is tightly related to our way of experiencing the music.   We want our artificial intelligence to provide great help and insights over the tracks it is tagging. Thus, we have invested a long time training it to differentiate aspects of the rhythm.    Do you want to know more about them? What about listening to some cool music on the way?

First things first: beats per minute

Of course, when it comes to defining the rhythm of a song, counting the beats per minute (bpm) is the most basic trait. We could consider the bpms as the heart of the song. If the song’s heartbeat is slow, it normally conveys the feeling of calmness or peace. When the heartbeat is high, it gives us the feeling of speed and rush. The bpms are basic to understand the rest of the categories we have incorporated in our ai that are related to the rhythm of the song. Most of those categories are just a way of translating this objective value into a more human experience. Nonetheless, in many cases, accurately identifying the bpms of a song is key. For example, if you are trying to find a song for a particular scene, you want it to have a bpm number that matches the frames or activities performed within the frames. This helps to build up the cohesion between audio and video and dramatically improves the audiovisual experience.

1. Tempo class

The bpms are an objective, discrete value. But in most cases, when we are looking for a particular song, we aren´t looking for a specific bpm value, but we want to find songs within a particular range. Thus, our ai also uses the bpms of a song to classify the tracks between “Low”, “Mid” and “Fast” tempo.

  1. Low tempo: Masterswarm by Andrew Bird
  2. Mid tempo: Ordinary World by Green Day
  3. Fast tempo: The Lovecats by The Cure
2. Arousal

The arousal of a song is related to the tempo and the energy that it conveys. This particular category not only considers how fast the tempo is but how much strength the combination of the tempo and the other elements of the song can convey. You might have a slower tempo song that has very high energy, and a high tempo song that is relatively calm. This categorization is a mix of objective and subjective traits of the track. The subjective ones are determined by human listeners! Let´s check some examples:

  1. Very calm: Blackbird by The Beatles
  2. Calm: Can’t Help Falling In Love by Elvis Presley
  3. Moderate arousal: Island in the Sun by Weezer
  4. Energetic: Sweet Home Alabama by Lynyrd Skynyrd
3. Rhythm Affinity

This is a very technical category, that is very challenging to automatize in an AI system. It talks about the type of tempo/time signature the song has. Normally, most music fits in, what is called a 4/4, 2/4…time signature. This is what we consider “common time”.  It is also relatively common to find songs that fit into a waltz-like time signature, what we called “triple meter”. Finally, there are some songs, especially in classical music, jazz and progressive styles, that use complex time signatures. Most people can differentiate between the common time and triple meter songs. When it comes to complex time signatures, there are so many variants (sometimes within a single song!), and it is highly challenging to train an AI to define them in detail. We are working on it, but, so far, we can just tell that the signature is not a “standard” one. Check some examples below:

  1. Common tempo: Crazy by Aerosmith
  2. Triple meter: On the Street Where you Live in My Fair Lady
  3. Complex time signature: Brahms: Sechs Klavierstucke, Op. 118 / II. Intermezzo in A Major performed by Arcadi Volodos
4. Grooviness

Grooviness is one of the most interesting categories in the rhythmic group. In music production, grooviness is referred to how strongly the feeling of rhythm is conveyed in the composition. A common way of enhancing the grooviness of a track is by making the rhythmic elements of a song (bass and drums, normally) merge with other elements of the track (melody, for example) in a particular point of the compass. This creates an enhanced “flow” or feeling of rhythm that, if well done, adds an extra dimension to the song.

  1. Steady: Bring You Back by Beacon
  2. Moderate Rhythm Feel: Memories by Maroon 5
  3. Groovy: Hey Brother by Aloe Blacc

There’s so much to take into account when analyzing and tagging a song. Human experts can accurately identify and tag all the categories our AI does, but it takes a long time. Can you imagine how much time would you save using an automatic system to perform all these tasks?   There are still so many categories to discover on musicube! More soon!