advertisement
Science News
from research organizations

Muscial classification system: Computers get with the beat

Automatic classification of music by genre

Date:
June 29, 2015
Source:
Inderscience Publishers
Summary:
As yet another music streaming service comes online to rival the countless available outlets for so many different genres, a new approach to classifying music to make archiving, sorting and music discovery easier.
Share:
advertisement

FULL STORY

As yet another music streaming service comes online to rival the countless available outlets for so many different genres, a new approach to classifying music to make archiving, sorting and music discovery easier is published in theInternational Journal of Computational Intelligence Studies.

罕见的音乐艺术家被描述为genre-defying. Most singers and musicians tend to stick to a particular genre, whether electronic dance music, reggae, classical, folk, jazz, rock or Indian genres such as Bhangra and Ghazal, or any of hundreds of other categories. Listeners might categorize any given song into one of a few dozen genres with which they are familiar while dedicated fans of a specific genre may well distinguish between dozens of sub-genres within each classification. In the age of digital distribution and archiving of music and music recommendation systems it makes sense to have a way to automate the process of genre categorization.

Now, researchers in India have devised a simple system that, rather than attempting to quantify many different parameters -- tempo, pulse, loudness, melody, rhythm, timbre etc -- focuses on just pitch, tempo, amplitude variation pattern and periodicity in order to tag a given song as belonging to a specific genre. Their approach uses random sample consensus (RANSAC) as a classifier.

In the team's approach their system decomposes, or breaks down, the sound signal into 88 frequency bands, divides each sub-band into short duration frames and for each frame, computes the short-time mean-square power (STMSP) and the average STMSP, this gives a metric for pitch. The team demonstrates that for seven major musical genres, this metric is very distinct. In order to be more precise, however, they also measure rhythm or tempo of a song, which is an important perceptual description essentially independent of melody. Tempo can be extracted from a sound file using a mathematical process known as a Fourier transform that gives the metric in beats per minute (BPM).

Pitch and tempo can both help decide on genre, but there is often overlap. For instance, these characteristics are often similar in North Indian Bhangra and Western rock music. So, another metric -- amplitude variation -- is also added to the mix. Additionally, the team also uses correlation-based periodicity. This is another perceptual feature which captures the repetitions within a given signal.

The team has now tested their genre identification system against earlier models by other researchers on a database of songs and then compared that to manual categorization. Their results show their system to be "substantially better" and it might readily be incorporated into a music database or online music recommendation service.

advertisement

Story Source:

Materials provided byInderscience Publishers.注意:内容可能被编辑风格d length.


Journal Reference:

  1. Arijit Ghosal, Rudrasis Chakraborty, Bibhas Chandra Dhara, Sanjoy Kumar Saha.Perceptual feature-based song genre classification using RANSAC.International Journal of Computational Intelligence Studies, 2015; 4 (1): 31 DOI:10.1504/IJCISTUDIES.2015.069831

Cite This Page:

Inderscience Publishers. "Muscial classification system: Computers get with the beat." ScienceDaily. ScienceDaily, 29 June 2015. /releases/2015/06/150629124203.htm>.
Inderscience Publishers. (2015, June 29). Muscial classification system: Computers get with the beat.ScienceDaily. Retrieved September 14, 2023 from www.koonmotors.com/releases/2015/06/150629124203.htm
Inderscience Publishers. "Muscial classification system: Computers get with the beat." ScienceDaily. www.koonmotors.com/releases/2015/06/150629124203.htm (accessed September 14, 2023).

Explore More
from ScienceDaily

RELATED STORIES