Artifical Intelligence Rhythms
Art gives way to emotions rather than logic. Composing, writing poetry, drawing pictures are mostly about our emotions. Art works are also generally evaluated with emotional points of view. After all, when we hear rhythm, timbre, and lyrics, don't we involuntarily smile, get excited, and have some emotions?
Isn't it one of the basic elements that makes the art of music effective, that the feelings of the artist meet the right sounds and reveal the feelings of the listener? Don't small tones and compositions increase our emotions?
Well, if we imagine the situation where emotions belonging to millions of people are collected in a pool and their relationship with the notes is clustered, finding the rhythms and timbres that activate these emotions would probably be a more effective and accurate way of turning your emotions into music. .
In fact, all this is possible with artificial intelligence, and many composers even use these artificial intelligence applications while producing their compositions.
Now there is a song I want to listen to you "Break Free" :
https://www.youtube.com/watch?v=XUs6CznN8pw
Yes, this song is one of the first songs created using artificial intelligence, released in 2017, where the lyrics and vocal melodies were written by Taryn Southern and the composition was created with artificial intelligence.
Incorporating artificial intelligence algorithms into the generative music industry has not been easy. Because mathematical tones and data had to be clustered. For example; According to the western notation system, while determining the note scale, we consider a note as 12 different tones, and these 12 tones are divided into lower and upper notes, creating new tones. In fact, according to the western notation system, between these notes, the third and fifth major notes capture a pleasant tone, while the seventh major notes produce sounds that are more disturbing to the ear. If we can teach these sounds and more to artificial intelligence, artificial intelligence will be able to bring together harmonious sounds by doing its part.
Today, the effects of music and sounds on human psychology are taught to artificial intelligence and enable them to compose.
No matter how personal and creative music is, the chords and compositions of many songs can be similar. Even though it is referred to as a "stolen composition" among the public, it is not the fact that they have 100% the same notes, but the timbre similarities in certain places. And this has been a situation that did not cause much trouble for many bands in the music world.
So, if the similarity in chords and rhythms in a piece is commonplace in the music world, why not create new and creative compositions using artificial intelligence algorithms potential music?
Using artificial intelligence to identify chord interaction and write chords will not only accelerate the composition process in the future, but will also enable the creation of creative melodies. For this reason, many companies today collect notes from different music devices on a single platform and create interfaces with artificial intelligence infrastructure. In this process, the companies that create these interfaces record different instrument recordings with appropriate voice recorders, record chord sequences and introduce these sounds to artificial intelligence. With the increase in the data introduced, the learning process can accelerate and produce more different and productive outputs.
The Flow Machines Project is one of the projects carried out in this field, where artificial intelligence intersects with music, which has been successful so far.
The algorithm used in this project can play a composition with different musical instruments, as well as discover new compositions. They work with artists on subjects such as producing or developing compositions in projects they carry out together or separately.
AIVA is the artificial intelligence that succeeded in completing the unfinished composition of the famous composer Antonin Dvorak, who passed away 115 years ago. AIVA tried to complete an E(mi) minor piano composition by Dvorak
AIVA has been known to compose "emotional soundtracks" for movies, television commercials and video games since 2016. AIVA can usually compose in under a minute for standard works, as the company's library has relevant and similar data, but the situation was very different for Dvorak. The composition took 72 hours to complete.
You can listen to the first part of the composition with the interpretation of Czech pianist Ivo Kahanek at the link below.
And finally, I would like to talk to you about the development of an artificial intelligence algorithm that I think will affect everyone who is interested in the subject. This video, which you will watch right below, will definitely open your horizons and increase your excitement. (if you haven't watched it yet)
AIVA is an artificial intelligence program that learned the art of composing music by reading the best 30,000 series and theater music in history. AIVA searches for patterns in compositions using deep neural networks. Based on a few measures in existing music, he tries to figure out which notes should come later in the piece. When AIVA is successful in these deductions, it can create its own original composition by determining some mathematical rules for a certain music genre. In a way, this is the way we humans make music, too. It's a trial and error system where we can't find the right notes every time. But we can correct ourselves with our musical ear or knowledge. But for AIVA, it takes a few hours from years of learning, decades of training as an artist, musician and composer.
However, music is a highly subjective art. We had to teach AIVA to compose the right music for the right person because each person has different preferences. To do this, we show the algorithm more than thirty category tags for each composition in the database. These category tags can be mood, note density, style of the composer of the piece, era in which it was written. By looking at all this data, AIVA can find answers to very sensitive requirements.
The basic vision at AIVA is to produce customized compositions and songs for everyone's own story and character.
Henry Wadsvorth Longfellow said, “Music is the common language of humanity.” he said. However, it means that music is now our common language with artificial intelligence.
REFERENCES:
- http://www.yapayzekatr.com/2018/07/11/muzik-dunyasinda-yapay-zeka-ritimleri/
- https://musiconline.com.tr/muzik-ve-yapay-zeka/
- https://www.youtube.com/watch?v=OeKfSosXsuU
- https://www.ted.com/talks/pierre_barreau_how_ai_could_compose_a_personalized_soundtrack_to_your_life/transcript?language=tr
- http://www.bangprix.org/dunyanin-ilk-yapay-zeka-tarafindan-bestelenen-ve-uretilen-albumu-cikti/
- https://ikidunya.wordpress.com/2011/03/20/muzik-insanlarin-evrensel-dilidir%E2%80%A6%E2%80%A6%E2%80%A6%E2%80%A6longfellow/