FORTU
Discover music through what machines ‘see’
Machine Learning
May 2020 | 4 days | Copenhagen Institute of Interaction Design (CIID)
Brief: ‘Explore how “machines” could help us cope better with social distancing, remote communication and spending more time at home.’
Fortu is a web-based contextual music player that lets people discover music through objects at home. It recommends songs based on what the ‘eye of the machines’ detects through computer vision. The project was developed during the 4-day machine learning course at CIID.
ROLE
Concept
Coding
TOOLS
HTML
CSS
ml5.js & p5.js
Spotify API
TEAM
Jose Chavarria (Costa Rica)
Yoshio Mikamo (Japan)
CHALLENGE
Explore how ‘machines’ could help us cope better with spending more time at home
Design Concept
for·tu·i·ty/fôrˈto͞oədē (noun)
a chance occurrence, the state of being controlled by chance rather than design.
Fortu is a web-based application that plays music based on objects detected by the camera. It uses COCO-SSD, a pre-trained TensorFlow object detection model accessed via ml5.js. Detection results generated from the model are parsed to p5.js, where calls are made to the Spotify API, showing songs related to the detection results.
👉Try it out here:👈
How did we get there?
When the team discussed what we missed the most about life before the pandemic, the social and spontaneity aspect of music came up. We missed dancing to music and discovering new music with loved ones. A few concepts were explored before we landed on our final concept.
01. Music to the Dance
What if your dances can command what music to play?
Teachable Machine Image | ML5 PoseNet pose detection | p5.js
You know that great feeling when your favourite song comes on and you just can’t help but dance? What if we could train our machines to recognise our dance moves to play the songs we want to dance to?
We trained our machines to recognise a few distinctive dance, including Macarena, Stayin’ Alive, and The Ketchup Song, it was challenging to develop the concept further as we ran out of distinctive dance moves for the machine to recognise.
02. Hum to the Beat
What if we could get our machines to play songs based on just ‘da daaaa da na’?
Teachable Machine Audio | p5.js
We all know how frustrating it is when you can’t remember the name of a song or any of the words but the tune is stuck in your head. This led to an idea of humming to play songs.
We trained the model with our humming which represented three genres — electro, hiphop, and rock. A few songs were selected for each genre and a random song would be played from the list when humming was detected. This concept was scrapped because of its inability to scale and a similar product is already in place.
03. Contextual Music Player
What if we could use machine to generate new music recommendations?
RunwayML | Processing | Spotify API | ml5.js | p5.js
Have you found yourself stuck in a ‘musical echo chamber’ where you listen to the same music over and over again? What if we could generate new music recommendations at home through machines? This led us to develop the first iteration of ‘Fortu’, a physical speaker that detects objects in proximity and plays music based on the objects identified using RunwayML and Processing.
We later moved the sketch from Processing to web browser and connected to Spotify to improve scalability. It was developed using ml5.js and p5.js where calls were made to the Spotify API. Songs that matched with the object detection results would be played in the browser. The visual style of the application was updated to a bright colour palette to evoke a fun and playful feeling.