SMU Journal of Undergraduate Research
Abstract
There are many reasons people listen to music, and the type of music is largely determined by what the listener may be doing while they listen. For example, one may listen to one type of music while commuting, another while exercising, and yet another while relaxing. Without access to the physiological state of the user, current music recommendation methods rely on collaborative filtering - recommending music based on what other similar users listen to - and content based filtering - recommending songs based on their similarities to songs the user already prefers. With the rise in popularity of smart devices and activity trackers, physiological context can be a new channel to inform music recommendations. We propose deep learning solutions for context aware recommendation and playlist generation. Specifically, we use variational autoencoders (VAEs) to create a song embedding. We then explore multi-task multi-layer perceptrons (MLPs) and Gaussian mixture models to recommend songs based on context. We generate artificial user data to train and test our models in online learning and supervised learning settings.
Recommended Citation
Mann, Elias
(2024)
"Context Aware Music Recommendation and Playlist Generation,"
SMU Journal of Undergraduate Research: Vol. 8:
Iss.
2, Article 2.
DOI: https://doi.org/10.25172/jour.8.2.1
Available at:
https://scholar.smu.edu/jour/vol8/iss2/2
Creative Commons License
This work is licensed under a Creative Commons Attribution-Share Alike 4.0 License.
DOI
https://doi.org/10.25172/jour.8.2.1
Included in
Artificial Intelligence and Robotics Commons, Data Science Commons, Statistical Models Commons