iTunes and Genius Integration
On 11 October 2018, Apple and Genius announced a deal that will only make the experience of their music player get better. The deal should be a song to everyone who's using the streaming service ear as it will give them direct lyric's access on their grasp.
Turns out, the deal is only a beginning as accessing the lyric while working on other project is not the easiest thing to be done.
This case was coming from my experience using iTunes lyric feature. Thus this study case will be written around the validating my assumption by finding the similar crowd that faces the same problem with me.
There is an application provided this functionality, such as Musixmatch. However, the memory resource and annoying pop up informing you what is the current song playing drives me to seek for other alternatives.
First stop to validate this assumption is to visit Reddit and see how receptive the member when this news spread out. The discussion is rather interesting as some members mentioned how Spotify, their main competitor, had implemented this feature but they did not like it. Interestingly, there are multiple members want the experience of reading the lyrics to be real time, the one like Shazam has.
Another insight that can be gathered from this study is that the Reddit's thread member complained about the lyric accuracy that iTunes provided before and that they only want the lyrics to be featured while the music is playing.
I use secondary information to understand who is using Apple Music/iTunes music player. Luckily, Jan Dawson of Jackdaw Research shares the same intention of understanding Apple Music's user demographic and he's kind enough to share the survey findings to the public.
The most interesting part of his finding is how he can understand what makes Apple Music's user choose Apple Music grouped by age.
The finding from Jackdaw Research showed that ease-of-use and finding faves are two most important features from Apple Music. It also showed that they are not tech illiterate that can be easily confused with how Apple Music's interface looks.
In addition to that, since finding faves requires user to dig deeper and use multiple functionalities at once, such as search function, button, filter, and navigation. I conclude that most Apple Music's user is already familiar with the latest tech/design trend in an application. It is consistent with Jakob's Law that postulates that user spends most of their times on another website.
From pain point exploration, we have understood that most problem that current Apple Music's user from Reddit discussion can be classified into:
- Lyrics accuracy
- Realtime compatibility
- Bounce between apps
- Bounce between apps
User flow when sing along to lyrics from music player
I asked five of my friends on how they're looking for lyrics when they want to sing along to the playing song. They're not sure on the exact flow as the flow might vary depends on their mood. However, the main activities can be classified as the flow above. Basically, they identified first which song that they want to sing, then they can either play the song first or looking for the lyrics so they can sing along with the song.
The news of Genius and Apple Music integration is so good as it can eliminate the browse lyrics activities that let user goes straight to the goal (sing along with the song) after they know what song that they're going to play.
The flow that's been presented below is the proposed flow that user needs to follow when the real-time lyric feature is implemented on the interface.
After finding the supposed-to-be optimum flow, I then formulate the design requirement based on the previous design thinking process. The main focus of the design will cover this feature:
- Manual editing for the accuracy
- Overlay lyric player for the real time functionality
- Direct access to the lyric screen
I like using notebook and pen to gain first idea about product's feature
I use pen and notebook (sometimes paper, but I like notebook better as it also serves as an archive for me) to get the initial idea to tackle the pain point. The picture above is the final iteration after multiple iterations that I've done and I decided to move to prototyping stage after that.
I use Adobe XD for the hi-fi prototyping, the process of translating from pen sketch to hi-fi prototyping is pretty fast as most of the work has been done on the research and ideation before I decided to move into the prototyping stage.
To test the prototype, I contacted the five friends that I've been asked about how they're looking for lyrics when the song is already been choosen. The stage is to validate and to identify potential pain points. The result of the testing stage will be used to evaluate whether the prototype needs to be adjusted/iterated further or not.
The testing found out that while there is no major usability issue in flow and element placement, the participants have different opinions about the position of real-time lyric function after they click "Play lyric in the background". Even though they have no problem with the position, almost all five respondents have their preferred position to place the lyric. Since this is subjective preference, I've decided to add one new functionality that allows user to drag the position and resize the real-time lyric functionality.
By providing access to real-time lyric while user is working on another project, user can reduce stress with singing along while listening to their favorite music without the hassle to search for the lyrics on the internet while keep switching between the apps.
This prototype hopes it can reduce the time and cognitive effort that user needs to take while picking the next song that they want to sing.
Click here to view the full-screen Prototype or you can play it below.