PianoBoard | Mobile AR Game
PianoBoard is a mobile AR experience that teaches one to play piano. Using an Aryzon or a mobile VR headset with the passthrough camera view, the player would see the sequence of AR keys ‘falling’ onto the keyboard in a certain order and rhythm to let them practice piano.
In this project, I collaborated with an engineer Serge Shack, and was tasked to conduct a usability testing for the app, improve its usability and UI. The results of the testing proved the concept was of a great interest to the target audience, however, there were a number of problems highlighted in both on-boarding and in-game experiences.
The main goals of the testing were to find out users’ main pain points, as well as expectations for how to use the app and where to find the content. There were a few technical specifications that were known to be of inconvenience to the users, but would have to be left in the app – such as printing out the marker images.
On–boarding / Pre–Game Experience
Unclear order of actions.
Unclear which parts of Home screen UI are interactive.
Poor UI organisation.
While on-boarding, users can undertake two different routs, depending on the keyboard device they use - these flows are not reflected in the on-boarding UI.
The flow for the returning user is not reflected in the original UI.
To solve the problem, there were suggested 3 flows: for the new user who’s playing an analogue piano, for the new user who is playing a digital piano with a MIDI output, and for the returning user. Each of the scenarios provides a clear path and gives clear instructions as for what the user is meant to do to start practicing. The interactive prototype for the on-boarding experience can be viewed here.
After on-boarding, the in-game experience offers a user to place their phone in the VR headset with the passthrough camera view. User is then asked to calibrate the AR piano and pick a melody to practice, after which the sequence of AR keys would ‘fall’ onto the keyboard in a certain order and rhythm to let them practice piano.
After play testing the experience on a number of users, there were a few common problems highlighted:
Information architecture seemed rather complex to navigate.
Labelling wasn’t clear enough.
In some instances, labels were too small to be able to read.
Users would have to move their head closer and further, up and down, to see buttons and labelling on the buttons.
While playing, users would accidentally trigger buttons by gaze.
Experience would offer to switch between the VR view and the 2D view, depending on the buttons clicked.
The Menu icon was too small, while it was surrounded by 4 blocks that are meaningless and are not interactive.
To not have Menu items triggered by gaze while playing piano.
To have Menu items visible, readable, and of an accessible size, shape and tilt.
Simplify interactions for controllers like adjusting speed, timeline, and volume.
Based on new Information Architecture map and challenges, there was created a 2D interactive prototype, that was then implemented by the developer into an app.
Voice Activated Controls
The pre-game and in-game experiences were validated to be intuitive enough, however, gaze controls slowed down the experience significantly.
To simplify interaction within an app we suggested to implement voice activated controls, where we would use template matching recognition method – for the trigger word and commands listed in Menu.
The trigger word would simply be “PianoBoard”, while the list of commands should include:
Team: Serge Shack (technical)
Role: User Research, UX, UI
Tools: Adobe XD