top of page

SHOEBOX Online: an accurate and accessible online hearing screener

This project was completed as an employee of SHOEBOX Ltd.

Image of different desktop screens of SHOEBOX Online: instructions, choosing headphone type, test, and results

Summary

SHOEBOX Online is an online hearing screening test that can be completed by anyone, anywhere, at any time. All the person needs is a pair of headphones and a device that can connect to the internet.


The project was completed by a product team that actually changed many times throughout the project, as we had to add additional people for certain features and requirements, and then remove them once those were done. At the start, it consisted of several software developers, a product manager, an audiologist, and myself as co-team lead/lead product designer/UX researcher/UX writer. Toward the end of the project, in addition to the team listed above, we also had a full time team lead (taking over my and a software developer’s co-team lead roles), an academic researcher, our VP of engineering, our chief technical officer, our device and headphone calibrator, and a junior designer.


The goals of the project were to design an online hearing screening test that works accurately with any device and any pair of headphones, and was particularly geared in usability toward older adults. This, it turns out, is a tall order—none of our competitors had succeeded in this in a screening format, so we were in uncharted territory.


The project was around 11 months long, five of which were focused mostly on perfecting the hearing test method and algorithm, which was coupled tightly with both the test modality and the instructions.


The product is much more successful at accurately screening than competitors’ products. Our product works on mobile devices as well as desktop—a surprisingly uncommon case—and the accuracy was only slightly affected (not enough to mislead, however) by the style and quality of headphones. The results were consistently accurate in comparison to a person’s diagnostic results, and rated high on the trustworthy scale by participants (potential patients).


We also had the unique experience of launching the product one month post-lockdown. This turned out to be a boon for the product because suddenly, hearing clinics couldn’t easily screen or test patients in person. The prospect of an accurate online hearing test became a popular option, much more so than we were initially expecting.


 

My Responsibilities

My responsibilities for this project included Team Lead responsibilities, Product Research, Product Design, UX Research, UX Writing, and Information Architecture. How much of each of these was my responsibility is detailed in the graphic below.

Graphic depicting my responsibilities throughout the project and how much of them were my responsibilities: 40% team lead, 50% product research, 90% product design, 80% UX research, 70% UX writing, 70% information architecture
 

Process

The process for this project was somewhat complicated, based on constantly learning new things regarding the accuracy of the testing method and algorithm. We had to step back to research multiple times after design and testing, before we could proceed through the beta launch—which then included further iterations—and then finally product launch. Like all products, we then gathered feedback and went back to design and testing.

Graphic depicting the overall process: Research to Design and Testing (iterative) to beta launch, launch, and feedback. Beta launch and feedback both have iteration pointing back to design. Testing also has a line pointing back to research.
 

Preliminary Research

Preliminary research was completed by the product manager, the developer coding the algorithm, the audiologist, and myself. This project definitely didn’t follow a linear process—we had to do further research much later on in the project to get the accuracy to where we wanted it. That’s when we added an academic researcher, the CTO, and the VP of engineering to the research team.


A unique factor in this product design was the desire by our parent company for us to use a patented test method that belonged to them. This test method, however, had not ever been used with so many variables involved, nor had it ever been used in an unsupervised application. Additionally, it took much longer in its full method than we were willing to have the online test be. On its own, it also didn’t give us enough information to meet the accuracy threshold we had set for ourselves. This means we had to find out how to use the test method with a high number of variables, in an unsupervised format, in a much shorter time frame, and in combination with a method that would improve accuracy in the results.


Given the sheer amount of research we had to do, we were lucky that we had already launched QuickTest by this point, so we were able to use some of the things we’d learned about creating a good hearing screening test from that project to inform this one.


The research for this project included:

  • Developing use cases of customers in terms of goals, technical requirements, and the knowledge they desired participants to come away with about their hearing

  • Reviewing competitor applications to understand test methods used, usability issues and successes, common workflows, and feature and information gaps where we could add value

  • Developing profiles of participants who would be using the screening application, as well as profiles of administrators who would be setting up the application

  • Reviewing academic research and interviewing our resident audiologist on how to position hearing test results to participants without alarming or discouraging them, and to build trust and encourage them to seek help if required

  • Conducting accessibility research by reviewing WCAG guidelines and Pew research study on how older adults use technology

  • Reviewing the research on the patented test methodology we were required to use

  • Reviewing the research on additional test methodologies that could provide a further layer of accuracy

  • Developing the code to produce a test algorithm that would follow the methodology

  • Testing the output of a variety of headphone types and brands coupled with a variety of devices, so that we knew the average adjustments that would need to be made to calculate accurate results

  • Reviewing the research on questions that can be used to build hearing profiles, so we could add yet another layer to our result calculating algorithm

  • Reviewing the latest WCAG guidelines and methods for dynamic sizing, so that if people had larger text sizes turned on in their browsers, we could honour that in our test

Images of participant profiles. Julie is a 26 year old bartender concerned she may have loss due to loud music playing in the bar she works in. She has mild loss and is moderately likely to adopt given her fear of losing her ability to hear music. Marian is a retired older adult who fears dementia, she has moderate to severe loss and is highly likely to adopt. Ying is a 58 year old office worker who has trouble hearing his coworkers. He is moderately likely to adopt since he doesn't want to seem old or checked out around them because he can't hear.
Image of administrator profiles. Assad is a VP of marketing at a hearing aid company who is interested in proven success. He wants to use shoebox online to gather leads to clinics that sell his company's products. Alyssa is an audiologist and clinic owner who gets shoebox online with QuickTest, and wants to use it to get more leads.

 

Design

Due to SHOEBOX Online not being a medical product, we had to tread carefully in terms of our language and design. It was important that we made it very clear that we weren’t diagnosing anyone’s hearing through the online screener—in the medical product world, anything that diagnoses requires strict regulations, certifications, and audits.


However, we also wanted people who used SHOEBOX Online to trust the results, so we had to ensure it still seemed professional and trustworthy, even if not strictly medical.


Given our audience of older adults, we also had to make sure the design and copy was accessible and easy to understand.


On top of this, SHOEBOX Online was to be translated into multiple languages, so we needed to keep the copy as short and simple as possible.


To accomplish our goals, the design and copy decisions we made (very similar to the ones for QuickTest, our other screening product) include:

  • Separating each step into one per screen, for easier digestion

  • Adding polished yet friendly graphics, for visual interest, approachability, and to make the product feel non-medical

  • Using simple, jargon-less language to explain the screening test and results

  • Depicting the results in a way that’s easy for a regular person to understand, and in no way confusable with a traditional audiogram

  • Ensuring all the text and images scale according to dynamic text size settings

  • Giving the participant control of the sound during the test—they have to increase the sound manually in order for any to play

All SHOEBOX screening products are white-label, so that companies can brand them with their own logo, images, and colour scheme to fit seamlessly within their websites and offices.


In order to allow our customers to customize, but not to mislead participants or create an inaccessible experience without knowing it, we made the following decisions (the same ones we made for QuickTest):

  • Not allowing customers to adjust any of the copy surrounding the test instructions, results, or disclaimer, but allowing them to add their own copy to what existed

  • Using code on the backend to dynamically choose either white or black text depending on the contrast scale for WCAG AA of 4:1 for any text that appeared on a coloured background

  • Providing a warning if the colour scheme choice in contrast to the white background of the screen didn’t meet a 3:1 contrast. Customers could still move forward with their choice (product management’s decision), but would knowingly be noncompliant with WCAG

Graphic depicting the design process: Workflow, Sketching, Wireframes, Copy, Prototype, Mockups
Image depicting different participant screens designed for desktop
Image depicting different participant screens designed for mobile
Image depicting different result screens on desktop and mobile
Image depicting the administrator portal setup pages and participant list pages on desktop
Graphic depicting the participant workflow through SHOEBOX online
Graphic depicting administrator workflow for setting up QuickTest on their website and reviewing participant results
Image of sketches of test ideas
Images depicting black and white simple screens of the test that were used for prototype testing
Images of charts and data from usability testing

 

Testing

Throughout the process, we completed many usability and accuracy tests, both internal and external, to ensure our product was working with all the variables in place. With so many variables, and a duty to ensure people weren’t provided with erroneous results, this part of the project was absolutely critical.


Due to the unique nature of the usability (which included both form and function as well as instructions) and accuracy of the test being so tightly coupled—without accuracy, usability was useless, but without usability, accuracy was useless—we couldn’t really test one without the other when it came to the actual test portion of the product, so every test had to account for both. This made the data particularly difficult to sort through, and many ideas had to be thrown out throughout the process.


Testing Timeline

Graphic depicting the testing timeline of when certain kinds of tests occurred over the period of 11 months from prototype to launch

Learning

  • Including multiple steps on one screen for the test setup was too confusing for many participants—breaking it up into multiple screens, even though it made it seem longer to us, was actually quicker because people weren’t confused by future steps being present

  • Many people do not know how to adjust the volume on their devices. These instructions needed to be provided for these people.

  • People were worried about turning their volume up to 100% (which was required for accuracy). Including reassurance that we would not play sounds so loud they would be uncomfortable for them was paramount

  • Many people don’t know which side of their headphones are right and left, so we needed a method in our setup to allow them to choose which side they heard the sound through. Then we could flip the results for the channels if they chose the right side, but we read it behind the scenes as the left channel. This ensured their results were not flipped.

  • More vague instructions could be interpreted many ways. We had to test many small iterations on the instructions to find something specific and simple enough that the majority of people (including those with cognitive difficulties) could understand.

  • The visual of sound increasing was incredibly important, so that people knew they were doing something and the test wasn’t broken even if they couldn’t hear anything.

  • We played around with more modern ways of adjusting the sound, including clicking on the graphic itself, or using a touch slider on mobile, but in the end, the simplest solution of plus and minus buttons was the best one, particularly for our target audience of older adults.

  • Regardless of how accurate we knew our test to be, people are often skeptical of new technology, particularly if they aren’t early adopters (which we know the majority of people aren’t). We did everything we could to provide assurance and trust, but in the end, we realized we can’t control this 100%, and we had to accept that there will still be a large number of people that are skeptical of an online hearing test.

 

Launch

This product was originally developed with our parent company in mind, and we honestly didn’t expect it to be a huge draw for customers. Often hearing aid companies provide online hearing tests to hearing providers to include on their websites for free as an exchange for pushing their products, so trying to get people to pay for a test was a hard sell.


However, due to COVID, suddenly people needed a much more reliable and robust online hearing test to screen potential patients. Nobody wanted more people to come into their clinics than necessary in order to mitigate risks, so our product suddenly took off. It helped that we had done many tests and our academic researcher had written up a research study on the accuracy of our test—that made it easy to convince skeptical customers that we were in fact much, much better than the competition.


SHOEBOX Online is currently being used by:

  • Small hearing clinics

  • Hospitals

  • Large hearing aid providers where online sales are possible (Europe and the USA)

  • Insurance companies in the USA

On average, 8700 screening tests are done per month across all our customers.


 

Reflection

Overall, we determined that we met our goals with the launch of SHOEBOX Online. Due to all our painstaking research and robust testing, we knew we had a much better product than our competitors—much more usable, as well as much more accurate.


What would I do differently?

Looking back on this project, I think we could have completed it faster than we did, even with the requirements of the amount of testing we had to do. I would have personally stood up a bit more for the project scope—at one point in our research and testing, our product manager got a bit carried away with a suggestion from a sister company that was very complex and really not needed to fulfill the requirements of the MVP. Looking back, I should have said something about the scope much earlier, and pushed to keep us on track. That would have cut about 2 months out of the project, and allowed for some further work on additional features that we had to cut for MVP.


I also should have gotten our audiologist and academic researcher involved in the instruction writing much earlier. In general, I haven’t had much trouble writing instructions for our products, but given this was a completely different way of conducting a hearing test, I should have realized I was out of my depth earlier on in the project.


コメント


bottom of page