Editor's Note: We are delighted to share this post in our series by UsableNet's Spring Marketing Intern, Brielle Cayer. Read more about Brielle in her first post, "Assistive Technology: What It Is and How it Helps Me."
In this post, Brielle explains the importance of mobile accessibility and how she uses her smartphone as a piece of assistive technology.
Social media scrolling, selfies, and Starbucks orders – these are the activities people assume I use my phone for as a young person. In all honesty, they aren’t always wrong; I certainly enjoy partaking in some of these stereotypes once in a while!
However, as a legally blind individual, I value my phone as a tool for something even more remarkable: digital accessibility.
Because my achromatopsia only reduces my visual acuity to 20/200, I have enough vision to use a regular smartphone with a few accommodations.
After reading the article, “How a Blind Person Uses a Mobile Device,” I'd like to provide my perspective as a visually impaired mobile phone user with considerable, albeit low, vision. Furthermore, I've included some additional tools that people with other disabilities can equip for enhanced digital usability.
A Tool for the Visually Impaired
Being a Gen Z member, I’m often on my phone, whether I’m scrolling through Instagram, looking up nearby lunch options, or Googling random topics. However, when I moved to the city for college, my phone became more important as an essential visual aid and navigation tool.
- I use my iPhone’s Maps app to help me figure out where I am, the way I’m facing, and how to reach my destination. One beneficial aspect of the app is its compass-like ability to show which direction I’m pointing toward (this is so important in getting around NYC!). The recently-expanded detailed voice guidance feature for Google Maps and Microsoft’s Soundscape tool are also fantastic options for those who like detailed narration for their journey.
- I use my camera to zoom in on faraway objects such as street signs, building names, or cute dogs my friends point out to me. Even better is the iPhone’s Zoom feature designed for this purpose. It reaches intense magnification levels; I love it!
- There’s a fantastic app called “Be My Eyes” that functions as a way for volunteers to assist blind users with visual obstacle aid, such as identifying a color or reading a sign. I love that these types of community-driven efforts exist on most people’s phones!
How I Use My Phone
A few simple settings or customizations I use to make my mobile experience as accessible as possible include:
- Increased Contrast: This increases the color contrast between app foreground and background colors, which can help with readability, among other things.
- Decreased Brightness: The first thing people tend to notice about my phone is how consistently dim the screen is. Since my visual impairment is characterized by “day blindness,” I try to keep the brightness down on all my devices whenever possible.
In fact, I go further than just dimming the brightness setting. I turn on “Night Light Mode,” which uses warmer colors that are easier on the eyes. I also decrease my screen’s white point, which reduces the intensity of bright colors.
- Dark Mode: Like decreasing my brightness, I also find relief by enabling “Dark mode” or any high contrast option.
By making most of the screen’s background black instead of white, the glare decreases, as does my eyestrain.
- Increased Display Size: This increases the actual scale of my phone screen and makes everything bigger.
- Magnifier: By double-tapping three fingers, my iPhone will zoom in on a section of the screen to the user’s preferred enlargement. This is a great feature that I use every day.
- Large, Bold Text: This setting affects all of the main text on my screen, including the phone’s interface and many apps.
Exciting Developments in Mobile Usability
I'm certainly not the only one who benefits from digital accessibility features!
One in four Americans lives with a disability that impacts “major life activities.” That's a noteworthy percentage of people who may need additional assistance when it comes to using technology.
Thankfully, leading tech companies are demonstrating considerable efforts in this area. A few exciting examples that come to mind include:
- Sound recognition for IOS and Android devices automatically listens and alerts users when it detects specific sounds such as a siren, smoke alarm, crying infant, running water, doorbell, etc. Apple has even developed Made for iPhone hearing aids that pair to Apple devices and customize sounds to fit your preferences. The Live Listen feature of these hearing devices is now included in Airpods.
- Google's Project Euphonia is improving voice recognition programs by collecting audio recordings of people with atypical speech patterns. Similarly, they are integrating gestures and facial expressions for individuals who don't speak.
- Apple's improved Voice Control lets users navigate their entire device using only speech.
- Facetime group chat detects sign language use and makes those users the prominent speakers of group calls.
- Android's Live Transcribe app types out a transcript of surrounding dialogue in real-time. The iPhone apps Ava and MyEar offer similar capabilities.
- The automatic captioning tool Live Caption by Google works on audio and video content. They also offer a Lookout app, which reads text from pictures. This allows visually impaired people to simply snap a photo of something like a sign, menu, or anything with text, and quickly know what it says.
How to Prioritize Digital Accessibility
The above settings are simply my personal preferences; every person uses (or doesn’t use) different features to fit their own needs. For the most part, I can control my smartphone experience (separate from individual apps), which is very important to me.
When your business is designing a mobile feature, app, or website, you should consider every user’s unique needs. When you take steps like prioritizing inclusive design, employing a diverse array of user testers, and committing to your platform’s accessibility, all of your users will thank you!
If you'd like to learn more about how to prioritize digital accessibility and inclusive design for mobile and native apps, UsableNet is offering a free webinar.
On May 12th at 11 am ET you can join Senior UI/UX Designer, Luca Boskin, their Head User Tester, Joseph DiNero, who is blind, and Jason Taylor, Chief Innovation Strategist at UsableNet on a Webinar, "Inclusive ADA Principles for Mobile App Accessibility."
Space will be limited, so save your seat now.
Contact us to learn more about UsableNet mobile app accessibility services.