BLUF – Adjusting for disability inclusion challenges the signals used in relaying data. Recognising the need for a variety of signals otherwise not available gives options to create new, complementary sensory platforms otherwise not considered, redefining how we engage with our digital landscape.
*Note: this is written as complementary to a previous post – On Sensing Digital Danger. Intent is to introduce needs unmet in digital spaces, namely uncovering sensory aspects currently lacking in our media. As we increasingly immerse ourselves in digital lives, continued survival (literal or figurative, your choice) requires engaging all our senses.
Ever compare the experiences between reading a story in digital versus physical media? Digital gives you the streamlined scroll, allowing you to get through much faster. The main points stand out, though your attention often gives way – notifications may have crept in.
Reading a physical book is a bit different; first you see the covers, relative age, and condition. You grasp its weight, note the scent and texture of the paper. You note the print type along with evidence of dog-eared page corners where someone stopped.
Looking back on the story, you’ll find most people recall more from the physical book. Why? Because the experience engaged more senses. Flipping through, you likely remember where some pithy quote you favoured was – looking in a general section, you see a familiar looking paragraph at the bottom left-third on the left page. These anchor your memory and, by proxy, your recall.
From an inclusion perspective, the challenge is in relaying the message. Data doesn’t change, though communication means must adjust to fit. Developing alternative displays the best we can, we focus on getting even better – requiring complementary sensory inputs for adequate data representation. Trouble is, the senses we rely upon, in some cases heavily, in physical space are absent in digital.
We often think of the immersive introduction in AR/ VR as a next step, in many ways it can be. The dynamics of digital overlay/ replication of physical space allows for incredible experiences. Increasing haptic devices in platforms utilise three point five senses (vestibular i.e. movement is somewhat there, but not enough to keep people from the occasional bouts with nausea).
However, AR/ VR hasn’t grown to a point of full immersion, and may not meet inclusion needs. For example, someone with significant visual impairment may not find AR/ VR useful as an option. Moreover, whilst some of the digital creations are truly incredible, we’ve yet to incorporate this into more standard daily work. How will this drastically improve my Powerpoint or Excel files? If the organisation develops their own enterprise tools or solutions, how much more dev work will be necessary to add AR/ VR features?
Client sales presentations including AR/ VR sensory immersion are intended for impressing people, but incorporating full-spectrum sensory elements into the main workforce should be the target. Doing so allows us to enhance not only their efficacy, but their experience in doing so. It may also open up venues for inclusion by offering a variety of options for utility/ interpretation/ presentation/ engagement, with leverage more available for users to better determine what sensory data will help them with the tasks before them. Need a bit more sound? Or would touch be better? Giving people choice in how much or little sensory data is rendered would serve both physical and neuro- diverse populations.
The challenge is not only in how to create the different aspects of sensation; we have made great strides in creation. Multiple facets require further consideration:
- Bringing individual parts together with legacy systems already in use: As stated earlier, we ask: how can this be of use to systems we use in business: Excel/ Powerpoint/ Quicken etc.? Whilst recognising the need to use more senses in enhancing inclusion efforts for a variety of people, how will the use of this sense facilitate better utility of the business tools that the organisation relies upon? There is further to go over time, but for the time being, work towards inclusion into the legacy, not reimagining the whole to support foundational inclusion. Some people may already be there, or can build from start – but many aren’t and need to see the benefit to move those mountains.
- Finding complimentary signaling: Outside of video games, there doesn’t seem to be as much focus on how to create an experience based on multiple complementary sense signals. Where a game might subtly change the ongoing visual/ auditory/ haptics based on events unfolding, the same is not true for normal business programs. When Outlook gets a new email, I hear a tone and see a notification flash. Using complementary sensory data signals for the visually or aurally impaired to sense a new email may be easy, but it forces us to think about how to use other available senses in our messaging.
- Uncovering what needs said in different sensory means to relay the message: Nuanced messages could be relayed to more people via different senses, but doing so requires developing an understanding into how various complementary sensory inputs might tell a different story (or a different sense within the story) with minor adjustments. This is long overdue, in part training digital spaces how to signal, in part training people better signal interpretation.
- Letting inclusion drive the build: As this is an underdeveloped area, we have an opportunity. We can base the build on making it possible for anyone to gain more from using all the senses they’ve available. The opportunities presented by building to enhance the experience of those with disabilities – regardless of their form – will benefit others, opening new possibilities they never considered.
Other focal areas will emerge as we uncover more. We start somewhere. We’ve a long way to go.