Inclusivity
We have long lived in a world with ableism.
Ableism, as you might guess by the name, is when being “able-bodied” is considered the “natural”, the only “normal” state of existence. People are assumed to be without disabilities until they show otherwise. The consequences of such a mentality that assumes ability are vast, and have been affecting differently abled people since ages. One of them is how, for a long period of time, everything – from public spaces to textbooks – was made by default for people who didn’t have special needs. Not only was such an implementation of projects inaccessible by differently abled people, but also led them to believe that experiencing what others experience is off the table for them.
Slowly and gradually, efforts were made to understand what differently abled people need – some of them by people with disabilities themselves. For instance, Louis Braille, who accidentally blinded himself at age three, grew up to create braille, a language for the blind, in 1829. Organised protests and widespread understanding also led to the creation of more inclusivity in public spaces. Many governments passed legislations that mandate ramps, wheelchairs, special handles, etc. in hospitals, schools, restrooms, etc. Similarly, technological methods were also adopted by makers of phones, laptops, televisions, etc. to make technology more accessible.
In this blog, I will talk about some of the measures Apple has taken to make their iPhones more accessible to people across ability levels, including their latest and much talked about 12 Pro LiDAR.
Apple’s History With Accessibility
In a world where mostly everybody relies on smartphones, a much needed move was made by Apple in the year 2009, when they took the first step towards making the iPhone more accessible. The Voiceover Programme they created that year – that I will talk more about in the following paragraphs – made them the first smartphone company to launch such a feature, and was received with open arms by the differently abled community. They went on to launch numerous other features for the community. Here’s a list of options they have for different categories of needs:
Hearing needs
-Apple created hearing aids based on bluetooth technology. Through these aids, people with hearing disabilities can stream whatever they’d like with good sound quality. They can also manage audio levels through their phone, adjusting sound volume for each ear differently depending on the hearing range. This option is available for regular earphones/AirPods as well.
-Subtitles or captions have also been made very easily available and can be turned on any time.
-There are vibration alerts and alarms.
-Further, when the setting is turned on, the iPhone can recognise common alarms like the fire alarm, the tsunami/flood warning alarm, a doorbell, etc. and give emergency alerts or notifications on the screen.
Motor Needs
-There’s an option of assistive touch for people who find the home button less manageable.
-In most iPhones, one can shake to undo.
-The screen sensitivity and home button sensitivity is changeable.
-A latest feature allows users to tap the back of their iPhone and choose from over 24 different actions. This list can be edited and users can create their own shortcuts.
Visual Needs
In 2009 the iPhone 3GS was launched making an iPhone usable by blind people for the very first time in history. The addition was a voiceover programme, as mentioned above.
-Voiceover developed into a built-in feature, pre installed in every iPhone. It provides screen-reading, or an option to read aloud everything that’s written on the screen. Its functions progressed to eventually allow sending out texts, typing and even opening apps by tapping on it twice.
-There are prompts that respond to gestures for changing settings like the speaking rate of voiceover.
-iPhones were made compatible with braille devices.
-For low vision users, text enlargement was introduced. Now, the iPhone’s screen can be magnified upto 1500%
-The dark mode was introduced for users with light sensitivity problems.
Here’s an extract from a statement by Sandy Murillo – a journalist and blogger with visual impairment:
‘Today, the iPhone not only helps me stay in touch with the world, it also gives me more independence. Apps like LookTell Money Reader and TapTapSee allow me to identify things without needing someone’s assistance. With the Bard Mobile and NFB NewsLine apps I can download books, newspapers and magazines in a matter of seconds to listen on my iPhone. The kNFBReader app quickly scans printed documents and reads them out loud to me. Thanks to Voiceover and the built-in accessibility of the camera, I can even take pictures! Finding last minute transportation has become easier thanks to apps like Lyft and Uber, and I can easily find my way to unfamiliar locations with the phone’s GPS.’
Here’s Sandy’s complete blog – She writes content straight from the heart, has the lived experience of being specially abled and deserves to be heard!
The new iPhone 12 Pro takes these visual aids to the next level with the introduction of LiDAR technology and people detection.
An Overview of the iPhone 12 Pro and its LiDAR Feature
The unexpected new feature the new iPhone Pro and 12 Pro Max have is the ability for blind users and users with low vision to – for all practical purposes – see other people coming. The feature went live on Thursday with the launch of iOS 14.2.
Apple has christened this new feature ‘People Detection’. It happens with the use of a LiDAR sensor on the back of the iPhone. It’s the same type of depth sensor that is used in augmented reality apps and games, and as the eyes of self-driving cars. Apple has applied it to iPhones to simplify navigating the world around them for people with vision problems.
Examples of this include helping people know when to move up in the checkout line at grocery stores, alerting them when people are walking close by on sidewalks in public spaces and letting them know when a seat is available on public transport. This feature will come especially handy in the current times. With the coronavirus pandemic requiring people to maintain social distance, they can safely keep a distance of upto 15 feet/5 meters with people, just like everybody else.
About the LiDAR Sensor
LiDAR stands for Light Detection and Ranging.
The LiDAR scanner is built into the camera array of the new iPhone. It is a minuscule dot at the back and uses Apple’s ARKit Occlusion feature to detect if someone is in the camera’s field of view. How far away they are is also something the sensor can estimate.
The science behind this is – the sensor sends out a short burst of infrared (IR) or ultraviolet (UV) light that momentarily illuminates its target. It is then bounced back to the sensor in a certain amount of time. This time is used to calculate the distance – the speed of light is a known constant in air.
In more advanced setups, radiations of different wavelengths are used. The resultant difference in return times are then used to make digital 3-D representations of targets. This has terrestrial and airborne applications and is usually the technology used in jets and even satellites.
Settings and Prompting People Detection
This real time sensing can happen in four possible methods and, as per convenience, users can employ them in any combination.
- Users can know how close somebody is to them through an audible readout. There are numbers – 1, 2, 3 and so on – in terms of feet. In meters, the lowest unit is half a meter. These numbers are called out indicating the literal distance.
- Users can set up a threshold distance – for example, 2 meters/6 feet (the default setting) – beyond which a certain sound is emitted and within which, a distinctly different one is.
- A haptic alert can be set up. This works by varying the intensity and speed of physical pulses with respect to the distance between the user and the person detected. The farther a person is, the softer and lower the generated pulse is. As the person comes closer to the user, vibrations get louder and more frequent. The buzz can be heard, or, in the case of users with hearing impairments, be felt through touch.
- A visual readout of the screen is possible that indicated in text how far the closest person to the user is. For people with low vision, a dotted line even indicates position and direction.
This setting is contained in Apple’s magnifier app. The app can be launched either through the latest back tap option or by triple clicking the side button. Siri can also be used to launch it.
The Future
Even though the technology has only yet come to iPhone 12 Pro and 12 Pro Max, it is all set to launch for iPads and Apple Watches as well. Attempts are being made to extend this ‘People’ Detection to Object Detection as well.
We hope Apple continues on its journey to make technology more inclusive for people across ability levels and that others also take inspiration from them!