You can probably read this – but for me, the written word is heard not seen
One billion people worldwide have a disability. In Australia, the last ABS figures tell us, nearly one in five Australians (17.7%) lived with some form of disability. This increases significantly when you consider the physical and cognitive impacts of ageing.
Today is Global Accessibility Awareness Day, a great opportunity to promote how inclusive design provides benefits for the entire community.
Let me share some personal examples.
Today I enrolled to vote in the Federal Election telephone voting system for Blind and Low Vision voters. I have been using this system for recent elections and by-elections. To my surprise, the platform was officially expanded to Aussies living in Antarctica. Now whilst I have absolutely no desire to move to a freezing cold end of the Earth, it’s great to know those voters in Antarctica don’t get hassled by How to Vote flyers in their face on election day – something that is completely useless if you can’t read it – and they can vote from the warmth and comfort of their home or office. For me I can vote without assistance.
Recently I read a tech tip, probably sponsored by Netflix but still important, letting people know TV shows and movies with audio description, an additional audio narration of the visual aspects of the program, was a great way to binge more content whilst exercising or walking without looking at the screen. In short, a visually described audio book.
Whilst these accessibility examples might seem like small benefits to some, they are incredibly crucial to my inclusion in society; someone who, suddenly and dramatically, lost their total sight 8 years ago. I too want my political voice heard, and desperately need to chat intelligently with my friends about Ted Lasso.
Digital accessibility is a human right, especially as so much information and accessibility to services is now only found on the internet. According to the GAAD Foundation, “every user deserves a first-rate digital experience on the web. Someone with a disability must be able to experience web-based services, content, and other digital products with the same successful outcome as those without disabilities.”
What is incredibly exciting right now is that someone who is blind or low vision can choose technology off the shelf, rather than being forced into specialised technology that gets fewer updates and ages quickly.
Just recently, KPMG approved my use of an iPad Mini, to facilitate my Microsoft Teams meetings. The use case was supported by a feature Apple called Centre Stage, which tracks eye movements to keep the presenter or attendee centred in the camera focus. Designed to assist people reading notes from another screen. This greatly assists me in a totally different way – I have no idea where the camera is, and I cannot fix on any one point. Now the camera is centred on me and I look and feel more engaged.
Right now, I am composing this article using my KPMG Windows laptop, which has an additional software called Jaws (Job Access with Speech) installed which converts digital text into audio speech. I don’t look at the screen at all, just listen to the audio feedback that Jaws provides. I navigate around the laptop using a combination of Windows keyboard shortcuts and Jaws’ keyboard shortcuts. Yes, there are many more than just Ctrl C and Ctrl V – try Windows Key + 1 to open the 1st program listed on your Taskbar, which for me is Outlook. These keyboard shortcuts are much quicker than finding an icon and clicking using a mouse. Rather than reading I get to listen to the content. For those of you have tried Immersive Reader in Microsoft Edge, well this is what I do all day long. But please, describe your images using Alt Text, as AI really can’t distinguish a cat from a dog!