The impact of AI
When I lost the rest of my sight in 2018, i had to scramble to adapt. Mine was no slow decline of vision, giving me ample time to react . Rather, it was a fast moving roller-coaster, which started me off with low vision in March, and and threw me, with steep ups and downs, into the world of full blindness in 3 months. I was basically fine in the beginning of March, with vision that I'd had for years - bad but manageable - and by July, I couldn't see if the sun was up when I woke in the morning to go to work. I had to adapt in many ways - though I had a lot of blindness skills already, fortunately. Today, i want to focus on one skill I didn't have, and I had to learn in a hurry.
When you lose your vision, two very important peripherals attached to your computer become entirely useless - the mouse and the screen.
Let's start with the screen. It tells you what's possible in any given app or web site. You look at it for guidance, you glance up and down it to find what you need. You get an overview of all the things you can do. It shows you pretty graphics. It is, for better or worse, fun and useful to interact with.
The mouse is your gateway to that screen. It allows you to select things on that screen, to interact with them in all sorts of ways. Drag and drop, clicking and scrolling.
So, if those input and output methods are gone, what do you do? Do you throw your computer in the trash? No, as it turns out, you can use something that's called a screen reader. It utilizes other input and output devices to try to give you all the information you need to operate your computer. you can't see the screen? Well, we'll have the screen reader talk to you to tell you what's on the screen. you can't use the Mouse? We'll use the keyboard.
This, of course comes with side effects. A screen is 2D, but a stream of words is not. Imagine if you shrunk your screen down to the size of a teleprompter, that was only able to display words and very rudimentary colors. Now try to imagine using a keyboard to manipulate your computer with its tiny screen. Sounds slow and cumbersome? Without significant changes to your workflow, it will be.
A screen reader - and your OS's built in keyboard support - allow you to get to parts of your screen very quickly and take actions that would normally require a mouse to do. If you ever pressed Control + B in Microsoft Word to write a piece of bold text, you've used the keyboard support in Word to quickly get done what you needed to without using the mouse.
Now imagine having to learn hundreds of these keystrokes. Imagine that, without knowing the keystrokes, you're unable to use your computer with even a modicum of efficiency. Just imagine, without knowing many exact shortcuts, you'd have to navigate to a button with the keyboard - which takes ages - and press Enter when you've finally reached it, rather than quickly moving the mouse there and clicking. Without knowing those keystrokes, an action that would normally take you a second might take 30. It's not enough to just master keyboard navigation. It's not enough to be able to sit at your computer and press tab 17 times to get to where you need to go. If that's the only way you can use a computer, you'll never keep up with your sighted peers.
There are two other groups - that I know of - that use the keyboard extensively; gamers and software engineers. both of those groups realize that, if you have to take your hands of the keyboard all the time, you lose efficiency and won't be able to keep up. And so they teach themselves. They take a concerted effort to learn keyboard shortcuts to get things done fast.
领英推荐
As luck would have it, I was a software engineer and so was used to learning a whole slew of keyboard shortcuts. I was working at a web agency at that time, and both they and our client at that time allowed me to take 2 full days a week, for as long as I needed, to adapt to my new situation. And so I set to it. Learning keystrokes came naturally to me, but even knowing those keystrokes didn't always yield the results I wanted. No matter, I was a programmer and so could script my screen reader to do exactly what I wanted when I took a certain action. I basically could define my own keystrokes.
It took me about 2 months, and I was back to my job fulltime. In some cases, I was even more efficient and faster than when I was sighted, just 5 months before. I got home without tired eyes, because I couldn't - and therefore didn't - use them anymore. I wound up thinking that going blind wasn't so bad, at least not for my career.
Then, last summer, i went to a couple of blindness conventions in the US for my current employer. There are some bubbles and echo chambers that, when you're in them, you don't realize you are. There are other, transparant ones as well. You know that there is an outside, you can perceive the outside, but you don't experience it, and so it's far away enough that it's easy to ignore. You try not to, of course, but you realize how much you where failing at doing so until that bubble pops. You realize that it's only the processes in place - rigorous testing, constant discussions with other blind people in your team, interactions with users, that kept you from releasing sub-par unusable software.
A wile back, i wrote about showing Picture Smart AI to people at the NFB and ACB conventions. I wrote about how excited everyone was to see the tech. I showed them all the ways in which they could access it - how you can press Insert + Space, followed by P, followed by F to describe a file. How you could press insert + Space, followed by P, followed by C to describe a control. How you could press Insert + Space, followed by P, followed by Shift + C to describe a control with extra information. how you can press Insert + Space, followed by P, followed by S to take a screenshot of the entire screen and have it described. How you can add Alt to your keystrokes and get the ability to ask a preliminary question before sending the image. And, as you're reading along, how would I ask a question before having a control described? you probably don't remember because I threw all sorts of keystrokes at you - for the record it's Insert + Space, followed by P, followed by Alt + C.
This is just one feature in one product and I haven't even mentioned all the available keystrokes. It gets worse if you have to consider different apps, where the same action has different keystrokes. If you want to mute in Zoom, you press Alt + A. In Teams, you press Control + Shift + M. If you want to mute in Teams while not being in the Teams window, you press Alt + Windows + K. In Slack, you press Control + Shift + Space. Good luck remembering them all.
The point of this story is that, most people cannot remember all the keystrokes required to get something done efficiently. Whether you remember 15, 20, 30 or 150 keystrokes, you'll always run into the situation that you don't know the keystroke to get something done. That's why I'm particularly proud of the work we're doing right now with FSCompanion. FSCompanion will give you keyboard centric instructions on how to get things done in with JAWS in Microsoft Office, in Windows and on the web. ask a question and it will answer with the keystrokes needed to get done what you need to get done. For now, the only way to match the speed of the mouse is with keyboard shortcuts. For better or worse, we'll need these shortcuts. Now you have a 24-7 assistant, ready and willing to answer all the questions you have. This feature is in beta, please try it out by going to https://fscompanion.ai and give us feedback. We'll be tirelessly working to improve the assistant and add more knowledge domains.
I'm great full for the time I got to adapt when i went blind. I'd like to thank INFO - Digital Decoded , my employer at that time, as well as Remmert Stipdonk , who was director and Info, Anja Molenaar from HR, Alexander Pluim , who was our client at that time, Paul Sanders , our scrum master, and the entire team.
I think FSCOmpanion will make the journey I had to undertake a little easier for people that come after me. i think FSCompanion will ease the burdon of remembering so many keystrokes, giving people space to focus on other, more important things. I think FSCompanion will help our productivity as blind people, and that's a good thing. I'm very excited to see what the future holds.
#FSCompanion #Accessibility #JAWS #ZoomText #Fusion #UX #AI
HR Adviseur - Dienst Justiti?le Inrichtingen, FPC de Oostvaarderskliniek
5 个月Meer nog dan over AI gaat dit over veerkracht. Respect Mo! ??
Android | Financial Happiness
5 个月Thank you for sharing your story. This could actually be a Ted talk.