React Native is a framework used to create native iOS and Android apps in a way web developers may already be familiar with. But how do you ensure your React Native apps are inclusive and usable everyone? Scott will share tips on how to test and build React Native apps with accessibility baked-in!
Creating Accessible React Native Apps
AI Generated Video Summary
This Workshop on creating accessible React Native apps covers topics such as digital accessibility, assistive technologies, testing methods for iOS and Android, the React Native Accessibility API, and issues with the API. It emphasizes the importance of understanding WCAG guidelines and meeting user expectations. The Workshop also includes a demo of the COVID Alert app, highlighting the improvements made for accessibility. Continuous education, involving people with disabilities in testing, and using tools for accessibility testing are also discussed.
1. Introduction to Workshop
Welcome to the Microsoft app for Microsoft Services workshop creating accessible React native apps hosted by Scott Winkle, who is a platform accessibility specialist here at Shopify. Over the next 90 minutes, Scott's going to be drawing on all his experiences working on Shopify's COVID shield up and looking at techniques for testing and implementing accessibility best practices for React native apps. In my spare time, I like to ride my bike and drive my car when the weather's nice, read books, watch movies and spend time with my family. COVID Shield was eventually adopted by the Canadian federal government to provide Canadian citizens a notification method for potential exposure to COVID-19. It was rebranded as COVID Alert and the development was taken over by the Canadian Digital Services team. What we'll be covering today includes details on how I tested the COVID Shield app for accessibility, some issues I found and the related solutions. Some of the topics we'll discuss include what is digital accessibility? Who are we including when we talk about accessibility? What assistive technologies are available? How to test using mobile devices and simulators? And then finally, I'll review some React Native specifics. That is issues I found when testing COVID Shield and at the end we'll have some time for questions. Digital accessibility means building digital content and applications that can be used by people with disabilities.
So let's get started. Thank you for joining us, everyone. This is a round of applause for Ray. Have a good day. Thank you. ♪♪ ♪♪ ♪♪ Cool. So I'm just going to jump into our meeting. And welcome, everyone. Thanks for joining us today. Welcome to the Microsoft app for Microsoft Services workshop creating accessible React native apps hosted by Scott Winkle, who is a platform accessibility specialist here at Shopify. Scott has years of experience working on accessibility both within Shopify and on independent accessibility projects, he's very active in the web accessibility, and will highlight a few of the insights that he has with the developer community. So over the next 90 minutes, Scott's going to be drawing on all his experiences working on Shopify's COVID shield up and looking at techniques for testing and implementing accessibility best practices for React native apps.
Just a few housekeeping notes before we get started, if you have questions during the workshop, please use the Q&A function within Zoom, and we'll be able to try and answer your questions. We will have some time towards the end for a Q&A section. And also I'll be monitoring these questions or anything that you have to say in the chat. I will also be sharing some links in the chat too. And I'll also be looking at the discord channel. And as I mentioned, we'll have some time at the end for a Q&A session. Also this workshop will be recorded and the recording is going to be sent out to attendees or to registrants later today, I believe. So yeah, that's it for me. I'm gonna pass you over to Scott and he's gonna show us how to create accessible React Native apps.
Awesome, thanks Liam. Yeah, so welcome everyone to creating accessible native apps at React Native, React Summit rather. So we'll be covering quite a few topics today. So I'll try to move quickly in order to make room for some of the questions at the end. So as Liam said, my name is Scott Vinkle. I am one of two accessibility specialists at Shopify. So this means I work with a lot of teams. I do a lot of testing with assistive technology and I write a lot of accessibility issue tickets on GitHub. So you can find me at Svinkle on Twitter or my blog and online store is at scottvinkle.me. In my spare time, I like to ride my bike and drive my car when the weather's nice, read books, watch movies and spend time with my family. And my two kids definitely keep me busy, especially during these COVID days.
So back in January of 2020, seems like forever ago, Shopify publicly announced our adoption of React Native. This meant that from this point forward, all of our new mobile apps would be built with React Native. By extension, this also meant that learning a new learning opportunity for our accessibility team by way of testing apps and guiding design and development teams to create accessible user experiences. Fast forward a few months, a few volunteers from Shopify start working on a new React Native app called COVID Shield. Now COVID Shield is a COVID-19 exposure notification solution, which uses the exposure technologies provided by Apple and Google. While the team at Shopify was working on the app, I was contacted by one of the leads and brought onto the team to conduct a bunch of accessibility testing as development was taking place. So what I'll be covering today includes details on how I tested the COVID Shield app for accessibility, some of the more interesting issues that I found and the related solutions to those issues. Before I do that, it's worth noting that COVID Shield was eventually adopted by the Canadian federal government to provide Canadian citizens a notification method for potential exposure to COVID-19. It was rebranded as COVID Alert and the development was taken over by the Canadian Digital Services team. So definitely recommend installing this app if you are a Canadian citizen. So what we'll be covering today includes details on how I tested the COVID Shield app for accessibility, some issues I found and the related solutions. Before we get into those details, we'll cover some other important topics around accessibility out of a general state. So some of the topics we'll discuss include what is digital accessibility? Who are we including when we talk about accessibility? What assistive technologies are available? How to test using mobile devices and simulators? And then finally, I'll review some React Native specifics. That is issues I found when testing COVID Shield and at the end we'll have some time for questions. So to start us off, what is digital accessibility? It's a good question. I'm glad you asked. Simply put, digital accessibility means building digital content and applications that can be used by people with disabilities. So these techniques can and should be applied to websites, mobile apps, desktop apps, operating systems, video games, electronic documents like PDFs, kiosks, and a whole lot more.
2. Understanding the Importance of Accessibility
From some statistics from the Global 2020 Economics of Disability annual report, there's an estimated 1.85 billion people who identify as having a disability. This actually makes up an emerging market larger than China. So who exactly are we talking about when we say we need to make our digital products accessible? Well, people with disabilities. There are many types of disabilities which need to be taken into consideration when creating digital content. So how do people with disabilities use technology to consume our digital content? Let's explore. Disabled people consume digital content by way of assistive technology. There are a number of assistive technologies and devices available to people with disabilities. So far we've covered people with disabilities are able to use modern technology via assistive technology. People with disabilities have the right to access goods and services both in the physical and digital world. They want to feel included, connected, and just find their place in the general community. So we know people with disabilities are able to use technology to browse the web on desktop and mobile devices, etc. So the question now is how do we, as designers and developers of the web and native mobile apps, make sure our products are usable by everyone? In other words, how do we make things accessible? By testing, of course. Testing all the things and knowing what to watch for, right? That's the key. So, understanding how to test with assistive technology and expectations from people with disabilities when interacting with content is a big part of creating usable and accessible experiences.
From some statistics from the Global 2020 Economics of Disability annual report, there's an estimated 1.85 billion people who identify as having a disability. So this number does not include those who do not identify with having a disability, so the total number could actually be higher. This actually makes up an emerging market larger than China. The disability market influences over 13% of the global population, and causes over 13 trillion in annual disposable income. So there's some serious spending power there which should not be ignored. And of course, as the world population grows, so do the number of people with disabilities as well as the total amount of disposable income. So keep those numbers in mind.
So who exactly are we talking about when we say we need to make our digital products accessible? Well, people with disabilities. So there are many types of disabilities which need to be taken into consideration when creating digital content. At a high level, some disabilities include visual impairments. So people with low vision, color blindness, or who are blind. Hearing impairments. People who are deaf or hard of hearing. Cognitive impairments, people who may have difficulties processing or understanding information. And motor impairments. People with a physical disability. So these represent only a few disabilities that are at a very generalized level. There are many more than this, including other permanent disabilities, but we shouldn't forget about the temporary or situational disabilities, such as someone with a broken arm. It's also important to note that someone could have one or a combination of disabilities. So for example, someone with a motor impairment who uses a wheelchair could also have a cognitive disability. And also note that disability is not binary, meaning it isn't always all or nothing. So someone who is blind may still have some vision, they could be able to see just enough to play a video game with accessibility options enabled, for example. Or a wheelchair user could still be able to walk in very short distances, but it could be very difficult or even painful. Thus they rely on their wheelchair to live more independently.
So how do people with disabilities use technology to consume our digital content? Let's explore. So disabled people consume digital content by way of assistive technology. There are a number of assistive technologies and devices available to people with disabilities. Some of which include but are not limited to screen readers, which audibly read out content to the user. This allows the user to navigate to be a keyboard and keyboard shortcuts. So when the user is navigating and consuming content via screen reader, this is what's referred to as the oral or audible user experience, as opposed to the visual user experience. Keyboard only users are people who can't use a mouse due to visual or motor impairments. As a result, they need to be able to navigate entirely by keyboard. Likewise, alternative input devices such as Sip and Puff or single switch buttons emulate keyboard style input. Screen magnification software or hardware tools help by magnifying the screen to extreme levels for people with low vision. Another example here, voice dictation software allows people who are unable to use a mouse and keyboard to navigate by voice command. So there are many more forms of assistive technology input methods. These are just a few examples. So, so far we've covered people with disabilities are able to use modern technology via assistive technology. And the fact that there is a large market share available by creating more inclusive experiences. So what this comes down to is people with disabilities have the right to access goods and services both in the physical and digital world. They want to feel included, connected, and just find their place in the general community. I really like this quote from Nejean Baptiste. Head of Product Inclusion at Google and the author of the book, Building For Everyone. ''We need to build for everyone with everyone. Not only because it is the right thing to do, but also because it drives innovation and growth while making the world a better richer place.'' I'm still reading this book, but it's a really great book. I highly recommend it. So we know people with disabilities are able to use technology to browse the web on desktop and mobile devices, et cetera. So the question now is how do we, as designers and developers of the web and native mobile apps, make sure our products are usable by everyone? In other words, how do we make things accessible? By testing, of course. Testing all the things and knowing what to watch for, right? That's the key. So, understanding how to test with assistive technology and expectations from people with disabilities when interacting with content is a big part of creating usable and accessible experiences.
3. Understanding Accessibility and WCAG
Knowing if something is accessible can generally be summarized in these three questions. First, what is this thing? Second, what happens when I click this thing? And finally, did clicking the thing meet my expectations? If all of these conditions are met, it's a good indicator that the thing is accessible. These expectations are derived from the Web Content Accessibility Guidelines (WCAG), a worldwide recognized standard for testing and implementing accessible user experiences. To test your site or app, you can use the checklist from the A11y project at a11yproject.com/checklist. Now, let's move on to enabling the Screen Reader on your mobile platform.
Knowing if something is accessible can generally be summarized in these three questions. And shout out to Derek Featherstone, Feather on Twitter, for sharing these questions on a Twitter thread a little while back.
So, first, when someone is working and using your app, they might ask the question, what is this thing? So, when someone encounters something on screen, it could be a form field, a link, a button, an image, there needs to be information included to share context and understanding of what the thing is they're currently interacting with. So, this information comes in the form of the element's semantic meaning and programmatic details added by the developer. Semantics are made up of the element role, name and state.
So, the second question, what happens when I click this thing? For the oral user experience, this is typically discerned from the previous question, the role of name and state. So affordances or hints as to what might happen on click are based on the visual and oral user experiences.
So this leads to the final question, did clicking the thing meet my expectations? This is a big one. So in other words, did activating the control result in what the user had in mind for when they clicked? Was the user successful or not?
So the point is, if all of these three conditions are met, one, understanding what the thing is, two, knowing what's expected when the thing is clicked and three, having user expectations met as a result, if all of these were answered in the affirmative, it's a pretty good indicator that the thing is accessible.
Here's the thing, a lot of these expectations are derived from the Web Content Accessibility Guidelines or WCAG for short. WCAG is a worldwide recognized standard for testing and implementing accessible and inclusive user experiences. It's primarily used for desktop and mobile web, but it can also be used for native apps and pretty much any other type of digital content.
And if you're looking for a checklist resource to work with through as you test your site or app, I'd recommend the checklist from the A11y project. I contributed a lot of the content to this list and is basically what I use internally at Shopify. So the URL for this is a11yproject.com slash checklist.
Okay, that's a lot of information. So let's get to it. This is part of the workshop where you get to do something. So I want to take everyone through enabling the Screen Reader available to them on their mobile platform.
4. Announcing State and Meeting User Expectations
Announcing the state helps convey the current condition of the element. Clicking a control with a certain role and name provides hints to the user about what might happen. Meeting user expectations is crucial for a successful user experience.
There could also be a state announcement. So announcing the state helps to convey the current condition of the element, whether interaction is needed, and what might happen when interaction takes place. So for example, check boxes and radio buttons have checked or unchecked, and selected or unselected state respectively. So the second question, what happens when I click this thing? For the oral user experience, this is typically discerned from the previous question, the role of name and state. So affordances or hints as to what might happen on click are based on the visual and oral user experiences. So for example, clicking an interactive control with the link role in the name read more about knitting will likely open a new page contents about knitting, right? Clicking an interactive control with a checkbox role in the name, subscribe to our newsletter or probably check the checkbox and enable this option when submitting a form. So these details, the semantics of the control in question which supplies the oral experience as well as the visual affordances provide hints to the user to what may happen when they click the control. So it's not only important to design the elements on the screen to look like the thing they are but also by providing a programmatic context via semantic meaning. So this leads to the final question, did clicking the thing meet my expectations? This is a big one. So in other words, did activating the control result in what the user had in mind for when they clicked? Was the user successful or not? So for example, did clicking the search button result in a search results page being loaded indicating how many items that were returned? To clicking play on a button result in playback of the video. All right. So being able to meet the users expectations is very important and helping the user feel successful. Without this feeling of being successful, self-doubt inevitably sinks in. So the user may start to question the quality of the app they're using or blame themselves for not understanding how to use the app. So neither case is desirable. So what we want to do is help guide our users to success and feel comfortable and confident when using our apps. So the point is, if all of these three conditions are met, one, understanding what the thing is, two, knowing what's expected when the thing is clicked and three, having user expectations met as a result, if all of these were answered in the affirmative, it's a pretty good indicator that the thing is accessible. So at a high level, here's a few, there's a big list of other things and expected results. And I think due to time, I'm going to skip this one, but you have the notes there, and yeah, we can come back to that.
5. Understanding Accessibility and Screen Readers
WCAG is a worldwide recognized standard for testing and implementing accessible and inclusive user experiences. It's primarily used for desktop and mobile web, but it can also be used for native apps and pretty much any other type of digital content. The principles of accessibility include perceivable, operable, understandable, and robust. The checklist from the A11y project is a recommended resource for testing your site or app. In this part of the workshop, we will guide you through enabling the Screen Reader on your mobile platform and explain the basics of using it. Both iOS and Android have similar navigation methods, such as explore and swipe. VoiceOver is the Screen Reader available on iOS devices, and you can enable it in the accessibility settings. To save time during testing, you can set up a shortcut for turning VoiceOver on and off.
So here's the thing, a lot of these expectations are derived from the Web Content Accessibility Guidelines or WCAG for short. WCAG is a worldwide recognized standard for testing and implementing accessible and inclusive user experiences. It's primarily used for desktop and mobile web, but it can also be used for native apps and pretty much any other type of digital content.
So the basis of technology, technical accessibility recommendations are derived from one of the four principles of accessibility. Each principle represents a set of guidelines which are to be followed in order to create a usable, accessible experience. So at a high level, the principles include perceivable, can the user identify information and relationships of content? Operable, can the user interact with content? Understandable, can the user understand the content available to them? And robust, does the site or app work consistently with the user's technologies? Is it built in a way that will continue to work with browsers and assistive technologies moving forward?
And if you're looking for a checklist resource to work with through as you test your site or app, I'd recommend the checklist from the A11y project. I contributed a lot of the content to this list and is basically what I use internally at Shopify. So the URL for this is a11yproject.com slash checklist.
Okay, that's a lot of information. So let's get to it. This is part of the workshop where you get to do something. So I want to take everyone through enabling the Screen Reader available to them on their mobile platform. So I'm making the assumption here that you are either on iOS or Android, but I feel like that's a pretty safe assumption these days. So in terms of React Native, these few tips will be useful for when you've got your app up and running on your real device. But we'll cover how to test and simulate it shortly. So we'll start small by going through the steps of turning the Screen Reader on and then off. You'll see in a second that once the Screen Reader is enabled, how different you're using your device can be. It's a very different experience than what you might be used to.
So here's the thing. Before we start, it's important to understand the basics of using a mobile Screen Reader. Otherwise, you make it stuck and not know how to return to the previous state. So both iOS and Android feature a similar base set of gestures when it comes to navigation. That is finding and activating a control on the screen. There are two basic methods here, one, explore. So what you'd want to do is place a single finger anywhere on the screen and drag it around in order to discover content. This will cause the Screen Reader to announce the item which is currently under your finger. So you are free to explore it in any direction using this method. And the other method called swipe, use a single finger and swipe right anywhere on the screen. This allows the Screen Reader to locate and announce content items on the screen from a top to bottom, left to right sequence. Swiping left will find items in the reverse order. So once a piece of content is in Screen Reader focus, you can double tap anywhere on the screen to activate it. This is important to remember, especially when attempting to disable the Screen Reader afterwards.
So if folks here are on iOS, as well as iPadOS, these devices come with a Screen Reader called Voicelover. So side tip, if you're testing a site in a mobile browser, typical pairing would be with Safari. So to enable VoiceOver on your iOS device, you go to settings, settings, accessibility, and then click the VoiceOver switch control. So this is awkward because normally I would be able to see folks who are attending my workshop, but since I can't, I don't know what is happening. I'll give you a few minutes to sort of try that out. Also, if you have the slides here, you can sort of flip back between the slides. So if you're successful in enabling VoiceOver and you'd like to turn it off, remember to use a single finger and drag to the VoiceOver switch, focus on this control and then double tap. In order to save time while you're testing, you can set up your device with a shortcut for turning VoiceOver on and off. So to activate this feature, you can go to settings, general, accessibility, and accessibility shortcut. So with this enabled, you can now turn on VoiceOver on and off by triple-pressing the iPhone or iPad home button. Scott. Yeah. Can you just quickly go through the previous slide with the navigation methods? I think maybe one or two people might not have caught that to just around those two navigation methods. Yeah. Yeah. So the idea is actually I can, I was gonna, I can demonstrate this the basic methods here. So you can walk my phone. Stop back on.
6. Using TalkBack and VoiceOver Gestures
For Android, most Android devices come with a screen reader called TalkBack. If your device doesn't have it, you can install it from the Android accessibility suite. To activate TalkBack, go to settings, accessibility, and enable TalkBack switch control. Set up a shortcut for turning TalkBack on and off by pressing and holding both volume keys. For iOS users, common VoiceOver gestures include touch or single tap to select and read, double tap to activate, swipe right to move to the next item, swipe left to move to the previous item, two finger tap to pause or resume reading, three finger swipe up to scroll up, and three finger swipe down to scroll down. For Android users, common gestures include touch or single tap to select or read, double tap to activate, swipe right to move to the next item, swipe left to move to the previous item, and a two finger gesture to scroll up or down. Try testing these gestures on your own app or website to discover any accessibility issues.
So I'm using Android. Oh, but it, I probably can't see my phone very well cause the slides are up anyway. I can just slide my finger. Connection is secure. Remote dot reacts eight open tabs. And it'll find things or I can click and drag my finger. Maybe. This is the react summit site. I don't know how well it works. Okay. Maybe there's problems. Okay. Click. That's the general idea. So, yeah. So that's the general idea. I'm going to skip ahead to folks who are on Android and go through the steps for Android. Yeah. So for Android, most Android devices, not necessarily all, but most of them come with a screen reader called TalkBack. So if your device does not have this available, you can install it from the Android accessibility suite, which is available on the Google Play Store. So starting TalkBack may differ slightly depending on the Android phone manufacturer. So for Google Pixel devices, you can go to settings, accessibility, and then click on use TalkBack switch control. And in order to save time while testing, you can set up your device with a shortcut for turning TalkBack on and off. So to activate this feature, you can go to settings, accessibility, TalkBack, and then TalkBack shortcut. So with this enabled, you can now turn TalkBack on and off by pressing and holding both volume keys at the same time. I'll go through some more common gestures between the two environments. And again, you can flip back and forth between the slides if you have them available to review. So for iOS users, while you're testing with VoiceOver, here are some common gestures. So to select and read the current item, you can touch or a single tap. Activate the currently selected item, double tap. Move to the next item as a swipe right, move to the previous item as a swipe left. Pause or resume reading is a two finger tap. Scrolling up is a three finger swipe up. That's just on a general screen. Scrolling down is a three finger swipe down. So again, there are a lot more gestures available than these, but these would be common I would say in conducting simple testing. And then for Android users, here's some more common gestures. So selecting or reading the item is a touch or a single tap. Activating the current selected item is a double tap. Moving to the next item is swipe right. Moving to the previous item is a swipe left. And scrolling up in this case is only a two finger gesture of sliding up. And scrolling down is a two finger gesture sliding down. So while you're giving that a try, I really recommend trying it on something maybe that you're working on. So maybe you're at a company and you're creating an app. Load up the app on your device if you can and give it a try. It's really interesting what you might find. This is actually basically how I got into accessibility. That was actually 10 years ago, I realized. I was working on a site at an agency one time and I'd always heard of a screen reader.
7. Discovering the Power of Screen Readers
I never knew really what it was. But for some reason that day, I decided to do Google search and figure it out. So of course, one of the first screen readers that people often find is ChromeVox, which is an extension for Chrome. So I fired it up on one of the sites that I was working on and just started talking back to me, the site that I was actively developing and it blew my mind. This realization that, one, this technology existed and two, people rely on this technology every day to help them read content on the web. It was a really big eye-opener. It's been my main focus for last 10 years, accessibility and development and design.
I never knew really what it was. But for some reason that day, I decided to do Google search and figure it out. So of course, one of the first screen readers that people often find is ChromeVox, which is an extension for Chrome. So I fired it up on one of the sites that I was working on and just started talking back to me, the site that I was actively developing and it blew my mind. This realization that, one, this technology existed and two, people rely on this technology every day to help them read content on the web. It was a really big eye-opener. It's been my main focus for last 10 years, accessibility and development and design.
8. Testing Methods for iOS and Android
Testing your app on real-world devices is ideal, but it's also important to test as you develop. The Mac OS accessibility inspector and macOS VoiceOver can be used to test your app in the simulator. The accessibility inspector allows you to inspect specific UI components and test for labels, roles, and states. VoiceOver can be run from macOS to test your app, using the virtual cursor to navigate and interact with items. In the Android emulator, Talkback can be installed and used for testing. Pinching, zooming, swiping, and content discovery can be done using the mouse. It's recommended to test on a real-world device if possible. Remember the three questions: What is this thing? What happens when I click? And where my expectations met? We've covered testing methods for iOS and Android, and now let's see how this applies to React Native. Facebook has developed a React Native accessibility API to help remedy any accessibility issues you may find.
Okay, so let's continue. I actually have a lot of content here. So let's keep going here. Yeah, so testing on real-world devices is great when your app is in good state to do so, but ideally, you'd also be testing as you're developing the app. So let's go over what testing methods are available when you're working on your app on iOS simulator or an Android emulator on the desktop.
So one method of testing your app in the simulator is the Mac OS accessibility inspector. So the accessibility inspector is a tool much like a web inspector found in any modern browser. So you can use it to inspect specific pieces of the UI to test for things like component label, role, or state. It basically shows you how a particular part of your app may sound when a screen reader is in use by outputting this data to the window. So there are two methods to open the accessibility inspector. One is going through Xcode and then Open Developer Tool and then Accessibility Inspector. Or the other method is to simply use Spotlight Search on Mac OS desktop and type Accessibility Inspector. To use the inspector, you'd click the crosshair icon and then use the mouse to hover over the UI to be tested. And there's a lot of useful information in the basic section of the Accessibility Inspector window, such as the component role and name. For more advanced information, like the component states, you'll need to open up the Advanced section of the window. So for the example here in the screenshot, I was inspecting a button control on the COVID Shield app, which would show and hide the main navigation menu. The label in the Accessibility Inspector revealed that the button's accessible name included everything in the navigation window. So that was an issue to be addressed.
And the other way to test your app while in active development is by using the macOS voiceover on your desktop. So while voiceover is not available directly on the Xcode Simulator, it is actually possible to run voiceover from macOS to test your app. So to do this, you set your keyboard focus onto the simulator window and then enable and disable voiceover with Command-F5. So from here you'll be able to use the virtual cursor to move between items on the screen. So to do this, to move the virtual cursor, you press Control-Option and then left or right arrows to move between content items. So in order to interact with clickable items on the app, you press Control-Option and then the Space key. You can also test using gestures in the simulator, which is pretty neat. You can pinch zoom and swipe in the iOS simulator by holding the Option key and then clicking and dragging your mouse cursor. So testing with voiceover on a desktop as well as using Accessibility Inspector, this allows you to hear what the UI sounds like, that is your oral experience. So while you're testing and going through various aspects of your UI, remember the three questions. What is this thing? What happens when I click? And where my expectations met? So let's switch back over to Android and testing in the Android emulator. So if your native app is running in the Android emulator, you can run the Talkback screen reader directly in the emulator for testing. However, Talkback is not installed by default. So you can install Talkback by logging into your Google account in the emulator and install the Android accessibility suite from Google Play. So refer back a few slides for more details, but as a reminder, starting Talkback may differ slightly depending on the Android simulator in use. So for Google Pixel-based simulators, you go to settings, accessibility, and then click the use Talkback switch control. There's a few ways you can use Talkback in the emulator. So one is you can pinch, zoom, and swipe in the Android emulator by clicking the command key and dragging your mouse cursor. And then a content discovery can be done in one of two ways, which is to use the mouse pointer to click and hold and drag around on the screen. So this is the same as using explore method on an actual device as we've already discussed. And then the other way is to use the mouse to swipe through the content. So I found this only works. Some of the times it's kind of cumbersome than on an actual device and it may lead to unexpected results. So if you can, it's actually better to test on a real-world device. Oh, this is something I found out recently. So in the AVD, make sure to test with a device image that comes with the Play Store app installed because not all of them do. Okay, so what we doing on time? We're doing okay. So, so far we've learned what to watch for in terms of identifying accessibility issues. That is the three questions, what is this thing? What happens when I click? And where my expectations met. We also learned how to test with mobile screen readers in iOS and Android devices as well as discuss basic testing within the iOS simulator and the Android emulator while working with our apps. So how does all of this come into play for React Native? If I've identified an accessibility issue, how do I go about fixing the issue? Well, to remedy any accessibility issues you may find almost any accessibility issues. Facebook has actually put together this really great React Native accessibility API.
9. Understanding the React Native Accessibility API
The React Native accessibility API includes methods and props to provide roles, state, and name to interactive elements. It also offers options to enhance general usability for assistive technology users. If you're familiar with HTML and the DOM, you'll find the API easy to use. You can add roles, labels, and control attributes to elements to provide semantic meaning and improve accessibility. The API documentation is available at react-native.dev/docs/accessibility.
So the API includes a series of react methods and props to provide things like roles, state, and name to interactive elements. It also includes other items to increase general usability of an app while using assistive technology. So we'll look at some of these shortly, but I want to point out that, if you're at all familiar with, you know, things like HTML, the DOM, the AreaSpec, then you've got a good headstart on using the accessibility API. So concepts such as adding a role to an element to provide a semantic meaning. Setting a label and a control via area label, or hiding something completely with area hidden, these are all possible with this API, it's just slightly different. So the API documentation is available at react-native.dev slash docs slash accessibility.
10. Issues with React Native API
During my time working on the COVID Shield, I discovered issues with the React Native API. The generic button component lacks a link concept, creating an accessibility barrier. The custom button component used in COVID Shield also lacks a role, causing confusion for screen reader users. Adding a role and name to interactive controls in React Native is achieved through the accessibility role and accessibility label props. These props provide context and purpose for screen reader users. Additionally, controls that only use icons should include an accessible name to indicate their purpose. Testing in the COVID Shield app revealed instances where controls were missing specific states, such as checked or unchecked items and disabled form controls.
So during my time working on the COVID Shield, I didn't have a chance to explore all of the API, but I'll go through what I did discover and share the issues with you. So let's start with a theme that we've discussed a few times already, adding a role, name, and state to clickable elements, such as a button. So in developing for the web, authors have a few native clickable elements in which to work with, to request data or to show a new view, links, and buttons. So these elements come with their respective semantics shared via their role, name, and state if it's applicable.
So I noticed React Native provides a generic button component. It outputs markup, which is similar to the HTML button element. This is good when loading a new view into the app or for other interactions, such as submitting a form. However, there doesn't seem to be, at least I couldn't find anything for a concept to convey a link, which is the case, in the case of a native app, a link might exit the app and load the mobile browser with a new resource. So, not having a link type is unfortunate, as folks may end up adding a custom link style with an on-press prop to a generic text component. So, the issue here is that this is basically creating the equivalent of a span with an on-click event in HTML. So, this creates an accessibility barrier as there being no role applied to indicate what the thing was, as well as not being focusable.
So, in my experience with COVID Shield, the team decided on a different approach. In order to make the app appear more native for each platform, there was a custom button component that was created for use within the app. So, this component included some logic to generate platform-specific touch controls. So, for example, iOS devices would receive the touchable opacity components, and Android devices would use the ripple component, which was imported from the React Native material ripple package. So, the issue I found with these components was that they did not include a roll to help convey what the thing was. What was interesting is, I noticed while testing that only TalkBack on Android still announced the control as clickable, and it also said double-tap to activate. So, I'm assuming this could be due to the fact that there's an on-press prop added, but it still does not actually communicate what the control is via its roll. So, let's review how to add a roll in React Native.
So, while I was testing with a screen reader on either platform, Android or iOS, I noticed throughout the app, that each clickable interactive control was missing its roll description. So, the screen reader would stop on the control and only announce its name, if one existed. So, as a sighted user, I have the visual affordance, that is the design of the button to indicate that this was a clickable control. But as a screen reader user, they might not likely be able to tell what they're currently focused on. So, in React Native, adding a roll to provide context on the current thing the user is interacting with is a matter of adding accessibility roll prompt. So, you'd add this to the component, which receives the click event. So, this prop takes a string value, which is defined in the API. So, one of which is the value of button, which denotes an action will result upon activation. So, in HTML, this is similar to adding the roll attribute to an element and designing its roll, excuse me, assigning the value of a button. So, for example, the screenshot here shows the COVID Shield with a visually style button with the name EnterCode without the explicit roll declaration. I believe there was only like some sort of blip, like sound, when the control came into focus by the screen reader, but nothing more to indicate what this thing actually was. And after adding the accessibility roll prop, excuse me, after adding the, yeah, the accessibility roll prop with the value of button, the control was then described as EnterCode button. So again, the purpose of this is to alert the user that they're currently focused on an interactive control and the button value helps to give an clue to what might happen upon interaction. So in this case, load a new view onto the screen. So another issue I noticed while testing, there were a few clickable elements in the view that only use icons to provide a visual affordance. So not only were these controls missing their role, they were also missing an accessible name to provide details on what they were meant for. The screen reader would stop on the control and not announce anything. Again, as a sighted user, I had the visual affordance of the icon indicating the controls purpose. But without a role and name, a screen reader user would experience a seemingly unnecessarily tab stop. Probably quite a confusing experience to say the least. So in React Native, adding a name or a label to provide a sense of purpose for the current thing the user is interacting with is a matter of including accessibility label prop. So this prop takes a string value which is defined by the author. So be sure to include something that makes sense and is appropriate for the context. So in HTML this is similar to adding the area label attributes to an element and assigning it an accessible name for screen readers to announce. So in the screenshot here I've highlighted an icon control with a downward pointing arrow. This is meant as an indicator that this portion of the screen is collapsible. However, since there was no label or role the screen reader would stop on the control and not provide any information. And after adding the explicit name and role the control would then be announced as close button. So with this the user would be, the user would have more of an understanding of what the control's purpose was and be able to act on it accordingly. Around the, so testing around the COVID Shield app there were some instances where controls were in a specific state. In one view there was a list of checked or unchecked items, and in another there was a form with a submit control in a disabled state by default.
11. Adding States and Headings in React Native
In React Native, we can add a state to controls using the accessibility state prop. Headings are important for screen reader users to quickly understand the content on a page. React Native uses the accessibility role prop to indicate a heading, with the value 'header'. It's a better user experience to have headings in React Native, even without heading levels. Adding a heading improves the accessibility and user experience of the app.
I knew these things because as a sighted user they were communicated to me by the visual affordance of the design. Again, as a screener user, however, they would not be able to acquire such information via the RO user experience. Nothing was added programmatically to provide information towards the control's current state, so we definitely want to include this information, so in React Native we can add a state by adding the accessibility state prop, so this prop takes an object, which its definition and values are defined by the API. So in HTML this would be similar to adding one of the many area state attributes, for example adding area-disabled to convey a disabled state of a form control, or area-selected to convey the state of a tab control. So in the screenshot here I have highlighted a form submit control with the label submit code. It's visually grayed out indicating it's a disabled state, however since there was no explicit state provided, the screen reader would stop on the control and announce its name only. After adding the explicit state object and role, the control would be announced as submit code, dimmed, button. And in this particular example, the word dimmed is actually unique to iOS, Android describes this state as disabled. Okay, moving on here, let's talk about headings. So that is headings referring to typically a large bold text which denotes the title of a page or a new section of content. So when testing Covert Shield, I noticed there were instances of larger bold text that would appear throughout the app, usually at the top of a new view. In essence, the design visually conveys the presence and structure of a heading, but the oral experience did not. You may be asking why, why is this important? So here's why, it's important because when someone who depends on assistive technology, such as a screen reader visits a new site page or app that they've never been to before, they'll often first navigate by headings. Screen readers have the functionality to allow the user to navigate by only specific type of content. For example, only links or only buttons images or maybe tables or lists, et cetera. So specifically people navigate by headings first in order to quickly get a sense of the content being offered on the page. So it's the same idea of someone scanning through and reading the headings of a newspaper or a blog post. The idea is to gather the general idea of the content available to them and then revisit the sections of interest. So how do we actually add a heading in React Native? So we can indicate a heading by adding the accessibility role prop to the components, which contains the heading text. And according to the React Native Accessibility API, the string value of header is H-E-A-D-E-R, not heading, should be supplied to the accessibility role prop. So in HTML, this is actually similar to adding the role equals heading attributes to a text element. But really, it'd be best to use one of the native heading elements, H1 through H6. It's also interesting that I noticed React Native, they don't have the ability to assign a heading level. So in HTML, we have the H1 through H6 heading elements, which indicates the heading level and logical structure of the content. But with React Native, it's strictly a heading only. So I'd say this is a coincidence, I'd say this is a better user experience than the no heading structure at all. So for the example here, the screenshot shows COVID Shield with a visually styled text heading with the content, share your random IDs. Without the explicit heading declaration, the screen reader would read the content as plain text. So not the worst experience in the world, but it's also not conveying the same experience as a sighted user would receive. That is the large bold typography indicating a new section of content. So after adding the accessibility role prop with the appropriate header value, the text was then described as share your random IDs heading. Again, not only is the early user experience describing the text as a heading denoting a new section of content, the screen reader users can also navigate via headings alone in order to gain an understanding of the content on the page. This is really great example of a quick accessibility win, low effort resulting in high impact.
12. Understanding Hint Text and Focus Management
Hint text provides additional information to sighted users. It is important to inform users if a link or control will open a new browser tab or window. This helps users make informed decisions and avoid extra work. By using the accessibility hint prop, we can provide context and alert the user. Including hint text enhances both the visual and oral user experience. Users can decide when to activate the link.
So let's discuss the concept of hint text. So hint text in the context of where I'll be using it is a method to provide additional information that is visually hidden from sighted users. So for example, if there was a visual indicator such as an icon, which conveys meaning to sighted users, we also need to pass these details along for folks who may not be able to see the visual hint. And the situation actually did come up while testing COVID Shield And the situation actually did come up while testing COVID Shield when a couple of items on the main menu would open the device web browser instead of loading a new view inside the app. And this scenario is actually quite common on the web. So the idea is if a link opens a new browser tab or window, or if the app control takes the user out of the current app, it's actually best practice to inform the user of this end result. So why is this important? Without this context, people might believe they're following an internal site or app link, which loads in the same browser window or app. Opening a new tab, or the device browser on behalf of the user would cause extra work for cited keyboard only users, screen reader users, or voice dictation users. If they're unprepared to move away from the current site or app, they'd need to put in the extra effort to switch back to the previous tab. So the idea is to give power to the user, right? Inform the user of what might happen upon interaction, and in order to allow a decision to be made on how and when they'd like to proceed. So we can include HintText by adding the accessibility hint prop to the component which activation results in the new context being opened. So this prop takes a string value, which is defined by the author. So be sure to include something that's appropriate for the context of the control. So in this case, typically something along the lines of opens in a new window, provides the context required to alert the user. In HTML, this is actually similar to adding the upcoming but not yet available area description attribute. And there's a link in the notes there. You can read more about that. So for this example, the screenshot shows the COVID Shield menu highlighting a clickable control with the content check symptoms. Beside the text is an arrow icon pointing up and to the right. So the intention here is to provide a visual indication of the activation result that is leaving the app and entering a different context. So without the accessibility hint prop, the control simply reads check symptoms. After reading the accessibility hint and the accessibility role, with the hint text value, the content, sorry, the control was then described as check symptoms, opens in a new window link. All right. So not only is the visual user experience increased by the icon, now the oral user experience is also enhanced by sharing the meaning behind the icon. As a result, all users will be able to make an informed decision if and when to activate the link, either now or later when they're ready.
13. Focus Management and Shifting to Headings
Focus management is critical for the success of your app in terms of accessibility. It involves shifting the keyboard focus cursor from one element to another, guiding the user with the intended flow of the app. In React Native and single page app environments, managing focus between views can be challenging. Shifting focus to a heading is a recommended solution, providing a clear indication of the new view and orienting users to a new starting point. However, implementing focus management in React Native can be complex, and there is no definitive answer. The COVID Alert app uses an accessibility autofocus prop on heading text components to shift focus. Other solutions may exist.
So next, let's talk about a topic that's actually quite critical to the success of your app in terms of accessibility. That is focus management. So focus management is a method to willfully and purposefully shift the keyboard focus cursor from one element to another on behalf of the user. So this technique is sometimes required to be used for communications. It's required to guide the user with the intended flow of the app. So for example, when opening a modal window, focus must be placed on or inside of the modal window in order to bring this new context awareness to the user. Otherwise, focus remains on the activator control and the user may not be aware or be able to reach the modal content. Focus management should only be used when absolutely necessary as to not create more work for the user when an uninspected shift in focus has occurred.
So in terms of COVID Shield, React Native and basically any single page app, a big accessibility issue lies in managing focus between views. So in traditional browser environments, the user could click a link or a button to submit a form and a full page refresh would occur. So at that point, the user's focus would be placed at the top of the document or view and they'd navigate forward. So with React Native and other single page app style environments, this is not the case. So what typically happens when a new view is loaded onto the screen is the focus remains on the previous activated control. So now the problem here is one, there's no notification provided to screen readers. The user will click a control to move elsewhere on the app resulting in no notification that anything has taken place. Focus would remain on the control they previously clicked. So this would be a very confusing user experience where the user may question the quality of the app or question themselves if they've done something wrong. And two, when the user eventually tries to move their cursor, there's no telling where it may end up. So it could be at the top of the new view, it could be someplace else. You know, who knows?
So how do we handle managing focus between one view to the next? So there's actually quite a few different approaches you could take when managing focus between app views. One of which includes shifting focus to the view container, allowing the user to navigate forward as if a full refresh had taken place. Another one could be shifting focus to the top-level heading element allowing the user to move forward from this point. So these are a couple of potential solutions, but instead of speculating, let's review some data from a study done in 2019. So this study was conducted by Marcy Sutton, an independent web developer and accessibility subject matter expert who had been... who was working with the Gatsby team at the time. And while the study focuses on JavaScript-based single page apps, the concept can still be applied to React Native. So the purpose of the study was to find out which approach rendered the best, most positive user experience for a number of different disability groups using various assistive technologies. So specifically this study included sighted keyboard-only users, screen reader users, low vision Zoom users, and then voice dictation users. So all of which have unique set of requirements and expectations of what might be deemed a successful user experience. You should definitely read through this post when you have a few minutes. But I'll jump to the conclusion as to what was considered a good solution for some, but not the best necessarily for all. Basically the TLDR version of this is to shift focus to a heading. So this was one of the more successful solutions that works pretty well, it worked pretty well mostly for all users. So the idea is shifting focus to a heading is as ideal as it provides screen reader users with a clear indication of what happened after the view loads. So it announces the heading text and focus shifts from one place to another. So for other users, voice dictation, keyword only and zoom users, shifting focus to the heading orients the user to a new starting points within the app. Now ideally, when the heading is first in focus, it would include some sort of visual indicator such as a focus ring. Some slighted assistive technology users may have a more difficult time understanding where their cursor is without the focus ring. But it's up for the team to decide if they want to include that, which I recommend that it is there. So how do we shift focus to a heading in react-native? That's actually a really good question. And one, I'm afraid, I don't necessarily have a good answer for. So this is one of the issues I reported and suggested to move focus to the view heading. But the COVID Shield team at the time didn't have the time to address this one. So instead what we can do is take a look at what the COVID Alert app does. So I took a look and reviewing the app source code on GitHub, the Canadian Digital Services team created their own accessibility autofocus prop. So this prop is placed on the heading text components. And when the view loads, focus shifts to the heading. So for the example here, the screenshot shows the COVID Alert app running on an Android phone. And the heading text, enter your onetime key is highlighted with a green border indicating the text has focus and is announced when the view loads. So there could be other solutions out there.
14. Hiding Controls from Assistive Technology
In this section, we explore the case of hiding a control from assistive technology. The design of a form to enter a COVID shield ID code created several accessibility barriers. The text input and clickable components were missing roles and names, resulting in a lack of context and information for users. The zero pixel width and height of the inputs caused a lack of visible focus indicators and created invisible tab stops. To address these issues, recommendations were made to add accessibility labels, adjust the styling of the text input components, and utilize the hidden prop for iOS devices and the important for accessibility prop for Android devices. These changes allowed the input to be discovered by screen readers and provided the necessary feedback for users.
Perhaps a third party component you could install and incorporate into your own project. Or perhaps depending on how your project is structured, you can simply do things like using React's come to the ref style with the component didn't know in lifecycle method. That's an option perhaps. But yeah, definitely check out the accessibility auto focus solution on GitHub, which is linked in the notes here. It actually works really well. This is the last sort of section that I have to share.
So, so far, we've discussed how to make things more accessible, how to bring context and awareness to the user on the thing they're currently interacting with. What happens when we need to hide things from assistive technology? So this is rare, but the case did come up when testing COVID shield, where I needed to hide a control from assistive technology, so let's explore this. So, the screenshot here is from, is a form to enter your COVID shield ID code. And there's a single input to capture your ID and then a button with the label, Submit Code. On the left is the initial state with the disabled submit, and on the right shows an example of data entered into the form with the Submit Code. And also, you can see an example of a code that is entered into the form with the submit control available for active activation.
Now, notice how the input is designed with the individual underlines for each digit entered. So, if you think this design is unique, you'd be right. So, in order to meet this design choice, the dev team had to create so for the code for this, this input is actually a box component with a stylized text component showcasing the inputted value. So, sitting on top of this stylized text component is a clickable control, a touchable without feedback component. And this is represented by the orange dashed outline in the screenshots here.
So, the idea was that on-click this control the user would send focus to a hidden text input control. And this control was hidden as it was styled with a width and height value of zero. So the input is represented by the little green squares here on the screen. So, as a sighted user, clicking the input would bring up the keyboard and as the user typed, they would see the content of their data being entered on the screen as they'd expect. So no problem, right? Well, not so fast. There were actually quite a few accessibility barriers created as a result of this particular design.
So, number one, the text input component was missing its role and name. And as we've discussed numerous times, without a role or name, the user will be missing the context and information to understand what it is they're currently interacting with. Two, the clickable component was also missing its role and name. So when this component receives focus, again, no information was shared. Are we noticing a trend here? I think so. Three, due to the zero pixel width and height of the inputs, when it did receive focus, there'd be no visible focus indicator. So this would make it seem as though there was an invisible tab stop of sorts, which in a way, without a role or name, it was. And four, the clickable component touchable without feedback. This created another tab stop. And if someone were to activate it, it would lead to the text input, which itself had no information.
So all this to say, this was a large issue, especially considering entering your COVID ID was a big part of the user experience and a core component to the app purpose. So here's what I recommended. The recommendations I made to the team to address these issues found were, one, add the accessibility label to the text input components in order to give it an accessible name. Two, adjust the styling of the text input components to 100% width and 40 pixel height and a transparent background in order for the dashed visual to still come through. So making this adjustment allowed the input to be organically discovered by the screen reader as the users would be either exploring or using swipe gestures to discover items on the screen. And then, interesting as well, this also provided the text input's natural role to be exposed, I think, because in the previous state with the zero pixel width and height, somehow it just sort of blew away the role from being announced. So when switching up the design here, it actually allowed the role to come through. And three, lastly, I recommended the accessibility element's hidden prop, and this is specific for iOS devices, as well the important for accessibility prop to be added. And this is one that's specific to Android. So with each of these, with their supplied values of true and no hide descendants respectively, this hides the control from being discovered by assistive technology. So since the control really only serves sighted users with the dashed visual, it didn't serve a screen reader user. So hiding the control in this way is similar to applying the area hidden attribute on a button to remove it from the accessibility tree in a browser. And as well, the the tab index equals minus one value to remove it from the natural tab order in the DOM. So for the example here, the screenshot shows the COVID shield code form, sorry, the form with the text input component highlighted. Without the recommended changes, the screen reader would basically stop on both the input and the clickable control and not announce anything. And afterwards, after the changes were made, the screen reader would provide the feedback as expected. So focusing, it would focus on the input, would announce its role and name, and thus providing the feedback required to alert the user of what they were interacting with, how to interact with it, and what to expect upon submitting the form.
15. Demo of COVID Alert App
So time permitting, I don't know, Liam, what do you feel? Should we go to questions? I was gonna demo the COVID alert app, but maybe I can skip it. The app will showcase the changes made and the importance of headings, menus, and button roles. The COVID Shield app was initially missing these accessibility features. Let's explore the improvements.
So time permitting, I don't know, Liam, what do you feel? Should we go to questions? I was gonna demo the COVID alert app, but maybe I can skip it. I think there's probably time to do... Yeah. Yeah. So, okay, cool. So I'll do that, I'll bring it up here. So what I wanted to showcase was basically the idea, I'll quickly go through the app and it'll basically announce a lot of the things that I shared here today, the changes that the team made. Okay. So let's try this. Stop back on. No exposure detected. Heading, COVID alert. So right away you hear that heading announcement, right? The heading was added to this particular text and then it definitely looks like a heading. So let's go down. Actually- You have nothing, you're in. Sorry, I think my baby's crying, let me go turn off my baby monitor. Just a second. Don't worry my wife is with my baby, so I'm not abandoning him. So I wanted to go through and showcase a few things. So if we keep going down. COVID alert is active. Button, menu. Double-tap to activate. So it announces the menu here and announces it with the button roll, right? So that's one of the big things that we're missing when I initially tested the COVID Shield app. So let's double-click on this. So let's double-click this. COVID alert is active. You'll notice that the focus is now on this new text, which unfortunately does not convey it as a heading. But let's keep moving. Close button. Close button. Double-tap to activate. So this was one of the controls that was missing its name and roll, but now it has the information available. Button, enter your one-time key. Button, turn off COVID alert. Two buttons here. I'll go to a couple more controls here. You'll notice they have, like the, sorry, there's the arrow pointing up and right. Let's listen to how that is described. Button, get a one-time key. What to do if you're exposed. Opens in a new window. Link. There you go. Double-tap to activate. Link's available. You swipe up then right to view. So this was, this notifies the user that it is a link and it opens in a new context. So I'm going to try and do that now. So, double click.
16. Demo of COVID Alert App
Zero percent. Progress bar. Okay. Pro. 70 percent. Web view. Double-tap to activate. So this opened up the Chrome instance, the Chrome browser on my device here. Enter your one time key. Notify people they've been exposed. Exposed people will not get any. Enter your one time key. So there we are in the heading of entering the one time key. And I'll go to the input. Enter the key you got when you were died. COVID alert key. Edit box. Button. Submit key. COVID alert key. Edit box. Right. So when you land on a form patrol. It is announced as a edit box. That is unique to the Android device. COVID alert key. Edit box. Button. Submit key. You need to enter your key before you can submit. Okay.
Zero percent. Progress bar. Okay. Pro. 70 percent. Web view. Double-tap to activate. So that. Double-tap and hold to long press. So this opened up the Chrome instance, the Chrome browser on my device here. So I'll go back. Overview. Pixel launcher. COVID alert. COVID alert. What to do if you're exposed. Opens in a new window. I'm going to go back out to the form that I was talking about.
Enter your one time key. Notify people they've been exposed. Heading. There's a heading. That's good. One. Information. Exposed people will not get any. Enter your one time key. Heading. So there we are in the heading of entering the one time key. And I'll go to the input. Enter the key you got when you were died. COVID alert key. Edit box. Button. Submit key. COVID alert key. Edit box. Right. So when you land on a form patrol. It is announced as a edit box. That is unique to the Android device. COVID alert key. Edit box. Button. Submit key. I'm gonna try to submit this without information. I'm curious to see what happens. I actually don't know. You need to enter your key before you can submit. Okay.
17. Demo of COVID Alert App
There's a little alert dialogue that opens up, and I wanted to show one other thing. When changing the language, the app provides clear indications of the selected option. This highlights the benefits of testing and showcasing the app to screen reader users.
Okay. That's actually useful. So there's a little alert dialogue that opens up. I think that might be sort of native to the platform. COVID alert. Button. Submit key. Button. Close. So I wanted to show one other thing. Help. Settings. Change province or territory. Yeah. Button. Change language. So this is something else that required state change. The change language. Well, it feels, to be honest, it's a typical registration of a mobile phone in a pocket. Sending and receiving. Button. Into the wrong thing. COVID alert is active. Button. Language. Heading. So you notice that, it goes to the language heading, which is great. Now, originally there was the visual check box there, right? There was no state, or role, announcing what these things were. So if you go to them now. Selected. Radio button. English, Canada. So there's a fun- There's a ton of information that's being shared. That's a radio button. It's in a selected state, and also provides what the thing is by, by its name. Radio button. France, Canada. And this one is not checked. Selected. Radio button. English, Canada. Radio button. France, Canada. It doesn't say unselected, but it also doesn't say selected. So there you go. Anyway, it's just a few, a few highlights there, of the benefits of doing this work, doing this testing and showcasing, you know, to screen reader users, what is actually on the screen, right? It's very helpful.
QnA
Q&A on Accessibility and Continuous Education
Question: Does the app announce disabled buttons? Other areas to test for accessibility: font size, color inversion, reduce motion. Encourage interest in accessibility through demos and usability testing with people with disabilities. Continuous education is important as accessibility is an ongoing process.
Question there, Scott just came in from Orlando, specific to the, the app here. Doesn't announce disabled buttons or do the set ignore those? So if the buttons are disabled, does it announce it, or does it just carry on through the page? Good question. I don't know if there is a disabled button in this version of the app. Originally there was with COVID Shield and when the Canadian digital services team took over, I think they changed that, that flow to have the button always be enabled. But I imagine it would say something like disabled. I think I was testing that out somewhere else at one point. But yeah, it would say disabled. Okay, good to know. Yeah. Cool.
There's actually one more thing I wanted to point out was that everything we've discussed today has mostly catered around the screen reader user experience. And while addressing issues for screen readers does help to remove some barriers for other assistive technologies, such as keyboard only and voice dictation users, it shouldn't be the only focus. So in iOS and Android, they have many other accessibility features built in. So here's a few other areas to try out the next time you're testing for accessibility. So font size, make sure your app's text is able to be dynamically adjusted according to the operating system. This helps folks with low vision and be able to read content with ease. Color inversion, this also helps people with various visual impairments. So test and make sure your content is still legible when colors are inverted. And this one's neat, reduce motion. So this helps if you've got a bunch of animations on the screen. Now, some people who prefer little to no animation, as they may be prone to motion sickness or other vestibular issues. So you can do here is use the prefers reduced motion or use CSS media query to take advantage of this operating system setting. This is also true on on desktop. So these are just a few settings available. I definitely encourage you to explore each setting and learn how you can adjust your designs to be more dynamic in order to be more inclusive for your users needs. And I have just a bunch of resource links that I like to share with folks. A lot of these referring to the desktop accessibility experience. But again, a lot of this can be applied to a native environment as well, including React Native Development. So if you have the slides there, you can definitely click through those, but yeah, that pretty much wraps it up.
That was absolutely brilliant, Scott. Thanks so much for sharing all of those insights. It was really great to see that demo going through the UI and learn all of those, you know, cases of different aspects of accessibility of different browsers, of different devices. It's a hugely complex topic, but it's great to see you break it down into this sizeable pieces there. So yeah, an amazing presentation. Thanks again, Scott. We have a couple of questions here that we'll dive into. Thankfully, we've got about 15 minutes. If you do have more questions, please do throw it into the Q&A function there, and we'll try to get through as many of those as possible. So I'm gonna start off with a question from Juan here, a very general question, and that is how do we get developers to be interested in accessibility? Yeah, that's a really good question. I mean, from my own personal experience, like I sort of shared that story earlier, is I discovered it on my own. I have my own personal interests, and from there just sort of took off, right? So I think one of the ways you could sort of help people get interested is just by running demos within your company, running demonstrations such as I did, just showing someone how to use a piece of assistive technology can be really eye-opening and could be really interesting for some folks. The other way that's also really important just in general is to test with people with disabilities, right? Run those usability test sessions and have those sessions recorded for you to share internally at your organization. It's very powerful, very inspiring to sort of take those pain points away that people experience and sort of rally to get those issues addressed. Yeah no, that makes a lot of sense. There's a lot of different ways of looking at it and thinking about it there. It's definitely a space that requires as well, continuous education. Yeah, for sure, that's a big thing is like people sometimes think this is a one and done checkbox item on their project, it's really not, it's an ongoing process such as keeping up with latest design trends or keeping up with security issues. It's forever ongoing. Yeah, that's right. We have a question here from Jordi.
Q&A on Accessibility and React Native
If you use the native attribute for disabled control, you do not need to include the area state. The area attribute only affects the screen reader experience, while the native disabled attribute indicates the actual state of being disabled. The area description attribute is still in active development and not yet supported. It is similar to the area described by attribute but requires a string value instead of an ID value. To get the currently focused item in React Native, you can use the active element in JavaScript. However, further research may be required for a more detailed answer. When changing the language to French in the COVID app, the voice should also start speaking French.
Would using a disabled property remove the need for an RES date or is it good practice to provide both? Yeah, yeah, that's a great question. So if you were to incorporate the native attribute for disabled control, you do not need to include the area state specifically. The reason being that when you include area, pretty much any area attribute that you include, it only affects the screen reader experience. Meaning that control will still be focusable, and it will still be actionable. All we're really doing there is adding the announcement to that control. So in order to get the actual state of it being disabled, meaning you cannot focus on it or use it, definitely include the native, the Boolean value attribute of disabled.
Oh, yeah. That's a good question. I think it's still being active in active development. Like, the link I shared in the notes there has a really great breakdown of the purpose of the attribute. I don't think it's supported anywhere yet. So, it's basically the same idea as applying the area described by attribute onto a thing. The difference is that area described by requires an ID value point to something that already exists in the DOM, whereas area description requires a string value, right? So, same idea as adding areaLabel, right? AreaLabel requires a string, versus areaLabelBy includes space-separated IDs. So, yeah, definitely not available yet. Sorry, Lainao. Hopefully soon.
Oh yeah. So, I think the question is how do we get the currently focused item? Is that the question? Yeah, I believe so. Oh, yeah. That's a good question. I know that in JavaScript, there is a specific, what do you call it? It's been a long time since I've done any JavaScript work. I think it's like active element. Active element. I don't remember what it is. Sorry, I don't really have a good answer for this without going to a whole lot of research myself. I'm sure it's possible. Yeah, sorry. Yeah, that's, someone says, yeah, that's a React element. I think you can sort of, reuse a lot of the things that are available to you in the JavaScript APIs. Yeah, we'll leave it at that. Google it. There you go. Yeah, Ariel has just dropped a link there for active element. So yeah, that should be how you can type out current element. Another question here from Jordi again, going back to, I think it's going back to the COVID app that you were looking at. When you're changing language to French, does the voice also start speaking French? Oh yeah, that's a great point. I should demo that. I believe it does. Let's try it. Go back to that screen. Language. Okay. Got back on. COVID alert. Okay.
Language Settings and Screen Reader Accent
Language settings on the personal device determine the language accent used by the screen reader. If the device is set to French, the screen reader will announce French content with an English accent.
Language. Okay. Got back on. COVID alert. Okay. Select radio button. I'll say Canada. Selected. Actually, no. Radio button. English Canada. No, that's a great point. The answer is no. The reason is, is because my personal device is set up to be English language, right? So if your device was set up to be French, with a French language, or a French region of the world, it would announce things as French accent. Oh, okay, that's good to know. Let me just confirm. Modifier with province OU. So it's trying to, it's actually trying to announce French content with an English accent. It sounds off. So, yeah. It's good to know.
Q&A on Accessibility and Testing
To assess an app's accessibility, testing with people with disabilities is crucial. Organizations like Fable offer dedicated testing services. Following WCAG guidelines, such as WCAG 2.1 AA, is important but should be considered the baseline. It's ideal to involve people with disabilities from the beginning to ensure a good state at production. Testing during development saves time and effort. The future of accessible technology will always require human involvement and testing. While advances in AI are promising, testing will remain necessary. There are tools available, such as linters and testing suites like YTQ and Webhint, that can make accessibility testing less tedious.
Olena has a question here. How do you, I'm just like, that's a really good question, and I'm not sure if there is one answer to this. And how do you assess if an app is fully accessible? So I guess around like that sort of being able to sign off or just to be able to evaluate that completely. And I guess like a follow up question is how many people are involved in verifying that and the testing process, using different devices?
Yeah, that's a great question. The short answer is having something that's fully accessible or compliant is not really realistic per se. You can try real hard to get there, and there's some of the things that you can do is like I said, tests with people, have user sessions with people with disabilities, right? So there's a great organization out there called Fable, and they are an organization who do testing full-time. You can reach out to them and say, hey, I'm trying to test this thing. Can you help us test it for accessibility? Another thing is to follow along the WCAG guidelines. I can't remember, did I share? Yeah, I did share WCAG. So, yeah, WCAG. You shared it in the chat as well. Yeah, WCAG 2.1 AA is currently the sort of gold standard as far as meeting accessibility requirements. Now it's worth noting that this should be, this should be taken as the baseline experience, right? In order to, and, you know, getting your app up to speed with WCAG can be quite a bit of work. There could be a lot of testers involved with developers and designers being involved. The ideal state is to include people with disabilities from the very beginning, right? Get their inputs while things are being designed and developed. So at that point, when things roll out and you push to production, things should be in good state. And you'll save yourself a lot of time and headache and heartache if you're trying to force accessibility onto a product after the fact.
Yeah, that makes a lot of sense. Just being mindful of this as early on as possible and testing as early as possible and getting feedback from users with disabilities. Yeah, fair point. As we went through COVIDShield, I was testing as it was being developed, which is pretty good, right? Most of the time, something will come up, there'll be a user that complains about something not being working and then you'll have to do a bunch of testing and fixing after the fact. It's not ideal, so testing during and even earlier. Yeah, really great to keep that in mind. Eldon has a question here. What do you think is the future of accessible technology? Yeah. I don't know, it's tough to say. There's been a lot of work happening with AI over the years, but you know what? Even with those advances, even though these things are really amazing, you're always gonna have that human component, right? Accessibility is by far a human issue. And like I explained earlier, it's about setting expectations for users. So I don't know. I think there's always gonna be disabled people in the world. There's always gonna be the requirement of testing, testing your products. Will we get to a point where we don't need to test? Probably not. Even with the most robust design systems out there that have components that have been tested for accessibility, there comes a point where you need to put those components together. And does that flow, does that user flow equal a great experience? Who knows? So you're always going to need to be testing. You know, that's a very interesting point. And you know, it does come back to those fundamentals for sure. Maybe there's a question that's slightly connected that Paul has. Is there a tool that would let, that would make testing accessibility less tedious or are you pretty much stuck with manual testing? That's a good question. As far as, actually yeah, okay. So there are lots of tools. I'm trying to think of React Native specifically. So there are tools that you can run while you're developing in your development environment. So there's linters that you can use. There's a couple. I believe there's one being actively developed right now, YTQ, they're bringing their Axe testing suite as a plugin for VS Code as a linter. I think there's another one called the Webhint. Webhint, something like that. I can't remember exactly what it's called. You can add that to your development environment as well. Now if you're developing for a website on the web, there's lots of automated tools that you can use to help speed up the process.
Using Tools for Testing and Q&A
Those tools can only find a limited amount of possible issues. Native apps may exist for testing on mobile devices. Incorporate tools into your development and design workflow. Deque offers useful extensions for testing. There is time for a couple more questions.
But the thing to remember is, those tools can only find a limited amount of possible issues. As the split between objective and subjective testing is very wide. As far as native, I believe, I meant to do some research into this, but I think there might be some native apps that you can install on an iPhone or an Android device, for example, that will run testing on your app on your phone. Don't quote me on that, but I think something is this. I think there might be something from Deque as well, I'll have to look into it. But yeah, definitely lots of tools there, definitely incorporate those into your development and design workflow. Yeah, I've used the Deque or PAX Chrome extension for their inspect element extension, which is really great. So it's good to hear that they've got plug-ins for text editors as well. Yeah, you guys. That's really cool. Yeah, well, we still have a few minutes for one or two, we might be able to take maybe two more questions.
Comments