Deliver an Exceptional Accessibility Experience 

Session 230 WWDC 2018

Go beyond the basics and create apps with an exceptional accessibility experience. Learn how to design elements that appear in your app for maximum clarity and accessibility. Discover how to enhance the way users interact with your app even when presenting custom views, or drawing complex UI.

[ Music ]

[ Applause ]

Good morning everyone.

Happy Friday.

My name is Skylar Peterson.

I am an engineer on the iOS Accessibility team at Apple, and with me today I have Bhavya, an engineer on the Mac accessibility team at Apple.

And together we’re going to delve into what it means to create an exceptional experience in your app when it comes to accessibility.

As a quick refresher, we tend to think of accessibility as meaning making technology usable by everyone.

At Apple, we categorize the way in which we make that technology accessible into four areas of focus, cognitive, motor, vision, and hearing.

Now we have a bunch of great features that we work on to address all of these areas, but at the end of the day, it’s the work that you do in tandem with our assistive technologies that make our platforms stellar places for people with disabilities.

Making sure that all the great content you work on is accessible to everyone creates a better, more inclusive community for all of us and fundamentally improves the lives of millions of people around the world.

For today’s talk, we’re going to skip past a lot of the basics of making apps accessible.

However, if this is your first exposure to accessibility or you’ve never checked out any of our accessibility APIs before, then I highly recommend that you check out some of the sessions from last year’s WWDC.

There’s still a lot that you’re going to be able to gain from our session.

It may just feel like it’s going a little fast for you because today we’re going to focus on going beyond the basics, and I want to start by focusing in on this one word, usable.

Usable is great, making sure that all of the content of your app is even visible to an assistive technology is a critical first step.

But when it comes down to it, we don’t want our apps to just be usable, right.

No one’s going to be out there bragging that their app is usable.

They want it to be exceptional.

We want people who use them to be delighted when they do.

It may not be clear to you exactly what that means when it comes to accessibility, and that’s what we’re here to talk about today and hopefully provide you with some helpful guidance on how to do that.

I’d like to focus on two main areas and the different considerations that you’ll want to make for each when evaluating the accessibility of your own app.

First, visual design, and second, the way that an assistive technology user experiences your app.

So let’s start with visual design.

I’d like to begin by addressing transparency and blurring.

We use blur and transparency across many parts of iOS.

It creates a vibrant, unique look and feel for our operating system.

But for some people, particularly those with low vision conditions, blur and transparency can have negative impacts on legibility and even cause a degree of eye strain.

So we provide an accessibility setting to reduce blur and transparency, and it can have dramatic effects like this.

Instead of blurring the backdrop, we fade it so that you still have the context of where you are in the system, but the visual noise is reduced, and the controls which before had color bleed-in from behind are more legible on their solid backdrop.

Same with folders and finally spotlight.

Now this is a particularly good example, because we’ve taken a sample of the user’s wallpaper so that we can color the backdrop, and they could still feel personalized and contextual to the device, but we still get that increased contrast and legibility of having a solid background.

As a developer, you can cater your own use of blurring and transparency by checking whether or not the reduced transparency setting is enabled and adapting your UI accordingly.

On iOS, you would use is reduce transparency enabled on UI accessibility, and on macOS, accessibility display should reduce transparency on NS Workspace.

Next, I’d like to talk about contrast.

Contrast between content and it’s background is hugely important when it comes to perceivability.

As colors get closer to one another, they become more difficult to distinguish.

As well, certain colors that are legible at bigger sizes don’t work when content is scaled down.

For example, with text, the letters bleed together more easily at smaller sizes.

Now you can compute the contrast ratio between a particular color combination to see whether or not it’s going to be legible, and we followed the recommendations laid out in the web content accessibility guidelines that states that the minimum contrast ratio you should aim for is 4.5 to 1.

Obviously, the highest contrast that you’re going to be able to get is between black and white, which comes out to be 21 to 1 and obviously works great at all text sizes.

Now let’s take a look at a shade of gray.

This particular gray will work great for larger text, but it isn’t so great for smaller text because your eyes can’t distinguish the shapes of the letter forms.

It’s contrast ratio 4.5 to 1 is right on the line of passable.

Let’s like take a look at one last gray.

It may not be apparent on this really large display, but this text is really hard to see on a small device, even at a large font size.

It’s contrast ratio of 2.9 to 1 is just too low.

And you can use the built-in tool in our accessibility inspector in Xcode to show you what the contrast ratio between a particular color combination is and what text sizes that color combo passes for.

And we follow the same guidelines that I mentioned before in this tool.

However, even with a ratio that exceeds the recommended contrast ratio of 4.5 to 1, for people with low vision conditions, colors that meet the bar can still be problematic when it comes to legibility.

So we provide a setting for increasing the contrast across the system.

In previous versions of iOS, this setting was referred to as darken colors.

So in your code you can use is darker system colors enabled to check.

Though if you’re using a standard UIKit control that you set a tint color on, you will actually get this work for free.

On a macOS, you can use accessibility display should increase contrast.

Next we have sizing.

Changing the content size of your device can have dramatic effects in the way that content is displayed and perceived.

And this is hugely beneficial to low vision users.

So let’s take for example a calendar.

On the left, we have this view at the default text size, and on the right I’ve blown it up several sizes to one of the larger accessibility sizes.

If we simulate what this looks like to someone with a low vision condition, then the benefits become immediately apparent.

That text on the right is actually still legible.

On iOS, you can check what content size a user has set their device to, and there are seven standard sizes with large being the default, but you can also enable even larger accessibility sizes, which increase that by five more.

Now there’s a lot more that I could say about dynamic type, but I’m going to leave it at that and instead encourage you to check out some of the fantastic resources that we have to figure out the best way to get dynamic type working in your own apps.

Now for some, changing the size of text is overkill, but the default weight of fonts and glyphs can still make it difficult to read.

So on iOS there’s a setting for enabling bold text, which will also increase the stroke weight of glyphs.

If you’re using stand built-in UI controls and a system font, then it’s likely that you don’t need to do anything to get bold text working in your app, but if you’re using your own text solution or a custom font, or you simply want to do something like making the dividing lines in your app thicker when bold text is on, then you can check whether or not the setting is enabled and adapt as necessary.

Next, we have motion.

Animation is fun, and it often makes content feel more alive.

It can provide a direct correlation between user interaction and what effects their actions are having.

However, certain conditions, particularly inner ear conditions that affect the balance center of the brain can make motion and animation problematic, which often results in things like dizziness or imbalance or even nausea.

Now I’m going to go through some examples of animation types that can be problematic, so if any of you in the audience have a problem with motion, then you may want to look away momentarily as I play the default animations, which I am going to preface.

The first is scaling and zooming.

On the left, we have the default animation for opening an app, specifically the clock app, which I’m going to show now.

We zoom into the location of the app’s icon while it’s UI scales into view.

And on the right, we have the same interaction but with reduce motion enabled, which will replace that with a simple cross fade.

Next, spinning and vortex effects.

If we take a look at the full screen echo effect in messages, which I’m going to show now, we can see that content is spinning along the Z axis while also changing in scale, and these types of motions paired together can cause issues.

In the case of message effects, if reduced motion is enabled, then we provide the user with a specific prompt to play the message effect instead of having it autoplay.

So they can choose whether or not they want to see it.

Next, we have plane-shifting animations.

What I want to show here is the Safari tabs animation.

So I’m going to tap on the show tabs button, and you see that the plane of the web cards is shifted to simulate a 3D space.

And on the right, we have the reduced animation, which again is a simple cross fade, but we also blink the card so that you have the context of which one you’re coming from that you got in the original animation.

Multidirectional or multispeed motion is another trigger.

I’m going to show the movement of the messages bubbles in a conversation.

It’s a little hard to see in this video, but it’s a lot more obvious on device.

As I scroll the messages, they have a springiness to them where they contract and expand away from one another.

If we have reduce motion enabled, then we disable this effect and have them just scroll normally.

Finally, we have peripheral movement.

The weather app on iOS has subtle animations that play in the background to indicate the current weather condition.

You cans see on the left that the clouds are sort of drifting through the air and the sun is sort of shimmering and shining.

But if you’re scanning the forecast below, then this animation is in your peripheral view, and this can be problematic in the same way that reading while riding in a car can make you feel sick.

The horizontal motion above your area of focus triggers a reaction in your brain.

With reduced motion on, we disable the background animation.

Now typically we don’t want to just remove all animation, so of course we have a setting that you can check to see if motion should be reduced and adapt accordingly.

On iOS, you check is reduced motion enabled, and on macOS, check accessibility display should reduce motion.

It’s important to note that simply removing an animation is often not an adequate experience.

You don’t want to lessen the experience.

You just want to add something else equally fun or engaging but different that works for the user.

Finally, I’d like to finish talking about design by addressing UI complexity.

Apps play key roles in our lives now, and it’s critically important for all of us that our technology is simple and easy to use, that it enhances our lives instead of adding extra unnecessary burden.

In the United States alone, one in six children have a developmental disability ranging from speech and language impairments to more serious developmental disabilities like autism.

For people with cognitive disabilities or even a chronic illness, using an app can expend more energy than for someone who is neurotypical.

So how do we make sure that our apps are simple and the least burdensome that they can be?

Well they should be easy to navigate by having them similarly structured and through clear, logical cause and effect.

We should be able to start using our apps and complete the most common tasks without encountering any barriers.

And finally, our apps should behave in a consistent manner so that when I learn something one place, it applies to another.

Using this standard UIKit views is great because people get familiar with way they work, but UIKit also functions as a design language for iOS.

So if you’re doing something custom, consistency with a parallel control in UIKit will help people intuit how to use your app.

This really all boils down to the way in which people experience your apps, which is just as important for people who use assistive technologies like VoiceOver or Switch Control.

So I’d like now to spend some time speaking to how you can improve the experience for those accessibility users.

The experience for assistive tech like VoiceOver and Switch Control can be quite different from your typical user experience.

VoiceOver uses many gestures like swipes and multifinger taps, while Switch Control scans groups and offers logical quick actions.

For both, there are built-in equivalents for any standard gesture or interaction, but what makes experiencing an app with these technologies exceptional?

We want to go from the bare minimum of it works to it works well.

While despite the fact that assistive tech users experience your app in ways that are different, the same design principles that drive a good experience for a nonaccessibility user applies here.

You want easy navigation, predictable behavior, prioritizing our action, and consistency.

It’s also important to note that if something is communicated contextually by separate elements, that same context is conveyed to our accessibility users.

I’d like now to bring Bhavya up on stage to run through a quick audit of the accessibility experience of an app.


[ Applause ]

Good morning, everyone.

My name is Bhavya, and I’m a software engineer on the Accessibility team at Apple.

Today, I’d like to talk to you about enhancing the accessibility experience of your app by sharing with you an app that Skyler and I have been working on.

The name of this app is Exceptional Dogs, and the goal is to help make the dog adoption process easier by allowing users to browse a collection of dogs who are in need of a loving home.

Let’s take a look at this app.

This is Exceptional Dogs.

At the top, I have a carousel like UI, which is essentially a collection view of all the different dogs that I can browse.

In the bottom left corner, I have a favorite button so that I can favorite particular dogs I like, and in the bottom right corner, I have a gallery button, which brings up a modal view with additional photos of the dog.

However, not all dogs have a gallery button.

Notice how this button fades away as swipe to Layla, who has no additional pictures.

Beneath the carousel I have a stats view with information about the dog like its name and breed.

Notice how this information updates as I swipe to the next dog in the carousel.

Finally, at the bottom I have the name of the shelter with two buttons which can open the location of the shelter in maps or call the location.

So there’s quite a bit going on here in terms of visual complexity, but Skyler and I done our due diligence, and we’ve sprinkled in a little bit of accessibility, just enough to make sure that VoiceOver can reach every element on screen.

So let’s turn on VoiceOver now and see how a nonsighted or low vision user would digest all of this complexity.

VoiceOver on.

Exceptional Dogs.


So VoiceOver lands on the image and correctly announces the name of the dog.


I’m going to swipe from left to right now to scroll through the carousel.

Layla, Bro, Pudding.

Favorite off button, Pudding, Bro.

So VoiceOver was able to correctly scroll through the carousel, but we weren’t able to reach the favorite button until we hit the very end of the carousel.

I could, of course, still get to the favorite button by placing my finger on it like this.

Favorite off button.

But it’s important to remember that this is not how a typical VoiceOver user navigates.

Rather, VoiceOver users swipe to reach to the next element, and currently, using swiping we aren’t able to reach the favorite or gallery buttons until we hit the very end of the carousel.

This isn’t a great or exceptional experience because it really limits my ability to favorite or view the galleries for virtually all of the dogs in the carousel.

I’ll place my finger over the modal view now to activate the gallery button.

Show gallery.

Picture one of Bro.


Let’s scroll through our gallery.

Picture two of Bro.

Picture three of Bro.



So, we were successfully able to scroll through the gallery with VoiceOver but focus went behind the modal view to the content in the background.

This experience might seem workable because it gets the user to the content that they need to, but it’s not great or exceptional because allowing focus to go behind the modal view is confusing and distracting for the user.

I elected the modal view now by double tapping on the close button.

Close, Layla.

Let’s swipe through our stats view.

Name, breed, Layla, Labrador.

When I hear a name, I expect to hear the actual name of the dog right after, but instead I hear breed.

Visually, the layout of all this information makes sense, but to VoiceOver, its structure causes it to announce information out of order.

This creates a lack of context for the user and places unnecessary cognitive load on them.

I’ll place my finger over the name of the shelter now and swipe to the buttons next to it.

Skyler’s animal shelter.

Open address in maps.




Once again, these buttons are perfectly accessible like the labels from the stat view, yet they lack context.

When I hear call, it isn’t clear to me what this call action is referring to even though visually this is pretty evident since the call button is right next to the name of the shelter.

VoiceOver off.

So we found a couple of issues here.

These issues don’t completely render our app unusable, but they do place unnecessary cognitive load on the user.

Remember, our goal here is to create an exceptional accessibility experience.

I’d like to invite Skyler back to the stage now to talk about how we can elevate our experience for Exceptional Dogs to be exceptional.

[ Applause ]

So we found some issues with our app.

So let’s take a look at some of our accessibility API that we can leverage to resolve them.

From Bhavya’s audit, we found some issues with our carousel.

We could get to the favorite and gallery buttons but only after reaching the end of the list instead of for each individual dog.

As well, we couldn’t get to that display data below for the dog until reaching the end of the list.

We need the actions and data for each dog to be quickly reachable.

So what we can do is use UI accessibility element to create a new element in place of the collection view.

UI accessibility element is a class used to create custom elements oftentimes for representing parts of your UI that are inaccessible, like something you’ve drawn in Core Graphics for instance.

But we can also leverage it to create a custom experience, which is what we’re going to do in this case.

We’ll need to return any custom elements from the accessibility elements property for the view that would contain them.

This tells our assistive technology what the accessible sub elements of a given view are and the order that they should be navigable.

So if we get our custom element in place, we can swipe to the content for each dog easily.


But then how do we change dogs?

Well, we can take advantage of accessibility increment and accessibility decrement.

By adding the adjustable trait to our custom element, we’ll let the assistive technique know that our element responds to these callbacks so that in the case of VoiceOver when a user swipes up or down with their finger, then the collection view will scroll to the next or previous item.

Now, next let’s take a look at these information labels.

When Bhavya was swiping through them, they were out of order, and the value labels lack the context of what information they were providing.

Context is hugely important.

If I just touch on the value label for whether or not the dog is fostered, all I’m going to hear from VoiceOver is yes.

Well, yes what?

What does that mean?

What does that correspond to?

As well, we have an issue with navigation.

We’re looking at having to swipe through 14 different labels just to find what we’re looking for.

But combining them would cut that in half and makes navigation much, much faster.

It makes more sense to group these labels together as a single element so that I understand the relationship between label and value.

We again can use UI accessibility element to accomplish this by creating elements for each pair of labels and returning them from the containing views accessibility elements property.

We also saw that when Bhavya was swiping through elements in the app that the shelter info view was split up into three distinct elements, the name of the shelter, the location button, and a call button.

The buttons have the same issue of lacking contexts as labels.

If you reach them first without reaching the name of the shelter, then it’s unclear what they correspond to.

Having these three elements exist separately increases the amount of elements that a user is going to have to navigate through to find the content they’re looking for.

So it makes more sense to combine them into one and give us quicker navigation.

As well, we want to think about what the priority is here.

The most relevant piece of information is what shelter the dog is housed in.

So we’ll provide that first and add supplementary actions for locating and calling the shelter.

To accomplish this, we can use our accessibility custom actions API.

This allows us to add actions to an element that the user can cycle through until they find the one they’re looking for and activate it.

You can return an array of custom actions for a given view, and then for each action, you have a name, which is conveyed to the user as what’s going to happen when they activate that action, and a target selector relationship, which will execute the action in your code.

So we’ll make that whole shelter view accessible and add custom actions to it for getting the location and calling.

Finally, we have the gallery view.

When Bhavya brought this up, we could swipe through all of the elements in the background, and though this view apps modal, that isn’t the case, because instead of housing this view in a view controller and presenting it modally, we instead opted to just add it on top of a view hierarchy.

And so since the content in the background is still visible, VoiceOver doesn’t know that it should exclude it.

Although everything in the gallery is accessible to VoiceOver, it gets confusing to the user whose lost the context of where they are and what effect their actions are having.

To fix this, we want to override accessibility view is modal on the presented view, which will tell our assistive technology that only the elements of that view should be accessible right now.

We’ll also want to post a notification to let the assistive technology know that our on-screen elements have changed and that it should attempt to refocus.

We can use the screen change notification to do so.

I’m going to turn things back over to Bhavya now to show you how to use all the API we just covered to implement our solutions.


[ Applause ]

Thanks Skyler.

Let’s elevate accessibility experience of our app from great to exceptional.

I’ll jump into Xcode.

Let’s start with enhancing the experience of the carousel at the top.

Here, I have a class called Dog Carousel Container view which houses our collection view of the dogs, our favorite button, and our gallery button.

As Skyler mentioned, we’d like to implement the increment and decrement functionality so that we can swipe up and down to scroll through our carousel.

To do this, we can create a property on this class called carousel accessibility element, which will be a custom UI accessibility element that supports those gestures.

Let’s implement carousel accessibility element.

This will be a subclass of UI accessibility element.

It also needs to know about the current dog object it’s representing, so let’s create a property for that.

We’ll give it an initializer also, and let’s give it some basic accessibility like the label and the value.

Our label will be dog picker, and the value will be a combination of the name of the dog and it’s favorited status.

Okay. Let’s get into the increment and decrement functionality.

First, we’d like to override accessibility traits to return the adjustable trait.

Let’s first implement accessibility increment.

We want our increment function to scroll the collection view forward.

So let’s create a function that does exactly that.

Here, I have a function called accessibility scroll collection view.

It takes in a bullion, which determines if the collection view will scroll forwards or backwards.

Let’s dive into this function.

I get a reference to my container view, my current dog, and my list of all the dogs.

If I’m moving forwards, I get the index of my current dog, and I validate that it doesn’t exceed out of bounds, and if doesn’t, I’ll ask my collection view to scroll forward by one index.

I’ll do something similar if I’m going backwards except I’ll make sure that my index won’t become negative, and then I’ll ask my collection view to scroll backwards by one index.

Let’s go ahead and make use of this function in our increment function.

I’ll pass in true to indicate that I want to scroll forwards.

And similarly, I can implement accessibility decrement, the only difference being that I’ll pass in false since I want to scroll backwards.

And that’s it for the carousel accessibility element.

Let’s scroll back up to our container view.

Here, we need to expose the carousel accessibility element, and to do that, we can override accessibility elements.

Within this function, I’ll get a reference to my current dog and make sure it’s valid before I go ahead and create my elements.

Let’s start by instantiating the carousel accessibility element.

I’ll first get a reference to the carousel accessibility element if one already exists, and if it doesn’t, I’ll instantiate one here.

I’ll set it’s accessibility frame to be the frame of the dog collection view, and then I’ll store a reference to it here.

And at the end, I’ll return this array of elements.

Before I do that, I want to add some elements to my array.

I’ll check to see if my current dog has more than one image, and if so, I want my array to comprise of the carousel accessibility element, the favorite button, and the gallery button, but if the dog doesn’t have more than one image, we should probably omit the gallery button.

One thing I need to remember to do is to update the current dog property on the carousel accessibility element if that corresponding property gets updated on my container view.

We’re almost done.

Recall that when I swipe to the carousel, the information, the content of the stats view is changing.

However, when I’m swiping with VoiceOver, I need to let the app, I need to let VoiceOver know to update it’s understanding of the accessibility elements on screen.

To do that, I can post a layout change notification whenever a scroll occurs.

I’ll head over in my view controller file, and inside my scroll view did scroll view at the very end I’ll post that layout change notification like this.

And that’s it for the carousel container view.

A core part of this solution was using and creating accessibility elements, and I can use this same concept to help me solve a different problem in the stats view.

Recall that in the stats view all the information was separated, and it lacked context.

I can use this same concept to help me group the information for the stats view together and to add context.

I’ll head over to the doc stats view where all of these stats are laid out.

I’ll start by overriding the accessibility elements to create my elements.

I want to cache the elements I create from this function so that I don’t have to recompute them every single time I make them.

To do that, I can create a private property called underscore accessibility elements, which will act as that cache and store these elements.

Let’s get into my function.

First, I’ll see if my cache is empty, and if so, I’ll go ahead and populate my elements.

I’ll iterate over all my title labels, and I’ll get a reference to the title label, which would be something like breed, and it’s corresponding value, which would be something like Labrador.

With the references to both the title and it’s corresponding value, I can now create an accessibility element, which encompasses the information for both of these together.

I’ll start by creating my accessibility element here, and I’ll set it’s label and it’s accessibility frame to be a combination of the corresponding values of both the title and the value.

I’ll add this element to my array of elements, and once I’m done looping over all the title and value pairs, at the end I’ll set my cache equal to this array, and I’ll return these elements.

One thing we need to remember to do is to reset our cache every time the dog object on this gets updated so that we can compute the new elements for the new dog.

To do that, I can simply set it to no over here.

And that’s it for the dog stats view.

Next, let’s work on adding some context to the call and open location buttons.

I’ll head over to the shelter info view where I have a reference to the name of my shelter, the map button, and the call button.

We’d like the actions of both of these buttons to be custom actions on their containing view, and to achieve this, we can start by exposing that containing view as an accessibility element.

Let’s go ahead and give this a descriptive label, which would be the name of the shelter.

Finally, let’s override accessibility custom actions.

Within this function, we’ll create two custom actions, the map action and the call action.

VoiceOver will describe these as open address in maps and call, but be sure to correctly localize these strings.

Each of these actions will call a corresponding function, which activates that button.

So, for example, map action will call activate map button, which activates the map button.

I’ve conveniently defined this function right here.

Be sure to return to [inaudible] to indicate that your method was successful.

And that’s it for the shelter view.

Last but not least, let’s work on improving the experience of the modal view.

I’ve defined my modal view in dog modal view, and as Skyler suggested, to help VoiceOver recognize this as a modal view, I need to override accessibility view as modal.

We’re almost done.

One thing to remember is that when I press the gallery button, we need to let the app know that the layout of the content on the screen has changed so VoiceOver can focus to the right content.

To do that, I can post a screen change notification whenever the gallery button gets pressed.

I’ll head over to my view controller file, and within my gallery button press function, once I animate in the modal view, I’ll post a screen change notification like this.

And that’s it.

I already have the app prebuilt with all this new accessibility work, so let’s take a look at what the new and improved experience is like.

I’m going to turn on VoiceOver.

VoiceOver on.

Exceptional Dogs.

Dog picker.



Swipe up or down with one finger to adjust the value.

VoiceOver tells me I can swipe up or down on this.

Let’s go ahead and do that to scroll through our carousel.


And I’ll scroll down, sorry, swipe down.


I can also swipe from left to right to get to the favorite and gallery buttons like this.

Favorite, off, button, show gallery, button.

Now, not only is every dog in the carousel accessible to me but so are the favorite and gallery buttons from every dog in the carousel.

I can swipe once more from left to right to skip the carousel and get to the stats view like this.

Name, Pinky.

So I don’t have to navigate to the whole carousel.

This feels like a much better experience now.

Let’s go ahead and keep navigating through our stats view to see what the experience is like.

I’m going to swipe from left to right to navigate through it.

Breed, mixed.

Age, 7.0 years.

Since this information is grouped together now, it provides me with way more context and navigation is faster since I don’t have to navigate through each and every individual label.

I’ll place my finger over the name of the shelter now.

Skyler’s animal shelter.

Actions available.

VoiceOver informs me that there’s actions available on this, so let’s swipe up to hear them.

Open address in maps.


Swiping up makes it clear to me that these actions are in reference to the name of the shelter and will be performed on them.

It gives me way more context.

Finally, I’ll activate the modal view by double tapping on the gallery button.

Show gallery, button.

Picture one of Pinky.


I’ll swipe from left-to-right to navigate through it.

Picture two of, picture three, close, close, close, button.

Focus no longer goes behind the modal view.

This is a much better experience now.

VoiceOver off.

[ Applause ]

Well you just elevated the accessibility experience of our app from simply working to exceptional.

We’ve achieved this by shaping the experience around the user and modelling the interaction around this experience, and in doing so, we’ve lifted a huge cognitive burden off the user.

Your users will love this extra bit of effort you put towards the accessibility of the app so they can love it as much as anyone else.

I’d like to hand it back to Skyler now.

[ Applause ]

Thanks, Bhavya.

Hopefully Bhavya’s demo demonstrated for you how small tweaks can transform the accessibility experience for the better.

It’s important to note that creating custom interfaces for the accessibility of your app is not always the right answer.

Sometimes the simple fix is the best fix, but what you really should be doing when making your apps accessible is thinking through beyond the surface level what it will really mean for someone who’s using an assistive technology to use your app.

To wrap this up, I’d like to return to where we started today.

Usable is a great first step for making apps accessible, but we can do better.

We can and we should provide our users with exceptional, delightful experiences that cater to their unique needs.

There are a lot of ways to take an accessible app from being functional to being exceptional.

Designing for accessibility up front can get you a lot of wins.

Doing things like making sure your colors meet minimum recommended contrast ratios or that your layout can adequately adapt to changing size have benefits for all users.

But you should also extend you default design to accommodate the settings that users have enabled.

Finally, when crafting the accessibility experience for assistive technologies, do so purposefully.

Prioritize your content.

Make it easy and intuitive to navigate and provide context when necessary.

For more information as well as access to a simplified finished version of the sample app you saw there, you can visit our session link on the developer portal.

As well, right after this session we’re going to be going over to tech lab 9 for an accessibility lab where you can bring any questions you have about making your apps accessible or about elevating the experience of your own app in terms on accessibility.

So please come on by if you have any questions.

Thank you and have a fantastic rest of your WWDC.

[ Applause ]

Apple, Inc. AAPL
1 Infinite Loop Cupertino CA 95014 US