Accessibility for iOS

Session 210 WWDC 2012

iOS devices are incredibly popular for users with special needs. Learn how to take advantage of the accessibility APIs so that everyone is able to use your apps. Gain insight into the best practices for making your apps work with VoiceOver and how to integrate accessibility features into your apps. Come learn how to make even an interactive game (The NSZombie Apocalypse!) work well with VoiceOver.

Chris: Welcome to iOS Cccessibility. My name is Chris Fleizach. I'm on the iOS Accessibility Team. Today, I want to talk about how do you take you app to the next level in terms of accessibility. I'm going to be saying the word accessibility a lot in this presentation. What do I mean when I say this word? It can have a lot of different meanings. Mainly, I'm talking about using technology to overcome challenges. Things like computer-controlled wheelchairs, assistive communication, devices like Proloquo2Go for the iPad, screen readers so people with low vision and blindness can use computers and, of course, there's many, many other examples of this. On iOS, we have a number of accessibility features that are designed to help people use the device in the same way that everyone does. VoiceOver is 1 of our flagship products. It allows people who are blind have low vision to use a touch screen device. It works by changing the gestures that are used so that instead of a touch to activate model, you have a touch to speak model. This provides a safe environment to explore the phone without worrying about something that will happen. We also have Zoom. Zoom allows you to use 3 finger gestures to zoom in and to pan around so you can get in close on text and read things if you low vision. In iOS 5.0, we added AssistiveTouch. AssistiveTouch allows access to all the hardware buttons, multi-fingered gestures even creating custom gestures through either a hardware device or if you only have access to a single finger or a stylus, you can also drive these gestures. In iOS 6, we've added a bunch of new things that I'm really excited to talk about. You've already heard a little bit about Guided Access. Guided Access mainly locks you into a single App and controls access to the hardware features. Now, why is this important and why is it important for accessibility? We designed Guided Access to really address the needs of users who are using the iPad with autism and what is the problem with that that we're trying to solve? The iPad has a lot of distractions on it. There's buttons everywhere. There's volume buttons, lock buttons, you can touch around and go into YouTube. Essentially, there are a lot of distractions and when there are distractions, children who have trouble focusing on things will get distracted by those distractions. Guided Access is really out there so you can help control the experience of a child or someone with autism so that they can focus on the content and learn or achieve what you want them to achieve. We're also adding made for iPhone hearing aids. This is really an exciting development. For a long time, hearing aids have been half-hazardly coupled to phones. They've had to use magnetic resonance conductors and other technology to overcome barriers but, now, we're going to be introducing high quality wireless direct audio links from hearing aids to their iPhone. It can be a very low power consumption device and you're going to be able to control settings directly from your iPhone for the hearing aid so deep integration. We're working with top hearing aid manufactures right now to produce this. They will be out sometime soon. It's going to be pretty exciting. You've seen a lot about maps already. VoiceOver also works with maps so the nice thing about having vector data is that data can be used for accessibility as well so we're going to allow you to discover roads and points of interest with VoiceOver on. You're going to be able to determine where intersections are and it's going to be integration with turn-by-turn directions so we're thinking this will be a pretty powerful solution for several users. We've also added some enhancements that's built upon previous work so custom vibrations was added in 5.0. It only applied to phone calls and FaceTime though. In 6.0, it will apply to everything. If you want a custom vibration for a text message from your wife, for example, you can set that and have it play specifically just for a text message and that applies to calendar alerts, Facebook items, reminders and the whole list of notifications. VoiceOVer and Zoom finally worked together. Before, there was conflicts between the gestures being used. We've overcome some of those so that you can use them at the same time. We made a number of enhancements to Speak Selection which I want to show you now. Speak Selection was introduced in 5.0. It allows you to select some text and have it spoken. We made 2 enhancements that I want to show. One is much better support for multiple languages so I have the same sentence in 3 languages, English, Arabic and Greek. Now, if I want to speak them, Speak Selection tells me that they’re English and 2 other languages available and when I select those … Speaker 2: The revolutionary iPhone also includes an equally revolutionary screen reader. Speaker 3: [foreign language 00:05:26] Speaker 4: [foreign language 00:05:33] Chris: It automatically detect languages based on script and other identifying information. If there are ambiguous languages, we look around and see what other languages you might want to speak at anytime so much better [inaudible 00:05:55] for international support. We've also added word by word highlighting. If I go to turn on, highlight words in the speak selection menu and come back. When we speak these items now. Speaker 2: The revolutionary iPhone also includes an equally revolutionary screen reader. Speaker 3: [foreign language 00:06:20] Speaker 4: [foreign language 00:06:26] Chris: I'm sure you got all of that but just some of the enhancements that we've added for speak selection. Let's talk about what the meat of this presentation is about and that's really using UIAccessibility to make your App more accessible for VoiceOver. Now why is this important? VoiceOver is a special application. VoiceOver is used to overcome a very challenging difficulty blindness but made more so by the touch screen in on of itself so making your App accessible means that you implement some APAI and a VoiceOver user can get more information out of your App and know how to use it effectively. We're going to be talking about the basic API, how do you do the most important things to make your App work well. We're going to be talking about the new API for the things that we've added in 6.0 and I then I want to have a deeper dive into some things that you may not have known. This is fourth year that we've done this. Some of you might be very familiar with the API and I think there are some things that you'll be able to learn. Who does this work? VoiceOver is running on iOS when you turn it on and it intercepts all gestures so when I touch somewhere, for example, I touch somewhere in notes, VoiceOver intercepts that and says, "What's the element at this point?" It does that by going through UIAccessibility and then asking the App. The App gathers some information about what's at a specific point and then bundles that up and sends it back over to VoiceOver. VoiceOver is then able to take that and transform it into synthesize speech or maybe braille output or some other alternative output that's appropriate for the user. How do we add accessibility to your App? The good news is a lot of it comes for free. If you're using UIKit controls to do things and most of it is just going to work so your job in that case is mostly to add labels. Things that are images or buttons that have images inside of them. All you need to do is set a label most of the time and things will start to work. Most of this information is conveyed through attributes. Attributes are the core of an accessibility API. Basically, they allow you to encode certain information about and object so the VoiceOver knows what to do. For example, we have a UIImageView here. By default, this has some attributes built in specifically we know the file name. Here, VoiceOver might have seen the image view and know that file name is apple_logo512x512 but, obviously, that's not a great user experience if that's what they're hearing. Instead it's up to you to set the appropriate attribute in this case the label for it and the label might be something like Apple logo. The 2 most important attributes in the API address the fundamental questions about is your App accessible. One, can a VoiceOver user reach an element? Can they touch and have it spoken? Then 2, what will be spoken? The first question is answered and is AccessibilityElement. You return, yes. That's means VoiceOver will be able to see it. The second question is answered with accessibilityLabel. AccessibilityLabel is a string that you can return that will be the textual representation of the object. By default these are filled in for UIKIt control so a UI label automatically returns its text for the accessibilityLabel. Some other common attributes, accessibilityHint is a good way to provide a little bit more information about what to do if the context is not completely clear. It's optional and accessibilityTraits provide you were the way to control what are the behaviors, the roles, other important information about this object. Is it selected? Is it a link or a button? A little bit more about the traits. Say we have a screen that looks like this. At the top, we will have an element that might be labeled with a static text trait. In the middle, we have an element that has the image trait, towards the bottom, there is a button and at the very bottom is a slider which would have an adjustable trait. There's about 16 or 15 or so traits that allow you to decide how VoiceOver will interact with something. How do we go about and add this attributes? Obviously, there's some [inaudible 00:11:24] but you can also add them through interface builder. In the inspector pane, you might see a screen that looks like a little bit like this and all the information we talked about is encoded here as well. Is accessibilityElement corresponds to this check box? AccessibilityLabel is the label, Hint the Hint and then these array of traits where you can select what's appropriate. Interface builder is great if you have a static design or a button just is there all the time but if you have a little bit more dynamic of an interface or you're doing things in code, you may need to set values and code so 1 way is to use the setters. Say for example we have a MyControl object. By default, things that are just UI controls don’t have any accessibility so you need to go in there and set … is accessibilityElement, yes and set the appropriate accessibilityLabel. In this case maybe it's a play button. If you have objects that change their value or change their label depending on your data model, you may need to override. You don’t just have to use the setters, you can also override and say, Yes, this is an accessibilityElement and my accessibilityLabel is something that's appropriate. Accessibility attributes, that will take you 85% of the way there. Most of the work you'll do will just be setting attributes here and there and you're absolutely be going to become a lot more accessible. Now sometimes, you also need to tell VoiceOver when things happen and these are done through accessibility notifications. When a few things change on the screen, you need to tell VoiceOver to update so if we look at this example in clock, when you press the edit button what you get, well there's a few new elements on the screen. A delete switch, something disappears and so on. In that case, you need to tell VoiceOver to update itself, just a few things which happened. To do that, we call UIAccessibilityPostNotification where the layout changed argument. The other type of notification that's important is the screen change notification. This is when the whole context switches out VoiceOver needs to update itself. It will play a sound, it will move to a new element and so on and you do that similarly by sending the screen change notification. For example, if you select a new tab in the tab controller, that will generate a screen change and this are sent for all that basic UIKit controls already. It's only if you're doing custom things do you need to worry about these notifications. That's going to take you 90% of the way there. Notifications and attributes are really the most important things you have to worry about when making your App accessible. Let's start in with the demo and some of the code and see what we can do. The demo App that we're using this year is 1 that I've created. It's called the NSZombieApocalypse. Let's see how the App works before we try to make it accessible and we can look at it with VoiceOver. Basically, the zombies, the NSZombies are running amuck in your App and when things happen, more zombies go and eventually you run out of memory and the App will crash. Your job is to keep deploying this memory, the release techniques like ARC or auto-release and you can make these zombies go away. I believe I have some sounds too. If you use garbage collection, that will actually add more memory to your program so don’t do that 1. We also have this little help screen with a little story that I wrote. Let's start how you play the App. The question is how do we make it accessible? The first step to do doing that is to turn on VoiceOver and audit it for accessibility. How do we do that? The easiest way to do that is to go to settings and … let's see. Go to accessibility and then you go to triple click home and set VoiceOver to triple click. Once we do that we can [crosstalk 00:15:55] and VoiceOver will turn on … Speaker 2: … VoiceOver off. Chris: … and off when you toggle that. Let's go back to our App, turn on VoiceOver. Speaker 2: Landscape. Home button to the left [crosstalk 00:16:06] NSZombieApocalypse. Chris: Let's start to examine what's wrong with the App. The first thing if I try to touch on the zombie meter … Speaker 2: VoiceOver on, zombie meter. Chris: Nothing really happens. Speaker 2: An object was retained too many times. Six- Chris: I can touch that so that looks fairly standard. What about the buttons? Speaker 2: Self release. Self release. Chris: It didn’t say it was a button and also the frames are rather small. Speaker 2: Button. Chris: That 1 just says button. There's a screen change here but nothing really happened. Speaker 2: Button. Chris: Those 1 also say button and if I start to examine by swiping around so I can move from element to element by doing a swipe gesture. Speaker 2: Self de-lock, self release, self auto- garbage colle- ARC. Garbage. Self. Self. Self. Free. Button. An object was- zombie meter. Chris: I can swipe through and go through all the elements in the screen and find out if something is missing. I also wasn’t able to touch on the zombies at all. That's sort of an empty area. There's a few things that we can do to make this a better experience. One, we can make sure that zombie meter is described. We can make sure the zombies are elements. Speaker 2: Game over. Chris: We can make sure the buttons say that they're buttons. We can probably give some hints about what to do because it's … not immediately clear what you do with those buttons and then obviously there's the help button needs some labels and stuff like that. Let's go to the code and see what we can do. I wrote down a bug list and I also have some screenshots so it will be easy to look at what we want to do. Here, the buttons at the bottom are not really buttons. That means that, let's see, these buttons are not really buttons so instead we need to make sure that these things say they're button and they have the right frame so how do we do that. Let's look at here and go to our button view. How come it didn’t speak button? It looks like the button view is just a UIControl and inside of that, there's a label which looks like the thing that VoiceOver was focusing on and then it does some other stuff. How do we do this? IsAccessibilityElement. Return, YES. Then let's return the label. What is the label for this button? The label is likely just the text of the labelView. Then finally, we need to set the traits to say button. That will make sure that the buttons behave like buttons. Now, the buttons also need some hints. How do we do that? Let's override a string accessibilityHint and return something like Drag memory technique over zombies and release to deploy. Hopefully, that should give enough context what to do. Now, the zombie meter description so it needs a label. All I could touch was on that label of the zombie meter. Luckily, there's a class called zombie meter. It's just a UIView with the label inside of it. What we're going to do is make this an accessibilityElement so you just have to find 1 giant zombie meter. We'll make sure it has the right label and that will also be the text of the label. Then we want to return the value of the zombie meter. The thing kept going up and up. It has some sort of percentage associated with it. In that case, we can use something that I haven’t mentioned yet which is called accessibilityValue. Value is great for things that change, that have some dynamically changing value to it. In this case we can return NSString stringWithFormat and%0f%% zombieLevel times 100. That will give me accessibilityValue. Now the question mark button need a label. Let's look at our question mark button here, it looks like it takes a symbol and might have been the question mark or an X or something and then it looks like it does a drawRect right in the middle with that string or … it draws rect with the circle and then it looks like it has a label beneath it so we added a label sub view to that button. We can see why VoiceOver doesn’t speak anything. There's a button and then there's a label inside of it. What we have to do is tell VoiceOver to speak the right. Now we could say it, speak whatever that label has. We could tell it to speak,"Question mark" or "X" but that's not the semantic information about the button. The button is really about help or close or something like that so let's find out where we set these things and set the right accessibilityLabel with the setter and call this 1 help and we'll find the next 1. This 1 looks like I should say next or something and the last 1 is hiding here. It looks like this is help 1. [inaudible 00:22:10] change over here. This 1 should have been close. Those buttons have labels but already buttons so we don’t need to say that they're elements again. When you press the question mark, VoiceOver needs to reset itself so when we're on this screen and we press that X or we press the question mark, we need to so that it resets itself so VoiceOver knows the context has changed. How do we do that? Let's find out where that happens so I'm looking at question pressed. This looks like the right place and I'm going to post and accessibility notification and a screen change notification. That will tell VoiceOver to reset when the question mark is pressed. Let's also do the same thing when we close the help. That will be good. Then our final bug, the zombie should be elements and have labels. Where are those zombies hiding? The ZBEWalkingDead, it looks like, and they have a bunch of body parts. Let's just find the body and then we can set IsAccessibilityElement and the accessibilityLabel will be something appropriate. Make sure you also localize all your strings. I'm not doing that that now for the sake of brevity but VoiceOver speaks like 30 languages or so. Make sure you use localization if your App is going to be localized. We try that [crosstalk 00:23:53] Speaker 2: … zombie apocalypse. Zombie meter, 0%. Chris: We check the zombie meter. Speaker 2: Zombie meter, 15%. Chris: Saying the right percentage. Speaker 2: Walking dead in the house. Chris: Great. I can touch the zombie if I want to. Speaker 2: Self release button. Chris: It say self release button [crosstalk 00:24:07] Speaker 2: Drag memory technique over zombies and release to deploy. Chris: That was the hint. We can close the help button. Speaker 2: Help button. Chris: When we press the help button, we should reset. Speaker 2: NSZombieApocalypse. Chris: We reset to the top of the screen. Let's check. Speaker 2: Close button. Chris: That says the right thing. Speaker 2: Next button. Chris: That says the right thing. Speaker 2: Close button. Chris: Let's close. Speaker 2: Zombie meter, 35%. Chris: We also went through the first element in the screen which was the zombie meter in that case. It looks like- Speaker 2: Zombie meter, 42- Chris: We have addressed all those bugs. The App is becoming more and more usable all the time. Let's go back and … those are the basic accessibility attributes. With just a little bit of work, we can start to make our Apps more accessible even something as dynamic and graphically intense as that game is. Let's talk about what's new in 6. We've already learned that with just a little bit of work, we can get a lot of gain. Where do we go from here? We've added new ways to interact with VoiceOver. We've added some new attributes and traits to help make your App more complete and something I won't demo but if you have a custom text view based on UITextInput, VoiceOver will work with it natively. No extra work required as long as you implement UITextInput. Let's look at the new API. First 1, accessibilityPerformMagicTap. What is this? If you've used VoiceOver, you may know that there is a gesture. It's a 2-finger double-tap. When you do a 2-finger double-tap, VoiceOver does the magic thing. These means if you're playing music, it will pause the music. If you're in phone call, it will answer the phone call or hang up the phone call. If you're in the clock Appropriate, it will start or stop the stopwatch. If you're in camera, it will take a picture. Two-finger double-tap magically does the right thing in whatever context you're in. You can also add this to your App too if you need to. You implement this method, you do whatever you need to do and return, yes and make your App seem like magic as well. Just like that. We also allow you to now move VoiceOver focus if you need to. Sometimes, you want the VoiceOver user to be at a specific place in your App to learn and understand what's going on. Before 6.0, that was quite challenging to do. Now what you can do is use the argument for the accessibility notification for either layout change or screen change and passing an element that you want VoiceOver to move to. For example, if I have a screen change and I want VoiceOver to move to a specific button, I can do that passing as an argument. You can also now find out when VoiceOver finishes speaking something. Finishes speaking what? There's another notification that I haven’t mentioned yet called the announcement notification. That came out in 4.0 and the announcement notification allows you to tell VoiceOver to speak something any string you want. Now, this is sort of nice if you need to have an announcement or the screen changes or something is happening that's not immediately clear. You can use that to inform the VoiceOver user what's going on but if you needed to string multiple instances of those together, for example, you're reading a book in row or something, you wouldn’t know when VoiceOver is finished speaking so you would interrupt what VoiceOver, what you told VoiceOver over to speak. Now, you can listen for the announcement did finish notification and that's an NSNotificationCenter notification and when that comes in, you know that VoiceOver is finished speaking your string. We've added a new API to help you order things when VoiceOver is navigating through your interface. What do I mean? Let's look at the clock App for example. The clock App has all these little alarms and a random grid of array and you would like VoiceOver user to navigate through them in a specific order. How do you navigate? Through a single-finger swipe left to right. As you're doing this swiping through left to right by default, VoiceOver is going to go top left to bottom right. It's going to go sort of row by row through things. If you want to order them differently, you can use shouldGroupAccessibilityChildren and that tells VoiceOver that these things in this view should be together. We've added a new trait, the UIAccessibilityTraitHeader. This screenshot of Game Center, I would identify 2 things as possibly being the header in this case. They allow you to identify the VoiceOver that said this thing is sort of like a heading over some sort of content and that can be useful for locating where you want to be or understanding the layout of the App. Let's go back to our App and see what we can do to use some of this new API to make it even better. The second demo bugs. What do we want to do? Let's implement this MagicTap and see what we can do. Let's try to move VoiceOver focus when the question mark disappears so when you press this X button, let's try to move VoiceOver focus to something more appropriate like maybe the status view. We can try to move VoiceOver somewhere to here which maybe more appropriate since that's the most relevant information. Let's also use shouldGroupChildren on the button collection view. That means that down here, we want these things to be grouped together. We don’t want this inconveniently placed button to be in the middle of all those other buttons. We want these things to group together. Then, finally, let’s use the heading trait on here on the NSZombieApocalypse heading to identify that appropriately. First up, before MagicTap. I'm going to my App delegate and override this. Return YES and you can implement perform MagicTap on an object, an individual object or a parent in the view hierarchy or the UI application or the UI application delegate. VoiceOver will go up the chain of things trying out each 1 and I think what we should do is togglePause. I know this is a fast-moving game. It’s very challenging. If you get tired and need a break, you should be able to pause it. Move VoiceOver focus to status when a question mark disappears. Let’s go to our view controller, help did close. This looks appropriate. Now, let’s just pass in statusView. Done. shouldGroupChildren on the button collection view. How do we make these things order together? Here’s a nice ButtonCollectionView and then here let’s return YES for that. Then, the heading trait on the question mark title. HelpView. There’s a giant label here somewhere. This looks like it and we can set the accessibilityTraits to be the right 1. It’s running. Speaker 2: NSZombieApocalypse. Speaker 1: What do we do? The first thing, the swipe order of this should be together. Speaker 2: Self de-lock. Self rele- Self aut- Garbage. ARC. Help button. Speaker 1: If you notice, swiping through. I went from- Speaker 2: Free but- Self … self … self … garba- ARC. Help button. Speaker 1: I went through all the buttons and then after that, I went to that help button. Buttons ordered together. Good. Things are working. We did the MagicTap. Paused. The zombies are finally paused. You can go get a break and [inaudible 00:32:40] when your … can double-tap and start the game again if you need to. What else do we want to do when … Speaker 2: Help. NSZombieApocalypse heading. Speaker 1: That thing says heading just like we asked it to and, finally, when we- Speaker 2: Close. Speaker 1: … when we close this one, we should move back to that status view. Speaker 2: Memory leak, 3 megabytes. Speaker 1: We close that thing and we told VoiceOver to move to specific element and it worked just fine. Let’s go back and talk about some other cool things. Let’s talk about some of the more deeper topics. If you’ve made your App accessible already, you may have had a standard App. It may have gone very easily but chances are that as your Apps become more complicated, you need to use some other accessibility techniques to continue to ensure your Apps remains accessible. The first thing I want to talk about, what happens if there is no view. For example, you’re using drawAtPoint with a string, you’re using overwriting drawRect. Maybe you’re using OpenGL. Good example is maps. Obviously, there’s not a new UIView for every road. What do we do in this case? What you want to do is make an array of UIAccessibilityElement’s. UIAccessibilityElement especially just a container for some idea in your program. For every distinct user interface object that you want to expose, make a new element. How does this work? For example, say we have an array of roads. We want to gather all of our roads together and return those as elements so VoiceOver over can interact with them. First thing we need to do is make the element. We do that by editing it. Notice the accessibility container is the … should be the view that does the drawing. VoiceOver starts by drilling down through the view hierarchy starting at the window and then all the views all the way down. If your view is the last thing before it starts drawing random things everywhere, then that is the accessibility container. You need to set the accessibilityLabel for that object so VoiceOver can speak something and then maybe you want to store that in an array. It’s good to cache these things. VoiceOver will ask for them often. If you can cache them in some array or something, it will make life easier. Next up, how do we tell VoiceOver what they are? We want to implement what we call the UIAccessibilityContainer protocol. Three methods, pretty much just like NS array. First 1, the count. How many accessibilityElements are there? Easy enough if you have an array set there, you can just return the count. Next up, what’s the index of an accessibilityElement? Here, you can mirror index of object from NS array. Finally, what’s the element at an index? Again mirrors NS array. Six or so lines of code, pretty simple if you have this modeled on an array to start with. The last part of this is you have to tell VoiceOver where is it in the screen. We filled out the label. It already knows that it should be an element because you made an accessibility element. You don’t have to set the isAccessibilityElement property again but you need to tell it where it is and you need to tell it in screen coordinates and this is a challenging thing to remember. I know I never remember this code. I have to write it down in some help or method. Basically, we have an object. We need to get the frame in the view coordinates. The map view is [inaudible 00:36:17] big and we start from the start of the map view and maybe it's at position 100 and it’s 50 by 50. Once we have that view frame, we need to convert it to the window coordinates. We take that frame and then convert to self window and then from that, we can convert to screen coordinates. We do that by saying self window, convertRect toWIndow:nil. That gives VoiceOver what it needs in the entire screen because VoiceOver doesn’t know about the internals of your App. It doesn’t know that this view is over here so it needs everything in screen coordinates. Then finally, you can set that on the accessibility object. Next up is using announcements to provide better feedback. There’s a lot of things that are quite dynamic in iOS and if you’re blind and trying to use this device, it can be a little challenging to do certain things. One example is rearranging things in the music App. If I select composer, I can drag that around on the screen and eventually drag it to the tab bar and we’ll replace 1 of the items on my tab bar. How do we make that in accessible experience so it can be useable? This is what UIAccessibility announcements are good for. For example, is you have a continueTracking method where we’re tracking where this object is, we can detect when things happen and tell VoiceOver about it. For example, if we are NearEdge, we can sell VoiceOver, you’re nearing the left border, you’re nearing the top border, nearing the bottom border. If we are on EmptySpace, we can say on EmptySpace, lift finger to cancel. If we’re over something that’s important, we can say on top of Artists lift finger to replace. Notice here, I’ve [inaudible 00:38:02] to find post to be the much longer UIAccessibility post notification that's just so it fits on the slide. The last thing I wanted to talk about is direct touch or direct interaction and this is designed as a way for you to allow VoiceOver to interact with something that is very dynamic. Good example is a keyboard. Keyboard really requires you to press things in realtime. You don’t want to double-tap on every key and that would be a very tedious way of making a song. VoiceOver allows you to do this by setting the direct interaction trait on an object and when you do that that means a certain area will allow voice over to interact with it directly. You can also allow VoiceOver to explore elements within the direct touch area. This keyboard is composed of many keys that have information. For example, you might know that this is a C3 key and you might want to inform the VoiceOver user of that. How does that work? VoiceOver has a number of gestures to control things and 1 of the things they control is whether the direct touch or direct interaction is on by default or off. If they turn it off, they can continue exploring an area. When they turn it on, they can interact with it. Some code to show you how you do that, maybe you having something like a PianoView and in your accessibilityTraits of PianoView, you’ll return the appropriate trait which is this direct interaction trait. Notice that it is an accessibilityElement because this is what you want VoiceOver to interact with but you can also set these objects within this view as accessibilityElements. A VoiceOver user can explore things within a direct touch area. A nice trick to remember if you have an App that uses this kind of paradigm. Let’s take a look at how we can make our App even more accessible using some of these techniques and I want to try to use the use announcements while dragging technique to provide a little bit more information. The 1 thing that might have been hard about this App is, where do you drag those buttons especially if you can’t see it? I believe we have some helpful methods here, buttonSelected, buttonDragged, buttonFinished. I think we can make this a better experience by posting some notifications that tell you what to do. When the button is first selected, what should we say? My guess is something like “Drag button over zombies.” That should be good enough. Now, when you’re dragging, there’s a few different states. First example, if the button is inside of the iPadView and it’s the first time inside the iPadView, we should probably say something appropriate like, “Lift to deploy memory.” If you’ve left the iPad view for the first time, maybe we can say, “No longer over zombies, lift to cancel.” Let’s give that a try. Speaker 2: NSZombieApocalypse. Zombie- Speaker 1: Now, I want to start dragging. Speaker 2: Drag memory technique … drag button over zombies. Lift to deploy memory. No longer over zombies. Lift to deploy memory. No longer … lift to deploy memory. Garbage collection. Button. Drag button over zombies. Lift to deploy memory. Speaker 1: Just by providing a few extra-announcements, few lines of code, you can make your App even if it requires a lot of interaction. Make them a lot more usable. Speaker 2: Home. Speaker 1: Final topic I wanted to mention are things that you may not have known about UIAccessibilities. Some cool things that you can do if you’ve really going into the advanced techniques. One is control the language of the entire App. How do we do that? You can set the AccessibilityLanguage on UIApplication. Accessibility language is another attribute that tells VoiceOver what language to use when it speaks. If you want it for the entire App or just a specific element, you set it on the right thing. You can also set it for range within a string. For example, let’s say, we have an AttributedString that has some Japanese and some English in it and you want VoiceOver to speak with using a certain language for certain range, you can add the attribute accessibilityLanguage with the right value for that range. Now, the values for these things are some standard Unicode thing, ja-JP is Japasese, fr-FR is France-French. Easy enough to find a list of them and the ones that VoiceOver supports and then once you have that, you can use that as AccessibilityLabel. AccessibilityLabel says it takes a string. You don’t have to believe it all the time. You can also pass in AttributedString if you want. Now, that AttributedStrings are more prevalent in iOS 6. How do you deal with controls without views? There’s a few controls in iOS that are challenging to get at. One of them is the UISegmentedControl specifically something that looks like that where you want to override it to say something different than degree sign F or degree sign C, in this case, whether you like to say Fahrenheit or Celsius. How do you get at those things? Normally, you create your UISegmentedControls with the NSString or image. In that case, just set the accessibilityLabel on NSString itself or if you have an image, just set the accessibilityLabel directly on the image and then when you insert those things into the segmentedControls, VoiceOver will pick that up and speak the right thing. Another example is UITableView index titles. Here is a thing that you return an array of strings for table to get the ones that you want to appear. You can change what VoiceOver speaks for each of these things as well by overriding the accessibilityLabel and setting that on NSString. In summary, obviously, we want you to add accessibility to your app. I think guys are all committed already. That’s why you’re here. It’s an important task. The reasons are numerous, just a few of them. One is going to increase you user base. Obviously, you want more users using your Apps so that’s a good reason. I think another great reason is that you get a lot of really good feedback from users. Users are very excited when they find out that an App is accessible. It’s really nice and rewarding to get that kind of feedback. Finally, Apple takes accessibility seriously. It is a partnership with developers. We are working hand in hand to try to make the most accessible platform that’s ever existed and it really requires Apple’s commitment and you commitment to ensure that everything works as well as it can. I think this is a really worthy goal, something that you should be telling your friends about. It makes for a really compelling use case for iOS to see just everyone being able to use this device and then getting the same satisfaction that all of us do every day. If you’re interested in related sessions, there’s an accessibility in iBooks session tomorrow and if you want to make a custom text view like I mentioned, keyboard input in iOS is a good 1 to attend and lots of more information. There’s accessibility programming guideline which will repeat a lot of the stuff that I said here about common attributes, protocol reference and then VoiceOver user manual. We’ll tell you all the gestures that VoiceOver can do, quite useful. Thanks for coming out and attending WWDC2012 and keep making great accessible Apps. Thank you.

Apple, Inc. AAPL
1 Infinite Loop Cupertino CA 95014 US