Design For Everyone 

Session 806 WWDC 2017

By understanding the range of abilities and capabilities of the people who use our apps, you can design robust apps that work for everyone. Learn how designing for accessibility and inclusiveness can do social good, widen participation, and enable everyone to benefit.

[ Background Sounds ]

[ Applause ]

Hi, everyone.

My name is Carolyn Cranfill and I’m a designer on the Human Interface Team, here at Apple.

And I also lead the inclusive design efforts.

Thank you for joining us today.

We released a series of seven videos last month highlighting people who feel empowered to do what they love.

And I’d like to share with you one right now.

At first, my motivation was proving people wrong.


Vision. Voiceover.


Scheduled arriving.

Second location.

We don’t want our first album to sound like a local band.

ETA 10 minutes.

We wanted our first album to sound like a professional band that’s been doing it for years.

Requesting a ride and connecting to nearby drivers.

Finding a ride.

[Drumming] We’re called Distartica and we play heavy metal.

[ Music ]

Turn left onto Empanada Drive.


Empanada Drive.

That makes me hungry.

I ended up becoming the PR manager in the band.

Like I didn’t get what it was before.

Yeah man, like oh, dude.

Like hashtag why?

Like What?

Hashtag metal.

Hashtag new music.

Or, you know on this case like hashtag like debut album.



Double tap to open.

Text field.


Album will be dropping worldwide on April 14, 2017!

Follow our ReverbNation page.

Done. Successfully shared.

[ Music ]

[ Applause ]

So, this film, along with the others beautifully depicts the impact accessible tech has on people’s lives when you design for everyone, including people with a range of disabilities.

I’m here today to ask you to join the campaign with us in designing your apps and games to work for everyone too.

Because we inherently design for ourselves, because we design to what we know, our own world view.

We have wants and needs, and passions.

We think of features, make an app, join a company to put those ideas out there for others too.

But regardless, the next step of reaching more people is putting yourself in their shoes.

We need to figure out who we might be excluding.

Have you thought about how there might be other people out in the world with the same passions, needs and wants as you, but can’t use your app?

If we think about how there are more than one billion active Apple devices out there, which is a lot of potential customers, and knowing that one in seven people in the world have a disability, we can be sure that people who may want to use our apps and platforms will lie everywhere across a spectrum of abilities and capabilities.

Human abilities vary greatly across cognition and social abilities; dexterity, mobility, vision and hearing.

And in order to understand and comprehend information, our perception through sight, sound and touch are essential.

What happens if you inhibit one of those senses, or take one away?

That could happen to you or I in our lifetime, gradually or instantaneously.

People with physical disabilities will include extreme cases, like cerebral palsy and paralysis, but also include conditions like carpal tunnel syndrome, or temporary conditions such as a broken wrist.

People with vision loss will include blindness, low vision, and colorblindness.

But also, situational disabilities like screen glare and temporary day blindness that we all experience when emerging from a dark movie theater.

Hearing loss is one of the leading types of hearing disabilities.

Whether it’s from birth or it’s prevalent with age, made for iPhone hearing aids are changing people’s quality of life.

And as we extend our notification queues to having all visual, oral, and tactile feedback, we’re enabling everyone to get their notifications.

And you may have heard the statistic that 1 in 68 children in the United States have been identified with the autism spectrum disorder, according to the CDC.

Autistic individuals can be sensitive to unexpected sounds and distracting graphics.

And there are also many adults and children with dyslexia.

Having multiple visual and auditory options help people with dyslexia and is also widely used by the blind and people with low vision.

So, if we think about how everyone in the world will have a range of abilities and capabilities, the next step of reaching more people is making your apps accessible.

But the word accessible wasn’t invented to describe accommodating those with disabilities.

It’s about access, whoever we are.

Easily reached or obtained.

Easily used.

Easily understood.

Everyone. Whether you have a permanent or situational impairment, young or old, new or pro to technology.

Whatever language you sign, speak, or read would love access to great apps.

Together, we can design better apps for everyone.

To help you do this, we’re going to talk about some design considerations and workflow techniques to keep in mind when making your apps work for more people.

By being inclusive, not only can we help everyone function in their daily life.

But we can also create new possibilities throughout society.

So, today, we’re going to talk about three areas of designing for everyone.

Striving for simplicity, making our apps effortless and easy to learn.

Accessing perceivability.

Making sure our apps are understandable through sight, sound, and haptics.

And designing with integrity.

Holding ourselves accountable for ethically making sure we include everyone.

These are principles applied to making great experiences, period.

So, first if we think about something that is simple to do, we think of it being effortless without difficulties, hurdles, or cognitive strain.

But for something to feel simple that does mean it can’t do complex things.

It just means trying to achieve desired outcomes in the most effortless and elegant way possible.

Things that aren’t simple create barriers of entry to everyone.

As we go through our day, the requirements of work and life slowly drains our energy.

And when we are low on energy, we have less energy to troubleshoot tech and less patience for navigating something that should be quick and easy.

At night, we rest and recharge.

However, were you aware that one in six children in the United States have a developmental disability?

It ranges from mild disabilities, such as speech and language impairments to serious developmental disabilities, such as intellectual disabilities and autism.

For people who have cognitive disabilities and/or chronic illnesses, they might start out the day at three-quarters of a tank.

And tasks take extra effort throughout the day.

Plus, if you exert yourself all the way to empty on one day, you might start out the next day, extremely low.

So, the apps that we make do a great job at helping us through our day.

Whether it’s a convenience or helping us to recharge.

They provide quick and easy access to services, which is extremely important for those who may have difficulty traveling outside of the home, speaking a hearing on the phone, or juggling work and home life.

For example, convenience apps like Instacart, you can have groceries delivered to your door which alleviates the stress of the supermarket, or allows you to rest at home.

Or maybe it just gives you time back with your family.

With education apps like Doulingo it’s incredible that we can learn from the comforts of our home, on our own time, at our own speed, and with health apps like Calm, they help us nurture our mental and physical well-being.

And with content creation apps like iMovie, we can create and share our artistry with friends, family and the world.

So, apps play a key role in our daily lives now.

And we can make even more impact in people’s lives by reducing the amount of effort that we require to use our apps, which reduces barriers and provides more access.

But this is easier said than done.

So here are some ideas.

We should strive for our apps to be easy to navigate by having them similarly structured.

We need to be able to get started and complete common task without encountering any barriers or issues.

And we should use UIKit as much as possible to achieve the consistent behaviors within each app and across apps.

Because UIKit wasn’t made for just developers.

It’s a design language for the platform that everyone is familiar with and knows how to use.

And there are design resources on human interface guidelines to help you use the standard UIKit components in your apps during the design phase.

It’s important to remember that these toolkits support many accessibility accommodations.

And if we can consistently use standard UIKit interaction patterns and components for more task of our apps like basic structure and navigation, sign-up flows, account settings, and sharing then you can spend the majority of your time on the parts of your apps that make them special.

And making sure those parts are accessible too.

Consistent use of our design language is beneficial because if you learn one app you have a serious head start on learning another because repetition creates cognitive ease and a comforting feeling of familiarity.

But we know it’s not always possible, to solely use UIKit.

There are times when you need to do something unique and original, but don’t forget about accessibility.

It makes you think more holistically about the products that you’re putting out into the world.

So, simplicity.

Building on what I know.

Making use of standard components and familiar interaction patterns to help make apps easy to navigate.

Be laser focused in your objectives.

Reduce any hurdles.

Help me get started quickly.

And it just works like I expected it to consistently.

So, our products need to be understandable and perceivable to everyone.

This means that information needs to be available to at least one sense; sight, sound or touch that is available to each person.

And we can look at our apps and ask ourselves is it legible?

Is it audible?

And is it tactile?

But before we look at legibility.

Let’s escape for a moment to this beautiful beach and experience it in a few different ways.

Relax, watch the waves roll in and out.

Soak up that sunshine for a second.

And if you have vision loss, you might not get to see this in all of its glory, but you’re hearing would be more sensitive to these beautiful sounds.

[ Ocean Sounds ]

Or if you’re colorblind, you might see the world more like this.

Or if you’re blind, you’re likely even more in-tune to the sound.

And these situations are very real.

In the world, 285 million people are estimated to have moderate or severe vision loss.

And 39 million of those are completely blind.

And one in 12 men are colorblind.

And best estimates are around 360 million people in the world have disabling hearing loss.

Not only can hearing and vision loss limit people from distinguishing information, but they will also be seeing and hearing your brand in different ways than you expected.

For example, someone with low vision may see their phones like this.

And if you aren’t low vision yourself, you can simulate low vision by using a small blur in your design software.

Now, people with low vision will often learn the location of icons as long as they don’t move.

But content areas are different.

So, they need to maintain high visual contrast to increase legibility.

So, if you get started, you should ask yourself is your app legible?

Because maximizing legibility is the highest benefit on being able to comprehend information because it’s less work on us to interpret.

For example, when we have key information that is low in contrasts because of small fonts and low color contrast ratios, or thin fonts.

It may be painful to read and you experience cognitive strain.

So, a good place to start to correct low contrast is to use the default textile body for your main content and a high color contrast with your background.

And then you can use bolder font weights to emphasize text or titles, rather than thin fonts.

And you can use filled in button shapes so that the text contrast is high, and the way forward in your app is really clear.

And if you don’t use textiles or system fonts in your app, it would be ideal to support the accessibility bold text settings, which requires you to increase your font weights by two weights thicker when the setting is on.

Now, this is built-in with system fonts and textiles.

The new App Store design does an excellent job at maximizing legibility through its bold typography, prominent buy buttons, large fonts, and high color contrast colors.

Great job guys.

On macOS, there are increased contrast settings as well, that when on all active controls become higher contrast by making backgrounds opaque, button outlines becomes thicker and darker and system colors become darker to create high color contrast ratios.

So now that you know to look for, let’s look at the before and after again.



And even the ruled lines and the color values of gray text becomes darker.

You will need to supply these same alternate designs, if you do not use standard AppKit because they will not come for free in custom UI, but they’re extremely important for maximizing legibility.

Now, the next suggestion to increase legibility maybe the most important thing to take away today.

And that’s to support dynamic type, or the text size setting because this feature increases legibility by allowing each person to personalize their experience by increasing the font size.

Or, maybe you need a farther viewing distance if mounted on a chair or propped on a table.

Because when our fonts don’t scale, people with low vision need to use screen zoom windows to read the content.

That’s like having to turn your light on, or your magnifier in a dimly lit restaurant to read the menu.

But except this is all day and every day.

And some people don’t bother with zoom windows at all.

They painfully read the small type or use their phones for short periods of time.

And that’s not a great experience.

So, if you adopt dynamic type by using textiles and use the new standard spacing constraints, we can offer a larger reading experience without needing zoom windows.

And new this year in iOS 11 is dynamic type support for custom fonts.

Additionally, if you access larger text setting via the general settings, there’s a switch that enables even larger accessibility text sizes.

And this year, in iOS 11, all textiles will support these larger settings.

So, this means that we need to design responsive layouts to be able to handle these large font sizes.

To do this for our own platform apps this year, a team of us took a step back and developed a few simple principles that helped guide us through a system-wide audit.

The first principle was to make as much text dynamic as possible and to support these accessibility text sizes.

Next is make good use of the screen width by wrapping lines of type, rather than truncating.

And then, displaying the same amount of text at the larger sizes as with the default UI.

And lastly, scaling glyphs that appear next to text in content areas so that the glyphs scale with the text in harmony.

And when we look at all of these together in practice it really gets magical.

The mail screen now grows the whole mail message to a zoom level of 315% from the default size.

And in custom layouts, like calendar, this year we made even more content scale like the month grid increase in size, where it didn’t support dynamic type it all before.

And at the larger accessibility sizes, we re-laid out the event row similar to the watch layout.

Because in the end, the character count per line was about the same.

And it allowed maximum screen width for text.

So, now if we turn on the bold text setting and simulate low vision with the blur, and compare it next to the largest currently shipping screen on the left, now people with vision loss will be able to use this app more.


[ Applause ]

We’ve designed these responsive layouts in most native UIKit controls.

For example, like action sheets, edit menus, and a keyboard auto correction bar.

For elements like navigation bars, toolbars, and tab bars that would take up too much space if we scaled them.

We added a larger representation of the Fortis in a central hub that you can get via Long Press.

And we also added larger popovers for segmented controls.

And we’ve also extended larger tech support to system spaces like home screen app labels, and notifications.

Now, there are a few we missed this year, but we have plans for more and if you support iKit, you will get these updates for free as we make them.

So, maximizing legibility and providing scalable dynamic type improves visual experience for everyone.

And alongside text that is clear and understandable, same applies for media and games.

All dialogue should support captions because captions display what is being spoken, which gives more access to people, whether you have hearing loss, in a public setting, or maybe even learning a new language.

It’s like trying to watch a video or a game intro with the volume turned all the way down, or trying to read someone’s lips that aren’t looking at you.

It feels like you only get about half the story.

So, apps like Kitchen Stories have added captions to all of their cooking videos, which is great.

OK so along with sight is sound.

You can ask yourself, are your apps audible?

So, let’s listen to a clip of the intro design for Carlos V.

video again and listen to the significance of the audible cues of the app along with VoiceOver that he’s using to navigate.

Now, VoiceOver is a built-in screen reader that speaks the content on screen for people who may be blind or have low vision.



Double tap to open.

Text field.


Album will be dropping worldwide on April 14, 2017!

Follow our ReverbNation page.

Done. Successfully shared.

So, we can hear VoiceOver navigating the app.

The custom sounds of the apps messaging feature, and dictation to input the message.

Altogether, this creates an experience that is accessible to Carlos.

So, adding thoughtful sounds to app interactions helps you know where you are, what you’ve done, what you can do.

Plus, it’s just delightful.

So, checking out VoiceOver in a system is as easy as asking Siri to turn VoiceOver on and off.

But if you don’t know the VoiceOver’s like I have a quick demo here.

I’ll be swiping across the screen to advance the selection through some apps.

But then since I know that the phone app is in my dock, I’m going to tap the lower left hand corner of the screen and then double tap to open.

VoiceOver on.

Mail. No unread emails.

Calendar. Monday, June 5.

Photos. Camera.

Dock. Phone.

Phone. Add.


Edit button.

Amy Frost iPhone button.

So, and there are tutorials and documentation on Apple’s developer site to help you become even more familiar and to be able to support this in your apps.

Additionally, if you have image-based apps, we need to allow image descriptions to be added at share time by the poster so that VoiceOver can have something personalized to describe the photo with.

Otherwise, new in VoiceOver this year is image analysis to describe the image contents or composition, or if type is overlaid on images.

But a handcrafted description will always be preferred.

Twitter has done a great job at adding image description options to their Tweet sheets.

So, another area that reduces hurdles and barriers for apps is integrating with Siri.

Siri is the ultimate quick access to your app that doesn’t even require looking or holding, or touching the screen, with this voice activation and speech input.

It’s hands-free and eyes free.

I can control my home, send messages, and even start workouts quickly with apps like Zova.

And also, new this year is being able to type to Siri that is available in accessibility settings.

This gives quick access to someone who may not be able to speak.

Or maybe you’re just in a public setting.

So, lastly, along with sight and sound, is touch.

You can ask yourself, are your apps tactile?

On supported devices adding haptics provides a way to physically engage people through feel.

It gets your attention if sight and sound are not an option.

But also, provides an enhancement for everyone.

For example, some system-provided interface elements such as switches, sliders, pickers, index scrubbers provide haptic feedback when we interact with them to reinforce the action that we’re doing.

And a haptic on pull to refresh, let’s you know that you’ve reached the critical threshold between scrolling and refreshing that shares a continuous gesture.

And likewise, with swipe to delete.

The haptic provides the critical threshold of the accelerated delete swipe.

And then another example is Siri.

It uses a success haptic as a confirmation that she heard what you said.

Or Apple Pay uses visuals haptics and sound with the confirmation for the successful purchase.

It feels the way it looks and sounds, leaving me with a confident purchase or payment [ding].

Now, that’s clear.

So, to see if your apps are perceivable, ask yourself is it legible?

Can I read it?

Is it audible?

Is it usable without site?

Is it tactile?

Can I feel it?

Take one away, or even two of these can I still understand it.

And so, now my colleague AJ is going to talk about what they did in the Maps Team this year, applying some of these methods.

[ Applause ]

Thank you Caroline.

Good morning, everyone.

In Maps we collaborated closely with Caroline and the platform team.

And this year we put into practice a lot of what she’s covered today.

Now, before we get into a few of those examples, and think about how to design with integrity, we’re going to take a short diversion.

It’s going to add a couple of minutes to our ETA.

We’ve used maps for thousands of years to plan, direct, to travel and to help us explore the world.

And maps of the past, like this one, beautiful as they remain, were not accessible at all.

You can’t zoom on this map without a magnifying glass, and you couldn’t orient yourself upon it without a compass.

In face the only manipulating you could do would be to fold it.

So, as our understanding of the world has improved, so have our maps.

Today, your map understands where you are.

It knows the direction you’re facing.

It knows the device you’re using, and it’s equipped with your individual preferences.

It’s updated every day with new information.

And you can search it.

You can mark it indestructibly and make it your own.

It can take you places in real time, avoiding traffic, hopefully.

And in iOS 11, you can even see inside venues.

The point is, today our mediums are so versatile, they provide us with the incredible opportunity.

So, when I joined Maps, I remember thinking a lot about how crazy it felt to see inside.

Like looking through a keyhole and stepping through this icon, I cannot have imagined the operation I saw behind it.

I took it almost completely for granted.

And everyone who uses our apps are the same way.

They don’t care how hard it was to build.

It’s not important that it works for Johnny Appleseed or Jane Doe.

What’s important is that it works for me.

Can I use it?

Imagine going to the movies.

You bought your ticket and your excited to watch the film.

The theater’s packed.

You get to your seat and there’s a pillar in front of it, completely blocking your view.

I’m pretty sure I’m not going to come back to that cinema.

And I’m no longer just a regular Maps user, not that there’s any such thing.

We’re trying to build a map of the entire world and it’s a big and diverse place.

Like you, I’m on a team responsible for everyone’s experience.

So, coming back to simplicity and perceivability.

Maps are not exact replicas of environments.

Sometimes photographic detail is useful.

But most people don’t find the satellite view practical in everyday life.

We use simplified abstracted representations, designed to be easy to understand and faster for us to process to help us orient ourselves more quickly.

And it’s why we provide a transit map too.

Where we suppress some elements, like places or minor roads, which aren’t as relevant to you once you’re using transit.

And we do all this for simplicity.

And we make sure our maps are understandable.

Visually, audibly, and through haptic feedback.

We put a lot of care into keeping the navigation screen as simple as possible.

The most important information available at a glance.

So, you can understand quickly and stay focused on the road.

And while you’re walking or taking transit, audible directions spoken clearly.

Starting route to San Jose McEnery Convention Center.

Imagine if Siri mumbled these instructions incomprehensibly.

Because isn’t that kind of the equivalent to small, blurry information.

And on your wrist, while your phone is in your pocket, different tactic sensations to inform which way to turn and when.

Altogether for perceivability.

It’s legible, audible, and tactile.

And you can use any one of these distinctly.

Integrity is something different conceptually.

We do things for simplicity and to improve perceivability.

But we do things with integrity.

The state of being whole and undivided.

When we say someone has integrity with think of them as being honest with an admirable moral compass.

So, when we consider that the people who need our services the most, might be the ones who find them the hardest to use, how do we design with integrity.

Well, as designers and developers, we take responsibility.

When architects build spaces today, they make sure they’re accessible for everyone because it’s a human right.

And when designer draw beautiful cars, they still have to adhere to regulations created for human safety.

So, when we design digital products, we have to hold ourselves accountable for their accessibility too.

Because when we don’t we’re deciding who’s important.

Last year for iOS 10, we redesigned Maps from the ground up.

Making it clearer, simpler, easy to use, and faster.

And it was a big undertaking.

We opened the door on accessibility, but we didn’t go that far along the corridor.

It was a busy year.

But in iOS 11 we did take responsibility, and we made it a priority, leveraging the new UIKit frameworks and following the principles that Caroline outlined today.

Making as much text dynamic as possible, displaying the same amount of text as the default UI, using as much of the screen width for text.

Search is a core component of Maps and now it’s easier for everyone to use.

But it’s not always just as simple as increasing the font size.

And you might be thinking, hang on, I can see some truncation in these screenshots.

This breaks the principles we’re talking about, and you’re right.

But sometimes we have to make choices based on context.

Like, on the right screen here the top of the card has a fixed header.

So, if we didn’t truncate, the route options beneath would not be scrollable.

Now, this is the card for a place.

They’re used in different context across iOS, including Spotlight, Safari, and Siri.

And from here you start navigation, make a call to check on your reservation.

Or even have a look at what’s inside.

Things everyone might like to do.

So, let’s take a closer look at some of the components of this design, like the title block.

We use this with search results too and last year we truncated the title for long place names instead of wrapping.

But with a big of UIKit magic, this year, now it does this.

Which is actually pretty cool.

And the component is more effective and robust because of it.

And we made some changes to the place card this year to improve hierarchy including this secondary row of actions.

And we need to accommodate a maximum of five options here.

Like these.

So, on an iPhone 7, in English, hang on a sec.

That was such a cheap joke.

On iPhone 7, in English this works and we can fit five in a row.

But how do we handle it with dynamic type?

Because we can’t just increase the font size.

But we could stack the UI elements smartly.

And all the way up the scale.

Internationalization is a form of accessibility too.

And it brings its own challenges.

So, how about these labels in Italian.

OK, they’re a bit longer.

So, if we go back down to default size, oh, but the label is truncating.

And the label is important.

We should try not to truncate it at any size.

But we could apply a similar technique like this.

And now, they remain clear and uncompromised.

Adaptive design based on a user’s language and UI settings.

And now it works better across different device sizes too.

And for these actions at the bottom of the card, as well as all of the content, we reflowed the labels and scaled the icons.

Using vector assets, so there’s no pixilation.

And of course, we made all the labels VoiceOver enabled.

And in Maps, we hung out with Siri as well.

So, there are more complex challenges than the ones I just illustrated.

And it’s easier not to take responsibility.

But challenges are rewarding.

And that’s how I ended up; challenges are fun and rewarding.

And that’s why I ended up standing here today.

Designing for everyone is a process that can bring a deeper understanding of your product and its purpose.

One of the few effective keys to the design problem.

The ability of the designer to recognize as many of the constraints possible, their willingness and enthusiasm for working within these constraints.

Because out of constraints come great solutions.

And if we’re designing just for default settings, we’re not doing it right.

By trying to do better than that, we make ourselves better.

And we really learned an awful lot this year.

So, secondly after we decide to take responsibility, we need to think about empathy.

The ability to understand people.

To recognize their wants, desires, and feelings.

People that are not you because nobody else is.

So, let’s take a look at a few made-up examples that might be closer to home.


Never been accessible.

There’s something nice to sitting down with a newspaper, but not if you can’t read it.

If your vision isn’t perfect, reading one of these has always been a challenge.

And a lot of you create platforms for content that’s consumed the world over.

Platforms with the capabilities to improve the readability of our content, relatively and automatically.

And platforms, which can be navigated and understood through sound for those who can’t read visually.

And we read a lot on our iPhones or iPads.

Sometimes it’s good news, other times it’s bad news.

Or, depending on where you look, it could be fake news, like these revelations about Lorem ipsum.

So, here’s the same article with some visual considerations for dynamic type.

The headline is moved outside the hero image for better contrast and for practicality.

And the content is readable at my preferred size.

And it still looks great.

But what if you couldn’t read a price tag?

What would the experience of buying things be like?

You can’t tap on a price tag to hear an audio cue, but on an iPhone, iPad, or Mac you can do exactly that and more.

So, here we are shopping for fresh, delicious, emoji-fied fruit.

And there’s a two-column layout at default settings.

So, if we turn up dynamic types a bit, we could reflow the content into one linear list.

Similar to how you would on a responsive website between desktop and mobile breakpoints.

Giving enough space to elements to make them simpler and easier to pass.

And here’s our product page for this pineapple.

Adapting your layout appropriately makes sure the most important actions are available immediately.

So, again if we bump up the dynamic type that’s working pretty well.

But, let’s turn it up to the maximum.

Nice, but is there more we can do here?

If we scroll the card, we can see the most desired actions are a little bit far way.

So, maybe we’d consider swapping them in this context.

Yes. And that’s definitely better.

Well, what about music, podcasts, stories, sound?

Audio content is great because it plays over the air, or your headphones.

You don’t need a UI to be entertained by it.

But you do need a UI to be able to discover it.

And in the new podcast and music apps on iOS 11, the content adapts to your preferences and can be used with VoiceOver, so everyone can access it.

Now, finally potential energy.

Unlocking potential.

If we make more things possible for more types of people, more things can happen.

Awesome things like iconic albums or pioneering science.

Or miraculous things like remembering your friends’ birthdays.

Simple things like sending a gift or a message to a loved one.

Can anyone read that?

We’re reliant on language and our remarkable use of it is perhaps the primary reason for our success as a species.

Be it through colors, text, sound, or signs.

And what about now?

German speakers in the room are one step ahead of us.

If we don’t communicate in everyone’s language, how can we help them accomplish things?

Without understanding, we can’t create any value at all.

I read an amazing article earlier this week called, “The Thoughts of a Spider Web” by Joshua Sokol.

And in it there was an example about how bees use ultraviolet vision to find flowers that have also evolved ultraviolet markings.

And beside it a quote which said, “if you don’t have those receptors that part of the world simply doesn’t exist.”

Design that includes everyone is not good design, it’s great design.

So, design with integrity.

Take responsibility, caring about everyone’s experience, not just your own.

And be empathetic considering the perspectives of others who interact with the world in different ways than you do.

And understand that by doing so we can help unlock potential, in tiny and huge ways.

[ Applause ]

So, together we can strive for simplicity and perceivability in our apps.

And do it all with integrity for making good apps great.

I encourage you to start now.

Go home and audit your apps.

Start conversations with your team and people who use your apps.

And most importantly schedule time to see it all the way into customers’ hands.

And lastly do it proudly.

We can give everyone the chance to be surprised, delighted, feel good, and empowered.

We especially as designers would always like things to line up just right, but you can’t control every line break.

You can’t control how someone will need to personalize their experience because everyone is different.

And let’s celebrate that.

We’re really proud of what we’ve achieved in Maps this year, but there’s more to do and we’re going to keep going, because a map is never finished.

And everyone wants to find their way.

So, we’re designing for everyone and we really hope you’ll join us on this mission.

[ Applause ]

Now, there’s a lot of great features that we couldn’t cover today, so please check out Apple’s accessibility website for many other great ways to make your apps accessible for everyone.

And you can watch this talk again at this URL at the Apple developer site, or on the WWDC app.

And later today in this same room check out “Design Across Platform” session for details on how your app is best presented on each platform and for implementation details on all the new iOS 11 dynamic type APIs we spoke about today, please check out the “Building Apps with Dynamic Type” tomorrow afternoon.

Otherwise, please check out these other great talks on or the WWDC app.

Thank you.

[ Applause ]

Apple, Inc. AAPL
1 Infinite Loop Cupertino CA 95014 US