Game Design and Development for iPhone OS, Part 1 

Session 401 WWDC 2010

iOS 4 delivers an incredible collection of graphics, media, and connected technologies for developing cutting-edge mobile games. Learn how to create the most compelling and innovative games possible. Understand the tools for creating game models, audio assets, and gorgeous artwork, and learn practical techniques behind game design and production. This two-part session is essential for developers wishing to create rich gameplay experiences for iPhone, iPad, and iPod touch.

Graeme Devine: Welcome to game design and development for the iPhone OS.

My name is Graeme Devine.

Be talking to you a lot today about game design on the iPhone.

And hopefully a lot of you in the audience are game designers.

So iPhone OS, it’s an incredible platform for game development.

One of the things that’s changed in the last year is that now you’re not only designing games for the iPhone but you’re designing games really for three platforms.

The iPod Touch, the iPhone, and the iPad.

And each of these is incredibly different than prior than just a year ago, you had just a single platform out there.

We’re going to talk a lot today about the technologies that you use to build games and how we actually apply those technologies to make a demo game that we’re going to show you in just a minute.

So you guys have been pretty busy making games.

50,000 games or more on the App Store.

I think that is absolutely incredible, and thank you so much for making games so popular on sufficient a great platform.

I think give yourselves a round of applause.

[ Applause ]

Graeme Devine: So we introduce yesterday this brand new concept of iOS 4, so I still may say iPhone 4, because I only heard it yesterday.

But we introduced some fantastic new frameworks in iOS 4 that can be used for you to make games.

Not the least of which is Game Kit which allows you to add achievements, leaderboards, and best of all the ability to connect to get two devices together anywhere in the world and play with your friends all over the world via the network.

We’ll be talking a lot today on our thoughts on how to put good achievements into games, games that drive game play, and a little bit about leaderboard set up and be able to get good networking.

So why are we really here talking to you about game design, what do we think why does Apple’s point of view on games really matter?

Well, our platform is absolutely unique.

The fact that we have taken away the controller.

You used to sit for 30 years with a controller in your hand that connected to a console that then connected into a television set.

And for many, many years, that’s the way we designed games.

And slowly the controller got more and more complicated.

We went from that up, down, left, right, to added 2D pads and analog stick, 2D pad plus to analog sticks, then 2D pads plus analog sticks plus a myriad of controllers.

Last game I worked on I think had twenty inputs on the controller alone.

Completely changed with the iPhone.

That controller is gone and the abstraction that a controller is in between you and the player’s world is completely gone as well.

We want to talk a lot today about how to take that controller out of games so that you are touching the world directly and using the touch screen to be able to interact with the world in front of you.

So this is actually two sessions.

We have a Part One and Part Two.

Part One is really more about design, making games on the platform.

Why 30 frames a second is so important for every single game, even if your game is tic-tac-toe, your game needs to be running at 30 hertz, and we’re going to be talking about that.

We’re going to talk about user interface and how we made the user interface in Quest, how that new [Inaudible] works on top of OpenGL.

We’re going to talk a lot about managing the acid pipe line.

As our games are getting more complicated and full of more graphics and more sounds and more models, but our dependency on a single programmer to make builds is becoming more of a critical point in game development.

And we need to find ways to get around that.

We’re going to talk about some of the things that we did that you can also apply to your games as well.

Finally, this first session, we’re going talk a lot about controls and how to get rid of that joystick.

Part Two, we’re actually going to take a technical deep dive into Quest.

We’re going to take a look at the shaders, as to how we did the [Inaudible]per-pexel lighting, and we’re going to walk down our graphics pipeline.

We’re also going to check in with our art department, Pete, who’s a single artist, and we’re going to talk about the acid generation that he went through in order to be able to make Quest.

And how that itself also made a tremendous difference in getting towards 30 frames a second.

Finally, we’re going to talk about the one dozen things that we learned that think that we think will make a tremendous difference to all of your games.

And that is the one dozen lessons that I want you to take home as homework tonight before coming to our labs tomorrow and talking about how you’ve applied that into your source code.

So you’ve got some homework.

Okay, Quest.

I don’t know how many you saw it last night, but we’re going to show it again in just a minute.

To reiterate it was two months worth of work by three programmers and one artist who are incredibly passionate about making a highly performance application for the iPod Touch, the iPad, and the iPhone.

Making a game on all three platforms run absolutely the fastest frames per second we can make it go, but at the same time make something delicious looking.

Quest also uses the best of Apple technologies.

Uses OpenGL ES in order to be able to provide per pixel background, ABI Foundation, to be able to provide a rich surround sound environment, because a lot of our people out there wear head phones.

Game kits on the phone ought to be able to connect to people across the world.

UIKit in order to be able to have that beautiful interface.

Simple things like HTTP.

That wasn’t simple ten years ago, let me tell you.

The simple things like HTTP that enable us to get assets easily and quickly onto the phone.

Then finally, interface elements like the Touch Interface and big accelerometer and playing around with easy ways in order to be able to change our interface into games so that they are actually very compelling.

So let me bring Jeff Rutner up to the stage.

We’re going to give a demo of Quest running on iPad.

Yesterday we showed it to you on the iPhone 4, today we’re showing it to you on the iPad.

So one of the first things you’ll notice about Quest is the very first screen tells a story.

From the title, I kind of know what I’m going to be doing.

It says Quest.

Kind of a big clue there.

From the environment below me, it’s kind of a dungeon, I can already see it’s got grates in front of me.

I can kind of tell, wow, I’m going to be in a dungeon.

And there’s a guy with a big sword, so you can immediately guess the weapon you’re going to be using.

So that’s all UIKit, used to generate that Quest logo, all the buttons, everything on the screen except for the OpenGL ES 2.0 background behind you.

And it’s absolutely running at an incredible frame rate.

So let’s take a look at the achievement list.

And one thing to notice about the achievement list is we brought it up using a Core Animation.

Core Animation made that flip to just bring it up.

Incredibly easy in order to be able to do transitions using UIKit over implementing your own UI in OpenGL.

So achievements are something we’ll be talking about later today, but haven’t quite run through and got the demon slayer achievement.

I seem to remember I won it last night.

Gone again.

So let’s jump into the game.

You can see that the game is full lighting per pixel in the environment and in the world.

This is pushing around 40 to 50,000 triangles per frame, and millions of triangles per second on all of our platforms.

As the character who is skinned on the GPU runs around, the GPU is doing all the work to animate him.

The world around the character is lighting him, the lights around the character are effecting the character’s lighting, and as he walks into shadows, he gets shadowed.

If I press on top of the character the context menu comes up.

And I can interact with the character.

I can call in a little pet to help me.

One of the things we did was in order to be able to easily interact with the world, I want to be able to touch the world anywhere I want to touch.

So the control for Quest is I just touch where I want to go, and that’s where I go.

If I want to interact with something, I touch on it.

So in order to be able to interact with my character I just touch on him.

And then the context menu has changed.

The context menu is now relevant because my pet is there, and I can go tell my pet to go take my loot back to town or dismiss it.

So let’s walk through and take a look at the final encounter.

You can see as we run around the world itself tells the story.

Little bit of a demon, kind of hungry.

The world itself tells a story, there’s little bits to go and explore, there’s all sorts of exploring, there’s all sorts of deep chasms with red lights down below.

I love that in games.

Whenever I see that I’m like how do I get down to that red light.

It tells it just invites you into the game environment.

I want to be able to see more.

So to interact with this demon, same as interacting with my character he isn’t very patient, isn’t he?

I go and attack him, use my large sword, pretty obvious, win my demon slayer achievement, and slay him.

Now that interaction was all done by interacting by pressing on top of the character.

Now before we finish them, I want to show you one more piece of UIKit that I think is just fabulous.

By pressing on top of the character’s head we bring up the user interface for the actual character himself.

Now this is all still UIKit.

And what we did with UIKit was we skinned it and we’re going to be talking about how we did this in this talk this morning.

We skinned it so that it looks just like the game environment itself.

One of the things we find absolutely compelling about writing Quest was making the environment, making all the UI look like the game.

But one of the real benefits about using UIKit was it gives the game the same user interface as every other application on that player’s phone.

They’re already used to play you know, pressing buttons and how they press, they have expectations about how things dismiss and how things come up.

So by using UIKit we give them that same experience automatically.

And it works perfectly on top of OpenGL ES 2.0 backgrounds.

I should also point out that all of this, all of what you’re seeing is just using the same SDK that you have already.

We didn’t do anything special.

All we did was start it 4 frames a second and go all the way up to 40.

Thank you, Jeff.

[ Applause ]

Now one of the things that’s really hard for you guys now, well harder, more of a challenge, is that you have to make games for more than one platform.

You have to make games for the iPod Touch, the iPod Touch is a challenge because people have their head phones on all the time, and the audio in that game is kind of important.

You have to make games for the iPad.

Well you hold your iPad completely differently than you hold either of the other two devices, and the interface of that is completely different and needs to be completely different.

The iPhone, people use this to make phone calls.

And people can interrupt the game play by calling you.

It’s really annoying.

Quest runs just great on the iPod Touch.

These are all screen shots from the actual game.

Runs great on the iPad.

Runs fantastic on the new iPhone 4.

Those are all shots from in-game.

Actually about a month ago.

So how did we start?

Obviously, two months ago we didn’t have a game running.

So this is an early contact art of Quest.

Two months ago we started with a simple concept.

We are huge nerds and fans of Diablo.

Got to deny can’t deny that didn’t influence us a little bit.

So we wanted to make a role-playing game that was set in a dungeon environment.

Now one of the things that we wanted to do was make a rich environment in a real-time 3D world that we could play with.

And we wanted every screen in the world to tell a story.

So even from the concept art, there’s that little balcony back there behind me.

What’s that for, what’s on there, there’s a special loop there.

Even if all I did on this first screen was run from you know, left to right across that bridge, the fact that there’s other stuff there really drags the user in.

One of the things that’s really different about our platform now is that I am holding the world in my hands, it is right in front of my face.

So the more you can do to invest people in that world, to make them wonder about it so that when they aren’t in your game they’re like ah, I bet I can get down to that red shadow area.

The more that they’ll come back and play your game.

And this is one of the key things that you want when you’re thinking about game design on the phone, is how to get the player to press your game twice.

So touching the world is something that we also thought a lot about.

We wanted our world to be extremely interactive.

To be able to touch the screen.

The major input device onto all of our devices is that touch screen.

We had to abstract away that joystick and throw it away so that I can just press the world in front of me.

Supremely important.

You might notice the UI overlay, that we actually stuck fairly closely to the initial design of the UI.

The mini maps up there, and this is of course iPad in terms of its aspect ratio.

But if you look at that initial touch design on the iPad we actually thought we were going to do something a little bit different early on.

We thought because you hold the iPad in two hands, I’m going to be holding the iPad in two hands, I can use my two thumbs for abilities, and I can use the accelerometer to be able to move my guy around.

You know, try to make gravity roll over so that you know, has to go downhill.

In the end, we stuck with touch [Inaudible] touching the world.

And that worked fantastically, but one of the things I want to encourage you to do is always think about different ways of input.

That accelerometer is fantastic.

And what works on one device may not work on another.

What works on the iPad there may not work on the iPod or the accelerometer may work better in some cases on the iPod because I don’t want to touch my main guy.

One of our other early decisions that really made a difference to the game development was data-driving our content.

You saw there briefly the console that we can bring up and we’re going to be talking a lot about how that data drive content and variables and made game design iteration very fast.

We realized with two months that A, they weren’t going to move the date of WWDC, although I guess [Inaudible] perhaps, and B, that we needed a way to iterate it very, very, very quickly on game design, and game ideas, in that we needed to be able to change things fast.

So we’re going to be talking a lot today about how to data drive.

Also early on, we talked we thought an awful lot about sound, AVFoundation.

AV Foundation is an fantastic.

You can start music with one line of code.

OpenAL. We realized a lot of people play games with their head phones on.

I don’t know how many of you have, like, teenage daughters that just have these white things in their ears all the time, but I certainly do.

And they play games constantly with head phones on.

So although the device itself has a single speaker off to the side, most people play games or a lot of people play games and actually hear the 3D world.

And you have to remember they’re holding the world in their hands.

So making the audio work in their ears just adds to the environment.

Making the world seem real, extremely important, you have to do a good job.

In order to get a real 3D sound from it, you have to use OpenAL.

Game center integration on the iPhone is also something that you need to think about from day one.

I love leaderboards.

I love leaderboards because I can see how I’m doing against my friends.

What typically happens with me, at any rate, is I would be working on a game and I’m in the top ten of the world, which is awesome.

And then a game will come out, and I’m in the top 100 on day one.

And then certainly with Halo Wars, I was in the top 100,000 on day two.

And then it fell off quickly.

So to be able to compare myself to my friends, though, hey, I’m still number three.

Comparison against friends is just fantastic.

Achievements are also something that really drive game play.

And thinking of different achievements is a real challenging game design.

It’s okay to have an achievement that says hey, I slayed ten demons.

That’s a pretty obvious one.

The harder achievements to think of are the ones that for instance in Quest, I’m going to go around in this high level and complete the level without killing a single monster.

The challenge is to make up achievements that drive game play in different ways.

Your users will go through the achievement list and think ah, that’s kind of impressive.

I’ll do that right now.

And they’ll go play your game because you’ve offered them some different way to play the game just be driven purely by an achievement.

And of course networking.

Love networking.

I love collaborative play, I love playing with four other people, going in killing bad guys.

Networking head-to-head, I hear that works great too.

So early on in Quest we also thought a lot about the initial character design and bringing the character up on the screen.

Our character is very cartoony.

There’s a very good reason for that.

Cartoony characters can have huge animations that are overreaching, I can go do this, which I can never do in real life.

And you can exaggerate the animation.

So when he’s running, when he’s grabbing, when he’s jumping, when he’s being heroic, when he’s dying even he can be over animated and very obvious.

When he’s in that three-quarter view, making things very obvious.

Very important.

Where he’s an animated character, too, you can also give him a big sword.

Big swords don’t really work when you go for photo realism, but the big sword works for two reasons.

I have a big sword, it’s very obvious I have it.

I’m not going wander I don’t have to go to an inventory screen to see what kind of sword I have, I have the big sword in my hand.

It’s very obvious when I use it.

I’m going to go strike with my big sword.

If you had a little sharp knife, not so obvious.

And three, it’s fun.

So for two months we really did not iterate much on our character.

We kind of stuck to that initial thought, that initial design.

So Sergeant Shock here ended up very close to our initial design.

Our initial level design was also very straight through.

We wanted initial level design that kind of showed how things work, so that I could drive through a single level more or less in one single pass and actually get from A to B going from side to side.

The side to side is kind of important because I want to be able to load part of a level in, then as I move to another part I don’t want to be able to see this first part, because I’m going to junk it out in memory.

I want the most inviting 3D world with the most polygons I can throw on the screen at any particular frame.

And if I have a large square level, it’s very hard to do that sometimes, because you have to go and show pieces of large square levels.

But if I guide you around a large environment that I can stream in pieces of, boy, I can load a lot of assets up.

The grid-like system that you saw in a previous screen too applied to how we actually made the graphics.

Pete’s going to talking in the second half a lot about the grid system that we used to set up and make all of our environments with.

You can see that the pieces go together.

The doors, the grates, the stairs, they all fit together, and we were able to make up levels very easily using this grid-like system.

Also, our team approach was we only had two months to go, so how do we actually set this up.

We started early on with a game design document that worked in two ways.

Initially, we had a game design document that outlined the world that the rest of my team could kind of see what the world was like.

A story. It was all about Sergeant Shock and his adventures and all about the world and why he was adventuring and so forth, so that everyone’s on the same page content-wise.

We also wrote up the encounter that you just saw in the Quest demo.

And more or less from start to finish assigned priority to it.

We only have two months to make this thing.

So we assigned priority to everything, and we would meet twice a week and reassign priority.

And that worked exceptionally well.

Because something that effects every single product is this wonderful feature called feature creep.

I’m sure a lot of you had that happen in your games.

And what happens is I start making the world’s best tic-tac-toe game, and tic-tac-toe becomes is a wonderful game.

And then I add a fireball to it.

And fireballs scorch the other player’s Xs and Os, I then think that I’ll make 3D tic-tac-toe with fireballs, I’ll make no, I’ll add a fireball gun, no, I’m just going to take away tic-tac-toe and I’ve ended up with a first-person shooter.

That happens, and you know that happens.

But what happened is I still have tic-tac- toe source code, really.

I haven’t made first-person shooter source code.

And tic-tac-toe source code, not going to work very well, not very robust, going to be pretty buggy [Inaudible] no feature creep.

[Inaudible] 30 frames per second.

Why is 30 frames a second so important?

You might be thinking my tic-tac-toe game, my [Inaudible] with friends games, why do I need to have 30 frames a second, so fast on the screen when that’s happening?

Tic-tac-toe can go 4 frames a sec, I don’t care.

Well I say two things about that.

A, you’ve written a really bad tic-tac-toe game, and B, you’re wrong.

The wonderful thing about our devices, the very first thing that you all did was go take a photograph and then you played around with the photograph on the other side of the glass.

And it felt real.

You felt like you were playing around with a reel photograph on the other side of the glass.

And it just that’s the fantastic thing.

That’s what sold me on the iPhone.

I thought that’s incredible.

And that’s what sold every one of your users on the iPhone.

And if your game doesn’t feel real on the other side of the glass when you touch it, what happens is you get start to get lack.

You start to get this laggy thing going on.

I’m going to pick up a tile and I’m going to move it around.

And because my game is not running at 30 frames per second my tile kind of lags around.

And what happens is the player starts to counter that, starts to go back to try to get the tile, because there’s something wrong with this tile, it’s not keeping up with me.

Then you go back, [Inaudible] then you’re in the worst possible situation.

The person is trying to counter your game design because you had not hit a frame rate of 30.

30 hertz is incredibly important for absolutely every single game you make.

It’s the new minimum.

And I challenge you all to meet and exceed that.

Of course when your game’s not doing anything, when it’s just sitting there and you kind of wonder what’s my next move in tic-tac-toe, don’t go with 30 hertz.

Slow all the way down, save that battery.

Using Apple technology’s extremely important in Quest.

And we applied a lot of them to Quest.

Talk about that.

UIKit. Everything on the screen, UI, both in the menus and in game is UIKit.

We’ll talk about that in just a minute and how it worked.

Core Animation comes for free with UIKit.

You just do six lines of code and things twirl around.

Your artist will love you.

He can sit and design things, and he actually understands the rotation and the matrix and the math.

And he can sit with you and make cool animations.

It’s much harder to do that if you roll your own UI.

OpenGL ES 2.0 and its ability to run a little program every single pixel on the screen.

You have no idea how magical that is, in order to be able to have pixels shaders.

20 years ago, you had to run one program that did an an entire screen.

Now you’ve got a program for a pixel.

It’s incredible.

Being able to use HTTP in order to drive changes to things like my user interface or my models or my textures, so that I think a connection using an HTTP server and a little folder on my desktop that I can drop things into that the artist can change the entire look of the game if he wants.

So if he doesn’t like the Quest logo he can change it to sci-fi Quest, change it to serenity Quest, he can sit and drop things and because we’ve written a little code that connects things inside the game it updates automatically for him in the user interface.

The only way he can tell if his stuff looks good is when he sees it on the device, make that easy for your artist.

He’ll love you for it.

AVFoundation, as I said, this is the one line of code to start background music in a game.

It’s just I think that’s if you’re not using that there’s something wrong.

You’re making life hard for yourself.

Use AVFoundation for a fantastic audio environment around you.

An accelerometer.

As I said earlier, the accelerometer is an input device very different, because I’m holding the world in my hands.

So even if your game is a touch game I’d encourage you to still look at that accelerometer feedback because I’m holding a world in my hands.

So if I tip my world just a little bit, and I take just a little bit of that tip, and I change things and I paralyze things just a little bit, that world suddenly becomes even more 3 dimensional, even more magical.

Just because I paid attention to something I’m getting for free from the accelerometer input from the user’s hands.

And you can really add a lot to your game by just doing these little simple things.

Game center I’m number one on this one game center, adding that from the very beginning, adding leaderboards will drive people to come back to your games.

If you add the social integration that challenges people in order to be able to I’ve beaten you now on my leaderboard, you are now number 2.

Well guess what, I’m going to go in and try to become number 1 again.

I’m going to go and hit play on your game.

Okay, enough of me for a minute.

I’d like to invite Helen to the stage to talk about user interface and how we approach that design in Quest.

Thank you.

[ Applause ]

Helen Ma: Thanks, Graeme.

My name is Helen Ma.

I’m one of the three engineers working on Quest.

I mostly worked on the user interface, which is what I’m going to talk to you about today.

Now user interface can make determines the perceived quality of your game in the crucial first 30 seconds.

The 30 seconds in which a typical user decides whether your game is worth playing or not.

Unlike the console or the PC where the player may have tens or hundreds of titles to choose from, on iOS they have over 50,000 titles to choose from in your hand, any time they want.

So the incentive for the [Inaudible] invest a lot of time in figuring your controls much, much less.

Nobody needs to play a game to get a job done.

The only reason anybody plays any games is because it’s an enjoyable experience for them.

And user interface can make or break that experience.

As Graeme mentioned, we choose UIKit to implement our user interface.

UIKit, ads you know, is the native user interface tool kit for iPhone.

For most users it what makes an iPhone an iPhone.

And as you saw in the demo, it works beautifully with OpenGL and everything else we ship on the platform.

But before we get into that I’d like to show you I’d like to have a quick talk about touch interface and games.

This is a screen shot of what the Quest UI might have looked like.

You have your multidirectional pad and you have a button box.

Very straight forward to implement.

But is it the right thing to do.

This [Inaudible] has been with us for the last 30 years, and directional pad even longer than that.

And [Inaudible] service while there, however, they do not change particularly well to the mobile platform.

Why? Because nobody wants to carry around your desktop, and your finger is not a mouse button.

New metaphors are now possible with a touch interface.

And we’re still very much in the early days of exploring that.

One particularly useful metaphor that we found is the looking glass metaphor which simply put is the idea that you are interacting with a visual world, your game, with nothing but a piece of glass between you and the game.

This is great for games, because it is fun.

It is unprecedented on any kind of gaming device to be able to directly touch and interact with the game objects.

So in Quest, instead of using a directional pad you simply tap on a screen where you want your character to go and he starts walking there.

And when you lift your finger off the glass he stops.

Really easy to learn.

Games that require you to go through tutorials and figure out the controls is not fun.

So now that you know how to move around in the world, how do you interact with that.

Well, the contextual menu is a very natural extension for that.

So to interact with any object in your in Quest, you simply hold your finger over the object of your interest and a contention menu pops up that shows you what’s possible.

You slide your finger over to the action that you choose, and it’s done.

Extremely intuitive to use and also very fun.

Now now that you’ve created this wonderful illusion, it’s important to maintain it at all times.

There are three things you must remember how to do to do, in order to maintain the illusion.

Number one, allow direct manipulation whenever possible.

Trying as much as possible, make sure what objects in your game are interactable.

Even if it’s just something like flicking over.

Every object that interactible adds to the realism of your game.

Number two, as Graeme mentioned, 30 frames a second.

Real world does not have lag.

As soon as you introduce lag the inclusion will be broken.

And number three, avoid abstractions.

Abstractions such as button bars, visual [Inaudible] pads.

There are things that players must learn in order to play your game.

Now you may think that directional pad is pretty straight forward.

But actually it’s not to a lot of people, who are seeing it for the first time.

What’s that funny arrow thing doing there, am I supposed to touch it.

Anyway, when you do touch it, it is does it move the character or does it move the screen.

All these things are not necessary on the touch interface now.

Now I’d like to bring out Jeffrey [Inaudible] for a quick demo of user interface.

[ Applause ]

Helen Ma: Now before we get into the game I’d like to just quickly show you the leaderboard achievement UI.

Now you see the animation there, how many lines of code do you think it would take to do that in OpenGL?

50, 100, I have no idea.

Because UIKit did it for me.

All I had to do was ask for it.

Now let’s get into the game.

Now as you can see everything on the screen is UIKit.

And it’s let’s see the contextual menu in action again.

Some of you may be surprised to learn that that is not a custom control.

That is simply a few UI buttons dressed up.

Extremely easy to do.

It’s all public API.

All right, let’s take a walk down.

Love how this looks.

Absolutely gorgeous to look at.

[ Background noise ]

Helen Ma: Wow, okay.

He looks pretty hungry.

I think he wants to give me his own experience.

All right, more than happy to oblige.

Let’s go. Oh, before we start never mind.

Can you bring up the character sheet, I just want to buffer myself.

Grab the fireball and that green thing I forget the name of.

Yeah. Okay, now let’s do it.

[ Background noise ]

Okay, so before we go, before we wrap up the demo I just want to quickly show you something.

We said over and over again that this is UIKit.

Now we’re going to show you what it looks like without the skinning, without the extra art work.

This is Quest UI looks like, without the skinning.

UI buttons everywhere.

All you have to do, as you can see, all you have to do with a bit of great art work you can make UIKit can put any style you wish to deliver for your gaming experience.

On the corner there is an UI scroll view and I’ll show you exactly how we did a mini map later on in the show.

Thanks, Jeff.

[ Applause ]

Helen Ma: One of the most frequent heard comments we had showing off this demo internally was that people can’t believe it’s UIKit.

Of course you can do all of this in OpenGL, programmers have been doing it for years, and it’s fine.

But on the iOS now you have a choice.

You don’t have to do that.

You can take advantage of UIKit.

And as you saw, it works great.

It will save you countless hours of engineering and key work and it’s easy to take full advantage of Core Animation.

Very few lines of code to do that.

Bottom line for you as programmers is it’s less code for you to write and debug.

Here are the six lines of code it took to do the character information sheet.

As you may recall, when the character info sheet came up it had a spin animation.

So to do that, you first tell UIView that you’re about to start animation.

You tell it how long you want animation to be.

Then you set a rotate and scale transformation.

And you state on the view, then you’re done.

UIKit does the rest.

Couldn’t be simpler.

Next I’m going to show you the mini map, how we did the mini map.

The mini map is an essential part of any dungeon crawler.

And we’ve got a pretty nice one.

So to do that you start with mini map which is the top down two dimensional view of your dungeon.

You put it inside an UI scroll view.

Pretty simple, straight forward to do.

And then you do your transformation your translation from your 3D coordinates into your 2D coordinates.

Then you have a functional mini map right there, as you saw in the last part of my demo.

However, to make it look the way it does in Quest you just need to do a couple of extra steps.

Number one, put a fan on top.

Number two, drop the whole thing inside the container with the mask view.

Then you’re done.

Very straight forward.

And it works really well.

Now I’m going to talk to you about something called a programmer’s UI.

It simply means the minimum amount of code sorry, the minimum amount of the UI required for program to keep working.

You say you have a great idea for a game and then you want to just prototype and get it going as fast as you can.

So to move around, throw some buttons up for that.

Say you want to have some abilities.

More buttons.

Why? Because buttons are easy to do.

And it tends to be really complicated because there’s not a lot of thought how everything fits together.

Your focus is on the game itself.

And then what happens is you’re late, you’re under some pressure to ship, and you don’t have time later on, the project around the cycle, to rearrange and rethink the UI flow, to remiplement the UI.

So you tidy up the project interface that you have and put a bit of window dressing on it and ship it.

That’s not a way to make great user interface.

You need to think about it from the beginning of your project.

It is such a vital and important part of your project, of your game, that you need to make it an integral part of your development process.

A phrase that you’re going to hear a lot about in this talk is rep iteration.

It simply means the ability to iterate on your user interface, your game design, your game play, every aspect of your game play.

Because the amount that you do that is directly proportional to the quality of the final product.

In Quest we discovered a couple ways to do that.

Number one, data drive the UI.

And number two, load your assets over Wi-Fi.

And we’re going to go into detail right now.

So what is data driven UI.

More typically, when you write in your application, you write a controller code for each screen that you have in your game.

So for example, you may have menu controllers [Inaudible] your [Inaudible] perhaps your settings controller.M.

Typically, controller code is not reusable.

And then we fix bugging one of these controllers you’ve got to propagate the bug fix in all of them.

What if you were to extract out your layer information into a data file.

It does a couple of things for you.

Number one, it encourages easy experimentation with your user interface.

It doesn’t take any code change, it should not take you any code change to move around user interface elements on your screen.

And because it takes no work it would you can do it easily, and it won’t cause you to do it more often.

But more importantly, it allows non-programmers to work on your UI.

If you implemented the screen layout information in a Plist like this, and have your program, have your game derive it’s user interface at runtime from this Plist you can simply hand the Plist over to non programmers and have them tweak it.

There’s one more advantage in doing things this way.

Now that we have the iPad and the iPhone 4, we have extra resolutions to think about.

It’s very tempting to scout out scale up your existing iPhone resolution games to the new resolution and ship it.

But is it the right thing to do.

Well users, having paid for the extra screen real estate, they expect you to take full advantage of the extra screen size available to you.

The games that are do take advantage of the screen resolution tend to do it better than the ones that don’t.

The second thing I’m going to talk about is Wi-Fi loading.

Now they have your layer information in a separate data file what about assets, what about things like the model.

The textures, the art, art work for the user interface.

What if you were to put those things on web server and have your game load these things at runtime.

Doing things this way will free your artist to experiment.

It’s very easy to set up a web server, and all of our shipping devices comes with Wi-Fi.

Your artist and your designers will love you for that.

Now in closing I’d just like to finish up with three things.

Number one, UIKit is great for user interface, and particularly as you’ve seen in the demo, for games.

It’s extremely flexible, versatile, and it’s great.

So use it.

Number two, rep iteration is essential for great user interface in games, and other aspects.

And we’ve shown you a couple ways to do that.

And three, don’t forget to turn those things off before you ship.

Thanks for your time.

Now back to Graeme.

[ Applause ]

Graeme Devine: Okay, let’s talk a little bit more about making Quest.

And in fact, have a little section called game design is nitty gritty.

There’s lessons that we’ve learned as we’re making Quest as game designers that are not the technical lessons that I’d like to go over to close out the session with.

Because game design on our platform is so different than game design the game design experience on any other platform out there.

Heard a lot about controls, haven’t you?

Abstracting that control away.

Going over the history that we have right from the arcade all the way to the sitting in the home to now, holding the world in my hands.

It is time to move on from abstracting a controller interface on the touch screen.

The touch screen is the controller.

When users touch the world they feel so much more in control of the world.

When they touch the things they’re interacting with, it really makes a difference as to how real that world feels in front of them.

I really encourage you to iterate through your design and work as much as possible on making a design that interacts with the world in front of you and not add a layer of abstraction.

Now layers of abstraction like key pads, we see those in a lot of games.

And sometimes that works great, and sometimes that does work.

I’m not saying, you know, strike and take them out well, I guess I am.

But please think about iterating more.

And putting the touch screen instead of the dual key pad on the screen.

Now I have a little rant here about play testing.

Now when we play test, we play test Quest every single day.

Something you all need to be doing is play testing your game from about Day Number 2.

If you’re not play testing your game on the second day of production all the way through your submission to the App Store, if you’re writing your game engine and it’s going to be running any day now, and my game engine is going to be on line the last week before I ship my game then that’s exactly the wrong thing to do for a game.

You need to be play testing early, you need to be play testing often, and you need to be play testing every single chance you can get.

More importantly, you need to let other people play test your game.

And what you need to do there is really, really hard, it’s kind of body over mind is not go and grab the controls away from them, not go and grab the, you know, iPhone back from them when they do something wrong.

Because what’s going to happen is you’re going to give your fabulous game you just stroved on for hours that has the best user interface ever, and you just know it’s going to be perfect and you I did not have it to your daughter and she runs into the wall.

And into the wall.

And there’s a big red button there that says press here to not run into the wall.

And what you have to do, what you have to do, is not run, jump over the couch, dive, take the things out of her hands and say no, this is how you do it.

You have to observe someone play testing your game and you’re going to have to take that pain.

And you’re going to have to take that pain because when someone buys your app on the App Store you are not there to dive over the couch and tell them to press the big red button.

You need to make your game intuitive, and the only way you will learn to make your game intuitive is by observing other people play your game and then asking them nicely afterwards why they ran into the wall so much.

Didn’t you see that big red button?

And if it’s your daughter, that might involve an iTunes gift card.

But do this all the time.

Play test.

Play test makes a difference in games.

Okay, rant.

Next rant.

They’re in a row.

One second is a very long time to start up a game.

When I press on the icon in my game on my homepage, I want to be playing a game or at the menu in one second.

One second is a very long time.

If I am waiting 8 seconds or 20 seconds on a loading screen after I press something on the home screen in order to be able to get my game, all I am doing is frustrating my game player.

He or she is getting more and more frustrated because it’s taking so long to load the game.

Not only do you need to optimize for 30 frames a second when you’re actually running the game, you need to optimize the game start up time.

Two ways to do this.

Only load the assets you need to load in order to be able to get to that first menu.

We only loaded the first part of the dungeon up.

Jeff’s going to go into that in Part Two.

We want that play button there right away, as quick as possible.

One second or less.

It’s very important to have that thing up and running.

And optimize load times, compress textures, anything you can do to make that load time less than a second.

Players are going to play your game over and over and over.

And you’ve got to think about their start up time as the kind of animosity meter.

You know, you kind of build that up over time.

If your game takes 8 seconds to start up, 10 seconds to start up and they’re at the bus stop, eventually they’re going to remember that and they’re going to stop playing your game.

One second.

That’s how much time you should be expecting to get to, after I press go to my game starting.

The phone, unfortunately, can ring.

As a matter of fact, it’s a phone.

All sorts of things can happen now on our devices and we need to take care of that.

Our phones can background our tasks, they can press that, change out an app and go do what they want.

A phone call can come in and they might choose to answer it instead of completing and getting the achievement.

They might just press the home button.

Their bus has come, they want to press the home button and wrap on up.

A message might come in that says oops, this happened, can you go back to the office and get milk.

Why milk’s at the office, I don’t know.

In the olden days when we made games we would first thing we would do when a game started up, we would say hey, here’s game slots one through ten, choose one.

Game slots are wonderful.

And you’ll be sitting in your console and it would be fantastic, and your mom would yell hey, it’s dinner time, and I’m going to bring up the menu to save a game, I’m going to choose my save slot 1 through 10.

Or I might just leave it paused.

You know, one of the two.

You do not have that luxury any more.

There is no room for save slots on our platform.

You need to be thinking about saving data constantly and getting the user back to the same place he was as quickly as you can, and do that for your user.

Because you don’t have a choice.

That home button isn’t going to not make your game go away.

Your game gets hold, hey, home button got pressed, do something.

And more often than not, we’re not doing anything.

And as game designers we need to start countering that.

We need to be thinking of a state save, doing that right there, so when I start my game up later I’m right back to where I was.

The game loop.

All sorts of ways to do a game loop.

We get asked about this all the time.

And typical game loop, I render the frame, the invaders are coming down from the sky and I have a glorious 3D rendered frame of invaders.

I then run my Game SIM.

And my Game SIM updates my invader positions and updates my base that’s going to fire up into the sky.

And it’s a quickly run Game SIM and it’s wonderful.

And then because it’s the phone, I get those delicate callbacks with the touch events and so forth.

I go and deal with that and I still have my user input and get it ready for the next time.

That’s the wrong way to do things in games.

I’ll tell you why.

Games you’re introducing there a whole frame’s worth of latency.

If your game is running at 30 frames a second you’re now effectively running at 15 because you’ve not taken into account the user’s input when you run that game simulation.

You need to be thinking about this order.

Render my fame of glorious invaders coming down from the sky, handle user input, store it and then update it, and then run my Game SIM.

Because now my Game SIM is going to reflect my latest touches I have on my screen, and my base is going to move and my fire and it’s going to go up in the sky.

And it’s going to be rendered in the very next frame.

In the previous slide, it would not be rendered very next frame and you would actually introduce latency needlessly into your game.

We also get asked a lot well how do I what do I do to anchor my game loop, what’s the best anchor I have out there.

And the best anchor is CA display link.

You should all be using CA delay link to anchor the beginning of your game loop.

Because the first thing you’re going to do is render your frame.

And see your display link comes off the vertical blank on the device and gives you as much time as you want to load up your GPU with that wonderful per pixel environment.

If you want to hit 60 hertz you need to be using CA display link.

Okay, we’re going to be back after the break, so we’ll start to get very technical with you.

We’re going to talk to you about the OpenGL ES 2.0 rendering pipeline.

And go into the shaders and how we really started off at a very slow frame rate.

We’re going to talk about that per pixel lighting go over line by line how we actually did that.

We’re going to get with our artist and have him show us the assets in Quest, and how those broke down and how they became optimized.

And finally we’re going to talk to you about the dozen things that we really learned from doing Quest that we really think will make a tremendous difference in your own games and your own game writing.

Apple, Inc. AAPL
1 Infinite Loop Cupertino CA 95014 US