What’s New in Photos APIs 

Session 505 WWDC 2017

Learn all about newest APIs in Photos on iOS and macOS, providing better integration and new possibilities for your app. We’ll discuss simplifications to accessing the Photos library through UIImagePickerController, explore additions to PhotoKit to support new media types, and share all the details of the new Photos Project Extensions which enable you to bring photo services to Photos for Mac.

Good afternoon.

It’s so great to see so many of you here today.

for session 505, “What’s New in Photos API.”

My name is Eric Hanson, I’m the Photos Platform evangelist and I’m joined here this afternoon by two of my colleagues from the Photos engineering team to talk to you about a whole lot of great things new in Photos for iOS and for macOS High Sierra.

So, obviously, we’re all here because we care about photography.

And our users as I’m sure most of you know, care deeply about photography.

Every day around the world millions and millions of photos are captured on iOS devices.

Leading to ever-growing libraries filled with tremendous amount of memory and meaningful moments that delight users when they reexperience them.

And so, we want to talk to you about creating great experiences in your apps and integrating with the Photo library.

And we’ve got some great new API to make this easy for you.

We also want to talk to you about doing this in a way that is high-performing within your app.

Ensuring that the responsiveness of your application is the best it can possibly be.

So, we’re going to talk to you about that as well.

And finally, we want to do this in a way that we’re not violating the trust with the user.

We’re not forcing the user to give up privacy just because they want to integrate with your application and with their photos.

We’re going to talk to you about that too.

And on this last point, I’d like to welcome to the stage, my first colleague in the Photos Engineering Team, Hasan Adil.


[ Applause ]

Thank you, Eric.

Hello, everybody.

It’s great to see you.

UI Image Picker Controller is a simple and easy to use API for selecting photos and videos in your app.

In iOS 11, we have some great improvements and new features that I’m really excited to share with you today.

At Apple, we care deeply about our users’ privacy and security.

Therefore, whenever you presented the UIImagePickerController, we require you to get the users’ authorization for their entire library in order to select a single photo or video.

I’m sure you all well aware, familiar with this alert.

This cause friction in your app.

As well as the user giving up a level of privacy that perhaps they didn’t really mean to.

As well as apps that will fail to perform even though there are great features, because the user didn’t want to give them access.

Well, we thought we could do better.

So, in iOS 11, we have revamped the privacy and security fundamentals of UIImagePickerController.

We started off with three goals.

Put the user in charge of their content.

Create a great user experience in your apps.

And third, give a higher level of privacy to our users.

So, I’m happy to announce that in iOS 11, this alert is no longer needed.

It just works.

[ Applause ]

Let’s look at a demo and we’ll get into it.

So, here’s an app I’ve just downloaded from the app store.

I’m really excited to use it.

I’ve filled out, I’m having a great day at WWDC, a little bit nervous, but that’s OK.

But I’d like to attach a photo.

Now, let’s see what presenting UIImagePickerController in iOS 11 looks like.

It just works.

UIImagePickerController in iOS 11 is now an out-of-process API.

What that means is that we present the contents of our API from an out-of-process sandbox and secure environment.

And apps no longer get access to users’ photo library.

Only the user is able to interact with UIImagePickerController UI.

When the user makes a selection, we take that photo or video and send it over to your app.

We feel that this removes friction from your app.

As well as gives users a really high level of privacy.

Because no authorization is granted, none is asked for.

We hope you’ll like it.

Using it is really easy.

You have to make no additions.

It will just work.

Photos contain rich metadata within them.

This includes the created date.

The format of the photo.

As well as other various metadata.

And we’ve made it really easy to get to this in iOS 11.

We now provide you a brand0new key with every result of UIImagePickerController.

The key is called UIImagePickerControllerImageURL.

This URL is a file URL, which points to a file in the temporary of your app.

So, if you would like to work with us later, we suggest you move the file to a more permanent location so it stays there.

You can us a URL as is, or you can read the NSDate into your app and process as needed.

In iOS 11 we have introduced a new format called HEIF for photos.

Now we realize that it will take some time for the ecosystem to adopt HEIF fully.

And we made it really easy to do so with UIImagePickerController.

We have a new property called imageExportPresent.

It could be set to two types.

It could be compatible mode, or we can set it to current mode.

Let’s look at them.

In compatible mode, if the source photo that the user selected exists as a HEIF formatted photo we will transport a Jpeg and give you a Jpeg version of it.

This is also the default value for the property.

And if you don’t need anything different, you don’t need to do anything.

However, if you would like the photo as we have it, set the value to current and we’ll give you the photo as we have it in Photo library and you might get HEIF and you can deal with that.

We have some great changes when it comes to selecting videos.

But let me step aside for a second, the AVFoundation is Apple’s framework for rich editing as well as playback of photos.

And AVFoundation has a rich set of export presets, or formats you can say.

And I’m happy to announce UIImagePickerController supports all of them.

We have a new property called videoExportPreset.

You can use this as a way to tell us what format would you like selected with you to be given to you.

Let’s look at an example.

After importAVFoundation you will create an instance of UIImagePickerController and tell us what format you like.

Here, I’m asking you to give me the highest quality video.

And then you present the picker.

When the user makes a selection regardless of what format it was, we’re cross coded to a matching format and give you back the video.

This is especially great for getting the videos back in the original format that they existed in.

For a full list of available presets, please see this interface called AVAssetExportSession, or come see us in the lab and we’d be happy to help you.

To my earlier point about increasing the privacy, we also took a deep look at what it takes to add content into a photo library.

Now, since we’ve made picking seamless, we thought we would make adding awesome also.

So, in iOS 11, if you are to use either of these two methods for saving a photo or a video into the user’s library, we have a new security model and a permission level.

The permission level for either of these two methods, the UIImagePickerController one is called UIImageWriteToSave ForusAlbum.

And the other one is UISaveVideoAtPath ToSavedPhotoAlbum.

These require add only authorization.

These are really library questions to get from the user.

Because these will allow you to add in library and not read.

So, we feel like users are going to be very likely, will give you this authorization.

And like before, come see us in the lab if you have any questions.

For our users of PhotoKit, we have something great.

Now, with every result of UIImagePickerController, if you’re a PhotoKit user, we also give you back PHAsset object matching the user selection.

This makes it really seamless to integrate PhotoKit across APIs.

Let’s look at an example.

So, here’s I’ve implemented the delicate method of UImagePickerController delegate.

And when I get the result dictionary, I have a brand new key, the key is called UIImagePickerController, PHAsset.

You get the value of that key and this key will be the asset object and you are free to use it.

Now, going along with this, we had previously deprecated the ALAssets laboratory framework and now the ALAssets Library reference URLS, which we used to pass to you using a key called UIImagePickerController referenceURL has now be deprecated also.

Yeah, PhotoKit is great and you will love it.

We feel that with these changes to enhance the user privacy, features to make UImagePickerController a powerful and full feature API, that this will satisfy the needs of the majority of apps out there.

However, there are times when you do need deep integration with the Photos frameworks.

And for that we recommend PhotoKit.

To tell you what’s new in PhotoKit in iOS 11, I’d like welcome my colleague, Andreas Karlsson.

[ Applause ]

Thank you Hasan.

So, apps related to photography are consistently some of the most popular apps on the App Store.

It takes, both great features and a great user experience to succeed in that market.

I’m very excited to be here today to talk about improvements we have made in PhotoKit that we hope enable you to write both exciting new features and makes it easier to improve the user experience of your apps.

So, let’s jump right in.

Let’s look at live photo effects that were announced in the keynote on Monday.

So, live photo effects include these loops, which are created by carefully analyzing the video and seamlessly stitching them together into these endless loops.

Also, include bounce which works in a similar way, but as you can see, it works better with dogs.

And finally, there is long exposure, which analyzes the video frames in a live photo to create these amazing looking stills.

So, we believe users are going to have a lot of fun with these, and we’re also really excited for what you come up with in your apps when integrating with these new kind of media types.

So, let’s look at how you would do that.

Let’s look at some code.

So, these are the PhotoKit Media types that currently exist.

They’re great, they describe what Asset is, but what’s even more important is how the asset gets presented.

And let me explain what I mean by that.

So, if a user shot a video, they would really like to see and use that asset as a video in your app.

If they shot a live photo, they care equally about the still part and the video part.

So, they would like to see that asset as a live photo in your app.

To do this currently, you would look at these three media types.

Image, video and photo live to do the right thing.

With live photo effects this become a bit more muddy and to make this really easy to get right, since it’s so important, this year in iOS 11, we’re introducing a new property of PHAsset called PlaybackStyle.

Now, PlaybackStyle is the single property you can look at to determine which kind of image manager API to use, what kind of view to present it in, and what kind of UI limits should accompany the view.

So, the PhotoKit has been updated to cover all these new playback styles, but I’d like to go into detail on three of them that relate to life photo affects here in this session.

So, let’s start with image animated.

So, new this year, the built in Photos app is afford for animated GIF playback.

[ Applause ]

To playback GIFs in your own apps, you would request the image data from the image manger and then use image iO and core graphics for playback.

Now, the sample app, does include a sample of how to do this.

So, please have a look there in case you would like to do GIF playback.

Next, is live photo.

So, live photos, users care a lot about live photos.

So, getting them presented in your app in the right way is really important, but also really easy.

So, in this case you would request a live photo from the image manager and then you say PHLivePhotoView for playback.

You set the live photo, you’ll get back on the view and users can then use three touch to playback a live photo.

Users [inaudible] in the built in photo’s app inside your app.

PHLivePhotoView can also be used with live photos outside the user’s photo library.

For example, if you transfer them over web service or something similar.

Finally, there is looping video.

And these include both the bounds and loop live photo effects that were introduced this year.

Now, to play these back in your apps, it’s very similar to regular video.

So, you would request the player item, and use AVFoundation for play back.

But you would use an AVPlayerLooper to get the looping effect.

So, I hope you have seen how easy it is to present the user’s media how they really wanted to be presented.

And we cannot wait to see what you guys come up with in your apps related to these new media kinds.

Next, I’d like to talk about iCloud Photo Library.

So, iCloud Photo Library is really great because users can now put their entire photo library in their pocket.

And this is really great for some users who have really large libraries.

Some users have more than 100,000 photos in their photo library, and they’re not that uncommon actually.

So, what kind of user has that many photos?

Well, to get 100,000 photos, you have to take 28 photos per day, every day for 10 years straight.

So, these users obviously love to use the camera and camera apps, but they also tend to download and use a lot of photography-related apps.

These users are really important to all of us in this room.

Now imagine one of these users, after taking their daily 28 photos, they go to the App Store to find an interesting app to use.

They find your app.

They decide to download it and give it a try.

And the first thing they see when they start using your app, is this.

Now, spinner for multiple seconds are not a great user experience.

And this user might uninstall your app right away, or even worse, leave a bad review.

So, you might be thinking, well my app wouldn’t do that, or if it does, I’m using PhotoKit and this user has a really large library.

So, maybe this is to be expected.

PhotoKit was designed from the beginning to be really fast and efficient with large photo libraries.

But there are some common mistakes clients make that can mean the difference between operation taking 10 milliseconds, and 10 seconds.

So, how do I fix this?

Well, since it’s so easy to end up with something taking a long time, step number 1 is to test your app with a large photo library.

Now we already covered, it’s kind of difficult to get that many photos using the camera app.

So, this year we want to make that really easy.

So, we’re providing a sample app that you can download called photo library filler.

Download this app and install it on your test device.

It will quickly generate a large library for you to use for testing.

So, great.

[ Applause ]

So, great you have a large library, you test your app.

Chances are you might start seeing some delays.

So, I’d like to talk about two common problems clients have with PhotoKit and their solutions.

So, let’s look at some code.

So, this is how you fetch assets from the users’ photos library.

There are two parts to this.

There is the database fetching part, and then dealing with a result.

Let’s focus on the database fetching part first.

So, here we’re fetching all assets on the library, but we’re filtering with a filter predicate on favorite assets.

And we’re sorting by creating day.

If you’re doing this kind of custom fetches, they are definitely supported, but if you start seeing delays in these kind of custom fetches, it’s definitely worth trying to simplify the predicate or simplify the sort descriptor.

And that can mean the difference between this query taking 10 milliseconds and 10 seconds.

The reason for this drastic difference is that you might end up outside the optimized path in the database and trying to get back into the optimized path can really pay off.

So, even better than doing these kind of custom fetches, is to avoid them if it’s possible.

So, in this case we can actually fetch the user’s favorite smart album and then fetch the assets inside of the smart album.

That gives both the predicate and the sort descriptor for free.

And you’re guaranteed to end up in the optimized path in PhotoKit.

So, next, let’s look at the result you get back from one of these kinds of fetches.

So, this object is a PHFetch result.

Now in a PHFetch result looks a lot like an array and you can use it a lot like an array.

You can ask for object and index.

You can ask for its count, and you can enumerate it.

But internally, a fetch result works very differently from an array, and is part of the reason PhotoKit can be so fast and efficient with large libraries.

So, let’s look at how it works internally.

So, initially it just contains a list of identifiers.

And this means you can get back one fairly quickly.

But as you start using it, additional work has to be performed.

So, let’s look at enumeration as an example.

And here we’re starting enumeration at index 0.

So you only have an identifier so far, so we need to fetch metadata form the database in order to create a PHAsset object to return to you in the block.

Now, the fetch result is pretty smart.

It’s actually going to create a batch at the same time.

So, as we continue enumeration, index 1 and 2 are already in memory.

And then enumeration continues and you get the idea.

It’s going to access the disk to get the metadata for subsequent assets.

So, for this kind of small fetch result this doesn’t matter.

But I would image this fetch result has 100,000 assets in it.

Each batch takes a couple of kilobytes of memory.

But when you have 100,000 of them you’re looking at hundreds of megabytes of memory usage.

And even worse, each batch take a couple of milliseconds to fetch.

But when you have 100,000 of them, you’re looking at 10 seconds to enumerate a large fetch result.

So, enumeration is clearly something we should try to avoid if at all possible.

We have found that it’s almost always possible to avoid enumeration.

And I would like to look at one example of how it can be avoided.

So, in this case, we’re trying to find the index of an asset that inside a fetch result.

Now, we could enumerate the fetch result and compare the objects we get back using is equal.

But as we saw enumeration can be quite slow.

So, even better is to actually use high level API.

In this case, we can use index of object.

And that is able to internally, in the fetch result be really smart and compare the identifiers with the object you’re looking for.

And array will define this index without any additional disk access or database fetching at all.

Same goes for contains object.

So, I hope you have seen how easy it is to get a large library to test your app with.

And how easy it can be to really make a huge different in the performance of your app.

Small fixes can mean multiple seconds of time savings.

If you do have questions on PhotoKit performance, please come see us in the lab tomorrow morning.

If you do encounter issues that you’re not able to resolve, please file [inaudible].

And if there’s one thing you remember from this part of the session, please test your app with the large photo library and take really good care of these really important users.

Next, I’d like to welcome Eric Hanson back to the stage to talk about Photo Project Extensions.

Eric [applause].

Thank you Andreas.

Thank you very much.

I am really, really excited to finally be able to talk about something we’ve been working really hard on this year in High Sierra.

So much so that I need a sip of water, sorry.

All right.

So, we’re going to talk about Photos project extension.

And I think to kick this off we have to talk about what are projects.

You know Apple has been allowing users to create these really rich creative photo centric projects in the photos apps for a very long time.

Actually, going all the way back to iPhoto 1.0 when we first released photobooks.

But this year, we wanted to do something different.

Because during that time, 14 years, you as developers have been building up some really great experiences yourself with your own apps.

And we thought, you know, let’s bring this together.

Let’s give you an opportunity to actually create something that’s a fantastic experience directly within the Photos app.

And that’s exactly what we’ve done with the extensions.

We started working on the API, engage just a few developers to kind of see what they would do with it.

Companies like Shutterfly, who are going to be launching with their own photobooks and photos in the fall.

A company called WhiteWall that allows you to do these gallery quality framed prints.

Wix.com that created a brand-new feature allowing users to easily create web photo albums.

Animoto, that allows users to create these really beautiful movies that are server hosted and integrated with social media.

And a company called ifolor that did something really novel.

They kind of started from the ground floor and looked at this as an opportunity for you know what could they build that really tied into the photos ecosystem and really created some magic for the user.

And I think rather than explain it.

I want to show you a quick demo.

So here we are in the photos app on High Sierra.

And I’m in memories.

And I’ve got this favorite memory here, together that I’m just going to select.

And I can hover over my project section in the side bar and you see this little button.

And you’ll notice that in addition to the Apple offerings we currently have, photos has automatically discovered that I have another application installed, called ifolor wall décor.

So, this extension is now exposed to me.

I simply select that and I’m presented with some choices.

Now these choices are coming from the extension itself.

So, before we’ve even loaded the extension, the extension is notifying us that you know, before you push a project to me I want the user to make a choice.

I want to be able to fork my own code.

So, they give us the title and the description and the image.

Photos actually lays it out beautifully in the sheet.

And I’m just going to pick that first option.

I hit the create button, and now, their extension, their view controller is going to be pushed directly into photos.

Full view, but this is their UI, this is their experience inside photos.

Now, they’ve leveraged the memory in this case and taken the hints we provided to lay out this grid.

But they’ve also taken hints we provided to let me get at all of the images that were considered when the memory was created.

So, you’ll notice over here on the bottom right, if you can see it, you can’t because it’s behind me, 283 images.

And what’s was really cool about that is I can actually customize.

So, I’ll switch this to unused images, because there’s a photo that I like a little bit better.

I’ll take this one, that I think would be a really great hero shot.

So, I can just drag that up and customize this layout.

And they went a step further than that.

They let me customize the grid itself in a really novel way.

So, I want to give a little more emphasis to this row.

And maybe move this down.

And I’ll move this down to line it up and you know let’s do one more.

And I really like that.

So, I could just hit the buy button now, still within photos.

Choose my size, this would show up in the mail.

And I have this memory hanging on the wall in my den.

It’s pretty cool.

That’s just a really quick tour of ifolor.

[ Applause ]

So, what you want to know is how does this all work.

If you’re familiar with the editing extensions in Photos, you already have a good sense of how this works.

We’ve added a new extension point to Photos; com.apple.photo-project.

And we’ve added a new template within Xcode that lets you easily create one of these extensions in your own app.

As you saw, in the demo, we automatically discovered your extension, raising the discoverability of your app on the system.

But we took discoverability a step further.

By offering users a direct link to the App Store that brings up the App Store window, but doesn’t require them to search for an app with that extension.

It automatically shows only the apps that support that extension point.

And we think this is really going to heighten the ability for users to discover your apps and quickly download them, and start working with them.

So much so that we put a similar App Store link in edit.

And so, if you’re already doing image editing extensions, users will have a direct link into the App Store to immediately discover you editing extensions.

It’s pretty cool.

So, your extension lives within the app, within your app.

And that’s really a great opportunity for you.

Because you may already have an app in the App Store that’s got a lot of app specific code, but you probably have some resources and some other sharable code that you could easily move kind of into this extension space and leverage those.

You obviously also have some extension specific code that you would need to write.

A very small amount.

And finally, you just need to add a view controller that implements this protocol.

The PH project extension control protocol.

With all that in place, Photos can immediately discover your extension.

Let the user choose to you know to create a product with that and we simply call this protocol method begin project.

And when we do we send over a couple bits of data.

So, in addition to the photos we’re sending you, the project extension context, and we’re sending you an object called PHProjectInfo.

You can consume that in whatever way you want.

And then you just quickly call back the completion closure that we hand you.

And that let’s Photos know it can install your view controller, full view inside the Photos app.

And from that point, as you saw, it’s totally your UI.

It’s your experience.

You have all of that API available to you on macOS to leverage from and build a really killed experience for your users.

So, the protocol itself is really simple.

There’s an optional property for supported project types and you saw this in the demo I just did.

And this lets you quickly describe the choices you want a user to make.

And you can opt out of this if you want, you just want to straight into the extension.

That’s totally possible as well.

And then there are few required functions you need to implement in your view controller.

The first is begin project.

That’s where we hand you the context.

And we hand you the product info.

And that’s what we call the first time a project is created with your extension.

Now, if a user is coming back to your extension for a project they already created, we instead call resume project.

And again, hand you this extension context.

And finally, if a user is in your extension and they decide to switch away to do something else within photos, we’re simply going to call finishProject.

Allowing you to clean up any kind of data you’re holding onto.

Maybe quiet down any processor intensive tasks or animations thing like that that you might be doing.

So, PHProject extension context is a container for two really important objects.

One is PHProject, and the other PHPhotoLibrary, which is the library from which you’re going to fetch all the assets.

PHProject itself is just a subclass of PHAsset collection.

You may already be familiar with PHAssetCollection.

It’s a container.

It’s got an array of assets.

It’s got a title.

It’s got a key asset, key photo.

And what we’ve done in sub classing that, creating PHProject, is we’ve added just one property.

But it’s a really, really important property for you.

And that is the projectExtensionData.

So, this is where you, the extension can save any data you need to that represents the project the user created.

So, it’s really intended for that critical project data.

That is your list of the assetIdenfiers that you’re using and that maybe is some basic layout information configuration information.

That’s not for images and you know cache, thumbnails, and rosters, things like that.

Because you can create those on the fly or cache those away somewhere else.

So, we want that to be small, but useful.

And we settled on this limit of 1 megabyte.

Because we think you know, an array of assetIdenfiers compressed is really, really easy.

They’re just strings.

And we think you can live within 1 megabyte.

And by doing that, we’re not bloating user’s libraries as they create project after project and explore all these amazing extensions that you’re going to create.

So, setting the data, really easy.

We have an object called PHProjectChangeRequest that you simply instantiate and then once you do, you can all this perform changes and weight function on the Photo library.

Inside that, set the data to whatever you want it to be.

In this case, I’m just using an NSKeyedArchiver archive an array of identifiers.

And that’s it, it’s simple.

So, let’s get into some fun stuff.

I want to talk about where the magic begins.

So, you saw in the ifolor demo they really leveraged the memory’s feature.

The memory’s feature is really, really popular.

Users love this.

Right, while you’re sleeping, Photos is working hard all night, going through kind of your photo library, learning sort of about you, right on your device.

It’s all happening right there locally.

And it’s building this graph of kind of the events that are meaningful to you and the people in your life that are relevant to you.

And through that it’s able to serve up these great memories that users just consume on a daily basis.

And so we wanted to make sure you had access to that level of data, right .

These extensions aren’t interesting if we just give you an array of photos.

We’ve got to give you context from which you can create something great.

And that’s what we’ve done with PHProjectInfo.

So, ProductInfo at the top level is broken into sections.

We give you this array of PHProject section objects.

And for a given section, we tell you the type.

So, you think about a memory, we’ve got like a key photo at the top.

And we call that type cover.

And then we’ve got this editorial grid that we call type content.

And then we give you this array of content.

It will look something like this.

And inside the contents we actually give you access to the elements, and the asset identifiers and some layout information.

And you might be asking yourself when you look at this construct, why the array and then the array, right.

Why this kind of nested structure.

But if you think about the memory feature it makes sense, right.

We allow users and memories to toggle between showing a summary of photos and showing all photos.

The memory itself is built up from a large pool of assets.

We call these curation levels.

And we wanted to make sure you had access to these different levels of information, right?

We’ve already done all this hard work.

We’ve run all this intelligence.

Let’s just share it with you and let you build something on top of it.

So, that section content array, is an ordered array, where you always know the object at index 0 is going to be the best of the best.

The most premium.

The most condescended summary of that collection of assets, whereas the object at the very end of the array is going to be the most.

It’s going to be everything we considered.

And you get to decide what you want to use.

The fewest, the most, everything in between.

Maybe you want to offer your users a choice.

You want to give them the ability to switch between these.

And you can do that with the context we’re giving here.

Next up, cloud asset identifiers.

This is kind of a new concept.

If you’re familiar with PhotoKit on iOS, you’ve not seen this before because it’s new.

But, we’ve never had a construct where we wanted you to save data into a user’s photo library.

Data that represents assets but data that might be synced to other devices, that might appear somewhere else at some point.

We want to make sure that the data you’re saving is meaningful wherever it shows up.

And so, we’ve introduced a new object called the PHCloudIdentifier.

And you can think of this as a global identifier for the asset.

But it’s not as simple as just a global string.

You know, dealing with the cloud and sync states and all that is really, really complicated.

And we’ve solved all that for you.

The only thing you have to do with this is just make sure you’re always converting from a global identifier to a local identifier before you fetch.

And for that we’ve just given you a convenience method inside PHPhotoLibrary to do that conversion both ways; from global, from pod identifier to local identifier, and vice versa.

Next, I want to talk about layout.

So, we didn’t just want to give you assets and some constructs around the curation.

We wanted to give you some hints about how we present those assets to a user right.

Help you with some of the difficult problems of things like grid layout.

All right, you look at a memory and we’ve done a lot of hard work already in laying out this curated grid in a really interesting way that’s pleasing to the user.

Wouldn’t it be great if you had access to that layout.

And you do.

But to do that, we had to establish first kind of set of rules, a coordinate system for you.

So, if you look at the memory, you notice everything aligns to a very specific grid.

It’s grid of 4×3 aspect ratio cells.

But that’s an odd, non-square kind of dimension to share back and forth.

That’s a weird multiplier for you.

So, we simply divided that yet again.

And in this case, this layout is really comprised of 20 uniform columns.

What we call our grid spaces.

And once we have this coordinate system established, you can scale this however you want.

It’s just a multiplier for you, for whatever your output is.

Right? If you’re rescaling on a resizable window, or working with web browsers of different size, you can use this data to know how to layout the content.

And it also lets us communicate to you placement information.

So, in this example, I have this photo that wants to live in the upper left-hand corner.

So, I can now easily tell you it’s at position 00.

It’s 8 grid spaces wide and it’s 9 grid spaces tall.

Next thing I want to talk about are elements.

So, inside the section constant we give you an array of elements.

And all the elements are subclasses of PHProjectElement.

And there’s two really important properties there.

One, is placement, which we just talked about and the other is something called weight.

So, again going back to the memories feature.

We talked to you about, like all the intelligence that we’re doing, right?

And ascertaining the most relevant photos in a large pool of assets.

And so, our scoring system.

And we wanted to share that with you.

And so, we do that by given elements weight, right.

it’s a normalized weight from 0 to 1.0, the default value is 0.5 meaning it’s not horribly significant, but is not horribly insignificant either, if that makes any sense.

But you know it’s this normalized score across the entire section.

And you can do amazing things with that, right?

Just really easily, right?

Sort your assets by that.

Decide where you want to chop off, you know a large array of assets and only work with the significant ones.

Right? Present those as choices to a user where you’re serving up the higher weighted assets first.

And predicting that that’s probably what they’re looking for, what they want.

And there’s one other bit here.

And this is specific to one of the elements, which is the PHProjectAssetElement.

We have this concept of regions of interest.

You know, as you know there are several ways already in the macOS API to run face detection to find faces in an image.

But form that you don’t know the relevancy of those faces.

You don’t know if those faces represent someone that is meaningful in someone’s life, or it’s a face in the crowd.

It’s the person walking behind you at a theme park, when you’re trying to photograph somebody in your family.

You really don’t have an ability to tell the difference.

We wanted to give you something that let you know, and let you trust that if we’ve marked, if we said it’s interesting, it’s interesting, right.

And if you leverage this, you create magic for your users.

Without necessarily even knowing why.

And we went one step further.

If you notice we have identifiers on these.

And we actually repurposed those identifiers across images.

So, if we know that this thing that’s interesting in this photo is also representing this same thing, or same person in this other photo, we’ll give them the same identifier .

So, if you’re doing like animation, slideshows, things like that, this is tremendously useful to you.

Because you now can actually correlated positions of images across a large set.

S, I could keep going, in a code walk.

But, I’d rather give you a demo.

Because I think it’s really important to show you just how easy it is to create one of these extensions.

So, I’m here in Xcode.

And I promise I will zoom in where necessary, so you will be able to read this.

So, I’m here in Xcode.

And I have a project I started, but it actually does nothing yet.

The only thing it has are some assets that a designed gave me.

And what we’re going to build right now, is a slideshow.

And the slideshow is going to be the marriage of the content that Photos is providing and seenkit to do a really compelling kind of 3D animation.

So, let’s begin.

So, the first thing I can do is just, I’m on the project and I have this little plus button here at the bottom add a target.

And I’ll just add a target here.

And you’ll notice we’ve got this new template, right for PhotoProject extension.

So, I’m going to select that and I’m going to give this a name.

Picture show.

Hit the finished button.

And yes, I want to activate that as a scheme it makes it really easy to run later.

And then, I mentioned that my designer is going to need these assets.

So, they’re already in the project, I just want them, basically referenced from within the extension as well.

So, I’m just going to add those via the copy phase.

And that’s the whole set of assets.

I just hit add and with that we’re ready to write some code.

Now, if you’re looking closely, you’ll notice that Xcode has already added this group for me for Picture Show for my target.

And inside of it, it’s already laid down a view controller for me that conforms to the protocol that we have to implement.

So, I’m going to go into that and first going to add some properties.

So, just a very small set of properties here.

I’m holding on to the context that’s handed to me.

I have some arrays to differentiate landscape and portrait assets for my slideshow.

I have an outlet for my scene view.

And I have an animator.

And I have two expected compilers.

I have not imported seenkit.

So, that’s an easy one to fix.

And animator is my own class.

I just haven’t created it yet.

So, let’s go ahead and do that right now.

So, I hit new, I want a Swift file.

And we will call this animator.

And there’s my animator.

I’m not going to walk you through all of this.

But it’s really, really simple.

So, I’ll just tell you a few things it’s doing.

When I’m initializing the animation I want to kind of reach into the scene and get out some specific elements of the scene.

I want to find these tagged photo frames in this 3D scene.

And then reach kind of inside them, behind the glass and get at the material that is going to be the photo I’m showing behind the glass of these frames.

So, that’s really all I’m doing in the initialize.

And then from that point, the scene, I know it’s just going to be spinning around.

It’s just this kind of rotation on a table and I have a little trickery that you’re like a good magician when you’re not looking on switching photos.

And so, I’m just doing that through a couple of timers.

And that’s basically the animation.

So, we’ll go back to the controller and start filling in the interesting bits.

When a user creates one of these projects, I want to save the data.

Because I want them to be able to come back to this slide show at any point in time.

So, I’m just going to add some data methods.

And what I’ve got here is a SaveProjectDataFunction that does exactly what you saw on the slide, right?

So, this should look very familiar with what we just talked about.

And then load data, I’m simply reading that back.

I’m getting the Cloud asset identifiers by unarchiving my data.

And I’m just converting to local identifiers.

And from that point, I’m able to do a PHAsset fetch assets request with those local identifiers.

All right.

Let’s go back up to the top and start filling in the rest.

So, viewDidLoad.

I just need to initialize my animator.

We talked about that.

Supported project types.

In this case, I don’t need to ask the user for anything.

So, I won’t fill that in.

It’s just a slide show.

But begin project, obviously, I’ve got some work I want to do here.

So, we’ll fill that in.

And what I’m doing in this case is I’m getting at the actual sections array.

And I’m looking for the first section of typed content.

Right? I’m really designing this around a memory, like we just talk about.

And form that, I want the most curated assets.

I want the top-level summary.

And so, I asked for the first object in that array.

From that, get the cloudAssetIdentifiers and I’m good to go.

Now, in the background, I am just saving the project data so that it’s there later if the user comes back to the slide show.

Get into resume project, and that’s really easy.

I simply hold on to my context load my project data, initialize the animation.

And finally, in finish project, I’m just suspending animation.

I don’t want these timers firing, I don’t want the fans running while user’s doing something else.

I just suspend it.

And believe it or not, that is the entirety of the code for this extension.

That is my slide show.

Now, what I haven’t don’t is wire up the UI, which I can do really quickly.

So, the template gives me the view for my view controlled.

I’ll just delete the default string.

And I’m going to add a scene to this.

And so, we add a seenkit view.

I’ll just resize that and just pop in here, set my auto resize and I just want this to be full view.

And the only other thing you need to do to that is choose the scene itself, which I already have in the assets that we loaded in.

And I just need to wire it to my view controller.

So, I’ll control drag form file’s owner and I see the outlet that we’ve already added in properties.

And that’s it.

We’re ready to run.

So, I’ll hit play.

It’s going to ask me to choose the app, obviously, it’s Photos in this case.

And that will build, and succeed.

Photos launches, and I’m just going to pick a memory.

In this case, I want to use, I’ve got this really cool trip to Hong Kong I’m going to choose.

Hit the plus button and look at that, there’s the extension showing up in Photos.

I choose that.

Now, when I do, this is the very first time I run this extension, so I don’t have permission as an extension to the entire photo library.

You remember the our API is handing the entire photo library to the extension.

So, user does have to grant that access once.

And just hit okay.

And then the slideshow loads.

[ Applause ]

And I’m just going to show you one additional detail, if you look in the sidebar, there’s the project we just created, Hong Kong Island.

This is saved inside that library right next to all the other projects.

So, I choose that and I can see this was created by an extension called Picture Show.

And I can open that.

And it will pop right back into the same slideshow.

So, that is PhotosPojects Extensions [applause].

Thank you.

So, we’ve talked about a lot in the past 55 minutes or so.

Talking about the frictionless photo picker in iOS.

Really, really nice advancements there, allowing you to get at original URLs.

Having the video quality.

Some great, great stuff there for you.

Andreas took you through the new media presentation types, I can’t wait to see how you’re going to consume those and the kind of things you do within your apps.

Allowing users to see their content played back the way they’d expect.

He also talked about testing with large image libraries.

And just a reminder there is a sample app now to let you fill out a library with tens of thousands of assets so that you can test that way.

And then, finally we talked about the project extensions.

And I’m really, really excited to see what your all going to create and have in the App Store when macOS High Sierra ships this fall.

So, for more information please check out this link, the developer.apple.com website.

And I want to talk about some related sessions.

Right after this in this hall is going to be a killer, killer session introducing you to vision framework.

I know you want to stick around for that.

Later tonight over in the executive ballroom is the first of two sessions covering depth in iPhone photography with some amazing demos and you really, really want to be there for that as well.

And finally, tonight we’re doing something we’ve never done that WWDC.

We’re hosting a photography get together.

It’s going to be downstairs in technology lab area, specifically lab J.

And I’ll be there, the other 2 speakers in this session will be there.

All of the speakers today talking about Cameron Photos will be there as well as a ton of engineers, management, and we hope to see all of you there.

Enjoy the rest of your show.

Thank you so much.

[ Applause ]

Apple, Inc. AAPL
1 Infinite Loop Cupertino CA 95014 US