Support this SiteBecome a Six Colors member and get access to an exclusive weekly podcast, community, newsletter and more.
By Jason Snell for Macworld
The 2010s were kind of a rough era for the Mac. Apple was busily improving the iPhone and iPad, while Mac models spent years between updates. There was a real question about whether the Mac was being readied for retirement, a legacy platform that would fade away as Apple shifted to its shiny new devices.
Last week’s announcements suggest that Apple has something else in mind for the Mac in the 2020s. First there was the word that the entire platform is moving to the same Apple-designed processor architecture that powers the iPhone and iPad. Then came the news that those Macs will run iOS and iPadOS apps as well as Mac apps. That means the Mac is no longer going to be an outlier. In contrast, it will become the center of Apple’s computing universe, where all of its platforms come together.
By Jason Snell
June 30, 2020 12:41 PM PT
Earlier this month, the entire Six Colors web infrastructure cut over from Movable Type to WordPress. You might not have noticed. That’s because I have spent the past few months learning the ins and outs of WordPress and rebuilding the site’s design, created by the talented Christa Mrgan, in an entirely new system. It was an experience.
Now that we’re on this new platform, which is slightly more actively developed than Movable Type, I can begin to add features that were impossible before. For example, we can now post items to Six Colors that were previously available only in our monthly membership newsletter. This means that members who prefer to read Six Colors on the web or via RSS can now see all those items there rather than in email. (Members, log in to your member page to get a link to a members-only content RSS feed.)
This week we’re putting out our June issue of the newsletter, so you’ll be seeing these posts on the site every day. (On the web, they’re marked in purple.) But this is the last time that the volume will be like this. Going forward, there will be an average of one members-only post on the site per week.
In addition, the Six Colors Podcast is also being posted to the website now. If you’re not a member, you may not know that another member benefit is an exclusive weekly podcast featuring me and Dan Moren chatting about the news of the week in a casual format for about 30 minutes. That podcast will now play right in the browser for logged-in members, and there’s also a member-specific podcast feed available from the Member Center.
Six Colors memberships help us navigate the complicated waters of being independent writers and podcasters. As you may have noticed, we very rarely have advertising on the site anymore—that source of revenue has dried up over the past couple of years. Today Six Colors is almost entirely supported by members.
If you’re not currently supporting the site, I hope you will consider doing so in the future. But I have no plans to turn Six Colors into a membership-only site. We’ll still be posting all the same stories and links here regularly, as we’ve done for more than five years. Only those weekly members-only pieces and weekly podcast posts will be limited.
Thanks for reading Six Colors! If you have any questions, you can contact me at email@example.com.
By Dan Moren
June 30, 2020 9:00 AM PT
After years of spending every fall hemming and hawing over whether or not I was going to buy a new iPhone, I decided in 2015 to sign up for Apple’s newly unveiled iPhone Upgrade Program. First offered alongside the iPhone 6s, the Upgrade Program allows you to pay for your phone on a month-by-month basis over the course of two years. While that might not save you on the cost of the phone outright, it helped avoid having to plop down several thousand dollars in a single go.…
This is a post limited to Six Colors members.
This week we welcome Apple’s Bob Borchers and Ronak Shah to the show to discuss macOS Big Sur, including all the new features in Safari. There’s also an awful lot of follow-up from the busy WWDC week that was, and we discuss the possible features of new Macs running Apple silicon.
By Stephen Hackett
June 29, 2020 9:00 AM PT
WWDC 2020 has come and gone, and for the first time in the event’s 31-year history, the conference was entirely online.
In the Ye Olden Days, developers could get copies of WWDC sessions on VHS or DVD, and eventually watch them online. Over the last few years, Apple has worked hard to get session videos online faster and faster.
Of course this year, none of those edit-and-upload-as-quickly-as-possible skills were needed, as the entire conference was done in advance, ready to stream online like content from Netflix or Hulu.…
This is a post limited to Six Colors members.
By Dan Moren for Macworld
The keynote at Apple’s Worldwide Developers Conference may be one of the biggest events of the company’s year, but it only ever scratches the surface. Not every change or update makes it into the presentation—even when it’s pushing two hours, as it was this year.
As people begin to dig into the betas and watch all of the attendant technical sessions, there’s a lot more that’s coming to light. And while much of that information is about things happening in the here and now, or perhaps about the products that will be released in the near future, this is Apple we’re talking about. The company plays the long game.
That, in turn, has encouraged those in the Apple community to start digging into the details and, of course, read the tea leaves about not only what’s in the pipeline for later this year, but also what some of these changes mean for the future of Apple’s device roadmap.
This week, on the 30-minute podcast that crams an hour of content into just half the time, Dan and Mikah are joined by special guests Jason Snell and Ish ShaBazz to discuss how iOS 14’s App Library will change our app organization habits, our thoughts on Mac Catalyst in the age of ARM Macs, how we feel about the macOS redesign, and WWDC’s new format.
By Six Colors Staff
June 26, 2020 2:10 PM PT
And so WWDC comes to a close for another year. We have to admit: We’ve probably watched more session videos this year than all of the prior years put together. The virtual format has been a real change, but a lot of the way the event has adapted is to the positive, including the ability to make all of this information easily accessible to anybody who’s interested. So, with that said, let’s wrap up a few last videos.
I love AutoFill—it’s one of my favorite features. On the Mac, I don’t mind typing things, but on the iPhone, having forms automatically filled out for you can be a huge time saver.
But, as it turns out, there are other benefits too. For example, privacy. As keyboard engineer Zeheng Chen points out, when you have an app where you want to send something to one of your contacts, you might choose to use a contact picker UI instead of granting access to your contacts, in order to minimize the amount of information that the app can see. But a contact picker UI might be slower than autofilling a contact address as you start to type it, and the autofill option still prevents the app from getting any data but what you type into the field. Plus, developers don’t have to create a user interface, since it’s already built-in.
Starting in macOS Big Sur, AppKit—the native APIs for building Mac apps—will now have access to AutoFill. And third-party password management apps will be available as sources for AutoFill, as they have been on iOS and iPadOS, making those programs even more useful.
Me, I’m all for less typing! —Dan Moren
Apple’s Shortcuts utility started its life as a third-party app called Workflow that offered an Apple Watch app. It’s taken awhile, but Apple Watch support has now arrived in Shortcuts. Not only is there a Shortcuts app on the watch, but you can launch specific shortcuts directly from watch-face complication slots. (To designate a specific shortcut for sync with the Apple Watch, you mark it as such within the Shortcuts app on your iPhone.)
Depending on how developers implement their Shortcuts support, shortcuts may run specifically on the Apple Watch without ever going back to the iPhone. But to do this, the parent app must have its associated Apple Watch app installed, too. While this is the ideal experience, there’s also the capability for a shortcut on Apple Watch to phone home to the iPhone and run the necessary automation there. It’ll just be a lot slower. —Jason Snell
Also starting in Big Sur, Mac apps will for the first time get access to ReplayKit, Apple’s API for recording, capturing, and broadcasting content from within your app. So if you’ve got, say, a Mac game, you can automatically build in features to not only do screen recordings, but also to broadcast your gameplay—and even provide an in-app editor or add overlays and other effects.
ReplayKit has been available on iOS and iPadOS for a couple years, but it’s a great addition to the Mac. Game streaming has become more and more common, and in our current world environment, there are lots of other instances in which being able to record and broadcast your app may be useful, such as during meetings.
In terms of gaming, Apple has also added support for triggering a screen recording via a button on an external PlayStation or Xbox controller, which is handy for when you want to record something that just happened without having to switch to a different input device. (On their respective game consoles, this functionality is generally accessed via controllers.)
I’m curious to see what the limits of ReplayKit are on the Mac. I’ve started spending more time streaming video for various types of entertainment, such as the shows we do at Total Party Kill and our occasional Jackbox games at The Incomparable. I’m not sure how applicable these features will be, but it certainly seems like it could simplify matters. —DM
Widgets are the big story in iOS 14, and this session details the tools developers can use to design great widgets for their apps. It focuses primarily on the decisions Apple made in creating its own widgets for iOS 14.
Widgets can adjust their display based on context. For example, the Weather widget might normally show an extended forecast, but if there’s precipitation in the area it can shift to a precipitation forecast, showing you when the rain’s expected to start or stop. The Maps widget has spatial awareness, noticing when you’re not home and offering up the travel time to get back there.
Editing widgets is adorable, and takes a page out of the old Mac Dashboard manual: You tap and hold on a widget while in jiggle mode, choose Edit, and then the whole thing flips over to present a settings interface that’s stored on the reverse side of the widget.
Users can also add multiple copies of a single widget, too—for example, you can create Weather widgets for different locations, and display them side by side—or create a Widget Stack and then flip between them. —JS
Last year Apple introduced SF Symbols, a library of more than 1500 icons meant to be used to unify the iconography of apps running on iOS and iPadOS. Developers can use SF Symbols to ensure that their toolbars and menus feel very much like they’re part of a unified system design.
This year, Apple has added more than 750 new symbols to the library, and the entire SF Symbols collection is also available on macOS Big Sur, bringing the visual design of all of Apple’s platforms closer together. There are also more localized symbols, so the iconography an app uses can shift based on what country or language preferences a user has.
SF Symbols also has support for multicolor variants of its symbols, for cases where a monochrome appearance isn’t ideal. For example, weather-related symbols showing a shining sun could be displayed in a weather app with the sun colored yellow. —JS
You’ll be able to use the Health app to assign shortcuts to the new Wind Down feature that gets you ready to go sleep. Apps can register certain types of behavior to allow suggestions of their shortcuts—so, for example, a meditation app could suggest a shortcut, or a journaling app. There will be a smart Sleep Mode collection in the Shortcuts app.
While you can now share watchfaces via your Apple Watch, iPhone, or website, Nike and Hermes faces won’t work except on those specific types of hardware. —DM
By Dan Moren
June 26, 2020 8:46 AM PT
WWDC sessions have a way of getting more technical throughout the week, so for the laypeople among us, it gets a little tougher to pick out those aspects that the general populace might be interested in. I mean, I know you’re all waiting with bated breath to hear about function calls and implementing error handlers, but let’s not get too excited. Anyway, here are a few quick tidbits I picked up from Thursday’s sessions.
It’s been a long time since I’ve played a game on a PC (or even my Mac), but I was once a decent FPS player back in the era of Quake 3 and I’m totally dating myself here, aren’t I.
Keyboard and mouse input have often been considered the gold standard for gaming1, and with the advent of support for external pointing devices as well as the new Magic Keyboard, Apple has moved quickly to provide APIs for games that want to take advantage of playing via these devices.
The system for using these devices is built on top of the same framework provided for external game controllers, but provides certain features tuned towards using keyboard and mice. For example, you can check whether keys on the keyboard are in up or down state, you can lock the pointer to avoid triggering system events like showing the Dock (and hiding the system cursor), and you can even implement support for scroll wheels on mice.
It’s even possible to switch between the usual cursor and keyboard support and game-related input within the game, for situations where, for example, you’re using a keyboard and mouse to play a multiplayer game, but want to allow users to access menus between matches.
Gamers will no doubt welcome these advancements, just as they did for external controller support. They also provide additional accessibility options for those who can’t or prefer not to use the touch screen or an external game controller. Moreover, they reinforce both the idea that pointer support isn’t just something Apple threw together, as well as investing in the longevity of both of these devices as input methods on both iPadOS and macOS.
Maybe I’m in the minority, but I do actually use Siri fairly often to play a specific song, album, or artist. While Apple had previously added the ability for third-party music apps to hook into the virtual assistant, it’s now made some further enhancements, allowing for both support on more platforms and additional features.
The HomePod will use a new cloud playback API as well as integrating the same media intents used on iOS, and Apple TV will also now support third-party media intents, meaning that you can use Siri on the remote to play music (and presumably other media) in non-Apple applications.
There’s also a new UI for an Alternatives feature that allows apps to provide other options, in case what starts playing wasn’t exactly what you were looking for. So, for example, if you asked Siri to start playing an album but it plays the wrong song, you could use this to pick a different song from the album.
Sign in with Apple was one of last year’s most exciting announcements, and adoption for the feature has been on the rise. But there were some limitations with the approach, and this year Apple is targeting one specific way to bolster adoption: converting existing user accounts to Sign in with Apple.
Some services do this already, but Apple’s providing a standardized API for doing so, via a few different entry points. One is when you log into a service for which you have a weak credential, like an easily guessable password. You can be prompted to update your account to Sign in with Apple, if the service provides it. Similarly, if the password manager in Settings alerts you that you have a weak password, it can offer the option to convert the account for you. Finally, developers can offer an option within their app for users to choose to convert their account.
Once the conversion process, which is largely transparent, has concluded, the old weak credential is removed, avoiding the risk of duplicating accounts. (I will say it was unclear to me whether that involves changing the account on the service’s back end, as opposed to just changing the credential within iOS/iPadOS. Is Apple trusting third-party services to merely do the right thing, and not hold on to existing account information, or does it enforce this policy somehow?)
These days, when I have the option to use Sign in with Apple when creating an account, I often take it, but there are so many services on which I already have accounts that I’m looking forward to this option being more widely available.
- I’ve been playing a lot of Sea of Thieves on my Xbox recently, and since the game has cross-play with the PC, one of the multiplayer options is to prefer sessions with only players using Xbox controllers, so you don’t get consistently and mercilessly slaughtered by keyboard and mouse users. I hear. 😭 ↩
[Dan Moren is the official Dan of Six Colors. You can find him on Twitter at @dmoren or reach him by email at firstname.lastname@example.org. His latest novel, The Aleph Extraction, is out now and available in fine book stores everywhere, so be sure to pick up a copy.]
By Jason Snell
June 25, 2020 4:25 PM PT
If you’ve ever wanted a longform explanation of how Apple built pointer support into iPadOS, and the challenges involved in re-inventing the pointer interface for a device that’s primarily touch oriented, Design for the iPadOS pointer is the session for you.
The Mac’s arrow pointer was designed for pixel-level precision, but of course in most contexts iPad software was designed for fingertip-level precision—in other words, a lot less of it. This is why the iPad’s default pointer is a fingertip-sized circle, because that’s the level of precision that most apps expect.
However, in some contexts, pixel-perfect precision can be just what is required. So the iPadOS designers focused on a pointer with “adaptive precision,” that could switch contexts (and shapes) to become more precise when necessary. The obvious example is in text editing, where the iPad’s beam cursor is extremely precise horizontally (to allow you to select exactly the characters you want, or place that insertion point right in the middle of a word), while being quite imprecise vertically (it snaps to each line of text). In a calendar app, the pointer can adapt again, snapping in 15-minute increments to indicate that by default, the app assumes calendar events don’t begin at odd times.
Precision is also the reason that iPad pointers morph into shapes when they’re selecting individual buttons1. This way, it’s crystal clear which button you’ve currently got selected. If you were using a more precise pointer, you might find yourself right between buttons and not know what would happen if you clicked. The iPad’s approach eliminates this as a possibility.
As you move the pointer, the system is making some guesses about what target you’re trying to reach. The Magnetism feature analyzes the direction in which you pushed the cursor and finds a nearby interface element that you were most likely targeting, and snaps the cursor to it. It’s a subtle thing that makes the system intuit what the user’s intent was, even if their finger swipe across the trackpad wouldn’t normally be quite enough to get there.
The session also picked up on a theme I’ve seen repeated several times this week regarding the iPad. We talk a lot about how the iPad can be used with touch, or an Apple Pencil, or a keyboard, or a pointing device. But this week we’ve been reminded, again and again, that you can also mix and match these input methods, and developers should remember that. Hold down a modifier key and tap with the Apple Pencil or your finger, and the right thing should happen.
After watching the session, I have to be honest: I fully expect Apple to bring an adapted version of the iPad’s approach to pointers to macOS in a future release. Now is not the time, because there is a level of precision assumed by most macOS apps that is way beyond what’s assumed by most iPad apps. But the Mac would absolutely benefit by a more adaptive pointing system than the one it’s currently got, which (let’s be honest) is largely unchanged since 1984. Maybe next year?
- The translucent shape is located behind the button icon, preventing the icon’s color from being distorted by a pointer overlay. A subtle touch. ↩
By Dan Moren
June 25, 2020 6:57 AM PT
The barrage of informative presentations continued on Wednesday, and among the most pressing question I had while watching them…what’s up with all the tchotchkes? Are they an attempt at giving each session a little personality? Could they be some sort of puzzle to decode? Some people have speculated they aren’t even real, but rather AR objects. The conspiracy theories abound!
Ahem. Anyway, here are a few interesting tidbits from a few of the talks I watched on Wednesday.
We’ve all gotten used to logging in to apps on our iPhones, iPads, and—in some cases—Macs using Touch ID and Face ID. So much so that when we visit a website and it asks us for our password, it feels extremely 20th century.
Well, good news: Apple is rolling out a way for websites to offer authentication via Face ID or Touch ID. What shouldn’t be a surprise is that on the user end the experience is almost identical to logging in to an app. Once a website offers Face ID/Touch ID as an option, you simply agree to use it, authenticate with the biometrics to prove your identity, and the next time you log in, you won’t have to enter your password.
An interesting tidbit I gleaned from this presentation: as part of the process, Apple has created its own attestation service. This optional feature provides an extra step of security in which the device maker can be queried to confirm details about the device requesting authentication. Such a process might be used by higher-security institutions like banks, which could be required to enforce multi-factor verification. So, for example, a site can check with Apple that the iPhone being used for Face ID authentication is really an iPhone and it actually supports Face ID authentication.
In traditional Apple fashion, the company has layered more privacy on top of this service. As WebKit Engineer Jiewen Tan points out, an attestation service could just provide the same certificate every time it’s queried, thus providing an opportunity for cross-site tracking by comparing those certificates. So, instead, Apple issues unique certificates every time, thus anonymizing the device.
Also, one final good tip I was glad to see from this presentation: Apple’s not advising sites to use Face ID and Touch ID as the only method for authentication, given that if you lose your device, you may be out of luck. So the password is probably here to stay for a bit longer.
Apple Pay has obviously become a big part of our lives, and especially in this day and age, contactless payments have become even more popular.
While Apple Pay enhancements weren’t something really touched upon in the keynote, there is at least one very significant improvement this year: the ability to use Apple Pay in both Catalyst apps and native Mac apps.
It actually kind of surprised me to see this brought up, because I hadn’t really thought about Mac apps not having Apple Pay. Heck, I’ve done Apple Pay transactions on my Mac—but they were all via Safari. Going forward, however, any app that’s on the Mac will be able to integrate Apple Pay payments. (That makes sense especially given that upcoming Apple Silicon Macs will run iOS and iPadOS apps, which themselves will be able to take Apple Pay.)
There are a few other improvements coming as well. To anyone who has ever been frustrated by having an Apple Pay transaction fail because of an error in your address or phone number, Apple will now impose standardized formatting of contact information, and will validate the data prior to the transaction.
Finally, in order to provide a better onboarding experience for people adding cards, the Wallet app will now support an extension for issuer apps. So, if you have your bank’s app installed, the option to add your card can appear directly in Wallet.
Oh, and perhaps most excitingly, developers can now alter the corner radius of the Apple Pay button, all the way from rectangles to a pill-like shape. Let’s hear it for button configuration!
I would not have guessed that Game Center would see a substantial overhaul in the upcoming platform updates, but I’m delighted to see that it’s getting a lot of attention.
Chief among the updates, Apple’s rolling out a new Dashboard feature that feels much more like what you might see on a game console. It collects a variety of information into one location, including your profile, your friends, your achievements, and your leaderboards.
To access the dashboard there’s a new, appropriately named Access Point that can games can integrate, which shows up as a little picture of your Game Center avatar on the menu screen of a game. (It can also optionally show other highlights, like your achievements or current leaderboard status.)
A big part of this Game Center update is pervasiveness. You can now access your profile in a variety of places, including in-game and even in the App Store, instead of having to dig into a dedicated app or the system-level Settings.
There are also new UIs for real-time and turn-based multiplayer that app developers can integrate, or roll their own, which not only let you add friends to games but nearby players as well.
Apple’s also beefed up offerings like achievements, which can now show your progress towards attaining the goal, and leaderboards, which can be configured to be recurring, resetting after a specific time period or when a particular score is reached.
Players also have more control over the visibility of their profiles, whether they’re available to everyone, just friends, or just private. And the App Store now shows you what games your friends are playing as suggestions, as well as letting you directly access their player profiles from the store.
Mostly, though, I just want to play the “The Coast”, the demo game the presenters were using. I don’t know if it’s real, but a game about being a lighthouse keeper trying to safely see cargo ships through a treacherous body of water? Sign me up.
All in all, these updates to Game Center seem like a great comeback for a feature that seemed to have been left for dead a few years back. I guess you could call it quite the…achievement.1
- I’ll see myself out. ↩
[Dan Moren is the official Dan of Six Colors. You can find him on Twitter at @dmoren or reach him by email at email@example.com. His latest novel, The Aleph Extraction, is out now and available in fine book stores everywhere, so be sure to pick up a copy.]
By Jason Snell
June 24, 2020 9:44 PM PT
How is it only Wednesday? It turns out that the WWDC time warp even happens when all of us are at our homes. Anyway, I watched a bunch more WWDC sessions today, and here are some observations from today’s binge. (Side note: Did Apple provide hairstylists for all of their presenters? Lucky ducks.)
In iOS and iPadOS 14, Apple is rethinking some of its previous design decisions—and in doing so, it’s introducing interface elements that might seem a bit more familiar to Mac users. This is, at least in part, because a lot of those decisions were made when iPhones were really small—and they’re not anymore.
So Apple’s rolling out drop-down menus that appear next to where you tapped to bring them up (often a round button with three ellipsis dots inside), rather than sliding up a modal list at the bottom of the screen. That way, your finger doesn’t have to move as far to finish the thought and complete an action. These menus behave very much like Mac menus do: if you tap and hold, and slide your finger, then lifting your finger will select an item. If you tap and lift your finger, the menu remains open until you tap on an item within the menu. Tapping outside of the menu dismisses it. The menu items themselves are also a lot more compact than the old slide-up options were, and feature not just text, but icons.
These menus can also be used to ask the user for more specific information. For example, the plus icon in Photos means to add something, obviously—but if you tap on it, you’ll be asked specifically what you want to add, in the form of a menu. When you tap to add an image in notes, a menu appears so you can choose exactly what kind of image you want to add. They can also be used for navigation: In Safari, tapping and hold the back button will reveal a list of previously browsed pages, and in iOS 14 this uses the new menu design.
One of Apple’s big goals is to reduce the density of elements on the visible part of the app interface by hiding those items—generally actions that must be offered but aren’t important enough to be displayed prominently—in a menu hidden behind one of those white three-dot more buttons.
Speaking of design decisions that haven’t worn that well, you won’t have the iOS spinning date and time picker to kick around anymore—or at least not nearly as much as you have up until now. The wheels have been replaced by new pickers that display a calendar with a month worth of dates. Tap to select a different month or year, and yes, the wheels will reappear—until you choose the new month and year, at which point the month view will return. Entering a time doesn’t require you to spin your wheels at all—you just type it in.
Finally, rejoice at the sight of the first unified color picker for iOS. You can choose a few different color-picking methods, sample images right from elsewhere on your device’s screen, and save colors to a palette that is consistent across all apps on your device.
Of course, iOS apps will need to be updated to take advantages of these new features. Users should expect to see them begin to appear when iOS 14 ships this fall.
I covered a lot of this interesting session about what Macs will look like when they’re running Apple-designed processors in an earlier piece, but the session was so jam-packed that I focused on the new boot system and left the rest of it on the cutting room floor.
Among the additional benefits of switching to an Apple-designed System on a Chip (SoC) is a unified memory architecture, shared across the CPU and the GPU. This means that graphics resources can be shared without any overhead—there’s no need to copy anything across the PCIe bus, because the CPU and GPU are pulling from the same memory. The SoC also picks up a bunch of other features that have been around on iPad and iPhone for a while, but will be new to the Mac: dedicated video encoding and decoding1, and support for fast machine-learning via the Neural Engine.
One of the biggest changes in the new Mac architecture, though, is asymmetric multiprocessing, or AMP. Mac software developers will need to set a “quality of service” property for the work that they’re dispatching to the processors, suggesting how that work should be prioritized. Does it need to be done as fast as possible, or is it okay to slow it down and keep things power efficient? Modern Apple-designed processors have separate performance-focused and efficiency-focused cores, so different cores will be better for different jobs.
The session provided a bit more detail about how Rosetta, the technology that translates code meant for Intel processors into instructions that Apple’s processors will understand, works. When you download an Intel-only app from the Mac App Store or install it using Apple’s Installer utility, Rosetta will automatically be triggered and will do the work up front of translating the app’s code. If the app gets on your system by a different means, the translation happens when you first launch the app—which means it’ll launch slowly the first time. (Also, operating-system updates can affect Rosetta, so Rosetta’s translations will be refreshed when the operating system is updated.)
And just as Macs with T2 processors have all had always-on encryption of their disks, so too will Macs with Apple-designed processors. But there is one added security bonus: secure hibernation. When one of these Macs goes into a deep sleep, all the contents of memory—not just disk—are protected.
Some apps are nosy. We all know it. No matter what App Store rules exist, no matter how many scandals emerge from apps abusing user data, there are still places where your personal information can leak out and an unscrupulous app can do something with it without us knowing.
Apple knows it, too, and it keeps tightening the screws where it can. This year it’s making a big move when it comes to access to your photo library. The new Photos Picker interface is meant to be used by most apps who need you to pick a photo or three from your library for use within the app. It runs in an entire separate process, and the app requesting can’t see anything about your photo library. While you’re in the photo picker, you can select multiple photos and even search your library. When you’ve selected what you want, then those items are passed to the app—and nothing more. A sneaky app can’t even take a screen shot of the contents of that picker and use that later. Sneaky.
If an app really does need access to the photo library, there’s a new set of permissions for that. Apple’s introducing a new “limited mode.” When an app asks a user for permission to read the photo library, you can choose full access or a limited mode—where you pick the photos you want to share, and that’s it. Those photos are all the app can see, until you go to the security settings and make a change.
Or as the presenter in one of these sessions put it to developers, “Consider if your app even needs library access.” The fact is, most apps don’t. The new photos picker is functional enough to do the job—and keep an app from being able to snoop on every single item in your library.
It’s only a matter of time before our phones replace car keys. Given the pace in automotive innovation and the rate at which I replace my cars, it will probably be a while for me—but I’ll be happy when it finally happens.
With its announcement of Car Keys, Apple is now on the path—that leads to a parking lot where you can unlock your car with an iPhone. This WWDC session was directed at auto manufacturers, but I found it pretty interesting in terms of some details of how Car Key works.
First off, it’s meant to be a radio-technology-agnostic technology. The first cars to use this tech, as announced, will use the same NFC technology you find in Apple Pay. This requires you to get very close to the reader—essentially, you’ll need to tap your phone on the door to open it. The NFC implementation actually requires two separate NFC readers—one to unlock the door, and one to start the car. The car will only start when a phone containing CarKey is laying on the NFC reader in the dash.
Things get more interesting with the second wave of Car Key tech, which uses Ultra Wideband and the U1 chip introduced in the iPhone 11. Ultra Wideband will be the best possible solution for using your phone as a car key, because its precise positioning capability and solid range will allow you to use it with your iPhone in a bag or pocket. The current Car Connectivity Consortium standard, version 2.0, covers NFC—which is why it’ll be first. But version 3.0 is on the horizon, and it’s the one that brings in the extra flexibility of Ultra Wideband.
Behind the scenes, CarKey works through a complex series of cryptographic transactions that authenticate your ownership of the car, allow the iPhone to securely pair with your car, and then allow you as the owner to distribute keys to other people. The initial setup requires Internet connectivity to the back-end systems of the car dealer and to Apple, but once a key is set up on your phone, no connectivity is required and Apple has no awareness of how you use your key—it’s all stored in the iPhone’s Secure Element, locked up tight. (And it works even if your phone runs out of battery, because it works with the trace amount of power left in the battery in Power Reserve mode.)
To share a key with a friend, you use Messages to send them a pass. From the perspective of the people making the key exchange, it’s a simple transaction. Behind the scenes, however, the two phones are doing a careful cryptographic dance that ends up with the friend’s phone having both a key and an “attestation”—basically a signed document that indicates the owner of the car vouches for them as a valid user of the car. If you lose your phone and put it in “lost mode”, the keys are suspended temporarily, and you can revoke a key from your iPhone or from the car’s own interface.
Depending on the implementation by the car maker, keys can potentially be limited. For example, you could let your kids drive the car, but not exceed a certain speed. But that’s all on a per-car basis, and many cars probably won’t provide that level of granularity.
Car Keys are stored in the Wallet app, and since they’re part of Apple’s enhanced contactless protocol (the same one used on some transit systems), you don’t need to authenticate to use them. Tap, and it works. (This also means that while your automaker may want you to download their app, it won’t actually be necessary for Car Key—all the important data is in Wallet.)
This is turning into the cycle of sweeping out old design notions that Apple’s now regretting. For Apple Watch, it’s the entire concept of Force Touch—pressing down hard on the watch in order to generate a contextual menu. Apple has decided that it’s too hidden a gesture to be of much use—and presumably also engineered the next Apple Watch to eliminate the feature in order to save space, just as it did with the iPhone 11 series.
So it’s out with Force Touch and in with more hierarchical navigation elements at the top of the screen, buttons at the top and bottom of menus, and swipe actions (where you swipe on an element to reveal a delete button, for example)—common on the iPhone and iPad, less so up until now on the Apple Watch. You’ll also see more floating buttons, indicating that you can tap to see more options.
It seems like after five years, Apple is ready to throw a bunch of Apple Watch interface assumptions in the bin and double down on the ones that actually work.
- I believe the iMac Pro and Mac Pro are already using this feature via their Apple-designed T2 chips, because Intel’s Xeon chips lack some built-in video encode/decode features. ↩