Support this SiteBecome a Six Colors member and get access to an exclusive weekly podcast, community, newsletter and more.
By Six Colors Staff
June 26, 2020 2:10 PM PT
WWDC 2020 Friday: Session Impressions
And so WWDC comes to a close for another year. We have to admit: We’ve probably watched more session videos this year than all of the prior years put together. The virtual format has been a real change, but a lot of the way the event has adapted is to the positive, including the ability to make all of this information easily accessible to anybody who’s interested. So, with that said, let’s wrap up a few last videos.
I love AutoFill—it’s one of my favorite features. On the Mac, I don’t mind typing things, but on the iPhone, having forms automatically filled out for you can be a huge time saver.
But, as it turns out, there are other benefits too. For example, privacy. As keyboard engineer Zeheng Chen points out, when you have an app where you want to send something to one of your contacts, you might choose to use a contact picker UI instead of granting access to your contacts, in order to minimize the amount of information that the app can see. But a contact picker UI might be slower than autofilling a contact address as you start to type it, and the autofill option still prevents the app from getting any data but what you type into the field. Plus, developers don’t have to create a user interface, since it’s already built-in.
Starting in macOS Big Sur, AppKit—the native APIs for building Mac apps—will now have access to AutoFill. And third-party password management apps will be available as sources for AutoFill, as they have been on iOS and iPadOS, making those programs even more useful.
Me, I’m all for less typing! —Dan Moren
Apple’s Shortcuts utility started its life as a third-party app called Workflow that offered an Apple Watch app. It’s taken awhile, but Apple Watch support has now arrived in Shortcuts. Not only is there a Shortcuts app on the watch, but you can launch specific shortcuts directly from watch-face complication slots. (To designate a specific shortcut for sync with the Apple Watch, you mark it as such within the Shortcuts app on your iPhone.)
Depending on how developers implement their Shortcuts support, shortcuts may run specifically on the Apple Watch without ever going back to the iPhone. But to do this, the parent app must have its associated Apple Watch app installed, too. While this is the ideal experience, there’s also the capability for a shortcut on Apple Watch to phone home to the iPhone and run the necessary automation there. It’ll just be a lot slower. —Jason Snell
Also starting in Big Sur, Mac apps will for the first time get access to ReplayKit, Apple’s API for recording, capturing, and broadcasting content from within your app. So if you’ve got, say, a Mac game, you can automatically build in features to not only do screen recordings, but also to broadcast your gameplay—and even provide an in-app editor or add overlays and other effects.
ReplayKit has been available on iOS and iPadOS for a couple years, but it’s a great addition to the Mac. Game streaming has become more and more common, and in our current world environment, there are lots of other instances in which being able to record and broadcast your app may be useful, such as during meetings.
In terms of gaming, Apple has also added support for triggering a screen recording via a button on an external PlayStation or Xbox controller, which is handy for when you want to record something that just happened without having to switch to a different input device. (On their respective game consoles, this functionality is generally accessed via controllers.)
I’m curious to see what the limits of ReplayKit are on the Mac. I’ve started spending more time streaming video for various types of entertainment, such as the shows we do at Total Party Kill and our occasional Jackbox games at The Incomparable. I’m not sure how applicable these features will be, but it certainly seems like it could simplify matters. —DM
Widgets are the big story in iOS 14, and this session details the tools developers can use to design great widgets for their apps. It focuses primarily on the decisions Apple made in creating its own widgets for iOS 14.
Widgets can adjust their display based on context. For example, the Weather widget might normally show an extended forecast, but if there’s precipitation in the area it can shift to a precipitation forecast, showing you when the rain’s expected to start or stop. The Maps widget has spatial awareness, noticing when you’re not home and offering up the travel time to get back there.
Editing widgets is adorable, and takes a page out of the old Mac Dashboard manual: You tap and hold on a widget while in jiggle mode, choose Edit, and then the whole thing flips over to present a settings interface that’s stored on the reverse side of the widget.
Users can also add multiple copies of a single widget, too—for example, you can create Weather widgets for different locations, and display them side by side—or create a Widget Stack and then flip between them. —JS
Last year Apple introduced SF Symbols, a library of more than 1500 icons meant to be used to unify the iconography of apps running on iOS and iPadOS. Developers can use SF Symbols to ensure that their toolbars and menus feel very much like they’re part of a unified system design.
This year, Apple has added more than 750 new symbols to the library, and the entire SF Symbols collection is also available on macOS Big Sur, bringing the visual design of all of Apple’s platforms closer together. There are also more localized symbols, so the iconography an app uses can shift based on what country or language preferences a user has.
SF Symbols also has support for multicolor variants of its symbols, for cases where a monochrome appearance isn’t ideal. For example, weather-related symbols showing a shining sun could be displayed in a weather app with the sun colored yellow. —JS
You’ll be able to use the Health app to assign shortcuts to the new Wind Down feature that gets you ready to go sleep. Apps can register certain types of behavior to allow suggestions of their shortcuts—so, for example, a meditation app could suggest a shortcut, or a journaling app. There will be a smart Sleep Mode collection in the Shortcuts app.
While you can now share watchfaces via your Apple Watch, iPhone, or website, Nike and Hermes faces won’t work except on those specific types of hardware. —DM
By Jason Snell
June 25, 2020 4:25 PM PT
WWDC 2020 Thursday: Giving a few pointers
If you’ve ever wanted a longform explanation of how Apple built pointer support into iPadOS, and the challenges involved in re-inventing the pointer interface for a device that’s primarily touch oriented, Design for the iPadOS pointer is the session for you.
The Mac’s arrow pointer was designed for pixel-level precision, but of course in most contexts iPad software was designed for fingertip-level precision—in other words, a lot less of it. This is why the iPad’s default pointer is a fingertip-sized circle, because that’s the level of precision that most apps expect.
However, in some contexts, pixel-perfect precision can be just what is required. So the iPadOS designers focused on a pointer with “adaptive precision,” that could switch contexts (and shapes) to become more precise when necessary. The obvious example is in text editing, where the iPad’s beam cursor is extremely precise horizontally (to allow you to select exactly the characters you want, or place that insertion point right in the middle of a word), while being quite imprecise vertically (it snaps to each line of text). In a calendar app, the pointer can adapt again, snapping in 15-minute increments to indicate that by default, the app assumes calendar events don’t begin at odd times.
Precision is also the reason that iPad pointers morph into shapes when they’re selecting individual buttons1. This way, it’s crystal clear which button you’ve currently got selected. If you were using a more precise pointer, you might find yourself right between buttons and not know what would happen if you clicked. The iPad’s approach eliminates this as a possibility.
As you move the pointer, the system is making some guesses about what target you’re trying to reach. The Magnetism feature analyzes the direction in which you pushed the cursor and finds a nearby interface element that you were most likely targeting, and snaps the cursor to it. It’s a subtle thing that makes the system intuit what the user’s intent was, even if their finger swipe across the trackpad wouldn’t normally be quite enough to get there.
The session also picked up on a theme I’ve seen repeated several times this week regarding the iPad. We talk a lot about how the iPad can be used with touch, or an Apple Pencil, or a keyboard, or a pointing device. But this week we’ve been reminded, again and again, that you can also mix and match these input methods, and developers should remember that. Hold down a modifier key and tap with the Apple Pencil or your finger, and the right thing should happen.
After watching the session, I have to be honest: I fully expect Apple to bring an adapted version of the iPad’s approach to pointers to macOS in a future release. Now is not the time, because there is a level of precision assumed by most macOS apps that is way beyond what’s assumed by most iPad apps. But the Mac would absolutely benefit by a more adaptive pointing system than the one it’s currently got, which (let’s be honest) is largely unchanged since 1984. Maybe next year?
- The translucent shape is located behind the button icon, preventing the icon’s color from being distorted by a pointer overlay. A subtle touch. ↩
By Dan Moren
June 25, 2020 6:57 AM PT
WWDC 2020 Wednesday: Session Impressions, Part 2
The barrage of informative presentations continued on Wednesday, and among the most pressing question I had while watching them…what’s up with all the tchotchkes? Are they an attempt at giving each session a little personality? Could they be some sort of puzzle to decode? Some people have speculated they aren’t even real, but rather AR objects. The conspiracy theories abound!
Ahem. Anyway, here are a few interesting tidbits from a few of the talks I watched on Wednesday.
We’ve all gotten used to logging in to apps on our iPhones, iPads, and—in some cases—Macs using Touch ID and Face ID. So much so that when we visit a website and it asks us for our password, it feels extremely 20th century.
Well, good news: Apple is rolling out a way for websites to offer authentication via Face ID or Touch ID. What shouldn’t be a surprise is that on the user end the experience is almost identical to logging in to an app. Once a website offers Face ID/Touch ID as an option, you simply agree to use it, authenticate with the biometrics to prove your identity, and the next time you log in, you won’t have to enter your password.
An interesting tidbit I gleaned from this presentation: as part of the process, Apple has created its own attestation service. This optional feature provides an extra step of security in which the device maker can be queried to confirm details about the device requesting authentication. Such a process might be used by higher-security institutions like banks, which could be required to enforce multi-factor verification. So, for example, a site can check with Apple that the iPhone being used for Face ID authentication is really an iPhone and it actually supports Face ID authentication.
In traditional Apple fashion, the company has layered more privacy on top of this service. As WebKit Engineer Jiewen Tan points out, an attestation service could just provide the same certificate every time it’s queried, thus providing an opportunity for cross-site tracking by comparing those certificates. So, instead, Apple issues unique certificates every time, thus anonymizing the device.
Also, one final good tip I was glad to see from this presentation: Apple’s not advising sites to use Face ID and Touch ID as the only method for authentication, given that if you lose your device, you may be out of luck. So the password is probably here to stay for a bit longer.
Apple Pay has obviously become a big part of our lives, and especially in this day and age, contactless payments have become even more popular.
While Apple Pay enhancements weren’t something really touched upon in the keynote, there is at least one very significant improvement this year: the ability to use Apple Pay in both Catalyst apps and native Mac apps.
It actually kind of surprised me to see this brought up, because I hadn’t really thought about Mac apps not having Apple Pay. Heck, I’ve done Apple Pay transactions on my Mac—but they were all via Safari. Going forward, however, any app that’s on the Mac will be able to integrate Apple Pay payments. (That makes sense especially given that upcoming Apple Silicon Macs will run iOS and iPadOS apps, which themselves will be able to take Apple Pay.)
There are a few other improvements coming as well. To anyone who has ever been frustrated by having an Apple Pay transaction fail because of an error in your address or phone number, Apple will now impose standardized formatting of contact information, and will validate the data prior to the transaction.
Finally, in order to provide a better onboarding experience for people adding cards, the Wallet app will now support an extension for issuer apps. So, if you have your bank’s app installed, the option to add your card can appear directly in Wallet.
Oh, and perhaps most excitingly, developers can now alter the corner radius of the Apple Pay button, all the way from rectangles to a pill-like shape. Let’s hear it for button configuration!
I would not have guessed that Game Center would see a substantial overhaul in the upcoming platform updates, but I’m delighted to see that it’s getting a lot of attention.
Chief among the updates, Apple’s rolling out a new Dashboard feature that feels much more like what you might see on a game console. It collects a variety of information into one location, including your profile, your friends, your achievements, and your leaderboards.
To access the dashboard there’s a new, appropriately named Access Point that can games can integrate, which shows up as a little picture of your Game Center avatar on the menu screen of a game. (It can also optionally show other highlights, like your achievements or current leaderboard status.)
A big part of this Game Center update is pervasiveness. You can now access your profile in a variety of places, including in-game and even in the App Store, instead of having to dig into a dedicated app or the system-level Settings.
There are also new UIs for real-time and turn-based multiplayer that app developers can integrate, or roll their own, which not only let you add friends to games but nearby players as well.
Apple’s also beefed up offerings like achievements, which can now show your progress towards attaining the goal, and leaderboards, which can be configured to be recurring, resetting after a specific time period or when a particular score is reached.
Players also have more control over the visibility of their profiles, whether they’re available to everyone, just friends, or just private. And the App Store now shows you what games your friends are playing as suggestions, as well as letting you directly access their player profiles from the store.
Mostly, though, I just want to play the “The Coast”, the demo game the presenters were using. I don’t know if it’s real, but a game about being a lighthouse keeper trying to safely see cargo ships through a treacherous body of water? Sign me up.
All in all, these updates to Game Center seem like a great comeback for a feature that seemed to have been left for dead a few years back. I guess you could call it quite the…achievement.1
- I’ll see myself out. ↩
[Dan Moren is the official Dan of Six Colors. You can find him on Twitter at @dmoren or reach him by email at firstname.lastname@example.org. His latest novel, The Aleph Extraction, is out now and available in fine book stores everywhere, so be sure to pick up a copy.]
By Jason Snell
June 24, 2020 9:44 PM PT
WWDC 2020 Wednesday: Session Impressions
How is it only Wednesday? It turns out that the WWDC time warp even happens when all of us are at our homes. Anyway, I watched a bunch more WWDC sessions today, and here are some observations from today’s binge. (Side note: Did Apple provide hairstylists for all of their presenters? Lucky ducks.)
In iOS and iPadOS 14, Apple is rethinking some of its previous design decisions—and in doing so, it’s introducing interface elements that might seem a bit more familiar to Mac users. This is, at least in part, because a lot of those decisions were made when iPhones were really small—and they’re not anymore.
So Apple’s rolling out drop-down menus that appear next to where you tapped to bring them up (often a round button with three ellipsis dots inside), rather than sliding up a modal list at the bottom of the screen. That way, your finger doesn’t have to move as far to finish the thought and complete an action. These menus behave very much like Mac menus do: if you tap and hold, and slide your finger, then lifting your finger will select an item. If you tap and lift your finger, the menu remains open until you tap on an item within the menu. Tapping outside of the menu dismisses it. The menu items themselves are also a lot more compact than the old slide-up options were, and feature not just text, but icons.
These menus can also be used to ask the user for more specific information. For example, the plus icon in Photos means to add something, obviously—but if you tap on it, you’ll be asked specifically what you want to add, in the form of a menu. When you tap to add an image in notes, a menu appears so you can choose exactly what kind of image you want to add. They can also be used for navigation: In Safari, tapping and hold the back button will reveal a list of previously browsed pages, and in iOS 14 this uses the new menu design.
One of Apple’s big goals is to reduce the density of elements on the visible part of the app interface by hiding those items—generally actions that must be offered but aren’t important enough to be displayed prominently—in a menu hidden behind one of those white three-dot more buttons.
Speaking of design decisions that haven’t worn that well, you won’t have the iOS spinning date and time picker to kick around anymore—or at least not nearly as much as you have up until now. The wheels have been replaced by new pickers that display a calendar with a month worth of dates. Tap to select a different month or year, and yes, the wheels will reappear—until you choose the new month and year, at which point the month view will return. Entering a time doesn’t require you to spin your wheels at all—you just type it in.
Finally, rejoice at the sight of the first unified color picker for iOS. You can choose a few different color-picking methods, sample images right from elsewhere on your device’s screen, and save colors to a palette that is consistent across all apps on your device.
Of course, iOS apps will need to be updated to take advantages of these new features. Users should expect to see them begin to appear when iOS 14 ships this fall.
I covered a lot of this interesting session about what Macs will look like when they’re running Apple-designed processors in an earlier piece, but the session was so jam-packed that I focused on the new boot system and left the rest of it on the cutting room floor.
Among the additional benefits of switching to an Apple-designed System on a Chip (SoC) is a unified memory architecture, shared across the CPU and the GPU. This means that graphics resources can be shared without any overhead—there’s no need to copy anything across the PCIe bus, because the CPU and GPU are pulling from the same memory. The SoC also picks up a bunch of other features that have been around on iPad and iPhone for a while, but will be new to the Mac: dedicated video encoding and decoding1, and support for fast machine-learning via the Neural Engine.
One of the biggest changes in the new Mac architecture, though, is asymmetric multiprocessing, or AMP. Mac software developers will need to set a “quality of service” property for the work that they’re dispatching to the processors, suggesting how that work should be prioritized. Does it need to be done as fast as possible, or is it okay to slow it down and keep things power efficient? Modern Apple-designed processors have separate performance-focused and efficiency-focused cores, so different cores will be better for different jobs.
The session provided a bit more detail about how Rosetta, the technology that translates code meant for Intel processors into instructions that Apple’s processors will understand, works. When you download an Intel-only app from the Mac App Store or install it using Apple’s Installer utility, Rosetta will automatically be triggered and will do the work up front of translating the app’s code. If the app gets on your system by a different means, the translation happens when you first launch the app—which means it’ll launch slowly the first time. (Also, operating-system updates can affect Rosetta, so Rosetta’s translations will be refreshed when the operating system is updated.)
And just as Macs with T2 processors have all had always-on encryption of their disks, so too will Macs with Apple-designed processors. But there is one added security bonus: secure hibernation. When one of these Macs goes into a deep sleep, all the contents of memory—not just disk—are protected.
Some apps are nosy. We all know it. No matter what App Store rules exist, no matter how many scandals emerge from apps abusing user data, there are still places where your personal information can leak out and an unscrupulous app can do something with it without us knowing.
Apple knows it, too, and it keeps tightening the screws where it can. This year it’s making a big move when it comes to access to your photo library. The new Photos Picker interface is meant to be used by most apps who need you to pick a photo or three from your library for use within the app. It runs in an entire separate process, and the app requesting can’t see anything about your photo library. While you’re in the photo picker, you can select multiple photos and even search your library. When you’ve selected what you want, then those items are passed to the app—and nothing more. A sneaky app can’t even take a screen shot of the contents of that picker and use that later. Sneaky.
If an app really does need access to the photo library, there’s a new set of permissions for that. Apple’s introducing a new “limited mode.” When an app asks a user for permission to read the photo library, you can choose full access or a limited mode—where you pick the photos you want to share, and that’s it. Those photos are all the app can see, until you go to the security settings and make a change.
Or as the presenter in one of these sessions put it to developers, “Consider if your app even needs library access.” The fact is, most apps don’t. The new photos picker is functional enough to do the job—and keep an app from being able to snoop on every single item in your library.
It’s only a matter of time before our phones replace car keys. Given the pace in automotive innovation and the rate at which I replace my cars, it will probably be a while for me—but I’ll be happy when it finally happens.
With its announcement of Car Keys, Apple is now on the path—that leads to a parking lot where you can unlock your car with an iPhone. This WWDC session was directed at auto manufacturers, but I found it pretty interesting in terms of some details of how Car Key works.
First off, it’s meant to be a radio-technology-agnostic technology. The first cars to use this tech, as announced, will use the same NFC technology you find in Apple Pay. This requires you to get very close to the reader—essentially, you’ll need to tap your phone on the door to open it. The NFC implementation actually requires two separate NFC readers—one to unlock the door, and one to start the car. The car will only start when a phone containing CarKey is laying on the NFC reader in the dash.
Things get more interesting with the second wave of Car Key tech, which uses Ultra Wideband and the U1 chip introduced in the iPhone 11. Ultra Wideband will be the best possible solution for using your phone as a car key, because its precise positioning capability and solid range will allow you to use it with your iPhone in a bag or pocket. The current Car Connectivity Consortium standard, version 2.0, covers NFC—which is why it’ll be first. But version 3.0 is on the horizon, and it’s the one that brings in the extra flexibility of Ultra Wideband.
Behind the scenes, CarKey works through a complex series of cryptographic transactions that authenticate your ownership of the car, allow the iPhone to securely pair with your car, and then allow you as the owner to distribute keys to other people. The initial setup requires Internet connectivity to the back-end systems of the car dealer and to Apple, but once a key is set up on your phone, no connectivity is required and Apple has no awareness of how you use your key—it’s all stored in the iPhone’s Secure Element, locked up tight. (And it works even if your phone runs out of battery, because it works with the trace amount of power left in the battery in Power Reserve mode.)
To share a key with a friend, you use Messages to send them a pass. From the perspective of the people making the key exchange, it’s a simple transaction. Behind the scenes, however, the two phones are doing a careful cryptographic dance that ends up with the friend’s phone having both a key and an “attestation”—basically a signed document that indicates the owner of the car vouches for them as a valid user of the car. If you lose your phone and put it in “lost mode”, the keys are suspended temporarily, and you can revoke a key from your iPhone or from the car’s own interface.
Depending on the implementation by the car maker, keys can potentially be limited. For example, you could let your kids drive the car, but not exceed a certain speed. But that’s all on a per-car basis, and many cars probably won’t provide that level of granularity.
Car Keys are stored in the Wallet app, and since they’re part of Apple’s enhanced contactless protocol (the same one used on some transit systems), you don’t need to authenticate to use them. Tap, and it works. (This also means that while your automaker may want you to download their app, it won’t actually be necessary for Car Key—all the important data is in Wallet.)
This is turning into the cycle of sweeping out old design notions that Apple’s now regretting. For Apple Watch, it’s the entire concept of Force Touch—pressing down hard on the watch in order to generate a contextual menu. Apple has decided that it’s too hidden a gesture to be of much use—and presumably also engineered the next Apple Watch to eliminate the feature in order to save space, just as it did with the iPhone 11 series.
So it’s out with Force Touch and in with more hierarchical navigation elements at the top of the screen, buttons at the top and bottom of menus, and swipe actions (where you swipe on an element to reveal a delete button, for example)—common on the iPhone and iPad, less so up until now on the Apple Watch. You’ll also see more floating buttons, indicating that you can tap to see more options.
It seems like after five years, Apple is ready to throw a bunch of Apple Watch interface assumptions in the bin and double down on the ones that actually work.
- I believe the iMac Pro and Mac Pro are already using this feature via their Apple-designed T2 chips, because Intel’s Xeon chips lack some built-in video encode/decode features. ↩
By Jason Snell
June 24, 2020 12:59 PM PT
Macs with Apple silicon will get new, refined boot and recovery mode
Doing unusual things at Mac startup has long required remembering keyboard shortcuts. Is it Command-Control-P-R or Command-Option-P-R that zaps the PRAM? Is that still even a thing? Is it Command-S for Recovery Mode—or wait, that’s Single User Mode, it’s Command-R for Recovery mode, Command-T for Target Disk Mode, Option to choose a startup disk.
With the advent of Macs running Apple-designed processors, things will get a whole lot simpler. As described Wednesday in the WWDC session Explore the New System Architecture of Apple Silicon Macs, these new Macs will only require you to remember a single button: Power. (On laptops, that’ll be the Touch ID button. On desktops, presumably it’s the physical power button.)
Holding down that button at startup will bring up an entirely new macOS Recovery options screen. From here you’ll be able to fix a broken Mac boot drive, alter security settings, share your Mac’s disk with another computer, choose a startup disk, and pretty much everything else you used to have to remember keyboard shortcuts to do.
Now that Apple is holding all the cards, the company has built a new boot process, based on iOS’s existing secure boot process, but modified to support those features that Mac users expect, such as different macOS boot drives, multiple versions of the operating system, and macOS Recovery itself.
On these new Macs, Target Disk Mode will be retired in favor of Mac Sharing Mode. Rather than turning your Mac into a disk, the new Mac Sharing Mode will turn your Mac into an SMB file server. As with most of the features of Mac Recovery, you’ll need to authenticate yourself before turning on Mac Sharing Mode.
These Macs will also have a little more granularity when it comes to boot security. Each startup volume can be set to a different security mode, either full security (which is the default) or reduced security. This means that external disks will be able to be booted from without downgrading security.
In reduced security mode, you can boot any supported version of macOS, even if Apple’s no longer signing it. And if an app or accessory you rely on uses a third-party kernel extension to enable functionality, you’ll need to use this mode.
For a while now, Macs have been able to recover from disasters by booting to the hidden System Recovery partition. When even that partition is gone, Intel Macs fall back to Internet Recovery. Macs with Apple-built processors will have access to different hidden area, System Recovery, which offers a very minimal version of macOS that will allow you to reinstall both macOS Recovery and macOS itself. (If System Recovery is also unavailable, it’ll be time to attach the Mac to another device running the Apple Configurator app to bring it back to life.)
Once the booting is complete, there’s also a new login window that’s more capable, because the system can fully boot before the user even presents their login credentials. It includes built-in support for smartcard authentication and supports VoiceOver as well.
By Dan Moren
June 24, 2020 7:24 AM PT
WWDC 2020 Tuesday: Session Impressions, Part 2
One of the best things about virtual WWDC is that sessions no longer feel the need to expand their talks to fit into a certain time slot—nor does the number need to reflect available physical space. So this year, Apple seems to have provided even more sessions than usual, but many are in bite-sized chunks, and more than a few are ten minutes and under. Here are a few talks that I found interesting on Tuesday of this big week.
What’s New in SiriKit and Shortcuts
The keynote spent some time talking about the refinements to Siri’s UI, and the revamp of the voice assistant ties nicely into some refinements to Shortcuts, which didn’t get much stage time this year.
Namely, Shortcuts gets the same compact UI that Siri now has in iOS 14 and iPad 14. Perhaps the best part of the experience is that shortcuts no longer need to open the Shortcuts app to continue their task now. Any Shortcuts user can tell you that its previous experience of launching the app and stepping through the various actions within it was fine, but not particularly elegant.
Apple’s also increased the capabilities in this new compact UI. So, for example, a disambiguation interface—asking the user to pick an item from a list, for example—can now also provide images and subtitles in addition to the main option. That way you can be sure you’re picking the right item, such as the soup you want.
While we know that Shortcuts get folders in iOS 14, the app also includes a couple of prebuilt Smart Folders showing which shortcuts are available in the Share Sheet or on the Apple Watch. It doesn’t look like users will necessarily be able to create their own Smart Folders, though—or at least, not yet.
There have also been improvements to the Automation abilities of Shortcuts, including more trigger types like receiving an email or text message, the battery hitting a certain level, connecting to a charger, and closing an app. Plus, more of those trigger types can run automatically, requiring no prompt from the user. And the Gallery part of shortcuts will offer suggestions for automations in addition to standard shortcuts.
iOS 14’s new Wind Down feature will also provide a lot of opportunities for shortcuts as we learn more about it.
All of these features point towards Shortcuts becoming an increasingly integral part of Apple’s iOS and iPadOS platforms, and these improvements will go a long way toward making it even more approachable for many users.
Meet Safari Web Extensions
But Apple’s approaching this in an unsurprisingly Apple-like fashion. If you want to distribute a web extension, it’s got to be wrapped in a native Mac application designed in Xcode. Installing the app from the app store will also install the web extension.
The good news is: for developers with existing web extension written for another browser like Chrome or Firefox, Apple provides a command line tool to convert the extension into an Xcode project. That handles a lot of the work for the developer, creating essentially a barebones wrapper app—little more than a window with a button to open Safari preferences. (For testing, it uses an ad-hoc signing process, meaning you have to enable unsigned extensions in Safari’s Develop menu.) This is a hugely exciting move, since Chrome and other browsers have huge libraries of powerful extensions, and in theory not only will these be open to porting to Safari, but it could help bolster future development of extensions that work on all browsers.
One thing that I started to wonder about during this talk was the platform availability. All the demos revolved around Safari web extensions available on the Mac, though the talk makes a point of not targeting user agent strings (information a browser reports about its own version and what system it’s running on), but rather asking about features that might be present. Given that other types of Safari extensions are available on iOS/iPadOS (namely content blockers), might web extensions eventually make their way to Apple’s mobile platforms as well? One can hope.
iPad and iPhone Apps on Apple Silicon Macs
There was perhaps no more surprising announcement during Monday’s keynote than the ability to run iPhone and iPad apps, unmodified, on the upcoming Apple Silicon Macs. In retrospect, given that these new Macs will run on the same processor architecture as Apple’s mobile devices, perhaps that shouldn’t have been a surprise.
But the ease of it is still kind of mind-blowing. iPad and iPhone apps will run on these Macs without even the need for recompiling. To do so, they’ll leverage the same technology that powers Mac Catalyst: the native Mac version of iOS’s UIKit.
The big question has been: if apps from iOS will run on macOS so easily, why even bother with converting an app using Mac Catalyst? Well, the latter still offers some benefits: for one thing, since iOS and iPadOS apps will run with no changes, you won’t really be able to customize their behavior on a Mac, meaning that still ultimately feel like iOS apps. For another, these apps only run on Apple Silicon Macs, so if you want to support Intel Macs—and the vast majority of Macs in existence will continue to be Intel Macs for some time yet—then Mac Catalyst is your only option.
But, if your iOS app is compatible with Apple Silicon Macs—which means it doesn’t require frameworks, functionality, or hardware that is missing on the Mac—it will automatically be available on the Mac. (However, developers can still control which of their apps are actually on the store.)
To ensure compatibility with the Mac platform, iOS app developers need to account for certain differences in hardware, UI, and system software. That includes everything from including hardware keyboard support (which will make the app a better iPadOS citizen as well) to supporting the Split View in multitasking on iPadOS, which will automatically give your app a resizable window on the Mac. (iPhone apps, however, are not resizable when run on the Mac.) It also means using the right APIs for accessing things like the file system, relying on features like Core Location (rather than specifically GPS, which Macs obviously don’t have), and not hardcoding the types of devices you expect the app to run on, i.e. detecting if this is an “iPad” or an “iPod touch.”
When it comes to distribution, making an iOS app available on the Mac is basically exactly the same as doing so on iOS, right down to the app thinning methods that deliver only the code needed for that platform. “In fact,” says Patrick Heynen of the Frameworks team, “for app thinning, a new Mac looks just like any other very capable iPad.”
The only real difference is that the Mac doesn’t support TestFlight for testing distribution…though I have to imagine that it might be on its way at some point.
There is something very important that Nahir Khan, a manager on the iOS System Experience team, needs you to know about widgets: they are not mini-apps. That means no controls or buttons or sliders. (Sorry, PCalc.)
Widgets have a very specific use case, defined by three characteristics: glanceable, relevant, and personalized. If you think that sounds familiar, you’re right—they’re the same qualities emphasized by complications on the Apple Watch, and WidgetKit even takes some of its cues from that system. And, as on the Watch, users want their information to be up-to-date and available, since, as Khan cites, people visit their iPhone home screens more than 90 times a day and seeing a bunch of loading spinners every time you go to your homescreen is definitely not a great experience.
Hence, the widget team has leveraged a similar technology from the Watch complications, building widgets using timelines. Apps can preload the widget with the information the should display at certain times. For example, if your calendar knows you have events at 9am, 9:30am, and 10am, it can give all that information to the widget, which automatically knows what information to display at the current time. If the information in the app changes—whether because it gets a background notification updating it, or a user makes a change—it can tell the widget to reload that information, so it remains up to date.
What’s cool about widgets is that they work across all different devices, are often available in a variety of sizes, and, because they’re built on the same Intents framework used for Siri and Shortcuts, they can handle configuration automatically. For example, when you go to configure the Stocks widget, it can provide a list of the symbols in the Stocks app—but, if you need to find a stock that’s not already in the Stocks app, it can let the user search and then use the Intents framework to go and search for the right symbol. Basically, this helps make it easier for app developers to add widgets without having to redo all the work they did in their app.
Even though widgets may not be mini-apps, they do support deep-linking into a certain part of an app. For example, tapping on an album in the Music widget can take you to a view of that album in the app. And is it possible that interactivity might be available in the future? I wouldn’t rule it out, though it seems as though Apple is happy to keep these as mainly readable experiences for the time being.
Widgets have, of course, been around for a long time on the Mac, where they used to live in Dashboard; more recently, they’ve provided some more minor functionality in the Today view on Apple’s platforms. But with WidgetKit and the changes in iOS 14, it seems clear that they’re about to be catapulted into prime time with their appearance on the iPhone home screen.
[Dan Moren is the official Dan of Six Colors. You can find him on Twitter at @dmoren or reach him by email at email@example.com. His latest novel, The Aleph Extraction, is out now and available in fine book stores everywhere, so be sure to pick up a copy.]
By Jason Snell
June 23, 2020 5:46 PM PT
WWDC 2020 Tuesday: Session Impressions
We used to have to run from room to room. These days we can sit on the couch and watch whatever WWDC sessions we want. Here are some interesting tidbits gleaned from a collection of sessions that were posted on Tuesday.
While many developers can get their apps built for Apple processors quickly, among the limitations will be precompiled software contained within the app that isn’t built for Apple’s processors. Apps can’t mix and match Rosetta and native code; if developers are using precompiled Intel software, they’ll need to contact the author and ask them to provide a new version compiled for Apple’s processors. Developers running on Macs with Apple-built processors can actually build and run their apps using Rosetta so they can get some idea how they will run on Intel processors.
More broadly, Apple’s new Macs will be using the same sort of multi-core layout that we currently see on the iPhone and iPad, where there are two different types of processor cores. One set of cores is focused on performance, while the other set is focused on energy efficiency. The system will dynamically choose which cores are used, just as on iPad and iPhone. Apple cautions developers to not be surprised to see their software running mostly on one set of cores or the other—it all depends on the current needs of the system. That will be a major change from current Mac multiprocessing, in which each core is more or less the same, and work is generally spread across all the cores.
I got a chuckle at the fact that, to target code specifically for the Mac, you used to be able to target Intel x86_64, but now you need to be more broad and just say “os_osx.” Of course, Apple changed the name to macOS two years ago and it’s not even version 10 anymore, but the behind-the-scenes stuff rarely matches the names used in marketing. (Developers will target Macs with Apple silicon by targeting the arm64 architecture, the exact same name as the target would be on iOS.)
If you want to see what kind of apps you’re running on a Mac with Apple silicon, just open the Activity Monitor utility, which will display every running process and list whether it’s running on Apple or Intel architectures. And just as during the transition to Intel processors, you can choose to force an app to run in Rosetta rather than natively by selecting the app, choosing Get Info, and checking the Use Rosetta checkbox.
Last year developers were a little concerned about just how much Apple was going to put into continuing to develop Mac Catalyst, since the company talked so little about it during WWDC. This year, though, Apple has rolled out numerous Mac Catalyst updates—probably aided by the fact that two stock Mac apps, Messages and Maps, are now implemented using Mac Catalyst.
Mac Catalyst now supports dozens of additional iOS frameworks, including some that you wouldn’t expect and that aren’t actually on macOS, like ARKit. No, the Mac is probably not gaining augmented reality support anytime soon, but by including ARKit, Apple allows app developers to simply use the normal checks they’d make to see if an iOS device supports ARKit—and then move on when it turns out the current device doesn’t—rather than having to block out that entire segment of code because the Mac doesn’t know what ARKit is.
This spring, iOS got an update that provided much richer support for handling changes in keyboard input, and that support has now been rolled into Mac Catalyst, as well. The app now supports the Mac’s color picker, and proper checkboxes, and (rejoice!) the date picker now uses a proper Mac date picker rather than the iPhone’s spinning wheel. (I believe the iPad also gets this feature, once again showing how much the iPad and Mac have in common these days.)
Mac Catalyst under macOS Big Sur also supports regular Mac pull-down menus, and popovers that are geometrically “real”—in that they can expand outside the bounds of the main app window. (Constraining everything to inside the app window was always a tell that an app came from the iPad.) Three-column views are also supported, and the use of SF Symbols for iconography (since that icon set is moving to the Mac anyway in macOS Big Sur).
Catalyst apps will also look clearer, with sharper text, because developers can now opt to run them at 100% scale, rather than the scaled-down version that was previously mandatory in order to avoid some interface layout issues.
Photo editing extensions on iPad can now be brought over to the Mac via Catalyst, and will work with Mac apps that support those extensions, most notably Photos. Widgets should also “just work”, appearing in the new Notification Center widget view just as they would on iOS.
macOS Big Sur sure looks different. This session goes through the details. Sidebars are now “full height”, meaning they extend into what used to be the title bar of windows. SFSymbols iconography is everywhere, with icons in sidebars boldly colored. Apps can set the color of those icons themselves, setting the mood themselves—but can opt to allow users to override that color with their own preferred interface accent color. (They can also opt to ignore user preference, as Apple Mail is doing with its VIP items, which also display as yellow.)
The title bar has basically merged with the toolbar, in the new “unified” toolbar style, in which the window title is left justified and inline among the bold SFSymbols toolbar icon. (There’s also room for a subtitle just below the title, which can provide secondary information about the window you’re looking at. Apple Mail uses this subtitle to display how many unread messages you have, for instance.)
There’s also a new unified compact window layout, which is shorter, and is meant to be used when the toolbar is sparse and the focus is on window content. There’s also still support for “expanded” toolbars, which is basically how all Mac windows used to work—big title bar, lots of big buttons, labels under the buttons, the works.
The humble slider control, which used to be a pointed polygonal block sliding along a track with tick marks below it, has been redrawn in Big Sur. It’s now an oval sliding over a track with tick marks on it.
App Clips are new mini-apps that quickly launch when you scan a code, tap an NFC sticker, or tap on a link. The method by which App Clips are launched is actually a special URL that Apple is calling an “App Clip experience.” These URLs are parsed by the system and then handled by an app clip rather than being routed to the web browser, in a fashion similar to how Universal Links work on iOS. Apple is launching its own “App Clip Code” format later this year, which seems to be a product that is both an NFC sticker and a visual code, so you can tap them or scan them to activate an app clip.
As for App Clips themselves, there’s “no magic” here—developers will need to create a second mini application within xcode, a standalone app within their existing app project that’s dedicated to just handling App Clip experiences. An App Clip app can’t exist without a corresponding regular App, and they have to be submitted to App Store review together.
However, the two apps are actually mutually exclusive. The App Clip only gets downloaded and launched if you don’t have the corresponding app installed—otherwise, the URL just links to that functionality in the existing app. If you use an App Clip and decide you want to download the full app, the App Clip goes away—though it can pass along some of its data to be imported into the full app.
App Clips are meant to be ephemeral, especially if you don’t use them. After a period of inactivity, App Clips and their data will be automatically deleted by the operating system, and their data doesn’t get backed up, either. However, if you use an App Clip regularly, its life will be extended—and if you use it regularly enough, the system may never delete it at all.
Last fall I wrote about Ultra Wideband, the technology inside Apple’s U1 Chip, found in the iPhone 11 series. It provides the ability to precisely locate other devices with Ultra Wideband chips, and up to now Apple has only rolled out a single feature that uses that chip: an arrow pointing at someone else’s phone while using AirDrop.
With iOS 14, Apple’s opening up access to the U1 chip to app developers, providing potentially interesting uses of apps that are spatially aware. Apps that want to use this feature have to ask permission, and all devices participating have to agree. Then the apps have access to a stream of distance and direction data for whatever devices they’re tracking—keeping in mind, only devices with the U1 are currently supported.
The tracking comes courtesy of a temporary discovery token that’s randomly generated to ensure privacy, and it’s discarded at the end of the session, which is why you can only grant an app one-time permission to track another device.
Some other tidbits from the session: iPhones should be held in portrait orientation for optimal tracking use. (It makes me wonder if the reason the iPad doesn’t have a U1 chip is because of the complexity of that device being used in both portrait and landscape mode far more than the iPhone.) There’s also a “field of view” for device tracking that’s roughly analogous to the field of view of the iPhone 11’s ultrawide camera. If a tracked device is outside that field of view, you’ll still get a distance measurement but won’t be able to get a direction.
Apple’s even built a simulator for developers to test these features. When you simulate two iPhones on screen, they’ll react to each other as you drag them closer to one another or further apart. Presumably developers will want the biggest screen possible in order to test this feature, so get those Pro Display XDR orders in now.
This was a nice session about Apple’s current thinking regarding iPad app design. “Just because the iPad is in the middle, that doesn’t mean it’s just halfway between the Mac and the iPhone,” says presenter Grant Paul of the Apple Design Team. “A great iPad app, you need to be just as dedicated to what’s great and special about the iPad” as the iPhone and the Mac.
For iPadOS 14, there’s been a lot of focus on sidebars—which are very helpful, as Mac users can attest. iPadOS 14 sidebars can contain items that users can reorder, drag in items as favorites, and the like. Handy. Sidebars can be toggled on or off by tapping the sidebar icon in the top left of the app, and can be brought back temporarily via a swipe from the left edge of the screen.
Another goal for iPadOS 14 design seems to be an increase in density. The Files app, for example, uses smaller icons than in iPadOS 13 in order to dramatically increase the number of items that can be viewed.
Apple’s designers also want to use the iPad’s extra screen space to improve informational context in apps. For example, you can now rename files inline in the Files app, rather than being taken to a modal rename screen. The Calendars menu in the Calendars app, which was previously a popover, now displays in the sidebar, allowing you to view changes in the calendar as you adjust which calendars are visible.
The iPad is a versatile device that can handle all different kinds of input methods, but Apple cautions developers to always start with multitouch. The iPad is, first and foremost, a touch tablet. But its many other input methods can be mixed and matched in interesting ways—for example, holding down the command key while tapping on a link in Safari, holding down shift while tapping on items to select them, or even drawing with an Apple Pencil while using a finger to adjust user interface elements in an animation app.
By Dan Moren
June 23, 2020 4:41 AM PT
WWDC 2020: The little stuff you might have missed
As always, Apple’s WWDC keynote was jam-packed with more jam than a packing plant full of jam. We all saw the big top line items: Macs switching to ARM, iOS apps running on the Mac, cats and dogs living together. But the sheer loudness of that news drowned a host of other little interesting tidbits that are significant in their own right. Here are just a few of the ones that caught my eye.
Tap tap revolution
— Ben Geskin (@BenGeskin) June 22, 2020
As a new Accessibility option, iOS 14 allows you to trigger actions by double- or triple-tapping the back of your phone. While this is definitely useful for people for whom the touch screen is unwieldy, it also has surprisingly big implications for all users, in that Apple hasn’t really allowed you to define systemwide actions before.
Federico Viticci reports that you can even use this action to trigger a custom shortcut, which is kind of wild. Given how many Apple features have showed up as Accessibility options before becoming more robust features—hello, cursor support—it makes me wonder if this might be a harbinger of some more in-depth systemwide capabilities in a future version of iOS.1
App Store review process, reviewed
Even after last week’s high profile clash between the developers of email app Hey and Apple’s App Store review process, there was skepticism from many quarters—including yours truly—that Apple would do anything to address the weaknesses in its relationship with developers. And while the company remained mum during its keynote, it did announce that it would be making a couple of high profile changes.
First, developers can now not only challenge whether or not their app actually violates an App Store guideline, but they can also challenge the guideline itself.2 Of course, this means absolutely nothing until we see how the process actually works, but the possibility of overturning an App Store guideline is wild. This is like Apple creating a Supreme Court of App Review. (Though let’s not be too altruistic: I’m sure Apple hopes that it will short circuit the current process of going public, thus potentially saving them the bad PR of having people yelling about the issues on Twitter.)
Second, Apple says it will no longer hold up bug fix updates because of App Store review guidelines. Frankly, this should have always been the case—not only was this a bad policy for developers, but it was actively user-hostile, forcing customers into two bad options: keep using apps with annoying or potentially serious bugs or stop using the app altogether, all because of bureaucratic guidelines.
While these are moves in the right direction, I doubt they will be enough to stave off the government inquiries that are already underway, nor earn back all the goodwill that the company has burned with developers.
The Force will not always be with you
Apple effectively killed 3D Touch last year by replacing it with the haptic touch feature in the iPhone 11 line, but it looks like its old-friend/the-same-exact-feature-with-a-different-name Force Touch is not long for the world either. MacRumors reports that watchOS 7 is shifting developers away from using the Force Touch interaction, in favor of exposing those features in other ways.
Gotta say, I’m not broken up about that. Force Touch was clever, but too often it concealed features that were not easily discoverable. Moreover, I’m sure the technology to detect those harder presses took up space in the Apple Watch that could have been used for other things—battery, for example, or just making the Watch slimmer.
My question is, now that Force Touch/3D Touch are gone on the Watch and on future iPhones, will it also quietly be phased out on the Mac? (That is, if anybody even remembers that it’s actually there in the first place…)
Never taken a Shortcut before?
Apple’s Shortcuts automation system got very little time on stage (what there was was mostly linked to the new Wind Down feature), but the app itself did get a number of useful enhancements, including folders, copy and pasting actions within shortcuts, new automation triggers, and running Shortcuts on the Apple Watch, including triggering them from complications on the watch face.
Shortcuts got some very cool updates in iOS/iPadOS 14:
– Disable confirmation for automations
– New compact UI for lists, input dialogs, running shortcuts in share sheet
– Automatic categories for share sheet/Watch
– Copy & paste actions (!)
– New automation triggers pic.twitter.com/8j4iZ0pyks
— Federico Viticci (@viticci) June 22, 2020
These changes make Shortcuts even more useful for its most devoted users. It warms the cockles of my heart to see Apple continue to invest in the power and utility of Shortcuts, even if it doesn’t give them much time in front of the keynote audience.
Other feature tidbits
iOS 14 has a very cool new Accessibility feature that can recognize certain sounds, like fire alarms and animals, and then provide notifications to users. This is potentially amazing—and heck, life-saving—for hard-of-hearing folks who might otherwise not catch these sounds.
There’s now an NFC reader button in Control Center.
Game Center got a refresh! Who had that one on the bingo card?!
I’m sure there’s plenty more out there that hasn’t been uncovered, not to mention all the myriad sessions coming this week. But don’t worry, even though some times some things go slipping through the cracks, these two gumshoes are picking up the slack.
[Dan Moren is the official Dan of Six Colors. You can find him on Twitter at @dmoren or reach him by email at firstname.lastname@example.org. His latest novel, The Aleph Extraction, is out now and available in fine book stores everywhere, so be sure to pick up a copy.]
By Jason Snell
June 22, 2020 8:25 PM PT
Thoughts on WWDC 2020 Day One
Most people attend WWDC remotely every year, watching video sessions in the comfort of their homes and offices. But I’ve been going to WWDC since the late 90s, so today was… strange. And the carefully orchestrated keynote video was of an even higher density than Apple’s presentations usually are. I’m still getting my bearings, and it’s important to pace one’s self for the week to come, but here are a few of my quick observations about the opening day of Apple’s first-ever virtual WWDC.
Bye Intel, hello Apple silicon
Well, here we go. The Mac begins the third processor transition of its life with a move from Intel processors to Apple-designed ARM processors that Apple’s referring to, generically, as “Apple silicon.” (My guess is that this is a necessary placeholder until Apple unveils the specific name for this series of processors when the first ARM Macs ship later this year.)
This transition is likely to be the smoothest of the three. Mac development almost entirely occurs with Apple’s development tools and compilers, so the path will be fairly straightforward for developers. That means that many apps will run natively on Apple’s chips from the start. For those that won’t, there’s Rosetta 2, a code-translation system that builds a version of your apps translated in advance for the new chips and using native system features whenever possible—so even your old, not-yet-native apps will run, and probably run at decent speeds. Apple has also indicated that these new Macs will follow the same general software policies as current Macs do, so they won’t be restricted to only Mac App Store apps or anything like that.
People who rely on running Windows apps on their Macs, however, will not find a comforting story. Apple made a point of highlighting virtualization features that are built into macOS Big Sur running on Apple Silicon, but these seem to be for virtualizing operating systems built for Apple’s processors, not for emulating an operating system built for a different processor. I would imagine that, eventually, there will be a way to run Windows on ARM Macs—but it may take a while and it may be a slow, frustrating experience when it does arrive.
Now let’s talk about the upside. In a lengthy presentation, Apple’s chip czar Johny Srouji offered a few tantalizing suggestions about why Apple was making this move. Of course, Srouji doesn’t want to steal the thunder of future Mac product announcements, but he does want to impress upon all of us that there are good reasons. I appreciated his line about how the 2018 iPad Pro’s status as a device that was more powerful than most PC laptops “foreshadows how well this architecture scales to the Mac.”
Srouji also made it clear that Apple is designing a series of processors that are purpose built for the Mac, just as it has built custom processors for the iPhone, the iPad, and the Apple Watch. While many developers will test their apps this summer on a Mac mini running the same A12Z Bionic processor that’s in the 2020 iPad Pro, Apple’s shipping ARM Macs will contain processors that are not left over from an iPad, but made specifically for them.
Of course, Apple has been headed in this direction for a while now. “Performance is enough of a reason to change, but it’s just part of the story,” Srouji said. That’s simultaneously a boast about the fact that Apple expects its processors to be superior to Intel’s across the board, and a declaration about the fact that Apple wants to do more than Intel processors can provide.
Most modern Macs already contain within them an Apple-designed T2 processor that is handling security, video encoding, control of onboard cameras, running the Touch Bar, and acting as a disk controller, among other functions. All of it is built around the edges of the Intel processor at the center—but soon, Apple will be able to build the whole package at once.
All of that is to say that while the transition to be smooth, the reason for the transition will probably be readily apparent. These new Macs with Apple-built processors will almost certainly offer speed and power efficiency far beyond what the Intel equivalent models currently offer. Apple has judged that the leap is worth the inconvenience.
Rethinking the home screen
The grid of app icons on the iPhone home screen has served Apple well, but it was past time to provide users with more functionality. So with iOS 14, Apple has created a new version of its widget architecture and allowed users to place widgets generated by various apps on the home screen, nestling among the app icons or even living on home screen pages all by themselves.
The design of these widgets looks great, though Apple has pulled support for interactivity, killing some more daring widgets like James Thomson’s PCalc calculator. It wasn’t entirely clear from the keynote, but if space is at a premium you can actually stack like-sized widgets together, so multiple widgets can share space. You swipe to move among them. (This is in addition to the Smart Stack features, which uses Siri to guess about which widget you need depending on context.) Fans of high-density, information-packed screens will take advantage of widget stacks.
Then there’s the App Library, which—when combined with the ability to hide as many pages of your home screen as you like—solves the issue of providing access to all of your apps while not forcing you to wade through all of them on many pages of home screen. It’s a carefully balanced feature that rolls in a bunch of existing iOS technology—Siri app recommendations and App Store categorizations to name two—to offer a little bit more than just a bare search field or alphabetical list. (And if you regret making one of your home-screen pages disappear, don’t worry—hiding them is nondestructive, and you can always bring them back.)
This is all good stuff. I look forward to redesigning my iPhone’s home screen to display fewer apps and more widgets. There’s just one thing missing: the iPad. In iPadOS, there’s no App Library, and while the new widgets are supported, they’re limited to a single column on the first page of the home screen.
The iPad’s home screen app icon grid is no less monotonous than the iPhone’s, so it could certainly use App Library. And the vast expanse of iPad screen real estate could happily gobble up a whole bunch of widgets, full of useful data. Instead, there’s just a single column on a single page. It’s a curious and unfortunate omission.
The Mac and iPad influence each other
Are the Mac and the iPad merging together? I don’t think so, but they’re definitely exerting gravitational pulls on one another. It’s hard not to look at the new macOS design and not see how it’s picking up a huge chunk of design language from the iPad, from the rounded rectangles in icons and windows to the use of SF Symbols glyphs to the mouseover effect on buttons. (That new Mac window design, with the title bar demoted into the toolbar, is going to take some getting used to.)
But the iPad, too, picked up a bunch of Mac interface elements, including a new search interface that’s basically the Mac version of Spotlight, the addition of drop-down menus, and app sidebars everywhere. I had to chuckle when Apple showed off the new sidebar in Photos on the iPad—that’s a feature that’s been in Photos (and before that, iPhoto) on the Mac forever. But on the iPad, it’s new—and those sidebars are showing up everywhere in iPadOS 14.
The iPad and the Mac are different, but they’re also joined. Mac Catalyst may have started the flow, but this goes beyond that. Apple’s using the iPad to make the Mac feel more modern, and using the Mac to make the iPad feel more functional. We’ll see how this relationship evolves.
The Mac at the end of the universe
At last year’s keynote, Apple focused its attention on SwiftUI and barely mentioned Mac Catalyst. This year, though, Catalyst got its own slide and a whole bunch of improvements, which is only fitting since two stock Mac apps—Messages and Maps—have been re-implemented entirely using Mac Catalyst. This is good news for developers who have been waiting to see if Catalyst was going to continue to be developed and improved.
But then Apple dropped a bit of a bomb: For Macs running on Apple’s own processors, iOS app developers will be able to place their apps in the Mac App Store, unaltered. These Macs will just… run iOS apps.
On one level, this is great news, because there are iOS developers who will never be bothered to build Mac apps—and now can make their apps available on the Mac, albeit in weird app wrappers that don’t feel like real Mac apps.
But on another level, it calls into question the very existence of Mac Catalyst. While some developers will put in the extra work to make the iPad apps into proper Mac apps, some will decide that it’s not worth the trouble since their iPad apps will run just fine on macOS.
So far as I can tell, Apple has decided that it’s not going to strongarm developers into doing the work to support the Mac. Maybe it’s confident in the powers of Mac Catalyst. Or maybe it realizes that some developers are just not going to care about the Mac, and it’s better to lower the standards of what a proper Mac app should be in order to get those apps on the platform.
The idealist in me says this is a terrible idea and that it will just lead to developers abandoning the Mac and just shoveling their iOS app onto the platform. If you think Mac Catalyst apps are weird, wait until you’re running pure iOS apps that have made no attempt to appear even remotely Mac-like.
The optimist in me says that there will always be good Mac apps, but there are also a lot of great iOS apps and being able to run them makes my Mac more useful and relevant.
The truth is probably that the future of the Mac is as a “pro” version of iOS and iPadOS. It’ll run more or less every app that’s available on the iPhone and iPad, but it’ll also run traditional Mac software. Over time, the distinction between iPad apps and Mac apps will begin to fade away entirely, and the Mac will just become a keyboard-and-trackpad mode of the iPad.
Like reading about our sun becoming a red giant and swallowing the Earth, or pondering the heat death of the universe, it sounds like a depressing story until you realize the time scales involved. If Apple handles it right, the Mac will fade away so slowly that by the time it’s gone, it won’t matter anymore. But it’s hard not to look at the appearance of unmodified iOS software on the platform and not see the endgame.
Wow, that got dark. This is going to be a really interesting week. I’m looking forward to what Tuesday brings.
By Jason Snell
June 22, 2020 9:14 AM PT
It’s WWDC 2020 keynote day!
(The keynote is over!)
Hello from California, where I am… sitting in my office on the day of an Apple event, rather than standing outside a venue waiting to be seated. Oh, 2020!
Beginning at noon Pacific, you can hear me discuss it live with Myke Hurley on Upgrade, which will be streaming on Relay FM.