By Jason Snell
June 23, 2020 5:46 PM PT
WWDC 2020 Tuesday: Session Impressions
Note: This story has not been updated since 2020.
We used to have to run from room to room. These days we can sit on the couch and watch whatever WWDC sessions we want. Here are some interesting tidbits gleaned from a collection of sessions that were posted on Tuesday.
While many developers can get their apps built for Apple processors quickly, among the limitations will be precompiled software contained within the app that isn’t built for Apple’s processors. Apps can’t mix and match Rosetta and native code; if developers are using precompiled Intel software, they’ll need to contact the author and ask them to provide a new version compiled for Apple’s processors. Developers running on Macs with Apple-built processors can actually build and run their apps using Rosetta so they can get some idea how they will run on Intel processors.
More broadly, Apple’s new Macs will be using the same sort of multi-core layout that we currently see on the iPhone and iPad, where there are two different types of processor cores. One set of cores is focused on performance, while the other set is focused on energy efficiency. The system will dynamically choose which cores are used, just as on iPad and iPhone. Apple cautions developers to not be surprised to see their software running mostly on one set of cores or the other—it all depends on the current needs of the system. That will be a major change from current Mac multiprocessing, in which each core is more or less the same, and work is generally spread across all the cores.
I got a chuckle at the fact that, to target code specifically for the Mac, you used to be able to target Intel x86_64, but now you need to be more broad and just say “os_osx.” Of course, Apple changed the name to macOS two years ago and it’s not even version 10 anymore, but the behind-the-scenes stuff rarely matches the names used in marketing. (Developers will target Macs with Apple silicon by targeting the arm64 architecture, the exact same name as the target would be on iOS.)
If you want to see what kind of apps you’re running on a Mac with Apple silicon, just open the Activity Monitor utility, which will display every running process and list whether it’s running on Apple or Intel architectures. And just as during the transition to Intel processors, you can choose to force an app to run in Rosetta rather than natively by selecting the app, choosing Get Info, and checking the Use Rosetta checkbox.
Last year developers were a little concerned about just how much Apple was going to put into continuing to develop Mac Catalyst, since the company talked so little about it during WWDC. This year, though, Apple has rolled out numerous Mac Catalyst updates—probably aided by the fact that two stock Mac apps, Messages and Maps, are now implemented using Mac Catalyst.
Mac Catalyst now supports dozens of additional iOS frameworks, including some that you wouldn’t expect and that aren’t actually on macOS, like ARKit. No, the Mac is probably not gaining augmented reality support anytime soon, but by including ARKit, Apple allows app developers to simply use the normal checks they’d make to see if an iOS device supports ARKit—and then move on when it turns out the current device doesn’t—rather than having to block out that entire segment of code because the Mac doesn’t know what ARKit is.
This spring, iOS got an update that provided much richer support for handling changes in keyboard input, and that support has now been rolled into Mac Catalyst, as well. The app now supports the Mac’s color picker, and proper checkboxes, and (rejoice!) the date picker now uses a proper Mac date picker rather than the iPhone’s spinning wheel. (I believe the iPad also gets this feature, once again showing how much the iPad and Mac have in common these days.)
Mac Catalyst under macOS Big Sur also supports regular Mac pull-down menus, and popovers that are geometrically “real”—in that they can expand outside the bounds of the main app window. (Constraining everything to inside the app window was always a tell that an app came from the iPad.) Three-column views are also supported, and the use of SF Symbols for iconography (since that icon set is moving to the Mac anyway in macOS Big Sur).
Catalyst apps will also look clearer, with sharper text, because developers can now opt to run them at 100% scale, rather than the scaled-down version that was previously mandatory in order to avoid some interface layout issues.
Photo editing extensions on iPad can now be brought over to the Mac via Catalyst, and will work with Mac apps that support those extensions, most notably Photos. Widgets should also “just work”, appearing in the new Notification Center widget view just as they would on iOS.
macOS Big Sur sure looks different. This session goes through the details. Sidebars are now “full height”, meaning they extend into what used to be the title bar of windows. SFSymbols iconography is everywhere, with icons in sidebars boldly colored. Apps can set the color of those icons themselves, setting the mood themselves—but can opt to allow users to override that color with their own preferred interface accent color. (They can also opt to ignore user preference, as Apple Mail is doing with its VIP items, which also display as yellow.)
The title bar has basically merged with the toolbar, in the new “unified” toolbar style, in which the window title is left justified and inline among the bold SFSymbols toolbar icon. (There’s also room for a subtitle just below the title, which can provide secondary information about the window you’re looking at. Apple Mail uses this subtitle to display how many unread messages you have, for instance.)
There’s also a new unified compact window layout, which is shorter, and is meant to be used when the toolbar is sparse and the focus is on window content. There’s also still support for “expanded” toolbars, which is basically how all Mac windows used to work—big title bar, lots of big buttons, labels under the buttons, the works.
The humble slider control, which used to be a pointed polygonal block sliding along a track with tick marks below it, has been redrawn in Big Sur. It’s now an oval sliding over a track with tick marks on it.
App Clips are new mini-apps that quickly launch when you scan a code, tap an NFC sticker, or tap on a link. The method by which App Clips are launched is actually a special URL that Apple is calling an “App Clip experience.” These URLs are parsed by the system and then handled by an app clip rather than being routed to the web browser, in a fashion similar to how Universal Links work on iOS. Apple is launching its own “App Clip Code” format later this year, which seems to be a product that is both an NFC sticker and a visual code, so you can tap them or scan them to activate an app clip.
As for App Clips themselves, there’s “no magic” here—developers will need to create a second mini application within xcode, a standalone app within their existing app project that’s dedicated to just handling App Clip experiences. An App Clip app can’t exist without a corresponding regular App, and they have to be submitted to App Store review together.
However, the two apps are actually mutually exclusive. The App Clip only gets downloaded and launched if you don’t have the corresponding app installed—otherwise, the URL just links to that functionality in the existing app. If you use an App Clip and decide you want to download the full app, the App Clip goes away—though it can pass along some of its data to be imported into the full app.
App Clips are meant to be ephemeral, especially if you don’t use them. After a period of inactivity, App Clips and their data will be automatically deleted by the operating system, and their data doesn’t get backed up, either. However, if you use an App Clip regularly, its life will be extended—and if you use it regularly enough, the system may never delete it at all.
Last fall I wrote about Ultra Wideband, the technology inside Apple’s U1 Chip, found in the iPhone 11 series. It provides the ability to precisely locate other devices with Ultra Wideband chips, and up to now Apple has only rolled out a single feature that uses that chip: an arrow pointing at someone else’s phone while using AirDrop.
With iOS 14, Apple’s opening up access to the U1 chip to app developers, providing potentially interesting uses of apps that are spatially aware. Apps that want to use this feature have to ask permission, and all devices participating have to agree. Then the apps have access to a stream of distance and direction data for whatever devices they’re tracking—keeping in mind, only devices with the U1 are currently supported.
The tracking comes courtesy of a temporary discovery token that’s randomly generated to ensure privacy, and it’s discarded at the end of the session, which is why you can only grant an app one-time permission to track another device.
Some other tidbits from the session: iPhones should be held in portrait orientation for optimal tracking use. (It makes me wonder if the reason the iPad doesn’t have a U1 chip is because of the complexity of that device being used in both portrait and landscape mode far more than the iPhone.) There’s also a “field of view” for device tracking that’s roughly analogous to the field of view of the iPhone 11’s ultrawide camera. If a tracked device is outside that field of view, you’ll still get a distance measurement but won’t be able to get a direction.
Apple’s even built a simulator for developers to test these features. When you simulate two iPhones on screen, they’ll react to each other as you drag them closer to one another or further apart. Presumably developers will want the biggest screen possible in order to test this feature, so get those Pro Display XDR orders in now.
This was a nice session about Apple’s current thinking regarding iPad app design. “Just because the iPad is in the middle, that doesn’t mean it’s just halfway between the Mac and the iPhone,” says presenter Grant Paul of the Apple Design Team. “A great iPad app, you need to be just as dedicated to what’s great and special about the iPad” as the iPhone and the Mac.
For iPadOS 14, there’s been a lot of focus on sidebars—which are very helpful, as Mac users can attest. iPadOS 14 sidebars can contain items that users can reorder, drag in items as favorites, and the like. Handy. Sidebars can be toggled on or off by tapping the sidebar icon in the top left of the app, and can be brought back temporarily via a swipe from the left edge of the screen.
Another goal for iPadOS 14 design seems to be an increase in density. The Files app, for example, uses smaller icons than in iPadOS 13 in order to dramatically increase the number of items that can be viewed.
Apple’s designers also want to use the iPad’s extra screen space to improve informational context in apps. For example, you can now rename files inline in the Files app, rather than being taken to a modal rename screen. The Calendars menu in the Calendars app, which was previously a popover, now displays in the sidebar, allowing you to view changes in the calendar as you adjust which calendars are visible.
The iPad is a versatile device that can handle all different kinds of input methods, but Apple cautions developers to always start with multitouch. The iPad is, first and foremost, a touch tablet. But its many other input methods can be mixed and matched in interesting ways—for example, holding down the command key while tapping on a link in Safari, holding down shift while tapping on items to select them, or even drawing with an Apple Pencil while using a finger to adjust user interface elements in an animation app.
If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.