Support this Site
Become a Six Colors member and get access to an exclusive weekly podcast, community, newsletter and more.
By Jason Snell
October 1, 2019 10:52 AM PT
The Finder places files and folders at the center of the Mac, but on iOS, apps are at the center. Still, managing documents is a fact of life in many cases, and over the past few years Apple has been evolving the Files app to become a more full-featured file browser utility. In iOS 13, Files takes a huge step forward in numerous areas… though there’s still more to be done.
Perhaps most important is the simple fact that Files can now see destinations that aren’t cloud services or other apps. You can add local SMB file servers to Files by tapping the ellipsis icon in the Browse pane and choosing Connect to Server, then entering the address of your SMB server. While you’re connected, that server will appear in the Shared segment of the Browse pane. (Strangely, Files doesn’t use Bonjour to detect nearby servers and display them, as Finder does.) I have a Mac mini on my home network that I use as a file server, and it’s been a delight to access files on it, directly, from within Files and apps that use Apple’s file interface.
USB drives are also supported. It’s kind of hard to believe that I’m celebrating USB disk access in late 2019, but here we are. You can attach USB drives to any device running iOS 13, but this feature certainly feels best when you plug a USB-C cable or thumbdrive directly into an iPad Pro. As an iPad Pro user, that’s a moment that really makes the iPad Pro feel like it’s been welcomed into the community of personal computers. And if you’re someone who has ever been handed a thumbdrive by a colleague who expects you to access it on your iPad, well, now you can do that instead of sheepishly admitting that it’s completely useless to you. I’ve used this feature to attach my portable audio recorder directly to my iPad to import recordings, something I previously had to use a breakout box to accomplish.
You can even create new folders now. Yes. It’s true. And there’s a new Column View, which is an approach to file browsing that I’ve never liked on macOS, but actually makes more sense to me on iOS for some reason.
iOS 13 also lets users perform many more actions on files than ever before by tapping and holding on an icon to reveal a contextual menu. Among the actions found here are options to compress files into an archive, decompress zip files, edit tags, preview a file in Quick Look, and display an Info pane with detailed information about a file’s attributes—basically, the stuff you’d expect from a file browser is mostly there. (It’s a bit strange that you can’t set items from Shortcuts to display directly in this contextual menu, as you can in the share sheets elsewhere on iOS 13. Instead, you have to tap and hold on a file, choose Share, and then pick a Shortcuts item.
Files separates iOS storage into two buckets, On My iPad/iPhone and iCloud Drive. On My iPad is basically what you’d consider “the hard drive” on a Mac—it’s local storage that is not synced over the cloud. If you want to save a huge file on your iPad and not have it swamp your current connection in an attempt to sync all that data to the cloud, put it in On My iPad/iPhone. If you want it available everywhere, put it in iCloud Drive.
Alas, not all is sunshine and roses in the land of iOS file access. Files is still a remarkably immature app. It sometimes fails to update file listings, frequently stalls out and provides me with a blank or incomplete listing, and, most frustratingly, the Save to Files extension for third-party apps fails to provide any feedback about the progress of a file transfer. That unreliability, combined with a slow file transfer to a remote server, leads to some pretty uncomfortable moments when you have no idea if your file is going to arrive or if the whole thing has failed silently.
Apple’s taken a few cues from the Mac in building up Files, so it’s time to take a few more. Progress indicators are vital. Allowing the user to get a detailed view about what’s transferring and how long it will take are must-have features, but when I try to save items within third-party apps, all I get is an endless spinner with no feedback. I realize this isn’t an issue with the Files app itself—it does show a little circular upload/download progress bar—but it is an issue with Apple’s greater approach to file transfers.
Still, Files has come a long way. It has gone from being an iCloud Drive client app to a neither-fish-nor-fowl representation of Apple’s ambivalence to file management on iOS to what it is today—a pretty capable file browser that’s still got plenty of room for improvement. Files in iOS 13 is a major upgrade—I just hope Apple doesn’t consider the job done.
By Dan Moren
September 26, 2019 6:36 AM PT
Like it or not, email is still a big part of our lives. I feel like I get more email than ever, even if a smaller percentage than ever before is actually meaningful.
But even after all these years, I still rely on Apple’s Mail apps on both the Mac and iOS. That comes with its fair share of frustration, as the company has often been slow to adopt the kind of improvements driven by third-party mail clients. For me, though, what Mail lacks in innovative features it more than makes up for with its integration into Apple’s platforms.
Still, when Apple does decide to update Mail, it’s often cause for celebration, and in iOS 13, there are a number of welcome additions that people—including myself—have spent a long time awaiting.
First and foremost among them—for me anyway—is the addition of multicolored flags on iOS. 1 While this has long been supported on macOS, it’s one of those areas where the two platforms have been out of sync. As someone who uses flags to notate and categorize messages, I’ve been frustrated at the lack of respect for them on iOS.
Finding the multicolored flag options, however, proves a little tricky. You might think it would be an option if you’ve configured the Flag button to appear when you swipe left or right on a message from your mailbox, but no. Nor do they appear in the More… sub-menu you can find by swiping left. Nor do they show up if you long press on a mail message; yes, you can tap Mark and then choose Flag, but it doesn’t give you the option to pick the color.
No, the only place you’ll find the multicolored flag options is, somewhat nonsensically, under the Reply button from a message itself. Tap the Flag button there and you’ll get a sub-menu that lets you choose one of seven colors, as well as an option to remove a flag.
Whatever color you’ve picked there most recently will be treated as the default on that device until you pick another color. So, if you go to flag a message using one of those other methods after choosing a color, you’ll see the Flag button color changed to reflect that. But jump from your iPhone to an iPad and you may have a totally different color.
While this is far from ideal interface-wise, I do at least appreciate that iOS respects the colors of flags set on the Mac, though I wish the flags themselves were a little more prominent in the message list. (They’re very small on the right side and easy to miss.)
Mail in iOS 13 also adds a few new options for those who get overwhelmed by emails—especially unwanted ones. For the first time, you can mute a thread, if you want to stop receiving notifications when the umpteenth email in that scheduling conversation arrives. (Or, as many folks have pointed out, for those ever-popular “replied to a list instead of the sender” moments that tend to take on a life of their own.)
You can also block mail from any contact, which not only prevents you from receiving messages from them in the future, but also moves all of their messages to the trash—and, moreover, syncs across devices.
Additional, Apple has improved composing messages, including a new photo picker that’s now a “card” on the iPhone, allowing it to be swiped up and down so that you can still see your message while you’re picking a photo. (On the iPad, it’s a popover that’s independently scrollable.)
Text formatting is much improved too, earning its own interface in the toolbar, instead of forcing you to navigate through several levels of popover menu. Plus it includes new features like lists, alignment, and font face, size, and color.
But my other favorite improvement to Mail in iOS 13 is an iPad-only capability that came along with the multitasking features: message composition windows can be dragged into Split View mode, so you can write an email while referring to another email. That’s a feature I’ve been hoping for since I was first able to put two Safari tabs side-by-side on the iPad.
That’s not the only thing you can do with a message composition window, either. For a while you’ve been able to drag a draft you’re working on down to the bottom of the screen, so you could keep working in your inbox. But now, on both the iPhone and iPad, you can drag multiple drafts down to the bottom; tapping on them then provides you with an overview, much like the multi-tab view in Safari on the iPhone. It’s a much more powerful—and, dare I say, Mac-like—way of working with your email.
Finally, another welcome improvement is that iOS’s new screenshot interface, which allows you to capture the full text of a web page, also makes it possible to export an entire email message as a PDF. Previously that required a somewhat arcane workaround with the Print interface. The newer methodology isn’t that much faster, but hopefully it’s less obscure.
Not everything in iOS 13’s Mail is an unmitigated improvement, however. For one thing I’ve found the Junk mail filtering to be a bit too aggressive, catching several legitimate messages by mistake. For another, I still question the user experience choice of hiding much of Mail’s functionality behind the Reply menu, including many features that are very much not related to replying to a menu: printing, moving messages, muting threads, and so on. I realize the space is limited, but at least putting many of those features under a Share menu would seem easier for many users to comprehend.
I know, I’m weird. ↩
By Dan Moren
September 23, 2019 5:25 AM PT
If you want to know how sold I was on CarPlay, you needn’t look any further than my experiences installing a new head unit in my car in order to get it. A few months later, this has proved to be an altogether excellent choice, with no real downsides.
But within a few days of using and loving CarPlay, I’d already run into a handful of things that could be made even better with a little adjustment. And the good news is that, as of iOS 13, Apple mostly delivers on a couple of the most significant ones.
There are, of course, the little tweaks in iOS 13: I prefer the new interface for sending and receiving text messages, which uses big round icons instead of text-based buttons. I love the integration with the Share ETA features in Maps. And the Now Playing screen for Music and Podcasts has added album art, which looks great.
But there are two features in iOS 13 that are huge improvements to CarPlay, although one of them is slightly more of a mixed bag.
First up is the new dashboard screen, accessible by swiping right from the home screen or tapping the button in the bottom left. It addresses the frustration of only being able to see a single app on the car’s display at a time. So if, for example, you’re navigating with a mapping app and you want to pause your music, you have to switch apps, potentially risking missing your next turn.
That the new dashboard view offers separate tiles for apps is great, but there are a couple limitations, first and foremost, which apps can actually appear there.
For example, the biggest pane displays a map—makes perfect senses, since navigation is often the most important thing you’re doing in the car. But that pane appears to be locked to Apple Maps, even if you start navigation with another CarPlay-compatible app, like Google Maps or Waze. That’s frustrating, especially since—as far as I can tell—the Apple Maps pane doesn’t even show traffic when you’re in the dashboard, making it a waste of space if you’re not using it for navigation.
My second frustration with the dashboard view is the lack of customization. Want to resize the panes, making one smaller and another larger? Tough luck. Want to put another app in that favorite places/Siri Suggestions pane or remove it all together? Sorry, no. The dashboard is a good first step, but it hasn’t proved to be quite as useful as I’d initially hoped.
The other major improvement, however, is unequivocally a positive. In earlier versions of CarPlay, the car display was essentially serving as a mirrored version of whatever was on the phone. (Not unlike when you AirPlay your phone’s screen to an Apple TV.) That meant that whatever app was running on the phone was running on the car display, and vice versa. 1 So if you’re checking your map and your co-pilot decides they want to look something up in Safari, it’s bye-bye map. This was no small source of frustration in my road trips.
The good news is that in iOS 13, these two views are totally independent. What your co-pilot does with your phone will not in the slightest affect what you see on the car display. 2
There are a few other small improvements in iOS 13: a Settings app that lets you adjust a few minor details about CarPlay 3; a Calendar app that will show you today’s—and only today’s—events; and an improved Siri display that won’t take over the whole screen when you summon the virtual assistant.
In short: CarPlay was already good, but now it’s definitely better, and here’s hoping that Apple stays on track and continues the course in iOS 14.
Even worse, if there was no CarPlay version of that app, you just get kicked back to the home screen on the car display. ↩
We can all breathe a sigh of relief for my marriage now. ↩
One of them is a toggle to show Siri Suggestions in the dashboard view, but they rarely seem to show up for me anyway, so in general usage this apparently has no effect? ↩
By Jason Snell
September 19, 2019 4:55 PM PT
With iOS 13, the Photos app gets a major interface update, including a whole bunch of editing tools. I’ve been spending a lot of time with Photos for iOS this summer as part of the work for doing a new edition of Take Control of Photos (coming soon!). Dan has covered the video features, but let me tackle the changes to the app as a whole.
The main focus of the Photos interface is the Photos section, which is now populated by default with a curated selection of your photos. You’ll find a set of tabs that let you choose whether to view Years, Months, Days, or All Photos.
In the Years view, square buttons depict single years in your library, with video, Live Photo, or series of still images selected from the same time of year as the present day. In the summer, you’ll see lots of beaches. In the fall, it’s kids going back to school. Right now it’s all the sample images I have taken for my iPhone reviews the last few years.
In the Months view there are square buttons in a couple of sizes, segmented by Photos in an attempt to detect discrete events based on time and location. An event might just be a single day at home, or it might be a weeklong trip to Europe. Apple uses a massive database of locations and events to make better labels when it can—images I took at a concert were labeled with the artist’s name, detected from the location and time (see the leftmost image at the top of this story)! Not every event is visible in Month view—instead, the Photos algorithms make guesses about the four or five most relevant events from each month.
The Days view provides curated selection of photos and videos, generally separated by day—though sometimes multiple days are pushed together when you didn’t take many photos. In this view you aren’t seeing all your photos (that’s limited to the All Photos view), nor are you seeing all of the photos that are displayed, since they’re cropped to fit in a squares-and-rectangles grid. Photos is removing similar photos and screen shots and pushing to the front the images that its algorithm suggests are the very best photos. You can select and edit photos from this view. If you tap on a photo, it will open full screen and you’ll see all the usual editing tools. You can even swipe between photos in that full-screen view, but you won’t be swiping to the next photo in your library—just the next photo that made the cut to be displayed in the collection.
Then there’s the classic Photos view, now called All Photos, which displays every single item in your library. On the iPad, you can choose whether you want to see them in their proper aspect ratio or cropped into squares by tapping the Aspect/Square button. As with the Days view, from here you can tap or double-click on an item to open it full screen and get access to editing mode.
With the introduction of the iPhone 11 series, there’s the addition of a new Portrait Mode effect: High-Key Light Mono. You can use this on older devices in Portrait Mode, too: the effect is similar to Stage Light Mono, but with a white background.
Like video editing, Photo editing has come a long way in iOS 13. The overall editing interface is quite different than it was before, though the controls themselves vary depending on your device and its orientation. In Adjust mode, you’ll find 16 controls displayed as icons inside circles either on the right side of the screen (iPads and iPhones held horizontally) or at the bottom (iPhones held vertically).
You can swipe to move between the different controls, and then use the slider next to it to make adjustments. Adjustments are made on a scale of 0 to 100 or -100 to 100, and that range is reflected by filling in a portion of the circle that surrounds the control icon. Every adjustment is non-destructive—the circle containing the icon for each control functions as a toggle. Tap it and that adjustment will be turned off. Tap again, and it’s back on.
Here are the new adjustments added in iOS 13:
- Vibrance: Adjusts the contrast between similar colors.
- Warmth: Increases or decreases the oranges and blues in your image, making it feel warmer (100) or cooler (-100).
- Tint: Increases or decreases the red and green in your image.
- Sharpness: Make edges crisper and better defined.
- Vignette: Adds an old-school vignetting effect, darkening the edges of the image. Vignetting is an effect of the physics of camera optics, something that can happen in all photography—and it turns out that this effect can be aesthetically pleasing. So if your photos didn’t get vignetted, you can fake it with this adjustment.
- Definition: Add contour and shape to images by bringing out definition in the midtones and adding contrast.
- Noise Reduction: Remove the amount of visible noise in images by applying a smoothing algorithm.
All the black-and-white editing adjustments in prior versions of iOS have been removed. However, don’t despair: You can still make your photos black and white by adding one of the three monochrome filters, Noir, Silvertone, and Mono, and then modifying the effect using the existing adjustment tools.
Finally, the search interface in Photos is improved. Previously, searching for more than one term was complicated. You had to search, then tap outside the search term, then enter another search term. Now you can just type multiple search terms and Photos figures out what you want, so I can type
dog snow and I will instantly be shown the one picture of a dog in the snow that’s in my photo library.
By Dan Moren
September 19, 2019 7:26 AM PT
I won’t say that I take a lot of video with my iPhone: a quick number crunch shows that still photos outnumber videos in my Photos library by over 80-to-1. But the iPhone has become a powerful video camera in its own right in recent years, and its features get even more impressive in iOS 13.
As Apple said during its WWDC keynote this year, iOS 13 not only brings new photo editing capabilities, but also, for the first time, enables all those same features for video. Which is pretty impressive, when you remember that previous versions of iOS offered nothing more than the ability to trim videos. In fact, iOS 13’s built-in video editing capabilities are so impressive that they massively outstrip what’s available on macOS by default.
Tapping Edit on a video in the Photos app does still let you trim a video to a specific section, and, as John Gruber noted in his review of the new iPhones, no longer requires you to save it as a separate clip—in fact this trimming is now non-destructive, so if you shorten a clip and decide later that you want to go back to the full length video, you can just revert to that. You can also now mute a video’s sound by tapping the speaker icon in the top left, or export it to another app by tapping the button in the top right.
But the bulk of the abilities are accessed by tapping the controls at the bottom of the editing screen. Tap on the dial icon and you can adjust image qualities, including exposure, highlights, shadows, contrast, brightness, black point, saturation, vibrance, warmth, tint, sharpness, definition, noise reduction, and vignetting. Each of them also lets you choose how intense to make the adjustment.
That’s a lot of stuff, and if you’re not someone who knows much about video or image quality, it’s probably a bit overwhelming. The good news is that Apple also provides an Auto button which does its best to figure out what changes will make the video look the best. I particularly like that when you do tap Auto, iOS shows you what adjustments it made to all those other settings in order to get this image, allowing you to tweak any of them further. You can even adjust the intensity of the Auto effect itself.
Moreover, all of that image adjustment is non-destructive, so if you ever decide you don’t like any of the changes, you can revert back without losing anything.
iOS 13 also adds, for the first time, filters for video, which you can access via the third button in the toolbar. So if you decide that video would look better in black & white, well, you can not only make that happen at a tap, but you have your choice of three different black & white filters, plus half a dozen color filters. 1 Personally, I don’t have the best eye for this sort of thing—I’m a #nofilter kind of guy—but it’s very cool to be able to simply switch over to a classy looking black & white image with the tap of a finger. And again, these changes are all non-destructive and can be reverted at any time.
Finally, and this is actually my favorite addition to iOS video editing, the last section of the video editing screen gives you tools for manipulating the shape of the image. That means, for example, cropping to just a specific section of the video; you can also tap the ratio button in the top right-hand quarter to enforce a specific aspect ratio, including square, 16:9, 10:8, 7:5, 4:3, 5:3, or 3:2. For most aspect ratios, you can even choose whether you want it formatted for portrait or landscape. That means, yes, wait for it: you can re-crop a portrait video into landscape orientation.
And in case that isn’t enough, you can finally 2 rotate video orientation. Accidentally shoot in portrait when you meant to be in landscape? Tap the rotate button and you’re done—or, if you prefer, use the manual Straighten tool to adjust it to any angle, not just 90° increments. Those cropping tools can then help you reframe and reformat your image to make it look like it was shot as intended all along. That’s been a long time coming.
Apple takes these tools a step further, though, by adding the ability to shift the perspective of an image on both vertical and horizontal planes, meaning if you shot something at a weird angle and ended up with a distorted image, you can manually adjust it until it looks right. You can also, at a tap, mirror flip the image over the vertical axis. This is all the kind of incredible stuff that used to require a ton of horsepower and very expensive software, and now your smartphone can just…do it. Better yet, on the iPhone XR or later, all the video-editing features work on any and all formats, including 4K video and slow-motion.
Look, I don’t envision myself taking advantage of these capabilities very frequently—like I said, I don’t even shoot that much video right now. And while professionals using the iPhone to shoot video are probably going to mostly rely on more powerful third-party apps, it’s great that these features are in reach for everybody. Because now, when you do need to flip, rotate, adjust, or re-crop an image, you can do it with all the ease of editing a photo.
By Dan Moren
September 18, 2019 8:14 AM PT
If there’s been anything in the history of iOS that one could perhaps legitimately slap with a term as strong as “debacle”, it would probably be Apple’s choice in 2012 to bid goodbye to Google as its location data provider of choice and launch a wholly new version of Maps. It’s a decision that was controversial at the time and still affects Apple today, despite Maps’s steady pace of improvement over the last seven years—yes, at this point, Maps has been without Google longer than it was with it.
But improve Maps has, year after year. Enough so that it’s been a solid competitor to Google Maps for some time now, even if there is still plenty of distrust resonating from some quarters 1. People felt burned from what they saw as Apple pulling out the rug from under them, and that kind of trust can be hard to earn back.
Personally, I’ve always bounced back and forth between Apple Maps and Google Maps, but I’ve been impressed by the strides Apple has made, especially in iOS 13. This is the year in which it feels like Apple is finally getting all its fundamentals squared away and started looking at new features.
Note: After the publication of this article, friend (and Rogue Amoeba CEO) Paul Kafasis pointed out that one big thing Google Maps still has over Apple Maps are biking directions. Which is disappointing for a company like Apple, which seems to put so much of an emphasis on being environmentally friendly. Here’s hoping it’s something the company is working on and, ahem, rolls out before long.
There’s a lot to like in iOS 13’s Maps—it’s possibly the biggest update since the app was first launched—but chief among them is the improvements to mapping data. Apple took the unusual step of announcing this initiative more than a year ago, and, in the words of William Gibson, the future is here—it’s just not evenly distributed. Apple says it will roll out to more cities throughout the end of 2019 and internationally in 2020, but right now it’s still hard to tell whether or not the good map fairy has descended upon your area.
I have, however, noticed Apple’s internal mapping game improving. For example, I now have detailed layouts of Boston’s Logan Airport, including where gates and restaurants are, which is super handy. Apple claims it will also provide flight information, though I haven’t yet seen that in action.
But of all the features added in iOS 13, I have two personal favorites. The first is the addition of real-time transit data. Previous version of Maps did have transportation information, but it was pulled from schedules, and anybody who uses public transit regularly knows that, like battle plans, no schedule survives contact with the enemy.
As of iOS 13, however, if you tap on a bus stop or subway station in Apple Maps, you’ll get a list of when you can expect the next bus or train to arrive based on actual live GPS data. 2 (You will, however, need to make sure you’re on iOS’s Transit layer, accessible by tapping the Info button in Maps’s top right corner.) This also includes information about delays, all the stops along the route and estimated times of arrival at them, and data on future arrivals. 3
The real-time data makes transit directions much more useful, since it gives you concrete information rather than a rough estimate. Of course, that’s something Google Maps has offered for a little while, so it’s another place Apple is playing catch-up, but it’s such a worthwhile addition that it’s hard to make too much fuss now that it’s arrived.
My other favorite feature is the new Share ETA option, which officially arrives in iOS 13.1 at the end of the month. It’s one of those features that Apple does best, taking a common task—letting someone know when you’ll arrive somewhere—and making it simpler. You can tap the Share ETA button when you’re in driving directions 4 and it will send an update via iMessage to the contact(s) of your choice, along with the ability for them to tap a link and follow you on your route. It’ll even send them a text when you’re getting close, which is particularly handy if you’re driving. (This feature works great on CarPlay, more on which in a future post.)
Once you reach your destination, Maps stops sharing that information with your contacts, which kind of makes this like a very specific use of Find My Friends-style location sharing. I found it handy when picking up my wife from the subway station, or when one of my podcast compatriots was running late and wanted to let us know how far he was from home.
Not only is Share ETA a useful feature, but Apple does a good job of surfacing it in the Maps app, which means I’m hopeful people will actually find it and use it. In fact, when you set a location as a Favorite (more on which below), you can set it up to automatically share your ETA with someone every time you navigate to that location, if you always want your partner to know when, say, you’re coming home from work. I think it’s a great feature overall: I just hope it adds support for transit and walking in a future update.
Like the more detailed mapping information, the new Look Around feature 5, is still only limited to certain locations. It looks good and works well, and definitely seems to match if not exceed Google’s own offering, but it will need to roll out to more places before it’s more than a showy demo. Again, this is a matter of Apple finally bringing itself into line with what Google has had for a long time.
Speaking of Google, search has long been one of Apple’s weak spots compared to Google Maps—no surprise there, given that search is Google’s bread-and-butter. Search seems mildly improved in iOS 13, but you don’t have to go long before you run into some decisions that still seem puzzling. For example, in trying to add my wife’s work to my Favorites—another new iOS 13 feature—it kept insisting that the location was an EV charging station; it took me a while to figure out that was because I’d spelled out the number in her address, using “ten” instead of “10.” Who knew?
In addition to marking favorites, you can now create collections of places that you can share with others, but disappointingly those collections are not synced—it just makes a copy. So you can’t, say, share a list with your partner of places that you want to visit on a vacation. I’m also (perhaps unreasonably?) annoyed at the dearth of custom icons in the Favorites feature, and the fact that I can only use the “Work” icon for my own office. I work at home, Apple. I don’t need a separate work option, but would love to be able to assign that to my wife’s office.
All in all, there’s no question that Maps is greatly improved in iOS 13. As the default mapping option in iOS, Maps has picked up a lot of users over the years, but there are still holdouts who insist on using Google Maps. Will these new features be enough to entice them away? I’m not convinced—as I said up top, trust is hard to rebuild. But there need be no fear about using Apple Maps—it’ll get you there in the end. And I’m hopeful that iOS 14 will offer an opportunity to veer more into new features like Share ETA, rather than just retreading the same ground that Google Maps has already…mapped.
My wife, for example, still checks everything in Google Maps, and I know she’s not alone. ↩
Assuming, of course, that where you are has this information and has allowed Apple to access it. ↩
You can also now create shortcuts that open certain locations, so for example, I can have one titled “When is my next bus” and it will open Maps to my bus stop, but alas, won’t simply tell me that information, which would be nicer. ↩
Sorry, Transit and Walking directions aren’t compatible yet. ↩
Don’t call it Street View. ↩
By Jason Snell
September 17, 2019 12:00 PM PT
Very early on the life of the iPhone, I found myself wishing for the digital equivalent of the Marauder’s Map, so I could see the locations of my friends at a glance. Though a bunch of early App Store apps tried to make it happen, it didn’t really catch on for me until Apple added the feature itself as Find My Friends. Separately, Apple created Find My iPhone, a name that kept getting worse as more Apple devices gained location-sensing technology, but an app that was essential for finding lost hardware.
They’re apps that do basically the same thing—one for your own devices, and one for the locations of people you know. In iOS 13, Apple has done away with both of them, replacing them with the new Find My app. We can quibble about the name—I got used to it in a hurry—but I think it’s a great step forward.
By default 1 Find My opens to much the same view as Find My Friends—the app’s People tab is selected, and the map will zoom out to show people who are near you. As before, “people” is defined as people in your Family Sharing group as well as anyone who has agreed to share their location with you. At the bottom of the screen there’s a People list that lists everyone who is sharing their location with you, with their current location—and if you tap on any of them, the map view will switch to show their location. The list also includes anyone you’re sharing your location with, even if they’re not reciprocating, so you can decide if you want to continue having that kind of a one-way relationship.
The interface has been refreshed to the style of a modern iOS app, but the features are more or less the same as before. You can get directions to the location of your friend, and add notifications to alert you when they’re arriving or departing particular locations.
But just hit that new Devices tab and… you’re essentially in Find My iPhone, but a much more modern version that’s using the same interface as the People tab. Here you’ll see all the devices associated with your Apple ID (and any Apple IDs associated with your family). This section will be a good reminder to de-associate all your old devices—as a product reviewer I have more of these than most people, but my list was still shockingly long.
As with Find My iPhone, the Devices list is a bit scattershot, because different devices can sense location and phone home via quite different means. AirPods will show up on the list, but it’s really only going to show the last time and location another device connected to them. Laptops that are asleep or off may appear, but at best the app will only show where they were the last time they were connected.
This is all about to change with iOS 13 and macOS Catalina, as Apple introduces new technology that will enable other Apple devices to find yours. It’s all encrypted to ensure privacy, but we are about to enter a world where Apple devices emit low-power Bluetooth pings in order to better let you find where they are. The Find My app will be a major beneficiary of this tech.
Then there’s Ultra Wideband tech, which Apple is bringing to the iPhone 11. That technology will allow even more precise discovery of Apple devices. And reports suggest Apple’s also working on a low-power tracking device that you’ll be able to place on other objects, which will then appear in a new Items tab within Find My. In other words, Find My isn’t just a much-needed merging of two longstanding location-tracking apps. It’s a refresh that’s happening because Apple is introducing a raft of new tracking technologies that go beyond anything we’ve seen up to this point.
I know this is a series about iOS features, but I also want to mention how great it is that Find My is coming to macOS Catalina via the Catalyst app-translation technology. Previously, you could view the locations of your friends on the Mac via a very limited Notification Center widget. Now Mac users get the complete Find My app experience. As someone who uses Find My Friends on my Mac all the time to figure out where my family is, I’m happy to finally be able to use an app rather than sliding out a weird little drawer.
I think. I don’t know what happens if you have no designated friends or family members. ↩
By Dan Moren
September 17, 2019 7:51 AM PT
When the iPhone’s multitouch interface first arrived on the scene, it was a testament to the simplicity and straightforwardness that touch could offer. But as the years have gone by and manipulating a slab of glass has become second nature to much of humanity, phone makers have struggled with how to add more complex features to the interface without marring that ease of use.
Apple is no exception to this challenge, but in iOS 13 it finally seems to have made a decision to focus on a single answer: the long press. The long press expands upon the Haptic Touch features introduced in last year’s iPhone XR, as well as absorbing the now deprecated 3D Touch hardware-based features first introduced in the iPhone 6s.
The long press—or tap and hold—essentially acts like a right-click/control-click/two-finger-click does on the Mac, bringing up a contextual menu of additional options.
For example, long press on a track in the Music app and you’ll get a scrollable menu of general actions like copy or share, as well as those specific to music, like adding it to a playlist or your Up Next queue. In Photos, a long press will give you a preview of that image or video (playing it back if it’s a Live Photo or video), as well as options for copying, sharing, favoriting, or deleting. Long pressing on a link in Safari also gives you an optional preview of the link (much like the peek-and-pop of 3D Touch), along with options for opening the link in the background, downloading it, adding it to your Reading List, and so on. You can even long press on an icon in the home screen to bring up shortcuts to actions within the app. Basically, any time you want to do anything more with an item that you can tap on, you can use the long press.
As someone who has never been a huge fan of 3D Touch, I think the choice to use the long press is great, for several reasons. First, and most importantly, it can be implemented standard across all iOS devices. No iPad 1 or iPod touch ever had 3D Touch, and it wasn’t even included in every iPhone model. As of iOS 13, you can be sure that whenever you pick up an iOS device, a long press on an item will have the same result. 2
It’s also simpler, in my opinion, to explain a long press to someone than it ever was to explain 3D Touch. 3 Pretty much anybody who’s comfortable using a multitouch interface understands the “tap and hold” concept.
Finally, it provides a place to put all those additional options that were previously hidden in a variety of places, whether accessible via 3D Touch or under the Share menu. No more trying to figure out where exactly those extra features might be. And even though there’s nothing that says “obvious” about tapping and holding, it’s a feature that’s not that hard to discover—and, once you discover it, you can easily generalize it to the rest of the OS.
My only complaint about the long press is that it’s not quite as standardized as I’d like. There are some holdouts: Tapping and holding on a message in Messages, for example, gives you Tapback options, as well as an old-style popover menu of options like Copy. And that long press on a song in the Music app brings up a long list of actions that are usually only accessible in other places via the Share button, which in other places requires an additional tap.
Still, I’m bullish on the future of the long press as a basic input mechanic in iOS. Just like the right-click on the Mac, you can get by without it, but once you learn it’s there, it opens up so many possibilities. Once third-party developers start to adopt this setup in their own apps, it may finally be well on its way to becoming a fact of life for iOS users.
The hardware was reputedly prohibitively expensive on the bigger display of the iPad, meaning it would have driven up the cost of the cheaper iPod touch and lower cost phones. ↩
Frankly, I think Apple spent too much time trying to standardize this technology across both the Macs and iOS devices, to the detriment of both—it couldn’t even settle on one terminology. I don’t think 3D Touch was ever great on iOS devices, and “Force Touch” is even less useful on Macs, in my opinion. These are different platforms with different interfaces; you can’t just drop the same interface conceit on both of them. ↩
“No, press it but then press harder.” ↩
By Dan Moren
September 16, 2019 6:21 AM PT
The iPhone’s software keyboard literally 1 reinvented the way we enter text on smartphones: not just because it eschewed the hardware keyboard for a software-based model, but also because of the variety of smart technologies it incorporated, such as autocorrect and tapping-and-holding for different characters. But despite the revolutionary nature of the keyboard, it’s remained largely unchanged since the iPhone’s introduction in 2007.
iOS 13, introduces, for the first time, an alternate way to enter text: swipe typing. 2 Instead of tapping on keys, you put your finger on the first letter and then slide it to the next letter and so on. Based on your finger movements and iOS’s dictionary, the system figures out what word you’re most likely trying to type.
Now, those with Android phones or who have used third-party apps like Google’s Gboard, SwiftKey, or Swype will rightfully point out that Apple didn’t invent this idea. But on iOS, swipe-typing has always been the province of third-party apps, and that’s limited its adoption.
iOS 13’s QuickPath keyboard is surprisingly good for a first effort. In the weeks that I’ve been using the betas, my biggest problem is simply remembering that the feature is there, so accustomed am I to tapping out my messages like someone from the long distant past who still hasn’t seen the series finale of Lost. It’s not error-proof by any means, but what problems I have encountered are outweighed by its convenience in many situations: for one thing, swipe typing when you’re holding your phone one-handed certainly feels a lot easier than tapping.
For those times when QuickPath doesn’t quite figure out what you’re going for, the predictive text bar above the keyboard does offer suggestions for other words that you might have been trying to type, though I maintain that iOS’s autocorrect system is still in need of an overhaul; for example, if you go back to correct a word you’ve already typed, the alternate suggestion system doesn’t seem to work quite as well. It’s also worth noting that, by default, tapping delete when you get the wrong word will delete the entire word, not just the last character, though you can change that option in Settings > General > Keyboard.
While bouncing back and forth between swipe- and tap-typing is seamless, this is also the source of my biggest frustration with QuickPath: You can’t swipe your way to any character that’s not on the main keyboard (i.e. numbers or punctuation). Right now, that gear-shifting slows my brain down a bit, but it’s something that I imagine I’ll adapt to in time.
Overall, I think the addition of QuickPath is a welcome one, though there remains the question of how many new users will be willing to change up their habits and how many users of existing third-party keyboard apps will give up the other advantages they offer. More to the point, though, I remain hopeful that Apple’s willingness to add a feature like swipe-typing means that it might actually improve other keyboard features that have remained stagnant for the last several years—but I suppose we’ll have to wait until iOS 14 for those.
By Jason Snell
September 15, 2019 12:26 PM PT
Formerly the third-party app Workflow, Shortcuts was bought by Apple and integrated with iOS last year—but it was a first step. Shortcuts has had a year to spread its roots throughout the operating system, and in iOS 13 it’s been improved and better integrated—with the promise of even more to come in the very near future.
Shortcuts is now included on every iOS 13 devices—it’s not an add-on you have to download from the App Store. Apple has also begun to integrate disparate automation features of iOS and place them all inside Shortcuts. Siri Shortcuts, very simple app-based automations introduced in iOS 12, now live inside the Shortcuts app. And beginning in iOS 13.1, the simple automations that you create in the Home app will also appear in Shortcuts—and can be modified and enhanced with additional features of the Shortcuts app.
As someone who is not a software developer, I’ve had to imagine the pain Swift developers went through in the early days as the language evolved and their they had to keep rewriting their code to conform to the latest version. With Shortcuts in iOS 13, I’ve gotten at least a taste of that feeling, because upgrading to iOS 13 required me to do some work to get my Shortcuts working again. That’s life on the cutting edge, and the changes are for the better, but be warned—you may have to do some work to get your Shortcuts functioning to your liking on iOS 13.
The biggest improvement in the Shortcuts format itself is the explicit passing of data from item to item. Shortcuts works in a linear flow, items executing one at a time from top to bottom. In previous versions of Shortcuts and Workflow, data generally passed from the previous step, so if you wanted to grab data from somewhere else in the shortcut and use it instead, you’d need to add a Get Variable item and then act on it.
In iOS 13, items in Shortcuts explicitly label what data they’re acting on. By default, it’s the preceding item, but you can see it and change it right within the item, rather than adding additional items. It means that Shortcuts are a lot shorter than before—all those blocks that set and get variables are gone—and it’s clearer what each item is doing. Each item in a Shortcut is styled more like a sentence—“Set name of file file.txt to result.txt” rather than a stack of parameters.
Shortcuts just got a lot more useful if you use Siri, too. You can now create interactive Shortcuts that can ask questions and accept text input, especially useful if you’re not able to look at a screen because you’re using AirPods or CarPlay. And the redesigned Share Sheet in iOS 13 means that you can prominently place specific individual Shortcuts in the Share sheet, making it easy to access them with a single tap.
Shortcuts will also become vastly more usable in iOS 13 because app developers can contribute much more detailed, useful actions into Shortcuts from their apps. In the past, data got passed between apps and Shortcuts either via the clipboard or by embedding lots of data in a passed URL. In iOS 13, apps can specify what actions and data can get passed back and forth, which should—once apps are revised to support this feature—make Shortcuts much more flexible and powerful.
As someone who does a lot of work on the iPad, I’ve found that Shortcuts benefits from the new iPadOS feature that allows you to pin widgets on the home screen. I keep the Shortcuts widget pinned to my home screen, letting me run Shortcuts right from the home screen with just one tap.
There’s more to come. In current beta versions of iOS 13, Apple has added an automation tab to Shortcuts, allowing Shortcuts to run on timers or when triggered by other actions, such as tapping an item containing a chip that’s readable by the same NFC reader that the iPhone uses for Apple Pay. It’s a shame that this isn’t available quite yet, but it’s another example of how integrating Shortcuts deeply into the operating system will pay dividends in all sorts of unexpected ways.
There’s still much more that Apple can do with Shortcuts. I’d like to see the ability to select items in a Shortcut and copy, paste, and duplicate them, for example. I’d also like Apple to continue minimizing the appearance of the Shortcuts app (and the visible scrolling through steps of a Shortcut) when Shortcuts are run. It’s visually distracting and, unless you’re actively building the Shortcut, unnecessary. But that work will have to wait until deferred improvements like automations are added in (hopefully) iOS 13.1.
Regardless of the existence of a few straggling features, Shortcuts in iOS 13 has progressed in exactly the way I had hoped it would. This is Apple’s vision of how user automation will work in iOS, and Shortcuts keeps gaining power, system integration, and app connectivity. The future is bright for Shortcuts users—but with iOS 13, so is the present.
Jason Snell for Macworld
August 28, 2019 9:49 AM PT
On Tuesday, Apple released a new iOS beta for developers, and it was a bit of a shocker, because it wasn’t an iOS 13 beta, but an iOS 13.1 beta. This raises a lot of issues about how Apple is approaching its impending releases of new hardware, its relationship with beta testers, and how it approaches overall software quality.
By Jason Snell
June 13, 2019 1:09 PM PT
My initial thought, when sitting in the audience at Apple’s WWDC keynote, was that iPadOS 13 was going to present me with a remarkable number of items from my iPad wish list. And that’s not wrong—it looks like this release is going to check a lot of boxes—but the keynote never tells the whole story. Some features are omitted from the keynote but end up being huge in my overall estimation of a new release. And of course, some wished-for features are never mentioned because, after scouring feature-list web pages and installing developer betas, you hit the inescapable realization that they just aren’t there.
In the bubble of the convention center, you only hear what Apple wants to communicate. Once you leave the bubble, you begin to process what’s real and what’s not. Reality begins to set in. It’s a good thing—reality is where we (well, most of us) live. And reality, not the stuff of wish lists, is where new software releases run.
Jason Snell for Macworld
May 22, 2019 11:26 AM PT
WWDC, Apple’s Worldwide Developer Conference, is less than two weeks away. In a dozen days we’ll know the broad outlines of where Apple is taking its software in the next year. It’s an exciting time, when you hope against hope that the features you dream about will come true and make it into a new release.
It doesn’t always happen, but when it does, it’s pretty great. Here’s what I’m hoping to see in iOS 13 when Apple unveils it on Monday, June 3.
Jason Snell for Tom's Guide
April 27, 2019 8:32 AM PT
We’re less than six weeks out from Apple’s annual Worldwide Developers Conference, and rumors about the future of Apple’s platforms abound, particularly for iOS 13. Most notable have been the leaks from 9to5Mac’s Guilherme Rambo, who reported numerous tidbits about the future of the operating system that runs iPhones and iPads.
This year, Apple’s giving iOS developers the ability to deploy their apps on macOS, which will change the Mac dramatically. But as you might expect, Apple’s been cooking up some improvements to iOS that are especially exciting. Here are some of the rumored features that have caught my eye.
Jason Snell for Macworld
November 21, 2018 10:31 AM PT
Reviewers can’t agree if the iPad Pro can be used for “real work” or not, but there’s one thing they all seem to agree on: the new iPad Pro hardware is great and Apple needs to invest in upgrades for iOS to take advantage of it.
But what form should those upgrades should take? Leave it to reader Mike R from Twitter to boil this down to the right answer: It’s time for a David Letterman-style Top 10 List. So here it is, ready to be delivered to the home office in Cupertino, California, the Top 10 iPad Features We’d Like To See in iOS 13…