Support this Site
Become a member and get access to a bonus podcast, newsletter, and community.
By Dan Moren
February 22, 2019 7:00 AM PT
I found myself curious recently as to whether I was getting enough sleep. There are, of course, a bunch of different ways to track sleep on your devices, including many third-party apps. Apple itself has even added a Bedtime feature that lets you remind yourself when it’s time for bed, and set an alarm for when to get up, then logs the time in the Health app.
Having tried that for a while, there were a few things that frustrated me about the approach. The goal of the Bedtime feature is to have you go to bed and get up at the same time, and that’s not something that I can always control. It’s very inflexible and prescriptive in a way that I found annoying, to the point that I eventually just stopped using it.
What I really wanted was for iOS to be a bit more intelligent. For example, it could realize that when I turn off my bedside light (which is a HomeKit-compatible Philips Hue bulb) I’m going to bed. And then, when I pick up my phone in the morning it could log that I’m awake, and store the resulting information in the Health app.
Alas, that functionality doesn’t exist. So I made it myself using a pair of Shortcuts.
The Bedtime shortcut, which can be triggered via the Shortcuts widget or Siri, sets the Good Night scene, turning off my bedroom lights, and then stores the current time in a text file in iCloud Drive.
The I’m Up shortcut, which I manually trigger when I wake up in the morning, reads the bedtime from that text file, gets the current time, and logs both into the Health app, along with calculating the difference between the two and providing a notification about how long I slept. 1 (Although the Get Time Between Dates action in Shortcuts sadly only lets you choose hours or minutes, and, in the former case, rounds it off.)
These Shortcuts are pretty simple, but they get the job done. On the off chance that other people are looking for something similar, I’ve included links above. The only alterations you’ll have to make are the HomeKit scene you want to set for Bedtime, if any. (You can also change the location or name for the bedtime text file; just remember to change it in both Shortcuts.)
When you log Sleep Analysis, it lets you choose whether this should be recorded as “Awake”, “In Bed,” or “Asleep.” Given that I can’t detect my sleep state, I went with the middle option. ↩
Dan Moren for Macworld
February 22, 2019 5:32 AM PT
Oftentimes, new technologies can seem like solutions in search of problems. And while Apple isn’t above those kinds of moves, it also often finds itself ahead of the curve, pushing technologies with a lot of potential before the world at large is ready for them.
Apple Pay has, since its introduction, tended toward the latter. It’s a system that offers real tangible advantages over the status quo; the ability to pay with your iPhone or your Apple Watch offers not only more convenience than paying with a physical card but also bestows much needed security on every transaction. It’s become more and more popular, but there are still lots of places where you can’t yet use it.
Of course, much of Apple Pay’s adoption isn’t entirely under Apple’s control. Some retailers still need to update the hardware or software on their point-of-sale terminals, and the makers of some of those payment systems may have to add Apple Pay compatibility as well. While the recent addition of major chains such as Target and 7-11 help, Apple Pay still hasn’t trickled down to every local shop in my neck of the woods.
Adoption’s just one part of the equation. Even without Apple Pay being ubiquitous, there’s still room for Apple to improve what its contactless payment system offers.
February 21, 2019 4:43 AM PT
This week, on the 30-minute tech show where you can hear the gears grinding, Dan and Mikah are joined by special guests Allison Sheridan and Kevin Clark to talk about improving Apple’s emoji discoverability, getting bogged down in finding the perfect tool for a task, vintage tech (and vintage games) we’d like to pick up where we left off, and the default app we’d redesign in iOS 13. Plus, a special Red Planet-themed bonus question.
By Jason Snell
February 20, 2019 10:30 AM PT
When I first started using a Mac, there wasn’t built-in file sharing—you copied files onto a floppy disk and walked them to a different computer, a process delightfully known as “sneakernet 1.” But the ’90s were an exciting time for more than just grunge music, and the Mac finally got built-in file sharing with System 7, so you could find a Mac, connect to it, see its shared folders, and drag things in and out within the Finder.
This is a file-sharing model that has largely remained intact through all the changes in the Mac platform. To this day I can open a browser window, find a local Mac, give it a password (or log in as a guest), and view a subset of its files as if it were an external disk on my Mac.
But something funny happened back in 2011: Apple introduced an entirely different approach to exchanging files between devices, one that it added to iOS a few years later: AirDrop. Unlike the old approach of mounting a shared folder or volume, this was modern Apple’s take on solving the problem of exchanging files.
AirDrop’s interface is simpler, because you don’t mount any volumes—you just exchange individual files. (AirDrop’s still a drag-and-drop process on the Mac, but on iOS it’s all done out of the sharing interface.) AirDrop is simpler than setting up file sharing and more elegant than emailing yourself or a family member a file (and waiting for it to make the round trip up to your mail server and back down). It bypasses the complexities of your local network and connects directly via Bluetooth and Wi-Fi.
Over the years, Apple has continued to improve AirDrop. In the last year I’ve completely abandoned my old method of transferring large files (mostly audio stuff for podcast projects) between my Mac and my iPad. I previously attached a cable and used the file-sharing features embedded clumsily within iTunes. Now I just use AirDrop. When I AirDrop giant audio files to my iPad, the transfers are fast and iOS does exactly the right thing, offering to open the files in any compatible app, including Ferrite, my podcast editing app of choice. It couldn’t be easier.
Well, that’s not right. It could be easier. When I complete a podcast project I want to transfer the archived Ferrite project file to my Mac, where I can file it away for backup and long-term storage. AirDrop’s an easy way to do it—but only if I’m within 10 or 15 feet of my Mac Mini server, which lives in a corner of my garage. I end up walking into the garage and standing by the server until the AirDrop concludes.
This got me thinking: AirDrop is well established and easy to use and, especially on iOS, a far better alternative to traditional file sharing. (Let me also point out that Apple has refused to support traditional file sharing access in iOS, though you can get to it via a third-party app such as FileExplorer.) So why not expand it to include cases where devices are on the same local network but not within close proximity?
What I’m advocating is an extension of AirDrop that doesn’t just search for devices that are nearby, but also offers devices that are AirDrop-capable and reside on your local network. I can appreciate that the transfers might not be as fast and that there are security issues that would need to be worked out, though I’m not sure the security aspects are more complicated than using AirDrop at a crowded conference or café. Apple has already built in layers of permissions, including the ability to only transfer files to your own Apple ID or the Apple IDs of people you have added to your contacts list, and a requirement that you accept all file-transfer requests from other people.
The other day my daughter needed a few big media files that were stored on my iMac. I told her to walk her MacBook out into the garage and stand there while we transferred the files. It seemed utterly unnecessary. Why couldn’t she go back to her bedroom and get the files across our local network? Why should I set up file sharing on my iMac for a one-off file transfer?
With AirDrop, Apple has come up with a simpler way to pass files around. In doing so, it’s made traditional file sharing seem old and fussy. So my modest proposal to Apple is to take AirDrop and expand its powers. Let people in homes and offices use it to drop files to each other, even if they’re not fortunate enough to be sitting right next to each other. Apple, you did your job and you did it well—I’ve utterly embraced AirDrop. But now I want more.
We used an add-on product called TOPS to move files around our local network at my college newspaper office. ↩
Jason Snell for Macworld
February 20, 2019 9:53 AM PT
Apple’s current strategy in the home tech market is a bit murky. It launched the HomePod and Apple TV 4K in 2017 and HomeKit support seems to have become much more widespread lately, but it also killed the AirPort line of products and has stood by as competitors like Google and Amazon snap up companies like Nest and Eero.
This past week we learned that the company has hired a new head of home products, which makes me ask the question: What exactly does Apple expect Sam Jadallah to do? Is his job to make deals with HomeKit partners and make the HomePod more successful? Or is this the sort of thing that happens when a company shifts gears because it’s realized that its old strategy wasn’t working?
There are no end to the opportunities for Apple in building more devices for the home. It just has to decide if it wants to compete in that market, or write it all off. I’m increasingly coming to believe that Apple needs to do more, not less, in building home products.
February 20, 2019 7:32 AM PT
This week on the most irreverent tech show not yet ripped off by a pale imitation, we recap this past week’s all-Rebound-host Mario Kart tournament, with some surprising revelations. Plus, a rundown of one analyst’s reports on what Apple has in store for the coming year, a look at Huawei’s latest shady doings, and the problem with pockets, both small and large.
By Jason Snell
February 19, 2019 2:41 PM PT
Last week I took a trip during which I needed to record three podcasts (Liftoff, Download, Six Colors Subscriber Podcast) with guests who would be participating via Skype. I almost took my trusty old MacBook Air with me, but I decided to see if I could figure out a way to replicate the bulk of my home recording setup without requiring a Mac.
In the past, I’ve done something similar using the Audio-Technica ATR2100-USB, a microphone that can output a digital signal using USB and an analog signal via an XLR cord simultaneously. The problem is that the last time I tried to use the ATR2100-USB with my iPad Pro, it didn’t return my own voice into my ears, making me unable to judge the sound quality of my own microphone. After years of having my own voice return to me, I strongly prefer not to record unable to hear my own voice. (I use in-ear headphones that largely shut out audio from the outside world, so the experience of speaking while not hearing yourself is even more profoundly weird than it would be with leaky earbuds.)
This time I wanted it all, or at least as close to all as I’m able to get with iOS in the mix: A pristine recording of my own voice, that same high-quality microphone audio also flowing across digitally to my podcast guests via Skype, and the ability to hear both my guests and myself at the same time.
I made it work with the addition of one box to my usual iPad workflow. Here’s what I did:
First, I plugged an analog XLR microphone into my Zoom H6 recorder. That solves the “get a pristine recording of my own voice” problem. But how to get that audio out of my Zoom recorder and into my iPad Pro? If I plug my headphones into the Zoom, I’ll be able to hear myself but not my guests. If I attach the Zoom to the iPad, I can relay my audio—but the Zoom is unable to record audio when it’s being used as a USB audio interface.
Second, I need to route my microphone audio out of the Zoom to a device capable of transferring it to my iPad Pro (and also transferring the voices of my panelists from the iPad back to me). Any standard USB audio interface should be more or less capable of that, and so I used mine—the Sound Devices USBPre2. The trick was how to connect the Zoom to the USBPre2. Fortunately, the zoom has a Line Out port on its front, and the USBPre2 has a line-in port on its side, and I happened to have the right cable (minijack on one side, stereo RCA on the other) to connect the two of them in my random drawer of audio cables.
Third, I attach my USB audio interface to my iPad Pro. (I used a USB-B to USB-C audio cable for this, but an old-school cable will also work with an adapter.) I haven’t yet met a USB device that my iPad Pro is incapable of powering by itself, so the USBPre2 worked just fine. I also attached my headphones to the USBPre2, so I could hear myself and my guests.
That’s it! I could launch Skype, press record on the Zoom, and record a podcast. My guests heard my high-quality microphone audio, I could hear them, and I could hear myself (with no noticeable latency). The only thing I’m really missing is the ability to record my guests’ audio too, as a backup, but I chose to live dangerously and speak only to people who know what they’re doing when it comes to recording for a podcast.
The final step was one that I’ve described before, namely using an external Wi-Fi box to transfer my audio files back to my iPad for editing. This workaround remains until the day where Apple decides to let iPads see external storage devices directly. Then it was off to Ferrite to put the podcasts together after the participants sent me their files and I imported them into Ferrite. (As an added bonus, in a recent update, Ferrite has gained the ability to split multi-track QuickTime audio files into their component tracks. Ecamm’s Call Recorder for Skype uses this approach and until Ferrite was updated, I’d have to use a Mac to split those audio files in two. No longer.)
And that’s it! It’s not pretty, it’s two more boxes than I’d otherwise bring, and I refuse to weigh the difference in boxes and compare it to the weight of my 11-inch Air. The important thing is that I was able to travel with my iPad and no Mac and have more or less the same podcast experience that I have when I’m sitting at home at my iMac.
[Don't miss all our podcasting articles.]
February 18, 2019 12:36 PM PT
A 16-inch MacBook Pro? A 6K Apple external display? Analyst Ming-Chi Kuo has dropped the first detailed report of Apple’s 2019 hardware plans, and Myke and Jason take turns dissecting them and wildly speculating about possible features. Also we ponder what a services-themed Apple event might look like, which is a lovely discussion until someone mentions Drake.
Jason Snell for Tom's Guide
February 17, 2019 10:55 AM PT
It looks as though Apple will hold a special event next month unlike any it’s held in recent memory, according to multiple reports. At the center of the stage won’t be new Mac, iPhone, or iPad hardware, but a new collection of subscription services.
This rumored March 25 event has probably been inevitable for a few years now, ever since Apple called out the importance of services revenue to its corporate growth. The most reliable source of growth at Apple the last few years has been in services, powered largely by the App Store, along with Apple Music, Apple Pay, and iCloud.
With its new services, Apple is planning on using its stature in the tech world, the size of its customer base, and its staggering cash flow to insert itself in markets that are undergoing rapid transformations. And while Apple’s not going to beat Netflix or Amazon Prime overnight, Tim Cook could always unveil a bundle that ties together video, music, news and more that could further shake things up.
Dan Moren for Macworld
February 15, 2019 5:10 AM PT
Apple’s plans to launch a subscription service for news are, by this point, an open secret. Just under a year ago, the company announced its acquisition of existing magazine subscription service Texture, which Apple executive Eddy Cue quickly revealed would be folded into the existing Apple News app.
Since then, the news service has mostly been absent from the limelight, generally taking a backseat to the more prominent news leaking out around Apple’s upcoming video streaming service. But as recent reports have started to filter out that the news service and TV service may be announced by Apple at the same event in March, combined with rumors about the revenue split between Cupertino and its periodical partners, the news service is suddenly back in the front seat—as are the challenges that it will face when it eventually sees the light of day.
February 13, 2019 11:22 AM PT
This week on the 30 minute tech show that gives 60 Minutes a run for its money, Dan and Mikah are joined by special guests Aleen Simms and Casey Liss to discuss Apple wanting 50 percent of publisher revenue for its news subscription service, where exactly all the App Store money is going, what Amazon wants with Eero, and transformative technology that seems underwhelming by today’s standards. Plus, a Valentine’s Day-themed bonus topic.
Jason Snell for Macworld
February 13, 2019 7:48 AM PT
Federico Viticci said it best on the Connected podcast last week: The departure of Angela Ahrendts as Apple’s retail chief is a Rorschach test. One’s reaction to the news will reveal a lot about one’s feelings about the current state of Apple’s retail stores.
I’ve seen a lot of criticism of Ahrendts featuring aspects of the Apple Store experience that actually preceded her. No, she didn’t invent the where’s-the-line, where-do-I-stand set-up that completely breaks everything we ever learned about how to behave in a retail store. (Under her tenure the approach was modified, not discarded—and in recent years I’ve noticed a more aggressive positioning of employees at the front of stores to intercept new shoppers and put them in the right place.)
February 13, 2019 7:37 AM PT
This week, on the irreverent tech show that will always be your Valentine, we discuss Amazon’s purchase of Eero, Kashmir Hill’s attempt to cut the major five tech companies out of her life, Apple’s naming of a new product marketing head for VR, Angela Ahrendts leaving Apple, and, of course, OUR PICKS.
Plus, tune in for our Rebound-host Mario Kart TOURNAMENT, live this coming Sunday, February 17th at 7pm Pacific/10pm Eastern. More information to come.
By Jason Snell
February 12, 2019 9:34 PM PT
The point wasn’t that these tasks were impossible on the iPad, but that they were inconvenient enough—requiring me to research a bunch of apps or figure out workarounds or write scripts—that I was better off just going back to my Mac and doing the work there, primarily in BBEdit and Numbers.
I complained about not being able to do grep searches in my iOS text editors of choice, and while that’s true, several people pointed out that there are iOS apps that are capable of them, most notably Coda by Panic and Textastic Code Editor 7. 1 I own both of these apps and while I don’t like writing articles using them—they’re development tools more than writing tools—they absolutely support grep and I will use them in the future when I need to do pattern-matching searches on iOS.
The biggest impediment to finishing my work on the iPad, though, came from the fact that I needed to generate a bunch of charts in Numbers—and they use a non-default font, Proxima Nova, that wasn’t installed on my iPad. How do you install extra fonts on the iPad?
It turns out, there’s a way—just a spectacularly inelegant one. Several apps will do it, taking font files transferred from the Mac and wrapping them in custom configuration files, then emailing them to yourself, at which point you can install them via the Settings app. I tried the free iFont 2 and it worked perfectly. Installing via the same kind of custom configuration file you’d use to install VPN software or to opt in to one of Apple’s beta-testing programs is not intuitive in any way, but with the help of iFont, I was able to get my charts to display on my iPad identically to how they display on my Mac.
This is perhaps my final lesson from this process 3: That I can work around most, if not all, of the roadblocks that iOS places in front of me. It might take an app I’ve never heard about, a feature of an app I rarely use, or hours of hacking together scripts based on code samples found in Google searches, but I can probably make it work. That’s not necessarily an endorsement—in the end it was far easy for me to go back to the Mac, where I’ve assembled all the tools I need to do my job over more than two decades. It’s a reminder that as appealing as working on my iPad is, there are still rough areas that I’m much more comfortable handling on my Mac.