Every now and then I type something into Spotlight on the iPad and I see that Twitterrific is still installed. I can’t bear to uninstall it, but since Twitter killed third-party client apps, I don’t use Twitter very often. Maybe I check one of my lists once a day. That’s it. And I used Twitter a lot—with my third-party client of choice.
The geniuses who own Reddit have apparently decided to walk the same path as Twitter. Here’s the report from Christian Selig, the developer of Apollo, a leading (and beloved) Reddit app:
Had a call with Reddit to discuss pricing. Bad news for third-party apps, their announced pricing is close to Twitter’s pricing, and Apollo would have to pay Reddit $20 million per year to keep running as-is.
Apollo made 7 billion requests last month, which would put it at about 1.7 million dollars per month, or 20 million US dollars per year. Even if I only kept subscription users, the average Apollo user uses 344 requests per day, which would cost $2.50 per month, which is over double what the subscription currently costs, so I’d be in the red every month.
Not only is the price ridiculous, but (as Selig shows with some back-of-the-envelope math) it’s far beyond what Reddit itself makes on its users. As with Twitter, there is a path for Reddit to walk that allows Selig to build a sustainable app business and for Reddit to be compensated for its service. But this isn’t it.
If Reddit continues on this path, it may discover that some of its most devoted users are devoted because they love Apollo. And if it vanishes, many of those users will too.
Interestingly, my methodology hasn’t really changed much. I still use the Remove Silence command in Logic to separate sounds into visible blocks, and then edit left to right, looking for collisions and interruptions.
The big changes since 2015: I’ve abandoned Skype for Zoom, and Skype Call Recorder for Audio Hijack. And Ferrite Recording Studio on iPad is now in the mix. But my article also covers editing in GarageBand, since it’s free. It’s been literally a decade since I pointed out that GarageBand would be fantastic for podcast editing with a few very small feature additions that already exist in its big brother, Logic. Unfortunately, Apple has never bothered to add them.
As a result of the new arrangement, Padres fans can now obtain a new direct-to-consumer streaming subscription for $19.99 per month or $74.99 for the rest of the season by registering at MLB.TV. This offer is only for Padres fans in the Club’s Home Television Territory and is a separate service than the MLB.TV out of market package. By offering a direct-to-consumer streaming option on MLB.TV in the Club’s territory for the first time, MLB is able to lift the blackout for Padres games previously distributed on Bally Sports San Diego. Fans can also find more information about the availability of Padres games at Padres.com/tunein.
The oncoming failure of regional sports networks in the face of cord cutting is one of the more interesting media stories of our times. While some local cable channels have begun to sell games to cord cutters—Red Sox broadcaster NESN was the first—this is the first time that Major League Baseball itself has taken over all production for a team’s games, and is streaming them directly in the MLB app. (For continuity’s sake, the games will also be on local cable, satellite, and Internet TV providers in the Padres’ geographic territory.)
It sure feels like a milestone moment in the future of sports broadcasting—and the unwinding of the exclusivity of cable TV for sports broadcasting.
My thanks to Kolide for sponsoring Six Colors this week. Kolide offers a more nuanced approach to setting and enforcing sensitive data policies. At most companies, employees can download sensitive company data onto any device, keep it there forever, and never even know that they’re doing something wrong.
IT teams routinely struggle to enforce timely OS updates and patch management, meaning that end users are storing sensitive on devices that are vulnerable to attack. Many MDM solutions are too blunt an instrument and many DLP tools are too extreme and invasive.
Kolide’s premise is simple: if an employee’s device is out of compliance, it can’t access their company’s apps. Kolide lets admins run queries to detect sensitive data, flag devices that have violated policies, and enforce OS and browser updates so vulnerable devices aren’t accessing data. And instead of creating more work for IT, Kolide provides instructions so users can get unblocked on their own. Check out Kolide today.
By
Jason Snell
May 25, 2023 9:18 AM PT
Apple TV’s multiview feature is so good, I want it everywhere
Twice as much baseball, if you want it.
Sometimes you want to watch more than one thing on your TV.
This impulse was initially satisfied by the introduction of TVs with picture-in-picture functionality, but as access to TV began to come from various decoder boxes, picture-in-picture became less practical.
In recent years, set-top-box software has become more sophisticated, home Internet bandwidth has gotten faster, and multi-view TV has returned as an option. Apple introduced picture-in-picture to tvOS, and some apps like ESPN and Fubo have built their own features to allow you to tile multiple live video streams at one time.
Three MLS games at once on Apple TV.
With tvOS 16.5, Apple has added multi-view functionality to the TV app for its live sports broadcasts. This past weekend I was able to try it out with both Friday Night Baseball and Saturday MLS games.
I use Fubo’s multi-view feature all the time; Apple’s is similar but with some uniquely Apple touches. (For example, every box has rounded corners.) To enter multi-view while watching a live event, bring up the player controls and choose the new multi-view option (it looks like a four-square grid; the classic picture-in-picture feature is also still available).
When you enter multi-view, a row of available live events will slide up from the bottom of the screen. The event you’re currently watching will be selected, and you can click on other events to add them to the stack. You can watch two, three, or four events at once and even select which layout you’d prefer. (For example, you can have four events displayed in four tiles taking up quadrants of your TV—at 4K resolution, it’s like you’ve got four 1080 HD TVs!—or you can opt to display one feed at a large size with three others as thumbnails stacked up to the right.)
You can switch audio between feeds by moving around using the Apple TV remote. Clicking on the selected feed will slide it forward into full-screen; tapping the back button will return the interface to multi-view. It’s all pretty straightforward and easy to figure out.
Unfortunately, this excellent implementation is currently limited to the TV app itself. That’s great for Apple’s sports ambitions, but Apple is also the owner of the entire tvOS platform—and this feature should really be a part of tvOS itself. I watch live sports on the Fubo, MLB, ESPN, Peacock, and Paramount+ apps, sometimes at the same time.
The TV app’s multi-view feature is good. So good that I want it everywhere, and I’ll be crossing my fingers that Apple might offer such support in the next version of tvOS.
Everyone wants to talk about AI. Most of them don’t know what it is, but they still want to talk about it. Who’s got it, and who doesn’t. What industries it’s ripe to disrupt. And the one company that’s not in the middle of that conversation is Apple.
Viral images generated by Stable Diffusion and pathological chatbots from OpenAI, Microsoft, and Google are the story of the day. Apple, meanwhile, has nothing. Or perhaps, considering the current state of Siri, less than nothing.
Apple’s nowhere when it comes to AI. That’s the narrative. The thing is, it’s not true. At least, not yet.
Special guest: Jason Snell. Topics: Headset, headseat, headset. And no baseball talk other than how games might look in VR. Also: Final Cut Pro and Logic Pro for iPad, and GM’s dumb decision to drop CarPlay.
Hands on with Final Cut Pro and Logic Pro for iPad
Final Cut Pro and Logic Pro are finally available for the iPad. I’ve had a week to use beta versions of the apps, each of which arrives in the App Store on Tuesday for $5/month or $50/year. And while it’s far too soon to issue final judgments, I’ve definitely got some initial impressions about both of these apps.
Logic Pro: Logical for musicians?
It just wasn’t meant to be.
It’s not you, Logic Pro—it’s me.
I’m not a musician. While Logic Pro on the Mac is one of the apps I’ve used the most over the last decade, I am using it decidedly wrong. I use Logic to edit podcasts, and while it’s perfectly good as a podcast editor, I know I’m not Logic’s target market, nor are its features tuned for me. And that’s only right!
When Apple chose to build Logic for the iPad, it logically focused on music creation and production. The result is an app that I feel like I just can’t judge fairly. I attempted to edit a podcast on Logic on iPad, but the commands I use the most just aren’t there. Splitting clips requires toggling to the separate Split mode, selecting a clip, and swiping down—or alternately, tapping and holding on a clip to bring up a contextual menu, then selecting Split Clip from the Split submenu. Strip Silence, a tool to automatically break long clips into component parts, doesn’t appear at all.
Unlike Final Cut, Logic offers roundtrip support for Logic projects between iPad and Mac. That’s great, but be warned: your Mac project must have saved as a package (if it’s not, you’ll need to use the Save As command to make a project version) and must use the musical grid, not the standard time format. (That’s a very strong hint to anyone who is not a musician—this is not the tool for you.)
Perhaps some of those features will come in time. Perhaps they won’t. That’s okay—Logic is not a podcast editing tool but a tool for musicians. If you are a podcast editor who wants to use the iPad, consider the $30 app Ferrite Recording Studio, which also works on the iPhone and does just about everything a podcast editor would want.
Final Cut Pro: A work in progress
I use Logic a lot on my Mac, but I also use Final Cut Pro all the time. Lately, I’ve been editing short video clips from the Upgrade podcast into videos to share on social media. So I was excited to take Final Cut for iPad for a spin and used it to build video clips from last week’s and this week’s episodes of Upgrade.
Final Cut Pro for iPad doesn’t contain the entire Final Cut feature set and isn’t round-trippable between platforms, though iPad projects are importable to the Mac. I quickly ran into a feature that I use—image stabilization—that just isn’t available on the iPad at this point. You may find some of your favorites missing, too.
While the iPad app’s interface isn’t quite the same as the Mac app, it’s close enough to feel familiar. Performance was never an issue on the M2 iPad Pro I used to test the app—even file exports were snappy. Once I got the hang of it, I was able to edit and export projects pretty quickly, and nobody would ever know that I used my fingers rather than a keyboard and mouse to do it.
However, there’s a lot of room for improvement, even when it comes to Final Cut’s basic touch interface. All the basic editing tools are there, and Apple has come up with some very clever ways to provide touch interfaces for many of them. There are quick-access buttons to turn on multiple selections or to quickly split or trim a clip to the play head. You can choose from multiple selection modes, such as range or clip or edge. Though I was working without any documentation (owing to the apps not being out yet!), I was able to figure out almost everything I wanted to do.
But just because something’s usable doesn’t mean it’s efficient, and that’s where this 1.0 version of Final Cut for iPad falls down. To use Apple’s preferred method of selecting a bunch of clips and then splitting them all at the play head—something I do all the time—required tapping on the selection mode, tapping on each clip in turn, then tapping Done, then tapping the Split at Playhead icon. For a four-clip stack, that meant seven taps every single time I wanted to split those clips.
Dragging items around in the timeline could be frustrating, even when waveforms weren’t turning upside-down and crashing.
The good news is, there’s a bit of a workaround—you have to start dragging with your finger when it’s in an empty area of the timeline and then keep dragging out the resulting selection box until it selects all the clips—at which point you can tap Split at Playhead and get what you want.
Unfortunately, I couldn’t find solutions for other common editing behaviors of mine. Every clip splitting requires a tap in the middle of the screen to select and a tap on the bottom right of the screen to split the clip. Every time I need to toggle between play and pause, I have to tap the play button at the middle left of the screen, which necessitates me holding the iPad in a specific way so that my thumb can hover over the play-pause button most of the time.
These might seem like small things, but every extra gesture, every extra five inches covered, slows down the editing experience. My experience using Ferrite Recording Studio to edit podcasts on the iPad might be instructive: Ferrite lets me split clips by tapping on them and swiping down on the clip with my finger. Two simple gestures, a tap and a swipe, in a single location—much more efficient. Ferrite lets you toggle play and pause by tapping two fingers down on the screen. I don’t want to think about how much time I’ve saved—let alone the creative cost of breaking my concentration!—by tapping with two fingers rather than having to move my hand elsewhere on the iPad screen to hunt for a small play/pause icon.
The app also doesn’t work right when moving tracks within the timeline. I found it exceedingly difficult to slide a clip forward or back without my touch being ignored or triggering an unwanted contextual menu. The only workaround that seemed somewhat reliable was to tap a clip and drag it upward as if I was going to relocate it to a different layer in the stack and then drag it left or right (as a transparent ghost clip) before dropping it back down. Editing by touch was easy—but sliding clips around was excruciating.
I did appreciate the introduction of the jog wheel, a floating interface element that lets you move through the timeline or trip or move clips with precision. I struggled with it conceptually for a few minutes, but after I saw how it worked, I was able to integrate it into my workflow. It certainly helped ease the pain when I was unable to slide clips around by dragging them with my fingers!
A few minutes of editing on an iPad, including use of the jog wheel.
I was similarly disappointed by the app’s seeming indifference to the Apple Pencil. I have come to rely on the Apple Pencil when I’m editing audio on the iPad with Ferrite. It allows a level of precision—both of selection and of gesture—that makes me much faster than I am with my meaty fingers alone. It feels like Final Cut could really fly if the Pencil could quickly split clips and move things around, but right now, it’s almost irrelevant.
I did pick up quite a bit of speed when I snapped the iPad Pro into a Magic Keyboard. Keyboard shortcuts sure can speed things up, and Apple has provided a bunch. Not only was play/pause toggled by the space bar, but I was able to perform familiar Final Cut tasks like splitting clips by typing Command-B while hovering over the proper edit point.
The prerelease version of the app I used could occasionally be quite buggy. At one point, my project crashed something like six times in a row after I attempted to drag an audio clip about five seconds back in the timeline. Sometimes the waveform on the audio clip would display inverted (a bad sign!) just before the inevitable death. Other times, the app would hum along with no trouble for long stretches of time.
Setting a resolution requires tapping a numpad on the screen, even if you have a keyboard.
I also found a bunch of sections of the app that felt very much half-baked. Entering in a custom resolution for a timeline required tapping on a custom number-entry element on the screen, even if I had a keyboard attached. The act of duplicating or editing timelines in a project takes place in the app’s project viewer when I’d expect it to be done in the open project itself. The floating picture-in-picture video preview doesn’t resize to support vertical video.
I couldn’t export a selection, but once I made a new timeline I was able to export my videos quickly and easily.
I was also disappointed by the app’s inability to export the contents of a selection. The only way I was able to export small clips was to duplicate my timeline, delete everything but the short clip, and then export that.
Getting media onto the iPad was also a bit challenging. The video assets for my weekly Upgrade social videos total about 32GB, and while I could probably preprocess the videos to make them a bit smaller, that’s still a lot of data to transfer over Wi-Fi or AirDrop. I ended up connecting my iPad Pro to my Mac and dragging them over using Finder’s file-transfer integration. It was still slow, but less so than all the other methods I tried, short of plugging in an Ethernet adapter.
After quite a few hours in Final Cut Pro iPad, my impressions are mixed. There were moments where I really did get into a groove and felt great about the app—generally when I was using the Magic Keyboard since it gave me access to shortcuts that haven’t been properly translated into the touch interface.
But I also felt a lot of familiar frustration at an app that’s packed with features but hasn’t quite realized that multi-touch gestures and the Apple Pencil can make the process go smoother even without an attached keyboard. The pieces are all in place for Final Cut Pro to become a great iPad app, but it’s still got a lot of growing up to do.
I’m not sure if people are ready for VR (or “XR” as Apple appears to be calling it), but one of the strongly rumored apps is a virtual IMAX screen. Yes, it will be great for movies, but also for sports. I’ve talked about the NBA’s courtside videos on Meta’s Oculus and with a 4K screen instead of the not-HD that even the new Oculus’s (Oculi?) have, it will be even better. I expect the sound will be too, or we’ll just use the AirPods we all have now.
But Apple doesn’t have NBA content… What Apple does have is the MLS, including the rights to do “new and innovative” things, and they have a weekly MLB game….
It’s that one game that I’m focused on. It’s going to need to be more, which means more like MLB.tv than a Friday night game of the week. MLB had one of the first apps on the iPhone and I fully expect that not only will there be a game of the week on the new headset, but also an app that does what the phone app does, except in big, IMAX-sized doses….
The confluence of Apple’s societal dominance, the collapse of RSN’s, and the opening of the VR era – no, really this time – is a potential inflection point inside sports. Not just MLB, but at every level, including the fans.
Needless to say, I’m very interested to see how Apple’s interest in live sports connects to the launch of its new mixed-reality platform.
Last week Joe Posnanski wrote an excellent story about PitchCom (subscription required), the technology that’s now used by Major League Baseball to relay pitch calls between catcher and pitcher. After a century of catchers sticking down one finger for fastball and two finger for curve, two magicians adapted technology used for mind-reading acts to work in baseball.
It’s a fun story, but there’s one technology-related line that made me laugh out loud:
How about using a Bluetooth device so the pitcher and catcher could just talk with each other? Well, in the words of Craig Filicetti, one of the world’s great builders of magic devices, “Bluetooth sucks. It’s completely unreliable and nobody can figure out how to connect and disconnect. It will never be Bluetooth.”
A couple of years ago, my favorite Mac email app—the Gmail web wrapper app Mailplane—was discontinued. After an appropriate period of mourning (which included using Apple Mail regularly for the first time in years), I was desperate for an email app that worked the way I wanted it to.
And the solution presented itself! Neil Jhaveri, who previously worked on the engineering team for Apple Mail itself, founded a company to build a new email app: Mimestream. After a few years in open beta development, on Monday Mimestream 1.0 was officially released.
If you don’t use Gmail as your mail service or need to use the same app across Mac and iOS, Mimestream isn’t for you—yet. I asked Jhaveri what he meant when he said the company will be “turning its attention a bit broader” in the future, and he told me that while the company needed to focus in order to launch a compelling new app, “our mission is to just be the best general-purpose prosumer email client on the market.” That will take time, and the next step is probably an iOS version.
As for support for IMAP email services, it’s also on Mimestream’s to-do list, but right now the app shines because it is a Gmail client through and through, so adding support for the very different IMAP metaphor will need to be done with a lot of care. I do think the app should definitely expand its remit, because it’s very good. But as someone whose top priority was a better Gmail app on macOS, Mimestream was a perfect fit for me on day one—or, technically, two years before day one.
If you’re a Gmail user, Mimestream will be a revelation. Since it was built from the ground up to understand Google’s approach to email, it doesn’t suffer from the weird workarounds required to map an IMAP protocol metaphor onto Gmail’s particular quirks. Instead, it behaves… like Gmail. But in a pure, Swift-driven Mac app.
Most importantly, it uses Gmail’s API to efficiently search my entire Gmail repository. Searching Gmail in Apple Mail frustrates me with its inconsistent and slow behavior, but Mimestream just works. Labels, Inbox categories, server-side filters… it’s got them all.
The app will look completely familiar to anyone who has used Apple Mail. It’s got a multi-column design with mailboxes on the left, a message list in the center, and message content on the right. (And yes, you can close off the message preview if you prefer to open messages in their own windows.)
With version 1.0, Jhaveri and the rest of his team have imported a few features that haven’t appeared before during the app’s lengthy beta, including better multi-account support via a “profiles” system that lets you place multiple accounts into different buckets. Profiles can be toggled on and off using Apple’s Focus Filters feature. Google’s Vacation Responder system is now available directly in the app’s interface.
Don’t let the version 1.0 label scare you. I’ve been relying on Mimestream as my Mac email app for two years, and it hasn’t ever let me down. This is probably the most mature version 1.0 release I’ve ever seen.
The biggest change in going to version 1.0 is that, after two years of using an in-progress email app for free, it’s time for Mimestream to become a real app—with real money changing hands. The app is available as a $5 monthly subscription or a $50 annual subscription. (There’s a 40% discount offer for year one available for the next few weeks.)
As with any productivity app, you’ll need to decide if the price matches your needs. (Mimestream has a page explaining their pricing decision.) Apple Mail is free. Gmail in a browser window is free. But after two years with Mimestream, I couldn’t put down my credit card fast enough.
My thanks to iMazing for sponsoring Six Colors this week.
iMazing is the answer to so many questions about how you can have more control over how to sync and back up iOS and iPadOS devices… and want to do it all from a Mac or Windows PC rather than relying on the cloud. The people behind iMazing, based in Switzerland, have spent 15 years delving deep within iOS to make useful software.
iMazing enables local device backup management with snapshot support, including to external drives or NAS; message extraction and archiving, including attachments and metadata, for SMS/MMS/iMessage and WhatsApp; quick drag-and-drop file copies from computer to iPhone or iPad, spyware checks, two-way music and photos transfer, and a whole lot more.
Whether you’re an individual or a business who wants more control over their mobile devices, you should check out iMazing.
I am reminded by Reader Donni that I haven’t updated my “How I Podcast: Recording” article since the days of Skype. I don’t use Skype now. I use Zoom. So I made a quick update to bring it up to date.
In short: Zoom is the thing we use now, mostly because Zoom is pretty much universally cross-platform and lets you record every participant’s voice on a separate track. That makes editing a podcast vastly easier—but you should still record your own microphone file locally, because that file will sound better than whatever Zoom sends over the Internet.
My thanks to Kolide for sponsoring Six Colors this week. Kolide offers a more nuanced approach to setting and enforcing sensitive data policies. At most companies, employees can download sensitive company data onto any device, keep it there forever, and never even know that they’re doing something wrong.
IT teams routinely struggle to enforce timely OS updates and patch management, meaning that end users are storing sensitive on devices that are vulnerable to attack. Many MDM solutions are too blunt an instrument and many DLP tools are too extreme and invasive.
Kolide’s premise is simple: if an employee’s device is out of compliance, it can’t access their company’s apps. Kolide lets admins run queries to detect sensitive data, flag devices that have violated policies, and enforce OS and browser updates so vulnerable devices aren’t accessing data. And instead of creating more work for IT, Kolide provides instructions so users can get unblocked on their own. Check out Kolide today.
By
Jason Snell
May 9, 2023 4:34 PM PT
Final Cut and Logic arrive on iPad: Questions and (some) answers
Back in November 2015, Apple released the first iPad Pro, and I was hooked. But in the intervening seven and a half years, it’s felt that the iPad’s hardware has constantly been let down by its software—and Apple’s failure to support its own pro iPad hardware with its pro-level apps was a perfect example of the problem.
“At least Adobe is investing in the future of the iPad Pro—something we’ve yet to see from Apple’s own pro software team, which still hasn’t offered versions of Logic Pro and Final Cut Pro for the iPad,” I wrote back in 2018, still lamenting the situation, as I did once again in 2021. When Apple released the M2 iPad Pro last fall, it was able to boast about video performance—but only by trumpeting the third-party app DaVinci Resolve, since Apple’s own video editing software still wasn’t available on the platform.
That all changes this month. Apple announced on Tuesday that Final Cut Pro and Logic Pro are coming to the iPad starting May 23. And beyond the obvious “what took them so long,” I had a lot of questions about both of these apps. Fortunately, I’ve got answers to some—but definitely not all—of them. (For the rest, May 23 is two short weeks away.)
What took them so long?
I said beyond the obvious one! I honestly don’t know, though it’s clear from what I’ve seen that Apple has put an enormous amount of effort into both of these apps. I really wonder what finally made Apple decide to build and ship iPad versions of these apps. (Surely it’s not a project seven years in the making!)
How different are these apps from their Mac counterparts?
Really different in a lot of ways—while also being strangely familiar. Apple clearly intends them both to be touch-first apps, just as the iPad itself is a touch-first device. You can swipe up and down in the center of the Final Cut Pro window to make the timeline larger (and the preview window smaller) or the reverse. A swipe from the left side in Logic makes the channel strip labels and controls wider, and there’s a loop navigator that can slide in from that side, too.
Apple seems to have done just what you might expect: these are apps that are familiarly Final Cut Pro and Logic Pro but modified to support touch gestures. I was especially impressed with the new jog wheel interface in Final Cut Pro, which lets you place a circular interface element on either the left or right edge of the screen and use it to move quickly (or slowly!) through the timeline.
But just as what makes the iPad special is that it’s not just a touch tablet but can take other forms, these apps also seem to embrace those other forms. There’s full support for Apple Pencil, and when you put the iPad Pro in a Magic Keyboard case or attach a keyboard, the app will use familiar keyboard shortcuts and responds to the trackpad-driven pointer as you might expect.
Due to the limited size of the iPad’s display, some items have been relocated—the Logic Pro mixer is its own window, for example—but everything seemed usable, even on a smaller iPad Pro. (I use Final Cut Pro and Logic Pro on a 13-inch MacBook Air without any trouble, so this shouldn’t be an issue—and it isn’t.) That said, the moment I saw Final Cut Pro running on an iPad, I immediately saw the potential of Apple making an iPad Pro with a larger display.
After seven years of editing podcasts on the iPad using Ferrite Recording Studio, I’ve come to appreciate the productivity enhancement that comes from using multi-touch features as the touch equivalent of keyboard shortcuts. The moment I configured Ferrite to toggle playback on and off by using a two-finger tap gesture, my productivity soared. At an initial glance at video demonstrating these apps, I didn’t see any hint of such gestures. But if users have to reach up to the top left corner of one of these apps every time they want to pause or play a video, it will get old really fast. I hope Apple has embraced multi-touch gestures—and if they haven’t, I hope they get with the program soon.
Are these apps compatible with their Mac equivalents?
Logic Pro appears to be more or less directly compatible. According to Apple’s press release, you can roundtrip projects back and forth between Logic on Mac and Logic on iPad without trouble.
Except… there’s just one thing. Many Logic users also use third-party audio plug-ins. You may not know it, but iPadOS supports Apple’s Audio Unit plug-in format and has for a while now. I’ve been using plug-ins inside Ferrite Recording Studio for years now. (And while the early days were pretty shaky, plug-ins are much more reliable today.)
The only catch is that the maker of the plug-ins you rely on must make iPad versions available, or your “roundtrip” Logic project really won’t be. Some pro filter makers, like FabFilter, support the iPad. Others, like iZotope, seem to not have even heard of the iPad. Your mileage may vary.
The compatibility story with Final Cut Pro is less good. You can import Final Cut Pro projects into Final Cut on the Mac in order to take advantage of object tracking and other pro features. That last sentence contained numerous red flags—I hope you caught them.
Final Cut Pro for iPad seems to be a subset of the Mac version. You can start on iPad and move to Mac, but the migration won’t work the other way, and a bunch of features from the Mac just aren’t there on the iPad.
This is disappointing. Yes, the lack of feature parity is unfortunate—but perhaps a bit understandable? But as someone who rarely uses those pro-level features, it’s also frustrating to realize that even my simple projects won’t be portable in case I need to leave home and run off somewhere with an iPad.
Still, there are a lot of cool features that did make it to Final Cut Pro for iPad, including multi-cam support (up to four cameras) and a bunch of “fast cut” features, including a nifty scene-removal mask. There’s also a machine-learning-driven “auto crop” feature that analyzes your video and chooses the best crop to preserve the content across different aspect ratios, like when you’re pulling 16:9 video into a vertical project.
Do these apps mean iPadOS’s sound subsystem has been improved?
iPadOS’s sound subsystem is remarkably rudimentary, as anyone who has tried to play audio from more than one app or record video while also playing back audio has discovered. There are some rumors out there that iPadOS 17 might give the iPad a serious audio upgrade, and I hope they’re true.
I doubt any major sound improvements will surface in iPadOS this month, but it is worth noting that Apple’s press release specifically says that these apps require iPadOS 16.4. That suggests to me that at least something in one or both of these apps requires a little bit of a modification to the operating system in order for them to run smoothly. (Third-party app developers wait for years for Apple to address roadblocks in its operating systems. Apple’s apps release alongside an OS update. That’s the ultimate advantage of being a first-party app.)
What will they cost?
These apps mark what I assume is a long-term policy shift with Apple’s pro media apps, joining tech giants such as Adobe in offering them only via subscription. Apple says each of them will cost $5 a month or $49 a year, and as always, there’s a free one-month trial. There’s no bundle discount, nor are they available in a bundle with their Mac counterparts.
Insert your own debate about subscription software here. Some hate it; some love it. I think, in many ways, it makes sense for apps that are on a pretty constant update schedule (as the Mac versions of these apps are), and I like the idea that you can buy a few months of Final Cut Pro for a project and then stop paying when you’re not using it. Then again, it also commits you to $49 a year — or $98 for both — for as long as you use the apps.
Whether that’s worth it is up to you. But I have to believe that this is the future of the Mac version of these apps, too.
Do these releases validate the iPad Pro as a product?
I love the iPad, but it’s true that in recent months I’ve begun to wonder if Apple truly believes that the iPad is the future of computing. I do think that Apple believes in the iPad Pro as a versatile, productive computing device—and that these apps help fulfill the promise inherent in the sheer power of the top-of-the-line iPad. (Logic Pro will also run on iPads powered by the A12 Bionic or later, but Final Cut Pro requires an M1 or M2 model.)
So, a promise has potentially been fulfilled. I want to praise Apple for (presumably) shipping these apps while also pointing out that it’s taken seven-plus years from the original iPad Pro announcement to get them out the door. There are still some serious questions about what Apple sees as the future of the iPad Pro. But as of this announcement, one big question mark has—finally!—been resolved.