By Jason Snell
February 2, 2021 5:07 PM PT
Bad AppleScript: Fake RSS, real newsletter
However, this change led me to decide that I wanted to change what I had been doing with our members-only newsletter. And that led me down a rabbit hole that led to a very large AppleScript script, which will come as no surprise to some of you.
Previously, we’d write four or five original pieces and mail them out to subscribers as a part of a monthly newsletter. Once the WordPress site was up and running, the newsletter no longer needed to be the (only) vehicle by which subscribers received their members-only content. If you like reading on the web, you can read them there. If you like reading them in RSS, you can read them there. And of course, there’s a newsletter, too.
I also wanted to spread those pieces out across the month and post one every week. And with that, I wanted to change the newsletter from a monthly schedule to a weekly one. On a weekly basis, members could get a members-only article—as well as the other articles we posted every week. It would make the newsletter a good fit for people who prefer to read stuff in email—and those who didn’t, didn’t need to read it anymore since all the members-only stuff was on the website.
So it’s decided, then. Time to make a weekly newsletter. But I didn’t want to spend time building a newsletter every week. I wanted it to happen automatically.
This was the rabbit hole. And what plunged me down that hole was my discovery that MailChimp, the email provider I’ve used for Six Colors since the beginning, supported RSS-generated newsletters.
Now by default, these newsletters were disappointing. Automated blog-to-newsletter systems just want to dump all your blog posts into a newsletter template and send it out. I wanted control over the order and design of the newsletter, with like posts grouped together.
This desire to have it just the way I wanted it is also why I spent months building my own WordPress theme for Six Colors, overriding dozens of default behaviors along the way. I want it the way I want it! Is that so wrong?
Most great user automation projects are the result of a “Bad Idea” moment. “I could do it this way—oh, that’s a bad idea. But…”
Here was my bad idea: Just because MailChimp’s RSS system didn’t work the way I wanted it to didn’t mean I couldn’t make it work the way I wanted it to. I could set MailChimp to automatically mail out an email on Friday evenings to all Six Colors subscribers, based on an RSS feed.
And then I could write a script that would generate an RSS feed with a single entry, containing exactly the newsletter I wanted to send.
Yep, that happened.
Scripting the newsletter
Building a custom RSS feed is easy. I did it for the Movable Type version of Six Colors, and since all podcast feeds are RSS feeds, I’ve done it numerous times for The Incomparable. My script builds an RSS feed with one item with a unique ID and the current time as the creation date, which is enough for MailChimp to recognize it as the one item it should include in this week’s newsletter.
The issue was getting the content to insert in that feed. Here’s how my script works.
First, I need the source content. Though there are some ways to parse RSS feed using the System Events app’s built-in XML parsing, it’s unreliable. Instead, I turned to the JSON format, which is much easier to work with, thanks to Mousedown Software’s excellent app JSON Helper, which converts JSON feeds into AppleScript objects.
Six Colors has a JSON feed. So now I’ve got almost all the content for the newsletter… but there’s a big catch. The JSON feed doesn’t contain the members-only content. That’s only in a custom RSS feed for members.
I decided to brute-force the members-only RSS feed into JSON format, first using the feed2json web service, then ultimately the Node package rss-to-json, running locally on my computer. Now it’s this simple to get my full site feed in parseable AppleScript form:
set theJsonFeed to (do shell script "/usr/local/bin/node ~/feed.js") tell application "JSON Helper" set theItems to (read JSON from theJsonFeed) end tell
Next up, I need to limit the script to the posts from the last week, so subscribers aren’t getting old posts. I set a repeat loop that moves through the list of items until it finds one that’s older than seven days—or, in unix epoch time, 604800 seconds! (The feed presents the most recent items first.)
repeat with theWeek from 1 to 50 set theDateOfficial to ((created of item theWeek of |items| of theItems) / 1000) set todaysDate to (do shell script "date +%s") if (todaysDate - theDateOfficial > (604800)) then exit repeat end if end repeat
Once this is done, the variable
theWeek is the number of feed items, plus one, that have been posted in the last week. Now I’m going to loop through the feed again, backward, moving from that seven-day old item up through the most recent item, so that the contents of the newsletter move forward in chronological order:
repeat with i from (theWeek - 1) to 1 by -1
In essence, if there are 11 items this week, we’re going to step one at a time from 11 all the way to item number one, and process them in turn.
This is the core of the script. Each loop through, it uses a series of
if statements to check to see if an item is of a particular type, and then act accordingly. I’m not going to list them all, but here’s how it handles link posts:
if category of item i of |items| of theItems contains "Link" then set theByline to author of item i of |items| of theItems set theURL to link of item i of |items| of theItems set theHeadline to title of item i of |items| of theItems set theContent to content_encoded of item i of |items| of theItems set theFilteredHeadline to (characters 1 thru ((count of characters of theHeadline) - 2) of theHeadline as string) set theLinks to theLinks & ("<hr><h1 class=\"link\">" & theFilteredHeadline & "</h1>" & "<p><strong>by " & theByline & "</strong></p>" & theContent) end if
This grabs the various fields of the entry and assembles a little newsletter item, which it adds to the variable
theLinks, which will contain every link item in the loop when it’s done.
I should mention that over the last couple of months, I’ve had to keep adding new features to the script based on the contents of the week. Last week I posted four enormous articles, two featuring long transcripts, one full of charts, and one full of Report Card responses. I don’t want all that stuff in the newsletter! So for posts, I’ve added a conditional based on a post’s tags. Certain tags will be ignored by the newsletter; others will display only the introductory text, and then a link to the full story on the website.
All the sections of the newsletter will end up accumulating in a few simple variables, which just need to be placed in order. I build the basic RSS document out of those variables, in a mega-variable.
But it’s still not quite what I want—I need to apply a bunch of styling that doesn’t exist in the base content, add in headers, strip out some pesky items like video embeds, and otherwise get it ship shape. So the script creates a new document in BBEdit and then runs 19 different search-and-replace commands on it, saving the result in the
Then it’s one final shell command to securely copy the RSS feed file up to my server, where MailChimp will come and collect it as a “new post” and mail it out.
That’s it! Except…
Automating the automation
I wanted this script to run automatically. I don’t want to have to remember to run it. Sure, I’m checking it most weeks to make sure there isn’t something awful gumming up the works, but if I forget, I want the newsletter to go out.
I needed to use my Mac mini, because it’s always running. But should I save the script as an app and try to open the app automatically? Should I try to remember how to edit a
No. I ended up using an old favorite, Peter Borg’s scheduling utility Lingon, to create a task that runs my script via the
osascript command-line command every Friday at 3pm Pacific.
Now that wasn’t that hard, was it?
If you appreciate articles like this one, support us by becoming a Six Colors subscriber. Subscribers get access to an exclusive podcast, members-only stories, and a special community.