Amethyst arranges application windows on screen (or on a specific space) into a tiled arrangement based on one of several models. (I am using the “tall” setting.) It takes a bit of getting used to, but I like it now that I have sort of got the hang. The key thing to grok is that it arranges all visible windows, so if you have a ton of visible windows, hide everything you don’t need at the moment. Un-hiding a window will cram it into the existing arrangement — by shortening one window to make room for another, for example. So you can pop up a terminal window and have it fit into your view while you use it, and when you hide it, the window(s) that made room for it will expand back to their previous size. It’s controllable by keyboard, so you can float, resize, and cycle through windows without touching the trackpad. Quite cool.

With a MacBook Pro upgrade, mentions hacking, and sync updates, it’s been a sort of Infrastructure Summer around here.1 To cap it off, after getting the new MBP up and running I decided to get serious about my home storage and backup.

I have frequently heard good things about Synology, so I started there. One of my goals was to centralize storage from a rolling series of external USB drives that I have accumulated over time. These drives have occasionally been connected to an Airport Extreme, hosted via an old iMac, or just plugged in from time to time to offload photos, music or video into “offline” storage, where it’s no longer taking up space on my laptop.

While I sort of drool over the fancy multi-bay devices like the DS1513+, my needs just aren’t that heavy. So Shawn Blanc’s Brief Review of the Synology DS213j — Shawn Blanc made that particular device look just about right. His writeup is the one I pinned for later reference and leaned heavily on for first principles when getting up and running with my very own DS213j. I also learned a lot from Gabe Weatherhead’s post about moving from a Drobo to a Synology. (I don’t know what I would do with Gabe’s 24-port switch save for gaze at the magnificent 18 unused ports, but I picked up an 8-port switch with my DS and dramatically improved the wiring in my cabinet as well as getting a spare hard-line to my laptop for when I’m at the desk.)


Like Shawn, I plugged two 3TB WD Red drives into the box. On setup, the Synology gives the option of automatically configuring the second drive as a mirror of the first; great, one less thing for me to figure out. So that leaves me with 3TB on board space, plus a USB drive plugged in as an additional redundant backup.

Obligatory Old-Days Aside: My first hard drive was a 20 megabyte disk installed in a Compaq DeskPro 386s. I later convinced my dad that we needed to replace it with a 40 meg drive so that I would have room for Wing Commander and my WWIV BBS. Sometimes the very idea of gigabytes of storage, to say nothing of three terabytes of disk space, still sort of blows my mind.

So what am I doing with all this?

  • Time Machine backup: I have a 500GB SSD on the new MacBook, so I set up a 1TB share dedicated to Time Machine. For the first backup I used the cabled connection to my switch, and my experience so far is that routine Time Machine snapshots to the Synology work fine via wifi. When I know I need to move around a lot of bits, I hook up the cable. Synology has good instructions for time machine setup.
  • All of my Lightroom photos and previous iPhoto libraries are now online. More about this below.
  • iTunes media lives here, too. My iTunes library file itself is still on my laptop; some people have the need to share everything across multiple computers, but I don’t need to solve this issue at home.
    • This does not make it accessible to the DS-Audio application, unlike the video application which can pull from multiple file locations.
    • I have not yet explored remotely accessing any of the content on the synology.
    • Update: Nope. iTunes happens to not like having its media location on a disk that’s not always available. As a result, it periodically resets itself to use a local, default media location, no matter how much I tried to ensure that the NAS share was mounted before running iTunes. This led quickly to situations where the library was out of sync with the media store, confused metadata, missing audio files, and piles of duplicates resulting from iTunes trying to re-re-reconsolidate its media. So unless/until iTunes in the future has better support for a network media location, I’m using my local disk, once again, for this. (A friend suggests using a symbolic link to point the local location to a NAS share; this seems worth a try, but I haven’t done it, yet.)
    • My nine-year old Brother network printer is now AirPrint enabled, simply by plugging it in to one of the USB ports on the back of the device.
    • Consolidation of backups from very old machines, like the Windows laptop that preceded my 1995 iMac.

Visibility & Accessibility

I greatly underestimated the satisfaction of having seamless access to my previously-archived photos. For years I moved from six months to a year of Lightroom library to an external hard drive. This freed up precious storage space on two generations of laptops but meant that I couldn’t get to those images without finding and plugging in that drive. Moreover, with each offloading cycle I had to re-identify the folders I had previously moved over and check backup status before I could even start moving the current set. It was a lot of mental overhead and consequently was all too easy to put off.

Now, with the NAS, I simply scroll further back into the years within my Lightroom library, and there are my photos. I am rediscovering photos and memories that I haven’t thought about in years. In photographic terms it’s inspiring: There are some pictures there that I’m proud of, work that I really like. In emotional terms, it’s wonderful: The summer we hiked every day; the trip to Orcas; odds and ends of days.

Under the hood

Shawn and Gabe both discuss the DiskStation software on the Synology, so I won’t go much into it except to say that I really like it. Being basically a UNIX-based NAS, this is a sophisticated piece of hardware and has a lot of capability. Synology overlays an accessible interface on top of the linux guts of the device, making volume creation and management, user administration, and application installation really slick.

Since it’s linux you can do all kinds of stuff, like ssh into the box, copy over an rsync script, and use the box as a server to perform scheduled backups to and/or from remote server. Zarino Zappia has a series of great posts that detail accessing the Synology via ssh and moving files around through the terminal.

Good stuff.

Looking ahead

I would likely look for a device with more USB ports next time. I’ve already swapped the printer with another USB drive a couple of times. It would be much more convenient to have a spare cable that I could easily attach another drive to temporarily.

At some point I’ll explore the features I get by opening up the NAS to external access, like music sharing to my mobile devices, as well as general file sharing. I have some of those remote server jobs to schedule, and I’m interested in trying out the Photo Station app. Meanwhile, there’s not much downside to this solution for me, and at this point it’s pretty much a fully operational battlestation. I got storage, backup, new capabilities from the Tiny Server in the Synology, and even got my cabinet cleaned up as part of my installation. I’m happy.

  1. And even moreso if you count the landscaping in the back yard! [return]

For some time now I have used to:

  • Copy markdown posts from Dropbox to my web server where my static blog engine parses and publishes them
  • Sync the slogger-generated entries and photos from that same web server up to Dropbox, where they are integrated into my Day One journal.
  • Occasional other file transfers

Previous posts about this: using mover & mover and slogger.1

Mover is positioned to provide this service at scale, and while I’m not enterprise power user, the tool was very useful for me. And, hacking with its API was itself a fun project. So I was surprised to learn, when my daily sync jobs appeared to be failing, that the Mover API was no longer supported and had been unsupported for a year!

Well, it was good while it lasted. Fortunately, the guys at Mover were nice about it and pointed me to a new tool, Kloudless. Because Kloudless returns similarly structured JSON in response to API requests, it didn’t take too long to modify my existing code to get up and running with Kloudless. So far, it’s working great: Kloudless seems to be faster than Mover, the API is quite nice, and it’s built for what I’m doing — API-based file transfers, rather than offering the API as a sort of side feature.

The biggest obstacle to getting up and running was needing to figure out how to handle the paging of returned results; unlike Mover, Kloudless returns only up to 1,000 items at a time, and my Day One folder has more entries than this. This is entirely sensible, of course, but it took me some time to lock in on a solution. After that it was a matter of cleaning up a little and updating my Editorial workflow to copy a file over and rebuild, and I’m back in action on both my laptop and mobile devices.

Everything should be shiny once again! Thanks, Kloudless.

  1. Here’s another great use of a webmentions implementation: Low-overhead lists of related posts, simply by linking them from another post and running the webmention hook. [return]

I’ve thought a couple of times about ways to connect this lonely personal site to the broader world. I have plenty of outgoing links, but little way to know if my posts actually connect with people on the other end, save for the occasional referrer link. Once or twice I started drafting a very small commenting system (one of which used the API and would have stored comments in that system), but my energy waned with my enthusiasm for actally doing something like managing comments.

So I’m really intrigued by the notion of POSSE publishing, which I encountered via Jeremy Keith’s Adactio. The core idea, briefly, is about owning the stuff you put online — which resonates with one of my original motivations for putting up this site — and the community around it has developed and assembled a bunch of resources to implement this goal.

What does this mean for interaction? Among other things, what we think of as “comments” begin their life by being published at the writer’s own site, through the implementation of webmentions, where the target site builds its own mechanism for interpreting and syndicating those comments. The publishers going full-on with this method are using it to syndicate to platforms like Twitter or Flickr, too, all with the goal of initiating everything they put “out there” from within their own tools and sites. Needless to say, this is pretty cool, and it’s an opportunity for me to tinker with Pretty Good Hat, too.

So How Does This Work?

There are two core elements to building this: Sending outbound webmentions, and handling incoming webmentions. I’m going to try taking on each of these in pairs of phases: Phase One is most minimal, and Phase Two will try to be a little more sophisticated and integrated. Each step will need to work within the constraints of my little existing blog engine, and it all leans heavily on the writeup and work done by Jeremy.

Constraint one is that I don’t want to get too deep into the guts of the existing build system for my static blog. This means that building a lot of microformats support into my own posts will have to wait. I’ll have to work any microformats support into my own writing through specifiers to the kramdown processor and minimal modifications to the ruby code that actually builds the site.

Constraint two has to do with the fact that this is basically a static site, and I like it that way. But being a good indieweb citizen requires reciprocity: I don’t want to just be a publisher of my own stuff, and need a way to respond to the generosity of others who may link to me. The “live” appearance of links that may ping my content via a webmention poses some design difficulties.

What I’ve Got So Far

What I have going for me is that this joint is built in a pretty stepwise fashion. The pieces are ruby, but they’re invoked sequentially in a shell script. Without modifying the core write-and-post mechanism, I can add a post-processing method to send webmention(s) using mention ruby client to send webmentions automatically to each URL in my own posts.1

So that’s sending webmentions. What about receiving them? Again, I’ll lean on Jeremy’s work and use his minimum viable webmention code. I start by simply receiving and logging the mentions, putting them into a sort of attachment file linked to each of my own posts.

One great benefit of building and joining these two elements of supporting webmentions is that I can build them in parallel and use one to test the other.

On the question of static versus dynamically displaying webmentions: The strongest way to respect my static site goal would be to receive mentions, store them, and periodically check for anything needing processing, and add any new mentions to the static site entry pages. But I decided on dynamic so I wouldn’t need to poll for updates or add more to the core build script. This means a small load when displaying pages, and I could change to a scheduled rebuild if this ever becomes a problem — but this is a very low-traffic site, so that’s unlikely. I also keep a single unified log file for webmentions that I could use to rebuild all mentions if desired or as I expand my sophistication of treating them. For example, right now I reach out to target URLs and retrieve only the p-name of the entry, if available.2

So where am I now? After several hours of tinkering over a couple of weekends I have a functional webmentions implementation. Mentions should be registerable against any post here at Pretty Good Hat, and will be displayed on the single-entry page for that post. My endpoint is discoverable, but like Jeremy I also include a link (again on the single-entry pages) to a submission form for manually sending a mention. I accomplished this in what I think is a reasonably effective way given the constraints I described above: I had to add a few formatting stubs in my build code to put minimal h-entry syntax into my entries, and it look a fair amount of trial and error to get that display the way I like it.

Presenting the mentions requires a jquery call, so it cheats the static site a bit, but I’m already including jquery for lightview, so I may as well re-use it.

I extend a huge thank you to Jeremy Keith, whose posts got me started on this. (This one in particular.) I’m also grateful to this microformats post by Barnaby Walters, from which I learned a bunch. This is obviously a thing in process, and I welcome and appreciate any comments and feedback.

  1. This client is great. It finds links within an h-entry, so it’s not working with anything your footer or header, and it looks to discover a webmention endpoint at your target site. [return]
  2. The h-entry microformat allows me to syndicate a lot more than this; I’d be really interested in learning if there’s a community best practice for what to syndicate. My instinct isn’t to replicate the content of a mention, rather to pull just the name and link. [return]

It’s approximately the twentieth anniversary of the Microsoft “home page”, as we called these things at the time,1 and a team has recreated that original site after some digital spelunking. One of the fascinating parts of the story is that there’s no solid record of exactly when the site was launched or what circa-19931994 web technologies were used.

There’s a not-at-all secret easter egg tucked into the code:

  <TITLE>Welcome to Microsoft</TITLE>
  <!-- To read more about the re-creation of the 1994 homepage, see readme.html -->
    <H1>Welcome to Microsoft's World Wide Web Server!</H1>
    <H2>Where do you want to go today?</H2>
    <P>If your browser doesn't support images, we have a <A HREF="1994-links.html">text</A> menu as well.</P>

And the linked readme.html describes the effort, including details about the pre-client-side image map, and a cool personal note from one of the developers: “It’s a big story with many turns and it makes you thankful for all the people (and companies) who have sought to make the web great.”

  1. OMG, that means I’ve been making my own tiny web pages for twenty years. [return]

A good desk-clearing is in order, as is evident by the volume of drafts I have begun to accumulate. Here in no particular order is a partial inventory of unfinished thoughts, notes and un-realized blog fodder. Free ideas all around.

There was one about going on vacation with my entire digital world so one can stay up to date with RSS feeds, Twitter and the Facebook while sitting in the reeds next to the river. My road trip media entourage includes an iPad loaded with media and games to help a four-year-old get through a ten-hour road trip, the trusty MacBook Pro (here primarily as a photo download location, since I take lots of photos while on any trip); and the adult iPad1 loaded with music, comics, books ambitiously scheduled to read, and games to play. As it turned out, with all this digitalia at hand, I spent a lot of time biking up a mountain and later sitting in the shade reading an honest-to-god book.2 I know.

A lot of talk in my media sphere lately is about women in tech, specifically gaming but as part of a broad current that has widened over the past few years as women respond to sexism at conferences, in hiring, and in every day interactions. Being a fallen academic, I naturally want to point some of this conversation to classic work like Rosabeth Moss Kanter’s Men and Women of the Corporation and the large body of writing on gendering in the professions (medical sociology, which I taught for a while, has a lot of this). The people responding to well-written and heartful posts about fear, sexism and misogyny with “everybody knows women don’t want to work in tech/choose to have babies/aren’t good at it anyway/asked for it” won’t care, but these bodies of work might help situate the authors’ experiences in a trajectory. While the tech industry didn’t invent this experience, it does seem to have polished it to a pretty fine sheen.

David Carr wrote recently on the overwhelming feast of prestige television

The growing intellectual currency of television has altered the cultural conversation in fundamental ways. Water cooler chatter is now a high-minded pursuit, not just a way to pass the time at work. The three-camera sitcom with a laugh track has been replaced by television shows that are much more like books — intricate narratives full of text, subtext and clues.

On the sidelines of the children’s soccer game, or at dinner with friends, you can set your watch on how long it takes before everyone finds a show in common. In the short span of five years, table talk has shifted, at least among the people I socialize with, from books and movies to television. The idiot box gained heft and intellectual credibility to the point where you seem dumb if you are not watching it.

About the same time, an episode of Pop Culture Happy Hour focused on nerds and nudity, with discussion of “nerds” becoming more acceptable as “enthusiasts” through sharing the things they (we) love.

There is a lot that is complementary about these two pieces, in the rising cultural currency of pursuits that have been marginalized and/or lowbrow. Pop Culture Happy Hour in particular is in position to bridge these realms (and continues to be one of the podcasts that I really look forward to). The cynical take on this is that it’s all marketing, that the kids who grew up on comics and video games can now buy them for themselves. But the one I prefer is that the kids whose interests were marginal — or marginalizing — are now in positions to make exactly the things that they love, and are able to share those things widely and sometimes (hey, because of that culture industry) even make money at it. Good for them [us]!

Should you find yourself migrating to a new MacBook, as I did recently (happily, not due to disaster or failure), don’t forget to copy over your private SSH keys. What am I doing tonight? Re-generating keypairs so that I can publish this very post through my bloggy-woggy machine. Speaking of which, Frankenstein’s looks like a very cool static blog publishing project: “It is a nameless, horrible and recursive assemblage of bash, sed, cat, echo, date, mkdir etc…, has no option, no for, and uses almost no variable.” Neeeat.

  1. The adult iPad is differentiated from the child iPad, most generally, by the relatively small amount of yogurt crusted into its chamfered edges. [return]
  2. James S. A. Corey’s Caliban’s War, sequel to Leviathan Wakes. Great, fun, galaxy-scale space opera. [return]

Alex Payne writes a letter:

Investors, shrinking in number but growing in wealth and political influence, own the means of digital production. Everyone else is doing shift work and hoping they still have jobs tomorrow.

Today — obligatory finallyOvercast hit the App Store, and all the guys are writing positive reviews.1

I have nothing wild to add; it’s nice: Overall UI refresh is fantastic as compared to my previous podcast app of choice, Instacast. For a while I’ve been dissatisfied with Instacast extremely modal separation between player controls, playlists, and editing subscriptions. On these scores I’m really happy with the feel of using Overcast. A few other things have made it a real pleasure to use, so far:

  • Effortless import from other podcast apps is brilliant and executed so well.
  • Miniplayer at the bottom is great, immediately improving on one of the things that was frustrating about listening in Instacast
  • Overcast resumes playing perfectly: If it cycles out of memory while I’m doing other things on the phone, opening up the app makes it perform much like Rdio does, with the miniplayer appearing at the bottom after a short delay so it’s easy to resume without having to re-find what I was listening to. Unlike Rdio, it even starts playing automatically in the car when bluetooth is active. I love this.
  • Continuous play mode combined with easy arranging of podcasts in a playlist: Cool.
  • Integration with web account for desktop use: Marco has said recently that he doesn’t like doing web apps anymore, and I suppose this barely qualifies as an actual web app; but he really gets web app infrastructure, and the backend support for Overcast works seamlessly and does what it’s supposed to — provide a web point of access to the things I want to listen to for those times I’m at my desk and don’t want to work through the iPhone. All it’s missing is a way to add a subscription (perhaps this is in v1.1), and meanwhile Huffduffer perfectly fits the bill for ad-hoc episode additions.
  • Sharing: Built-in sharing to Twitter and Instapaper, yes, thank you. Once I get something to one of those services, it’s easy to pick up downstream through IFTT. And if I want to get to a specific sharing bookmarklet in Safari, there’s no need to deal with a built-in web view for podcast sites — Just an open in Safari option that makes all those links very close at hand. If there were a Pinboard share option here my social links would be complete, I think.

I’ll have to explore the recommendations feature some more.2 So far, I’m just pleased and happy. I have a couple of long road trips over the next two weeks, and I’m looking forward to putting the app much more fully through its paces.

  1. See MacStories and Snell, for example. [return]
  2. I see my friend Joel is having some trouble sorting it out, so I’m probably already at a big disadvantage. [return]

@YouAreCarrying on Twitter”) generates a random inventory from a compilation of classic Infocom text games. Here’s mine:

@youarecarrying i

a brittle scroll, a milky orb, a bottle of Bourbon, a beautiful brass bauble, your ticket, a spade, a ticket stub, an envelope.

Sounds pretty good to me.

Nich Maragos, Procedural Life Labyrinth: 17 Views of NetHack:

I played NetHack yesterday, I played it today, I’ll play it tomorrow. There’s a better than even chance I’m playing it as you’re reading this. I do this despite having basically exhausted the challenge of the game. Sure, I still die a lot, but less out of inexperience and more out of impatience. I know what to expect from every monster and what to do to counter it.

So why not move on? If it’s not hard anymore, is there a point? I suspect NetHack has gone past something I do to challenge myself and become another one of my comfort zones. I look over at my unopened copy of Bravely Default, and think, “That seems like a lot of effort.” So I sit in the same chair, in the same room, playing the same game, day in and day out. It’s not a good look.

I’ve been enjoying Steve Lubitz & Co’s Isometric podcast for several weeks now,1 and his vicarious company on for a little longer than that. Player two Start is a sweet and nuanced story of his plan to introduce his daughter to his favorite classic games that gradually turns into discovering a new game to enjoy with her and that matches her abilities. Along the way Steve makes some nice observations about some elements of console gaming that hinder kids whose hands are still small and whose gaming started with touch screens of all things. These are things I’ve started to think about with my son, too, and I really love what Steve has put together here.

(Related and also recommended: Pop Culture Happy Hour’s making toddlers into nerds episode, which gave me a small truckload of great recommendations while being similarly sensitive to the way kids need to find their own things to geek out on.)

  1. A show recently asked to join the 5by5 network — congratulations! — and on which Steve claims to be merely the token male among a crew of sharp, all-star co-hosts. But such protestations should not be taken seriously, for the entire cast does a great show. [return]

Use Simple Words

Vigorous writing is concise. A sentence should contain no unnecessary words, a paragraph no unnecessary sentences, for the same reason that a drawing should have no unnecessary lines and a machine no unnecessary parts. This requires not that the writer make all his sentences short, or that he avoid all detail and treat his subjects only in outline, but that every word tell.

Use Beautiful Words:

There’s an amazing thing that happens when you start using the right dictionary. Knowing that it’s there for you, you start looking up more words, including words you already know. And you develop an affection for even those, the plainest most everyday words, because you see them treated with the same respect awarded to the rare ones, the high-sounding ones.

Which is to say you get a feeling about English that Calvin once got with his pet tiger on a day of fresh-fallen snow: “It’s a magical world, Hobbes. Let’s go exploring!”

I want AppleTV to get better in the following way: multiple cross-provider playlists of favorites, things to watch, and other things I have bookmarked. This would eliminate a lot of the need for a big UI overhaul and increase TV watching quality of life a ton.

Pipe dream: an open channel service similar to Roku that would let me plug in stuff like Rdio and, dare I say, Amazon. This will never happen, of course.

Also a pipe dream: a more visually-complete iOS app that would let me truly navigate the AppleTV menus from my phone. Use case: queuing up the next episode of Rescue Bots from the kitchen without having to get into line of sight of the TV. Come on, this can’t be impossible, right?

All I really want1 for iOS is a way to select items from clipboard history. Use case: grabbing a username and a password from 1Password and then switching to a target app where I could paste them both via two operations without switching back and forth twice between the apps. Similarly for things like email and phone number from a contact, or multiple snippets from an article I want to quote. The ease and great convenience of working with the Alfred clipboard history has spoiled me.


It wouldn’t be responsible not to speculate, at this point. Here’s my Beats acquisition theory, which more or less came to me while listening to the first twenty minutes of The Prompt discussion of the value of curation on the bike at the gym. I didn’t get to finish the show, so it’s likely that my so-called original idea was discussed thoroughly after I got off the bike, but here goes:

I agree on the increasing value of curation; there’s too much music and noise to find music that I might like, and I’ve never found algorithmic you-might-like lists very useful. In my case it’s because they very rarely can tell me why I might like something.2 For this reason I’m pretty attracted to at least the concept of the Beats music service, which is promoted as much more driven by real-person music editors.

That said, making a profitable streaming service probably relies substantially on learning what consumers want to listen to, and with 1) slow uptake of iTunes Radio and 2) streaming supplanting buying, Apple was losing out on a precious resource that Spotify, Rdio and (to a lesser extent due to its newness) Beats were eating up at ferocious rates: actual listening data.3 By buying Beats, and assuming that Beats Music does gain a foothold among users, Apple gets insight into consumer preferences that was probably steadily leaking further and further out of their vision every quarter. And as we’re learning that the intent is to run it as basically a subsidiary, it maintains the brand that the Beats team have been building, one that may be a little more vibrant to some consumers than Apple. They get a big headphone business, an injection of new industrial design and software engineering expertise along with it; seems like a really strong move to me.

  1. That’s probably not all I really want, but it’s a start, and all I can think of on this sunny afternoon. [return]
  2. Pandora is supposed to be better at this, but I just never clicked with that service. [return]
  3. Worth mentioning here is, which I have to imagine has, because of its age, by far the biggest listening history database on the planet. That they’re not at the center of all of these conversations has got to be some kind of huge misstep by CBS. CBS owns; everybody knows this, right? [return]

The release of Editorial 1.1 has re-started me on the path of tinkering and tuning, and so far my impression is something along the lines of man oh man. Now, you’re going “oh great, another navel-gazing tools post,” and, no, you’re not wrong. This is another navel-gazing tools post. I do recall my college creative writing professor noting that poems about writing poems often make the worst poems; fortunately that observation probably doesn’t apply to blog posts about blog tools. We will see.

The short version is that I’m excited about this tool and this is a really powerful update. To elaborate:

  • Sub-workflows replace the previous ability to save and re-use workflows as “presets,” and improve on this idea greatly. A workflow can call another workflow as part of its own process, which means that re-using the things I’ve already built is easy, and I can improve those component workflows without having to trace their changes into the other places where they’re used.
  • The UI workflow builder is cool and I’m sure I will only use a tiny bit of its power. However,
  • Together, sub-workflows and UI builder are fantastic: I can use my existing workflows and piece them together into a parent workflow complete with buttons! In about ten minutes of work I have a popup sheet with buttons for my PGH publishing options and tools, that I can invoke from the keyboard directly. This is really, really slick. I admit I did not quite grok this when I read Oli’s preview posts about it, but the simple explanation really is true: one could use this to make full-on python-powered applications, complete with GUI, that run inside Editorial. I’m really excited to see what people come up with.

Also great, Ole’s backup/restore workflow, which has made it bang-easy to keep my workflows in sync across devices: and now with the iPhone version, I can begin work on any device and seamlessly pick up on another; this had been an obstacle between my two iPads, where I only had really tuned up workflows on one or the other. Now I’m equally capable on either iPad or my phone, no friction. It’s almost spooky, how cool it is to see the workflow panel get populated with all the workflows restored from a backup. Previously, the way to do this would be via syncing each workflow manually via the workflow repository. The brilliance of a seamlessness, easy sync just cannot be overstated. I think the ease with which one can keep iOS platforms in sync is going to be a very big deal with this release. Related, Ole Moritz has also put together a snippet backup workflow that looks likely to ease some of the frustration with the new method for using TextExpander in iOS 7.

My small kit of tools that I use to publish this entire site consists primarily of: write, upload1, render static HTML with my home built tiny engine (server side), deploy preview and deploy “production.” I have had these operations automated within Editorial for quite a while now, and every facet of doing that just got even better. Moreover, the app’s improvements give me more reason to incorporate it into even more of my non-blog style writing. In short, a great app is made even better; if you’re already a user you’re going to like it, and if you haven’t tried it but like writing and tools, I think this will be right up your alley.

  1. Really, upload here refers to syncing from Dropbox over to my web host server, where the server side static HTML rendering gets done. [return]

Adactio: Journal—Selfish publishing ▹

I continue to just really, really like what Jeremy Keith writes about putting stuff out on the web:

I have to admit, I really don’t care that much about the specific technologies being discussed at indie web camps: formats, protocols, bits of code …they are less important than the ideas. And the ideas are less important than the actions. As long as I’m publishing to my website, I’m pretty happy. That said, I’m very grateful that the other indie web folks are there to help me out.

What’s New in Editorial 1.1 ▹

I’ve been working on this for over nine months, and in a lot of ways, it feels more like a 2.0, or at least 1.5. There’s a new look for iOS 7, an iPhone version, tons of refinements everywhere, and several major new features for building even more powerful workflows.

No kidding! What a great update to an indispensable application. Most of my posting here as well as other writing I do with Editorial, and having it available on iPhone is huge. The taskpaper support also looks intriguing. Can’t wait to see the writeups by Viticci and Weatherhead.

Pre-post update naturally, Federico Viticci already has some gajillion in-depth words written about out the update. Bookmarked.

lightbox2 lightbox2

The days are getting longer, so long that it’s hard these days to convince my nearly-four-year old that, yes, it actually is bedtime. A recent Saturday evening date night found us out and about downtown and I made a few pictures. The Hotel Monte Vista is, of course, a totemic downtown Flagstaff photo; and this old GMC truck reminded me a bit of my grandfather’s ‘53 Chevy pickup back home.

Since last summer I have been using Koken for image hosting and presentation here at Pretty Good Hat. It’s nice: Pretty interface, Lightroom publishing service, and produces nice galleries. Unfortunately, since my host change, uploading into Koken has frequently failed and we haven’t been able to track down the cause.

I tooled around last weekend to come up with an alternative. Although I like it the built-in gallery publishing feature in Koken, I have not used it much, instead using Koken primarily just to serve up images in multiple sizes for posting here1. So I went thinking about what I could replace that with. I started with Jeremy Friedl’s run any command plugin for Lightroom. Unlike the stock Lightroom function for running a script of application after an export, Run Any Command can work both with individual files and a list of all the files in an export, which makes it great for my purposes where I don’t want to do much file management on my own: Using Run Any Command, I can limit my processing to just the selected files without needing to keep track of previous exports that might have put output files into the same directory.

With Koken, I usually publish images to a gallery and then later return to incorporate the published photos into a blog entry. This is often on my iPad, where I use an Editorial workflow to process Koken embed URLs into markdown and then kickoff the blog preview/publish functions. So when I thought about a replacement, I knew I would want some kind of capability to ease that process — but without the nice Koken library viewer I’d need something to help me identify images and filenames.

Turns out something that does just the trick is built into ImageMagick: The montage command, as written up in great & helpful detail by Pat David, can build a nice contact sheet, or, for my usage, a tiny reference gallery that I can deposit in dropbox and quickly reference. And with a bit more code, I can add that single gallery sheet — a set of thumbnails, basically — to a dirt-simple HTML page that includes the image filenames and titles for further use.

The code

In case you’re curious, here’s how it works2. This is part of a Hard Drive export preset using Run Any Command. First there’s the Command to execute. This takes the exported file (set to appropriate size for my “fullsize” lightbox view in the LR export settings) and makes a smaller image for the blog link:

cp '{FILE}' '{NAME}'-small.JPG
/usr/local/bin/mogrify -resize "300x300>" '{NAME}'-small.jpg 
echo '<li> {name}\t{Title}' >> ~/Desktop/contact_sheet_list-$(date +%y-%m-%d).txt

And then the Command to execute upon completion takes care of the tiny gallery and file handling:

/usr/local/bin/montage -label '%f' -font '/System/Library/Fonts/Helvetica.dfont' -background '#000000' -fill '#ffffff' -pointsize 10 -define jpeg:size=200x200 -geometry 200x200+2+2 -auto-orient {FILES} ~/Desktop/contact_sheet-$(date +%y-%m-%d).jpg
echo '<img src="contact_sheet-'$(date +%y-%m-%d)'.jpg"<br>' > ~/Desktop/contact_sheet-$(date +%y-%m-%d).html
cat ~/Desktop/contact_sheet_list-$(date +%y-%m-%d).txt >> ~/Desktop/contact_sheet-$(date +%y-%m-%d).html
rm ~/Desktop/contact_sheet_list-$(date +%y-%m-%d).txt
rsync -e "ssh -i mysshkey" /Users/alan/Pictures/Exported\ Photos/tgal/*.jpg me@server:path/
rsync -e "ssh -i mysshkey" ~/Desktop/contact_sheet-*.html me@server:path/
rsync -e "ssh -i mysshkey" ~/Desktop/contact_sheet-*.jpg me@server:path/
cp ~/Desktop/contact_sheet-$(date +%y-%m-%d).{jpg,html} ~/Dropbox/tgal
rm ~/Desktop/contact_sheet-$(date +%y-%m-%d).{jpg,html}

Again, all of the montage work is cribbed from Pat David; do check out his great site. The rest is my remarkably inelegant handling of the resulting files and concatenation of filenames into a list from which I can copy filename and title/caption into a post. On the writing side, I have a textmate snippet and an editorial workflow to grab that title+caption string from the clipboard and paste their respective parts into a kramdown-flavored image link:

[![lightbox2](]({:id: .lightview data-lightview-group="GROUP" data-lightview-group-options="controls: 'thumbnails', viewport: false" data-lightview-caption="title" }

It’s ugly, I know. But send that through kramdown and it gets the right Lightview styling and everything. So it’s actually pretty hot. I’m grateful to Pat David and to Jeremy Friedl for documenting and building, respectively, some of the tools that help me do this. Jeremy is a long-time Lightroom plugin developer”) and I’ve happily supported his plugins in the past — now that I’m enjoying photography more again, it’s time to re-up!

  1. I use Lightview for a lightbox-style presentation. [return]
  2. This explanation is really so I remember how I did this in the future. [return]

The beautiful blueprints for Fujifilm’s camera of the future ▹

“If I want to play my favorite song, I want to choose my favorite guitar,” says Fujifilm designer Masazumi Imai. “It’s the same with cameras. If I want to take a photograph of something important to me, I want to choose a special product.”

The prototype photos of the XT-1 are super cool.

I love XKCD as much as every other right-thinking guy who likes science and math and the internets, and this strip about free speech says something that I think is important: Namely, it’s inappropriate to frame all restrictions of expression as infringement on the right to free speech.

However, in a series of tweets, mcc makes a critically important point, that the narrow implication of the XKCD strip is that only government can act to restrict the right to free expression, which is clearly not true:

We live in a world with many systems of control. The government is one system of control. It isn't fundamentally different from the others. — mcc (@mcclure111) April 18, 2014

We — and XKCD — may sneer at CEOs and TV stars who defend themselves from criticism by invoking “free speech:” It’s hard to sympathize with the rich and powerful who stand on a national stage and complain that they have no ability to speak their minds. While the Bill of Rights expressly protects all from restrictions imposed by the government, access to free expression is not at all universal, and, like so many things, is a function of power and resources.

This write up by Buster Benson at Medium is a great, thoughtful piece on how he’s getting real return from the quality of life tracking he is doing with Reporter. He’s categorizing responses to Reporter surveys by whether what he’s doing at the time is “quality time” and then adding detail that at explains why or why not. It’s exactly the kind of thing I thought about recently, but much more fully realized than my current use of Reporter — and it motivates me to refine and improve my own use, to ask questions of the information I am collecting and then find ways to act on it.

Pixel And Dimed: On (Not) Getting By In The Gig Economy: Sarah Kessler really takes the wind out of the “monetize your passion in your free time” marketing of errand and odd-job marketplaces like TaskRabbit:

I spend the biggest chunk of my time, about two hours, labeling photo slideshows at a nickel each. Each of them has five photos, and each photo has 11 pages of labels to use on it. That means that it takes at least 55 clicks to earn $0.05. There are slideshows of cats on couches. Cats on beds. Dogs on beds. Cats in sinks. Dogs with cakes. Cats with cakes. Cats with pizza. Cats with windows. Dogs in car mirrors. Dogs with bananas.

No surprise, it takes an awful lot of small gigs to even begin to get by (she has one or two good days in four weeks of hustling), but Kessler’s story shows just how seriously atypical the social media success stories are.

Working on some stuff, keeping busy and watching green sprout up in the garden. Meanwhile, I made a couple of quick updates to my writeup of hooking Runkeeper to Slogger. Just in case you’re interested.

Man did I ever save a lot of stuff to Pinboard this week, though.

“That Bites!” is a documentary about food allergies and living with food allergies by a 12-year old kid in Chicago. Good for him. He’s funded, but I would love to see him get a ton of money and go big with this thing. Our son has a bunch of food allergies and it’s scary that he could be killed by a bit of peanut cross-contamination. Food is so central to so many activities, and his inability to casually participate — or, say, to get on a plane without our worrying — is profoundly saddening to me. Broader understanding of food allergies, their seriousness, and how to minimize risk to others in environments like restaurants is critical.

Originally posted March 8, 2014 / Updated 2014-03-29

Five days since all the text was removed from and this notice went up on the relatively out of the way community forum at

It is disappointing to report that after a year and a half of uphill battles and unimagined setbacks, after several costly efforts to regroup and find another way, options to keep TextDrive growing have run out, and we will cease operations on the 14th of March, 2014.

Six days at the time I write this, March 8, until TXD turns off the lights, and customers have yet to receive notice via email or on the front page of the business.

Moving Tips

I’m moving to Kaizen Garden. There’s an active forum there where former TextDrive customers are helping each other out. So far my migration has gone perfectly smoothly. I’ve cribbed some sql and rsync commands from Joel Dueck’s set of helpful migration pointers.


The servers at Ubiquity stayed up a couple of weeks longer than expected, but finally went dark yesterday (March 28, 2014). Judging by the traffic at twitter an awful lot of customers were caught unaware. There was never a notification beyond the above-mentioned forum posting, and Jacques Marneweck, formerly of TextDrive and now running Kaizen Garden, has been working at all hours to field requests from stunned users and perform recovery from backup, where available, but not everyone is recoverable:

I hate to be the bearer of bad tidings to a number of users. Writing a reply where there are no backups for a users data really truly sucks. — Jacques Marneweck (@txdjm) March 29, 2014

To be very clear about a couple of things: There was no notification sent directly to customers, and this is on Dean Allen. I wish I could be stunned by this, but I’m mostly disappointed. Dean’s lack of communication is inexcusable, but unfortunately not surprising, given his absence from the operation of TextDrive after he took it over from Joyent. When the Joyent-TextDrive transition took place I noted some of my own concerns about Dean’s capability to pull it off: “… But I concluded that we’re all more or less adults, that the key folks are smarter at this stuff than I am, and that I’d trust Dean not do jump back in through a fit of (merely) fury or loyalty.” Perhaps I should have listened more to that internal warning.

And: My not entirely informed understanding is that Jacques carried the Ubiquity tab for an extra couple of weeks, on his own, in order to help with migrations, but could not perform a global notification because he never had access to the customer database itself. He has, without promise of compensation, taken on helping with recovery for users who aren’t really his customers. This after running TXD operations without pay for months.

I very much hope that Kaizen Garden succeeds profitably, both for my own self-interest of avoiding another migration, and to begin to repay Jacques for the tremendous work he has put in. To reiterate my note above about my migration, my experience there has been flawless: Hosting in an environment nearly identical to TextDrive, which means I had little to do on migration other than import a few databases, move files into place, and throw the DNS switches. It’s a highest-quality operation and has a smart, driven chief at the helm.

Joyent, Née TextDrive

For the sake of completeness, I also wrote about the original announcement of Joyent’s end of lifetime TXD hosting, back when.

Statistically, winters in Flagstaff average fifty inches of snow, and in practice we’re used to that meaning that some winters we really get dumped on, and other winters see rather little snow. The first winter we spent here brought us only a few storms, but since then we’ve had some pretty good seasons, including the whopper of a winter of 2010, when a four-day span dropped five feet of snow on us. That winter I was lucky enough to buy one of the last unsold snowblowers in town and just barely keep up with the storm.

This year we had the longest winter-time dry spell on record, ever: Over thirty days without so much as a whiff of precipitation. So the whole town was buzzing over the prospect of the storm that came through late this week and into the weekend. Reports are that it brought a couple of feet to the San Francisco Peaks, and we got wonderful rain here in town, where it stayed just too warm for it to fall as snow except for a brief period overnight.

I got out around town for an hour or so between storms, looking for good puddles, stormy light, and view of some of my favorite downtown scenes, as well as some new alleyway nooks and crannies.

lightbox2 lightbox2

The view toward the Monte Vista from the south side is one of the iconic pictures of this town, and it’s easy to see why. I can’t resist checking it out nearly every time I walk that direction. This night, the late sun poked through the clouds at just about the right time.

lightbox2 lightbox2

Downtown has a lot of streets at odd angles to one another, old gravel lots, and a mix of new and old construction — churches next door to motels next door to restaurants, and almost everywhere a view of the train tracks or buildings that once housed businesses related to the trains. The neighborhoods are home to small, old homes in different stages of repair, depending on how long they have served as rental housing for NAU students. I enjoy the mix of home construction, some of which shows a strong southwestern influence with Adobe and tile, while other homes are in the mountain town style of clapboard, shingles, corrugated metals. There’s one fascinating house downtown built out of converted shipping containers. Last night, there were some fun puddles to be found, shining up the late evening light.

lightbox2 lightbox2

lightbox2 lightbox2

I made it back to the car just as the storm really opened up again, cold hands flexing in the car after holding the camera in the wind and incipient rain — turning to slush and snow as the temperature dropped in the last block that I pulled up my hood, stuffed the camera back in the bag and made for the car.

A VSCO note

My photo walk was in part inspired by the need to get out of the house, and in part by Michael Laroque’s “Addicted” — VSCO Film 05 writeup. I really love Michael’s photography, and his discussion of Film 05 (I have Film 04 and Film 02, already) prompted me to pick it up after hedging for a few days — do I really need it?, I wondered. Michael knows that the point of these preset collections isn’t to make every photo look the same, but to find some inspiration in the looks they offer:

But it does mean I get to play with a new toolbox and with each and every release, some of the new emulations have triggered ideas, or found their way into my workflow in some shape or form. In many ways it’s like buying a great photography book… It inspires and shakes the status quo even if you don’t end up copying everything you’ve seen.

So, no, I don’t need it, and as a hobbyist I’m not going to make any money or anything. But I like them, and as I’ve mentioned previously, the film packs don’t take me back to my golden days of shooting film (though they do make me think about how long I spent shooting 35mm film in an automatic camera without thinking a moment about it); they help me find moods or textures or ideas that I might not have otherwise. So as someone who just enjoys this, it’s easily worth the price.