Why I Don’t Have a Personal Site ▹
If you give a developer a domain, they’re going to want to host a site on it. But where should they host it? They’re probably going to need to do some research on hosting. But they can’t decide on hosting until they figure out how they’re building the site. They are really familiar with this framework but since they have full control maybe they should experiment with this other framework.
Abandoned in Place
Photo: Launch Ring Restored, Launch Complex 34, (Apollo Saturn) Cape Canaveral Air Force Station, Florida – Roland Miller
Roland Miller is a family friend and I grew up with many of his photos in the house. He has been working on a project titled Abandoned in Place for more than twenty-five years. He is documenting in photographs the facilities and structures that were the foundation of the U.S. space program, like launch towers, gantries, factories, control rooms and panels.
Roland’s Kickstarter for an Abandoned in Place photo book is almost home! If you are interested in space, science, history, or photography, this is right in your sweet spot and I encourage you to check it out. You can get the book — with essays and memories from space program scientists and astronauts and writing by Ray Bradbury! — along with signed prints, and help preserve the stories and images of these places that are gradually being lost to time and the elements. Go go!
Bridgy by Ryan Barrett is another fantastic-looking and new-to-me tool for spinning up your own piece of the indie web. I’m really excited for the possibilities presented by it and webmention.io, both of which I found via Jeremy Keith. His Indie web building blocks post is one more great primer on how all this works.
I have so much to do. #
Webmention.io is a super idea, basically webmentions as a service from Aaron Parecki, with code offered for runnning your own server. This looks lots more sophisticated than my implementation, and, hey, idea: This kind of mechanism might be a great way for sites (like mine!) over at the Tilde Clubs to turnkey implement webmentions in an otherwise limited server environment.
Maybe that’s something to tinker with this weekend!
Paul Ford had this idea, opened up a pop-up unix server called Tilde Club
and invited friends and strangers to come put stuff on it. It’s pretty cool. I’m ~schussat over there.
September 28, 2014
I’m a long-time Lightroom user. Since its beta release it has been the place where my photos go. I have a slew of export actions and have hacked together methods of varying sophistication (such as the export script that produces a tiny gallery) to accomplish what I want to do. I dug into the data that Lightroom gives me about my photos, too: See, the Lightroom database is sqlite, and one can pull all kinds of things from it, such as this network map of my keyword relationships, or my then-annual exploration into My Year In Metadata.
Way back when I was running an Android phone, I had scripts and smart galleries to help sync photos from that phone into my Lightroom library and later populate galleries to sync to my iPad. It was pretty high-tech, you guys.
This is all a way of noting that I’m very much at home working on my photos in Lightroom, so that even when I got myself my first iPhone a couple of years ago and could get photos from the device onto my MacBook Pro, I didn’t have much use for iPhoto. Photostream wasn’t muh of a solution for me, because, until I recently got a shiny new Mac, I had a version of iPhoto that predated it; my cloud photos couldn’t magically sync to my laptop unless I invested in iPhoto, and I didn’t really want to do that. So I put up with periodically plugging in the phone to the Mac (horrors) or, later, using Dropbox’s photo upload capability, to get pictures out of my phone and into Lightroom.
Point is, there was still friction. But I think now I have a solution, thanks to the good photostream syncing that I now have on the new MacBook Pro with a current version of iPhoto, plus this great and now indispensable tool: PhotoStream2Folder. PhotoStream2Folder basically does what it says on the tin, and does it well: Identifies photos in your photostream (optionally within a specified range) and moves them into a usable location on your file system.
The Lightroom trick? Lightroom can monitor a directory and automatically import anything that lands in it. So with PhotoStream2Folder I just set up its output folder as the directory that I already have monitored from Lightroom. The next part is pretty close to magic: Within moments of landing in my photostream, new images are automatically in Lightroom. The mental overhead is just gone. I tried this in my kitchen: Took a photo of a cup of coffee (because I do), then wandered over to the MacBook Pro on the counter; the photo was already in my Lightroom library. I went to the Sunday farmers market, shot a bunch of photos with the very impressive iPhone 6 camera, and when I opened up the computer upon arriving home, all the photos were in my library, ready for me to edit, share via my extensive and baroque variety of Lightroom export/publish presets, no iPhoto interaction, copying, or re-importing required.
One more thing: If iPhoto is importing my photostream images, too, won’t I end up with a bunch of duplicates? Very good question. Fire up iPhoto preferences > iCloud, and turn off “automatic import.” Then iPhoto will display photos from your photostream, but not import them into your catalog, and they’ll “age out” over time. Meanwhile, all those images will be seamlessly added to Lightroom, where you really want them.
The author of PhotoStream2Folder, Laurent Crivello, asks a small donation via paypal if you find the tool useful. I think it’s awesome, and want to stress that it can be used for anything, not just Lightroom; if you simply want to get photos out of your photostream, without relying on iPhoto as your photo management application, this is your go-to. With one quick solution (please work on Yosemite, please work on Yosemite) it has taken all of the mental overhead and friction out of managing the photos I shoot on my phone.
This week’s Humble Bundle is the first in quite a while where I haven’t already owned most of the included games. Of the set, I only had Papers, Please, so I went for this one. And all weekend long I’ve been playing SteamWorld Dig. It’s fun, has great two-dimensional art, is challenging but never obtuse, with a nice learning curve.
Gamasutra has a cool writeup by the developers that describes the design decisions they made in the dig and jump mechanics. It’s a good read, and a great game. #
September 7, 2014
There’s some research somewhere — or perhaps merely a critical contention — that focusing on taking pictures instead of appreciating the moment inhibits our formation of memory. That is, the argument goes, all our social media documenting makes those moments increasingly fleeting and uncaptured. Well, I spent a good chunk of yesterday slowly making my way through 2007 in photos, and was struck again and again by those memories. I took more than 4,000 photos in 2007; when I scroll through those images I am astounded by what I remember.
I shared many of those photos at the time. I was trying out my first Project 365 on flickr, and I posted at least one and sometimes several pictures per day. It was a pretty big year for me: I lived on my own in Seattle for a while during a predoctoral internship; finished my dissertation and graduated; interviewed for some jobs; made a significant career change; bought a house; and captured hundreds more daily snippets of life because I was basically taking pictures all the time. Walking through not ony the pictures that I shared, but the photos that I didn’t share, brings me back this flood of recollections.
Don’t Forget to Remember This (At John Carey’s blog, which I discovered via Shawn Blanc) is a wonderful essay about why we take pictures and about the pressures that shape what we shoot, and for whom:
The challenges present in photography today are not in the devices we use to capture, it’s not in our approach, skill level, or what we think we need to create good photos; the problem today is in social pressure. Photography has quickly evolved in its short lifespan from revolutionary, to useful, to ubiquitous and full of expectation.
John Carey works through the conflict in contemporary photography between one’s own perspective and the aesthetic driven by likes, shares and faves. So much of what he says resonates deeply, all the moreso as I think about — and look at, again — the photos I never shared: My wife, my son, moments that might be snapshots or might be carefully composed but which were shut out of sharing because perhaps they weren’t fancy enough or evocative enough, or maybe just because I had already posted a couple that day.
Some of the pictures of my wife and son (born much later than the year of photos I have been poring over this weekend) are images I would love to share; they’re so beautiful, and pictures of people have so much vibrance and life to them, which I am always so happy to preserve. But they’re also private. The stars and comments of “great capture” might superficially validate me or make me feel like a portrait photographer with a fine eye. But a flickr friend’s heart would not feel what mine does when I find, again, that lucky photo of my wife suddenly laughing, so wonderfully bright and alive.
My current camera of choice is this wonderful Fuji X100S, but previously I shot with a Pentax K100D and a growing collection of prime lenses. It’s easy to be captivated by new, fun, fine gear: gear acquisition syndrome is driven just as heavily by the peer pressure to make photos like others’ photos and in turn the notion that doing so requires having their gear. But as I look back at these older years’ collections of photos I am also struck that some of those photos are really good. Not simply because I like the subject or I found a good moment, but technically good: They’re sharp, colorful, detailed. That’s a good little camera that’s now nearby on the shelf with its FA35mm lens mounted and ready to go; that combination perfectly fit my own photographic vision for years, and I loved going out and using it.
John Carey again:
My compositions and developing have similar fingerprints in that they tell me a lot about how I felt when I made the photographs. Every click of the shutter for me is a moment worth remembering and it’s the memories that make photography so gratifying for me. I find so much to be thankful for when I look back through the images I have captured through the years.
Back to my own memories: Exploring the photos I made that year, I can’t say that I recall every single moment. Sometimes I was detailed enough to put in a pretty good caption. But in the context of the surrounding images, I get so much back: That was a photowalk around Bellevue; this was a hike on the Arizona trail (and that summer we hiked almost every day!); here’s the celebratory drink the night I decided to go for it; the job interview trip and Half Moon Bay with my grandfather; sitting in the backyard with the dogs and making coffee. Normal days and extraordinary days all lined up next to one another.
And on and on, now with my iPhone and Fuji (regarding which I must confess that I am beginning to feel some desire for the flexibility of point of view offered by an interchangeable lens system; that’s another desire re-kindled by my back catalog and many favorite 50mm images from the old Pentax); and even more with a boy now in preschool and a city and neighborhood that I still frequently walk, camera on my shoulder.
Amethyst, a tiling window manager for OS X
Amethyst arranges application windows on screen (or on a specific space) into a tiled arrangement based on one of several models. (I am using the “tall” setting.) It takes a bit of getting used to, but I like it now that I have sort of got the hang. The key thing to grok is that it arranges all visible windows, so if you have a ton of visible windows, hide everything you don’t need at the moment. Un-hiding a window will cram it into the existing arrangement — by shortening one window to make room for another, for example. So you can pop up a terminal window and have it fit into your view while you use it, and when you hide it, the window(s) that made room for it will expand back to their there previous size. It’s controllable by keyboard, so you can float, resize, and cycle through windows without touching the trackpad. Quite cool.
September 1, 2014
With a MacBook Pro upgrade, mentions hacking, and sync updates, it’s been a sort of Infrastructure Summer around here. To cap it off, after getting the new MBP up and running I decided to get serious about my home storage and backup.
I have frequently heard good things about Synology, so I started there. One of my goals was to centralize storage from a rolling series of external USB drives that I have accumulated over time. These drives have occasionally been connected to an Airport Extreme, hosted via an old iMac, or just plugged in from time to time to offload photos, music or video into “offline” storage, where it’s no longer taking up space on my laptop.
While I sort of drool over the fancy multi-bay devices like the DS1513+, my needs just aren’t that heavy. So Shawn Blanc’s Brief Review of the Synology DS213j — Shawn Blanc made that particular device look just about right. His writeup is the one I pinned for later reference and leaned heavily on for first principles when getting up and running with my very own DS213j. I also learned a lot from Gabe Weatherhead’s post about moving from a Drobo to a Synology. (I don’t know what I would do with Gabe’s 24-port switch save for gaze at the magnificent 18 unused ports, but I picked up an 8-port switch with my DS and dramatically improved the wiring in my cabinet as well as getting a spare hard-line to my laptop for when I’m at the desk.)
Like Shawn, I plugged two 3TB WD Red drives into the box. On setup, the Synology gives the option of automatically configuring the second drive as a mirror of the first; great, one less thing for me to figure out. So that leaves me with 3TB on board space, plus a USB drive plugged in as an additional redundant backup.
Obligatory Old-Days Aside: My first hard drive was a 20 megabyte disk installed in a Compaq DeskPro 386s. I later convinced my dad that we needed to replace it with a 40 meg drive so that I would have room for Wing Commander and my WWIV BBS. Sometimes the very idea of gigabytes of storage, to say nothing of three terabytes of disk space, still sort of blows my mind.
So what am I doing with all this?
- Time Machine backup: I have a 500GB SSD on the new MacBook, so I set up a 1TB share dedicated to Time Machine. For the first backup I used the cabled connection to my switch, and my experience so far is that routine Time Machine snapshots to the Synology work fine via wifi. When I know I need to move around a lot of bits, I hook up the cable. Synology has good instructions for time machine setup.
- All of my Lightroom photos and previous iPhoto libraries are now online. More about this below.
- iTunes media lives here, too. My iTunes library file itself is still on my laptop; some people have the need to share everything across multiple computers, but I don’t need to solve this issue at home.
- This does not make it accessible to the DS-Audio application, unlike the video application which can pull from multiple file locations.
- I have not yet explored remotely accessing any of the content on the synology.
- My nine-year old Brother network printer is now AirPrint enabled, simply by plugging it in to one of the USB ports on the back of the device.
- Consolidation of backups from very old machines, like the Windows laptop that preceded my 1995 iMac.
Visibility & Accessibility
I greatly underestimated the satisfaction of having seamless access to my previously-archived photos. For years I moved from six months to a year of Lightroom library to an external hard drive. This freed up precious storage space on two generations of laptops but meant that I couldn’t get to those images without finding and plugging in that drive. Moreover, with each offloading cycle I had to re-identify the folders I had previously moved over and check backup status before I could even start moving the current set. It was a lot of mental overhead and consequently was all too easy to put off.
Now, with the NAS, I simply scroll further back into the years within my Lightroom library, and there are my photos. I am rediscovering photos and memories that I haven’t thought about in years. In photographic terms it’s inspiring: There are some pictures there that I’m proud of, work that I really like. In emotional terms, it’s wonderful: The summer we hiked every day; the trip to Orcas; odds and ends of days.
Under the hood
Shawn and Gabe both discuss the DiskStation software on the Synology, so I won’t go much into it except to say that I really like it. Being basically a UNIX-based NAS, this is a sophisticated piece of hardware and has a lot of capability. Synology overlays an accessible interface on top of the linux guts of the device, making volume creation and management, user administration, and application installation really slick.
Since it’s linux you can do all kinds of stuff, like
ssh into the box, copy over an rsync script, and use the box as a server to perform scheduled backups to and/or from remote server. Zarino Zappia has a series of great posts that detail accessing the Synology via ssh and moving files around through the terminal.
I would likely look for a device with more USB ports next time. I’ve already swapped the printer with another USB drive a couple of times. It would be much more convenient to have a spare cable that I could easily attach another drive to temporarily.
At some point I’ll explore the features I get by opening up the NAS to external access, like music sharing to my mobile devices, as well as general file sharing. I have some of those remote server jobs to schedule, and I’m interested in trying out the Photo Station app. Meanwhile, there’s not much downside to this solution for me, and at this point it’s pretty much a fully operational battlestation. I got storage, backup, new capabilities from the Tiny Server in the Synology, and even got my cabinet cleaned up as part of my installation. I’m happy.
August 31, 2014
For some time now I have used mover.io to:
- Copy markdown posts from Dropbox to my web server where my static blog engine parses and publishes them
- Sync the slogger-generated entries and photos from that same web server up to Dropbox, where they are integrated into my Day One journal.
- Occasional other file transfers
Previous posts about this: using mover & mover and slogger.
Mover is positioned to provide this service at scale, and while I’m not enterprise power user, the tool was very useful for me. And, hacking with its API was itself a fun project. So I was surprised to learn, when my daily sync jobs appeared to be failing, that the Mover API was no longer supported and had been unsupported for a year!
Well, it was good while it lasted. Fortunately, the guys at Mover were nice about it and pointed me to a new tool, Kloudless. Because Kloudless returns similarly structured JSON in response to API requests, it didn’t take too long to modify my existing code to get up and running with Kloudless. So far, it’s working great: Kloudless seems to be faster than Mover, the API is quite nice, and it’s built for what I’m doing — API-based file transfers, rather than offering the API as a sort of side feature.
The biggest obstacle to getting up and running was needing to figure out how to handle the paging of returned results; unlike Mover, Kloudless returns only up to 1,000 items at a time, and my Day One folder has more entries than this. This is entirely sensible, of course, but it took me some time to lock in on a solution. After that it was a matter of cleaning up a little and updating my Editorial workflow to copy a file over and rebuild, and I’m back in action on both my laptop and mobile devices.
Everything should be shiny once again! Thanks, Kloudless.
August 24, 2014
I’ve thought a couple of times about ways to connect this lonely personal site to the broader world. I have plenty of outgoing links, but little way to know if my posts actually connect with people on the other end, save for the occasional referrer link. Once or twice I started drafting a very small commenting system (one of which used the App.net API and would have stored comments in that system), but my energy waned with my enthusiasm for actally doing something like managing comments.
So I’m really intrigued by the notion of POSSE publishing, which I encountered via Jeremy Keith’s Adactio. The core idea, briefly, is about owning the stuff you put online — which resonates with one of my original motivations for putting up this site — and the community around it has developed and assembled a bunch of resources to implement this goal.
What does this mean for interaction? Among other things, what we think of as “comments” begin their life by being published at the writer’s own site, through the implementation of webmentions, where the target site builds its own mechanism for interpreting and syndicating those comments. The publishers going full-on with this method are using it to syndicate to platforms like Twitter or Flickr, too, all with the goal of initiating everything they put “out there” from within their own tools and sites. Needless to say, this is pretty cool, and it’s an opportunity for me to tinker with Pretty Good Hat, too.
So How Does This Work?
There are two core elements to building this: Sending outbound webmentions, and handling incoming webmentions. I’m going to try taking on each of these in pairs of phases: Phase One is most minimal, and Phase Two will try to be a little more sophisticated and integrated. Each step will need to work within the constraints of my little existing blog engine, and it all leans heavily on the writeup and work done by Jeremy.
Constraint one is that I don’t want to get too deep into the guts of the existing build system for my static blog. This means that building a lot of microformats support into my own posts will have to wait. I’ll have to work any microformats support into my own writing through specifiers to the kramdown processor and minimal modifications to the ruby code that actually builds the site.
Constraint two has to do with the fact that this is basically a static site, and I like it that way. But being a good indieweb citizen requires reciprocity: I don’t want to just be a publisher of my own stuff, and need a way to respond to the generosity of others who may link to me. The “live” appearance of links that may ping my content via a webmention poses some design difficulties.
What I’ve Got So Far
What I have going for me is that this joint is built in a pretty stepwise fashion. The pieces are ruby, but they’re invoked sequentially in a shell script. Without modifying the core write-and-post mechanism, I can add a post-processing method to send webmention(s) using mention ruby client to send webmentions automatically to each URL in my own posts.
So that’s sending webmentions. What about receiving them? Again, I’ll lean on Jeremy’s work and use his minimum viable webmention code. I start by simply receiving and logging the mentions, putting them into a sort of attachment file linked to each of my own posts.
One great benefit of building and joining these two elements of supporting webmentions is that I can build them in parallel and use one to test the other.
On the question of static versus dynamically displaying webmentions: The strongest way to respect my static site goal would be to receive mentions, store them, and periodically check for anything needing processing, and add any new mentions to the static site entry pages. But I decided on dynamic so I wouldn’t need to poll for updates or add more to the core build script. This means a small load when displaying pages, and I could change to a scheduled rebuild if this ever becomes a problem — but this is a very low-traffic site, so that’s unlikely. I also keep a single unified log file for webmentions that I could use to rebuild all mentions if desired or as I expand my sophistication of treating them. For example, right now I reach out to target URLs and retrieve only the
p-name of the entry, if available.
So where am I now? After several hours of tinkering over a couple of weekends I have a functional webmentions implementation. Mentions should be registerable against any post here at Pretty Good Hat, and will be displayed on the single-entry page for that post. My endpoint is discoverable, but like Jeremy I also include a link (again on the single-entry pages) to a submission form for manually sending a mention. I accomplished this in what I think is a reasonably effective way given the constraints I described above: I had to add a few formatting stubs in my build code to put minimal h-entry syntax into my entries, and it look a fair amount of trial and error to get that display the way I like it.
Presenting the mentions requires a jquery call, so it cheats the static site a bit, but I’m already including jquery for lightview, so I may as well re-use it.
I extend a huge thank you to Jeremy Keith, whose posts got me started on this. (This one in particular.) I’m also grateful to this microformats post by Barnaby Walters, from which I learned a bunch. This is obviously a thing in process, and I welcome and appreciate any comments and feedback.
Microsoft.com in 1994
It’s approximately the twentieth anniversary of the Microsoft “home page”, as we called these things at the time, and a team has recreated that original site after some digital spelunking. One of the fascinating parts of the story is that there’s no solid record of exactly when the site was launched or what circa-1993/1994 web technologies were used.
There’s a not-at-all secret easter egg tucked into the code:
<TITLE>Welcome to Microsoft</TITLE>
<!-- To read more about the re-creation of the 1994 Microsoft.com homepage, see readme.html -->
<H1>Welcome to Microsoft's World Wide Web Server!</H1>
<H2>Where do you want to go today?</H2>
<P>If your browser doesn't support images, we have a <A HREF="1994-links.html">text</A> menu as well.</P>
And the linked readme.html describes the effort, including details about the pre-client-side image map, and a cool personal note from one of the developers: “It’s a big story with many turns and it makes you thankful for all the people (and companies) who have sought to make the web great.”
August 8, 2014
A good desk-clearing is in order, as is evident by the volume of drafts I have begun to accumulate. Here in no particular order is a partial inventory of unfinished thoughts, notes and un-realized blog fodder. Free ideas all around.
There was one about going on vacation with my entire digital world so one can stay up to date with RSS feeds, Twitter and the Facebook while sitting in the reeds next to the river. My road trip media entourage includes an iPad loaded with media and games to help a four-year-old get through a ten-hour road trip, the trusty MacBook Pro (here primarily as a photo download location, since I take lots of photos while on any trip); and the adult iPad loaded with music, comics, books ambitiously scheduled to read, and games to play. As it turned out, with all this digitalia at hand, I spent a lot of time biking up a mountain and later sitting in the shade reading an honest-to-god book. I know.
A lot of talk in my media sphere lately is about women in tech, specifically gaming but as part of a broad current that has widened over the past few years as women respond to sexism at conferences, in hiring, and in every day interactions. Being a fallen academic, I naturally want to point some of this conversation to classic work like Rosabeth Moss Kanter’s Men and Women of the Corporation and the large body of writing on gendering in the professions (medical sociology, which I taught for a while, has a lot of this). The people responding to well-written and heartful posts about fear, sexism and misogyny with “everybody knows women don’t want to work in tech/choose to have babies/aren’t good at it anyway/asked for it” won’t care, but these bodies of work might help situate the authors’ experiences in a trajectory. While the tech industry didn’t invent this experience, it does seem to have polished it to a pretty fine sheen.
David Carr wrote recently on the overwhelming feast of prestige television
The growing intellectual currency of television has altered the cultural conversation in fundamental ways. Water cooler chatter is now a high-minded pursuit, not just a way to pass the time at work. The three-camera sitcom with a laugh track has been replaced by television shows that are much more like books — intricate narratives full of text, subtext and clues.
On the sidelines of the children’s soccer game, or at dinner with friends, you can set your watch on how long it takes before everyone finds a show in common. In the short span of five years, table talk has shifted, at least among the people I socialize with, from books and movies to television. The idiot box gained heft and intellectual credibility to the point where you seem dumb if you are not watching it.
About the same time, an episode of Pop Culture Happy Hour focused on nerds and nudity, with discussion of “nerds” becoming more acceptable as “enthusiasts” through sharing the things they (we) love.
There is a lot that is complementary about these two pieces, in the rising cultural currency of pursuits that have been marginalized and/or lowbrow. Pop Culture Happy Hour in particular is in position to bridge these realms (and continues to be one of the podcasts that I really look forward to). The cynical take on this is that it’s all marketing, that the kids who grew up on comics and video games can now buy them for themselves. But the one I prefer is that the kids whose interests were marginal — or marginalizing — are now in positions to make exactly the things that they love, and are able to share those things widely and sometimes (hey, because of that culture industry) even make money at it. Good for them [us]!
Should you find yourself migrating to a new MacBook, as I did recently (happily, not due to disaster or failure), don’t forget to copy over your private SSH keys. What am I doing tonight? Re-generating keypairs so that I can publish this very post through my bloggy-woggy machine. Speaking of which, Frankenstein’s ___.sh looks like a very cool static blog publishing project: “It is a nameless, horrible and recursive assemblage of
mkdir etc…, has no option, no
for, and uses almost no variable.” Neeeat.