Journalism and leaks

Over at Lawfare, Jack Goldsmith had a great article called “Journalism in the Doxing Era“. Professor Goldsmith examined the differences between data published by Wikileaks and The New York Times. I’m no journalist, but I am a journalish, and the thing that stood out to me is what makes the act of publication journalism.

Two attributes, in my mind, make the publishing of leaked or stolen information journalism. First, authentication. Responsible journalism requires presenting facts, not rumors. If documents are published, they had better be the real deal. It’s easy to fake correspondence that looks authentic, but if you publish it, it had better be real.

The second attribute is editorial filtering. Once you’re left with true (or at least authentic) documents, what’s newsworthy? There’s an argument that everything should be published so the public can decide for themselves what they think is important. I’m sympathetic to that, but it’s also a little lazy. Journalists should not just be gatherers of information, but they should be curators of it. That means chucking out what’s not important in favor of what is.

Of course, importance is very context-sensitive, but some things are pretty clear. John Podesta’s risotto recipe? Not important (unless there’s a food blog that wants to run with it). The Clinton campaign receiving debate questions in advance? Important. (As an aside, the whole “but her emails” thing overall may prove to be one of the great tragedies of the 21st century. That doesn’t make this particular example unnewsworthy.)

An editorial filter does lend itself to bias, and an even greater perception of bias by those biased in the opposite direction. Nonetheless, most news consumers don’t have time to examine everything and draw their own informed conclusions. Journalists serve the public interest when they collect facts, but also when they curate them.

Objects in the shell: why PowerShell’s design makes sense to me

A while back, a friend said “PowerShell is what happens when you ask a bunch of drunk lizards to make Bash shitty.” Another friend replied that his understanding is that PowerShell is driven by a desire to “modernize” the shell by piping objects instead of strings. Now knowing that I got my start as a Unix and Linux sysadmin, you might expect me to take the “it’s Bash, except awful” side. But you’d be wrong.

Full disclosure: I have not used PowerShell in any meaningful sense. But from what I know about it, it represents a real improvement over the traditional Unix shell (of whatever flavor) for certain use cases. Some sysadmins lionize the shell script as the pinnacle of sysadminry. This is in part because it’s what we know and also because it’s hard. Oh sure, writing trivial scripts is easy, but writing good, robust scripts? That can be a challenge.

Shell scripts are a glue language, not a programming language (yes, you can write some really complicated stuff in shell scripts, but really what you’re doing is gluing together other commands). PowerShell, in my view, is closer to a programming language that you can script with. This fits well with the evolution in systems administration. Sysadmins in most environments are expected to be able to do some light programming at a minimum. We’re moving to a world where the API is king. Provisioning and configuring infrastructure is now a very code-heavy exercise.

The object focus of PowerShell is truly a compelling feature. I think about all the times I’ve had to use awk, sed, cut, and others to slice up the output of a command in order to feed selective parts into the next or to re-order the output. A machine-parseable medium like JSON or XML makes programmatic gluing much easier.

When running interactively in the shell, strings are much easier for humans to deal with. In those cases, convert the objects to strings. But crippling machine output for the sake of humans doesn’t seem productive. At least not when both can get what they need.

If the Unix shell were being designed today, I think it would have some very PowerShell-like features. PowerShell has the advantage of being a latecomer. As such, it can learn from mistakes of the past without being constrained by legacy limitations. Microsoft is serious about making Windows a DevOps-ready platform, as Jeffrey Snover said at LISA 16. To do this requires a break from historical norms, and that’s not always bad.

Maybe your tech conference needs less tech

My friend Ed runs a project called “Open Sourcing Mental Illness“, which seeks to change how the tech industry talks about mental health (to the extent we talk about it at all). Part of the work involves the publication of handbooks developed by mental health professionals, but a big part of it is Ed giving talks at conferences. Last month he shared some feedback on Twitter:

So I got feedback from a conf a while back where I did a keynote. A few people said they felt like it wasn’t right for a tech conf. It was the only keynote. Some felt it wasn’t appropriate for a programming conf. Time could’ve been spent on stuff that’d help career. Tonight a guy from a company that sponsored the conf said one of team members is going to seek help for anxiety about work bc of my talk. That’s why I do it. Maybe it didn’t mean much to you, but there are lots of hurting, scared people who need help. Ones you don’t see.

Cate Huston had similar feedback from a talk she gave in 2016:

the speaker kept talking about useless things like feelings

The tech industry as a whole, and some areas more than others, likes to imagine that it is as cool and rational as the computers it works with. Conferences should be full of pure technology. And yet we bemoan the fact that so many of our community are real jerks to work with.

I have a solution: maybe your tech conference needs less technology. After all, the only reason anyone pays us to do this stuff is because it (theoretically) solves problems for human beings. I’m biased, but I think the USENIX LISA conference does a great job of this. LISA has three core areas: architecture, engineering, and culture. You could look at it this way: designing, implementing, and making it so people will help you the next time around.

Culture is more than just sitting around asking “how does this make you feeeeeeeel?” It includes things like how to avoid burnout and how to train the next generation of practitioners. It also, of course, includes how to not be a insensitive jerk who inflicts harm on others with no regard for the impact they cause.

I enjoy good technical content, but I find that over the course of a multi-day conference I don’t retain very much of it. For a few brief hours in 2011, I understood SELinux and I was all set to get it going at home and work. Then I attended a dozen other sessions and by the time I got home, I forgot all of the details. My notes helped, but it wasn’t the same. On the other hand, the cultural talks tend to be the ones that stick with me. I might not remember the details, but the general principles are lasting and actionable.

Every conference is different, but I like having one-third of content be not-tech as a general starting point. We’re all humans participating in these communities, and it serves no one to pretend we aren’t.

Weather forecast accuracy is improving

ForecastWatch recently issued a report on the accuracy of weather forecasts from 2010 through June 2016 (PDF here). While many readers will focus on who was more accurate, what stood out to me was how forecast accuracy has improved. Meteorologists have long “enjoyed” a reputation for inaccuracy — often more due to perception than fact. But those in the know are aware that skill is increasing.

Forecast accuracy over time

ForecastWatch’s U.S. analysis shows a clear — if small — improvement in the average accuracy since 2010.

Average U.S. forecast accuracy from 2010 – June 2016.

The chart above shows the average for all of the forecast sources Forecast Watch analyzed. To be frank, World Weather Online is a stinker, and brings the averages down by a considerable margin. Examining the best and worst forecast shows more interesting results.

Best and worst U.S. forecast accuracy from 2010 – June 2016.

Forecasts get less skillful over time, thanks to subtle inaccuracies in the initial conditions (see also: butterfly effect). That’s obvious in both graphs. What this second chart shows is that the best 6-9 forecast is now roughly as skillful as the worst 3-5 forecast was in 2010. And the best 3-5 day forecast is in the middle of the 1-3 day forecast skill from just a few years ago.

Forecasts are definitely improving. This is due in part to better modeling — both more powerful computers and also the ability to ingest more data. Research and improved tooling helps as well.

Forecasts still bust, of course, and forecasters hate bad forecasts as much as the public does. As I write this, forecasters in North Carolina are dealing with an inaccurate snow forecast (winter weather forecasting sucks due to reasons I explained in a previous post). Missed forecasts can cost money and lives, so it’s good to see a trend of improvement.

Forecast accuracy in your city

The ForecastWatch report breaks down by broad regions: United States, Europe, and Asia/Pacific. But weather is variable on much smaller scales. The ForecastAdvisor tool compares forecasts at the local level giving you the ability to see who does the best for your city. As of early January 2017, AccuWeather had the most accurate forecasts for Lafayette, Indiana, but they only place fourth when considering the past year.

My 2016 in review

Well 2016 is over. Looking back on the previous year seems to be the in thing to do around now, and it sure beats coming up with original content, so let’s take a look at the past year.

Between this blog,, and The Next Platform, I published 102 articles in 2016. That doesn’t count blog posts, conference papers, marketing materials, and other things I wrote for work. Writing has certainly claimed a central part of my life, and I like that.

In 2016, I got to see my articles in print (thanks to the Open Source Yearkbook). I started getting paid to contribute (I was even recruited for the role, which is a great stroke to my ego). I presented talks at two conferences, chair sessions at two others (including one where I was the co-chair of the Invited Talks). My writing has given me the opportunity to meet and befriend some really awesome people. And of course, it has helped raised my own profile.

Blog Fiasco

Blog Fiasco had what is probably its best year in 2016. I was able to keep to my Monday/Friday posting schedule for much of the year. Only in May — when I was traveling extensively — did I have an extended stale period. I only published 78 articles here compared to 99 in 2015, but I also have done more writing outside of this blog. With just over 8,000 views in 2016, traffic is up by about 5%. As a matter of contrast, my article on a bill working its way through the New York Senate had more views than all of Blog Fiasco.

Top 10 articles in 2016

These are the top Blog Fiasco articles in 2016

  1. Solving the CUPS “hpcups failed” error
  2. Reading is a basic tool in the living of a good life
  3. When your HP PSC 1200 All-in-One won’t print
  4. Fedora 24 upgrade
  5. Accessing Taleo from Mac or Linux
  6. A wrinkle with writing your resume in Markdown
  7. elementary misses the point
  8. Hints for using HTCondor’s credd and condor_store_cred
  9. Book review: The Visible Ops Handbook
  10. What do you want in a manager?

Top articles published in 2016

Here are the top 10 Blog Fiasco articles that I published in 2016.

  1. Fedora 24 upgrade
  2. Hints for using HTCondor’s credd and condor_store_cred
  3. What do you want in a manager?
  4. Product review: Divoom AuraBox
  5. A culture of overwork
  6. Disappearing WiFi with rt2800pci
  7. mPING and charging for free labor
  8. What3Words as a password generator
  9. My new year’s resolution
  10. left-pad exposed the real problem

So 2017 then?

I’m pleased to see that a few of my troubleshooting articles have had such a long and healthy life. I’m not sure what it means that the article I published on December 30th was the ninth-most viewed article of the year, but it certainly says something. This blog has never really been for anyone’s benefit but my own, as evidenced by the near-zero effort I’ve put into publicizing articles. In part due to having other, audience-established outlets for writing, Blog Fiasco has become a bit of a dumping ground for opinions and articles that don’t really fit on “real” sites. I’m okay with that.

Will I put more effort into promoting content in 2017? We’ll see. I think I’d rather spend that time writing in places that already have visibility. The monthly “where have I been writing when I haven’t been writing here?” posts will make it easy to find my work that doesn’t end up here.

On a personal note

Outside of my writing, 2016 has been a year. Lots of famous people died. Closer to home, it was a year with a lot of ups and downs. My own perception is that it was more down than up, but I think 2017 is heading in the right direction again. I’ll let you know in early 2018.

Professionally, I’ve changed positions. I left an operations management (but really, operations doing-ment) role to do technical marketing and evangelism. It was an unexpected change, but a hard-to-pass-up opportunity. I don’t regret the decision, except that it has changed what I thought my career trajectory was, and I haven’t yet figured out if I want to curve back that way at some point or if I want to continue down this (or another) path. I know better than to make specific plans, but I take comfort in having a vague target in mind.

And then of course, there’s stuff going on in the world at large. I try to avoid politics on this blog, but I’ll take a moment to say that the next few years are shaping up to be “interesting”. I have a lot of concerns about social and environmental protections that may cease to exist. Nationalist movements in the U.S. and Europe are gaining steam. I know that even if things get as bad as some fear, society will eventually recover (depending on what happens with climate change, “eventually” could be pretty long), but I also know that for some people it will really suck.

Whatever 2017 brings, I wish you health, happiness, and success, dear readers.

Other writing in December 2016

Happy new year! Where have I been writing when I haven’t been writing here?


Once again, SysAdvent was a great success. The large community that has built around this project means I do less than in years past. I want to give others the opportunity to get involved, too. This year I edited one article:

The Next Platform

I’m freelancing for The Next Platform as a contributing author. Here are the articles I wrote last month:

Over on, we hit the million page view mark for the third consecutive month. I wrote the articles below.

Cycle Computing

Meanwhile, I wrote or edited a few things for work, too:

  • LISA 16 Cloud HPC BoF — I summarized a BoF session at the LISA Conference in Boston.
  • Various ghost-written pieces. I’ll never tell which ones!

My new year’s resolution

I’m not usually one for making resolutions for the coming year. I know myself well enough to know that my resolve will wane pretty quickly. (I may be lazy, but at least I admit it!) But for 2017, I have decided to make one resolution.

I resolve to read. 

Not to read more books, blogs, magazines, etc., though I would like to do that. My resolution involves what I share. 2016 had many lessons for us, one of which is that it’s far too easy to share something that reinforces our existing views, even if that something happens to be totally false. Or even if the article is factually correct, the headline could be way off.

So in 2017, I will not share articles that I have not read. No more sharing based on the headline or the opening paragraph. I can’t independently fact check every article I read, but I’ll do my best to validate claims that seem to wild – or too good – to be true.

Does this mean I won’t share as much? Almost certainly. But it also means that what I share will be higher quality. I’d like to think people read my writing and follow me on Twitter for quality information, not just my stunning good looks and fiery hot takes.

As you consider your 2017 resolutions, I urge you to please join me in adopting this one for your own.

What the IRS taught me about user experience

Way back in 2014, I screwed up my taxes. I filed too early and had to amend them when new information came in. I actually screwed up the refile, too, but since it was a math error, it was caught and corrected automatically. But then some more paperwork came in and I apparently ignored or forgot about it. The paperwork happened to be related to a change in a retirement account that resulted in a tax obligation.

In the summer of 2016, the IRS figured it out and sent me a letter letting me know I owed a not-insignificant sum of money. I sent a reply letter letting them know they incorrectly removed a credit, but that I otherwise agreed to it. So we settled on a number and I mailed them a check.

You do want me to pay you, right?

I want to be clear: the story I tell in this post is of my own making. Had I not screwed my return up in the first place, there would be nothing to say. However, once I started down this path, the user experience added unnecessary delay and frustration.

It started when I mailed the “yes, I agree” form and a check for the full amount due. “This is important mail,” I said to myself, “I should send it Certified Mail.” Normally, that makes sense; you get tracking, signature on delivery, etc. But in this case, it means that someone from the IRS had to go sign for the envelope instead of just getting it with the rest of the mail. As a result, three weeks after I had mailed the check, the IRS still had no record of having received it.

(As an aside, this become further frustrating when the post office failed to scan the envelope, so the tracking information never explicitly said it was picked up. Don’t send Certified Mail to a P.O. Box, kids.)

Since the due date was nearly upon me and the IRS couldn’t say that they had received it, I decided to put a stop payment order on the check. I submitted the payment online. A week or so later, I received a check from the IRS in the amount of my payment. When making the payment, I selected “Civil Penalty”. The “you owe us money” letter was form “CP2000”, so it would make sense that the “CP” stands for “Civil Penalty”, right?

So I call the IRS again and they say “no you should have selected payment to your 1040.” I return the check with instructions to apply it correctly. Another month or so goes by and I get another check, this time for a much smaller amount. For some reason, they adjusted my payment due again. This time, the amount was lower. Last week I called yet again and verified that I was all sorted out and everything was in order.

Taxes are serious business. Failure to pay is generally frowned upon. Even when you owe due to an honest mistake, it’s a very stressful situation. Getting the runaround only makes it worse. I would expect that most people in a similar situation are there for the first time. The bureaucracy of the IRS is overwhelming, and a lack of clear instructions makes it worse.

Don’t be like the IRS

So why do I tell this story? When designing a product or process, or when documenting it, think about how a nervous first-time user would approach it. If the IRS form had said “if you mail us a check, don’t send it Certified Mail”, that would have shortened the time to resolve this by three months. Similarly, online payment instructions more detailed than “go to this page (which is the same page for a variety of payments)” would have helped.

The IRS has different departments and whatnot for reasons that (probably) make sense internally. I don’t care about them. Similarly, your users don’t care about what makes sense to your organization; they care about what makes sense to them. Presenting a variety of options that don’t make sense to the uninitiated doesn’t help. If you don’t have a monopoly on your market, it will probably mean your customers go elsewhere.

The tradeoffs of Slack for community projects

When my employer adopted Slack, we saw benefit immediately. Conversations are searchable, file sharing is easy, and oh how I ? /giphy. It’s a great tool, but I don’t like it for open communities.

Slack was designed to be a company’s internal communication system. For that purpose, it’s great. It was not designed to be an open platform. For example, it is basically impossible for users to manage harassment.

Most people have one employer at a time. That’s not the case for hobby and interest communities. I have five unrelated rooms on Freenode IRC that I’m regularly in. For the most part, I manage that in one place. But each Slack instance I’m in might as well be a separate universe.

That’s not to say Slack is all bad. It is much easier to learn and use than most IRC clients. This is a significant benefit to non-technical communities. Creative Commons, for example, saw a large uptick in community participation after moving to Slack. Slack allows for a richness of community culture to develop in ways that text-only formats don’t.

But for me, particularly with open source communities, the less-than-public nature of Slack teams is a negative. People can’t join the communities they don’t know about. And if they can’t lurk quietly (by reading transcripts or joining the server anonymously), will they feel safe jumping in? There are lock-in considerations as well (my free software readers have probably been waiting for me to get to this point) that I think I’ll address in a later post.

Each community has to decide what is best for them. Like any other technology, Slack has pros and cons. The important thing is to weigh them before making a decision.

Airlines race to the bottom

A race to the bottom is rarely an attractive concept, particularly in a submarine or an airplane. And yet the airline industry seems to be dead set on racing to the bottom. Case in point: United announced the addition of a new “Basic Economy” fare tier. This tier does not permit use of the overhead bins and does not assign seats until the day of departure.

The cynical (and perhaps correct) view is that this is an opportunity to raise prices on tickets people would actually want to buy while keeping the “as low as!” price the same. But it’s also an attempt to compete with budget airlines like Spirit and Frontier, according to an industry source. Being able to match the low fares is “absolutely non-negotiable.”

I don’t have the benefits of seeing the financial models for this, but from an outside perspective, this seems like a bad move. Not all customers are created equal and it damages your brand to go after the wrong market. Some customers will buy based solely on price, and if that’s who you want to go after, do it. But someone buying solely on price probably won’t be that loyal, so the minute your competitor drops prices, you’ve lost them.

Itemizing everything enables the customer to pay for exactly what they want. It also gives the impression they’re being nickeled and dimed. It’s much easier to just have the price than to add up all the line items. I find it amusing that no-frills carrier Southwest is the holdout for free checked luggage. (As an aside, I’ll probably never fly Frontier again because the notion of paying $40 to check a single bag insulting.)

I’m also curious to see how this affects behavior. By adopting checked bag fees, airlines incentivize passengers to push the limits of carry-ons. This slows down the boarding and deplaning process. Will this Basic Economy tier get people to shove everything into their personal item that’s just barely wedged under the seat in front of them? Will it lead to upset customers who didn’t pay attention trying to use an overhead bin they’re not entitled to?

Most likely, we’ll grumble about it and then end up buying the cheapest ticket anyway. That seems to be the pattern, so I suppose it makes sense for airlines to follow the customer. But maybe there’s room for one or two airlines to buck that trend.