The irony of automation

Automation is touted – often rightly – as a savior. It saves time, it saves effort, it saves money, and it saves lives. Except when it doesn’t. A while back, I read a two-part post about how a mistake with an automated pharmacy system lead to a 38x overdose. It’s not that the system itself made a mistake, but it enabled the medical professionals to make a mistake that they’d never have made in a pen-and-paper system.

This story has two ultimate lessons. First, modes are dangerous in user interfaces, because they are easy to overlook and can lead to incredibly different outcomes. In this story, had the dosage input always required either the total dosage or the dosage per patient weight, this would have never happened. Allowing either makes it easy to make a lethal mistake. Perhaps a better option would be to have a optional popup that calculates the dosage from the per-weight dosage and the patient weight. That retains the convenience of being able to prescribe the dosage either way, while making it explicitly obvious which way is being used.

The second lesson is that it’s important for experts with specialized knowledge to apply that to their use of automation. When something doesn’t seem right, it’s easy to find ways to explain it away, especially if the automation is reliable. But “that doesn’t seem right” must remain a feeling we pay attention to.

Giving up is the better part of valor

There’s a lot to be said for sticking with a problem until it is solved. The tricky part can be knowing when it is solved enough. I’ve been known to chase down an issue until I can explain every aspect of it. The very existence of the problem is a personal insult that will not be satisfied as long as the problem continues to insist on being.

That approach has served me well over the years, even if it can be annoying sometimes. I’ve learned by chasing these problems to their ultimate resolution. Sometimes it even reveals conditions that can be solved before they become problems.

But as with anything, there’s a tradeoff to be made. The time it takes run down a problem is time not spent elsewhere. It’s a matter of prioritization. Does having it 80% fixed do what you need? Can you just ignore the problem and move on? 

A while back, I was trying to get a small script to build in our build system at work. For whatever reason, the build itself would work, but using the resulting executable to upload itself to Amazon S3 failed with an indecipherable (to me at least) Windows library error. It made no sense to me. This was a workflow I had used on a local virtual machine dozens of times. And it worked if I executed the build artifact by hand. Just not in the build script.

I spent probably a few hours working on solutions. But no matter what I tried, I made no progress. When I got to the point where I had exhausted all of the reasonable approaches I could think of, I implemented a workaround and moved on to something else.

It can be hard to know when to give up. Leaving a problem unsolved might come back to bite you later. But what else could you be doing if you’re not spinning your wheels?

Bulk removing invalid emails from Salesforce

Hey, who ever thought they’d see a Salesforce how-to on this blog? One of the things I do at work is put together our newsletter and send it to our leads and contacts. But our list of contacts has built up over the years and some of the email addresses are no longer valid. So here (largely for my own reference later) are the steps I use to clear out the invalid emails. There may be an easier way, this is just what I’ve figured out.

  1. Export bounced emails from Constant Contact
  2. Put bounces in a spreadsheet tab called “CC”
  3. Export your Leads from Salesforce (just get the ID and email fields)
  4. Put the Leads into another tab on the spreadsheet
  5. Use this formula to identify addresses that need to be removed (because they exist on the CC tab):
    =IF(ISERROR(VLOOKUP(B2,CC!$E$2:CC!$E$2000, 1, FALSE)),"", "REMOVE")
  6. Sort by ColumnC Z->A
  7. Copy the LeadID column for anything with REMOVE into a new sheet
  8. Add a second column header called “Email”
  9. Save as CSV
  10. Open the Salesforce DataLoader
  11. Select Update
  12. Login
  13. Select Lead or Contact as appropriate and add your CSV file
  14. Create the field mapping
  15. Run that puppy
  16. (repeat steps 3-15 for Contacts)

The cloud is more than just someone else’s computer

“The cloud is just someone else’s computer” is a common phrase in tech circles. An otherwise excellent article last week on opened with this line: “A personal web server is ‘the cloud,’ except you own and control it as opposed to a large corporation.” Let me be unambiguous here: that’s bullshit.

The context of the “someone else’s computer” saying is generally one of data ownership. Why let someone else own your data when you can own it yourself? I’m sympathetic to that point, but it glosses over a very questionable assumption. Namely, that people have the skills and desire to run the services themselves. That may be true in the tech sector, but it’s certainly not going to be true in the population at large.

What’s even more frustrating is the comparison of a Raspberry Pi to a multi-replica distributed environment. A Raspberry Pi has no redundancy, so if a component fails, you’re out of luck until you can replace it. If your house floods, sorry about your data. Granted, you can address these issues yourself by having redundant hardware and an offsite copy, but the effort goes up dramatically with each layer of protection you build in. Maybe it’s worth the effort to you. And maybe you have the skills necessary to do it. Good for you.

It’s absolutely a good thing to make sure people are aware of the costs and benefits of any technology solution. But one of the benefits of cloud offerings is that some portion of the stack is maintained by competent professionals that can aggregate the demands of individual customers to build a pretty robust and reliable offering. You know why it’s big news when Amazon Web Services has a major outage? 1. Because it’s rare. 2. Because their services are good enough that a lot of people have said “it doesn’t make sense for us to do this ourselves.”

I liken “the cloud is just someone else’s computer” to saying “the grocery store is just someone else’s farm”.

Book review: The Dance of the Possible

Few categories of book have the potential to be as obnoxious as the “how to be creative” genre. Scott Berkun’s The Dance of the Possible fails to live up to its potential in that regard. In fact, it’s not really obnoxious at all. Instead it’s filled with a humorous approach to treating creativity as a skill to be honed instead of a magical epiphany bestowed from some mysterious muse.

Berkun goes out of his way to avoid giving an easy solution to being creative, and he’s very dismissive of creativity as an end goal. The point of being creative is to create something, and creativity as a virtue is a relatively recent development. Explore the possibilities of choices in mundane situations, he suggests. Somehow, this ended up with me writing on my sock with a permanent marker at 12:30 AM.


My oppressive pseudo-creative project. It will make sense if you read the book, I promise.

Any sort of creative work, even this very book review, is a delicate dance between two opposing forces: expanding what is possible for the project and contracting the scope so that it actually gets done. I can deconstruct the message of the chapters and reassemble them in any way I want, mixing them into something new. And I can shuffle these ideas around forever, but at some point the review must be published, or else what good has it done you?

One aspect of the book that I particularly liked was Berkun’s focus on some of the mental issues involved in trying to develop a creative work. In chapter 12, he talks about “the tightrope of creative confidence”: being confident enough to act, but not too confident. I prefer to think of it as “the eternal struggle between the Dunning-Krueger Effect and Impostor Syndrome”, but regardless of the name it’s a balancing act I know well.

All-in-all, The Dance of the Possible was a quick read. Indeed, the very first note I wrote down was “he seems to insist we not read the book.” This is not a book designed for Scott Berkun to wax poetic for chapters on end. Instead it shares real, actionable advice for exercising the thinking muscle. I enjoyed this book and found the framing of the problem and solution to be very helpful in understanding my own thought process. I can’t say that I came away with any sudden, brilliant insight, but maybe that’s the point.


The Dance of the Possible is published by Berkun Media. It goes on sale March 15, 2017. The author provided a review copy for this post.

Purdue Boilermakers: Big Ten champions

The men’s basketball season ended for Purdue last night, with a close victory in Evanston against the Northwestern Wildcats. But in a sense, that game did not matter. No matter the outcome, Northwestern is likely to make the NCAA tournament for the first time in school history. More importantly (to me), Purdue had already secured the outright conference title. Purdue now has 23 Big Ten titles to its name, reclaiming sole possession of the lead after Indiana tied it up last year.

Speaking of Indiana, it was against the hated in-state rivals that the Boilermakers clinched a share of the title. To be able to secure a trophy at home, on senior night, against a bitter rival? That was a special treat for team and fans alike. When the final horn sounded, confetti burst from the ceiling and the trophy was presented to the team.

Confetti rains down after Purdue defeats Indiana and claims a share of the Big Ten title. February 28, 2017

The Purdue men’s basketball team celebrates with their trophy.

Earlier in the season, it seemed like Wisconsin had the title all but locked up. A few head-scratching losses by Purdue made the title seem out of reach. But Wisconsin was a paper tiger.

Despite holding the conference title record, it had been 21 years since the last time Purdue won the title outright (and seven years since the last title). Promising seasons in the early part of this decade were cut short by injury, or by underperformance, or by who knows what. A string of consecutive first-round wins in the NCAA tournament came to an end with heartbreaking losses in consecutive years. Purdue fans were hungry, so being able to celebrate a season that seemed destined for failure felt really good.

Up next, we hope, deep runs in the Big Ten tournament and the NCAA tournament.

Assorted site updates, plus a newsletter

Have you noticed a cool new logo on Funnel Fiasco lately? Thanks to Susan at Sumy Designs, I have a professional-looking logo instead of the hot garbage I did in MS Paint 10 years ago. I’m pretty pleased by it. To celebrate, I’m also announcing my newsletter. Newsletter Fiasco is my attempt to be like all of my cool friends who have a newsletter. I’ll be sending out out once a week or so with links to stuff I’ve written, stuff I’ve liked, and some thoughts about whatever it is I’m thinking about.

In addition to the logo, there are a few other changes to the site:

Other writing in February 2017

Where have I been writing when I haven’t been writing here?

The Next Platform

I’m freelancing for The Next Platform as a contributing author. Here are the articles I wrote last month:

Over on, we managed our 5th consecutive million-page-view month, despite the short month. I wrote the articles below.

Also, the 2016 Open Source Yearbook is now available. You can get a free PDF download now or buy the print version at cost. Or you can do both!

Cycle Computing

Meanwhile, I wrote or edited a few things for work, too:

  • HyperXite case study – The HyperXite team used CycleCloud software to run simulations for their hyperloop pod.
  • ALS research case study – A professor at the University of Arizona quickly simulate a million compounds as part of a search for pharmacological treatment for Lou Gerhig’s disease.
  • Transforming enterprise workloads – A brief look at how some of our customers transform their businesses by using cloud computing.
  • LAMMPS scaling on Microsoft Azure – My coworkers did some benchmarking of the InfiniBand interconnect on Microsoft Azure. I just wrote about it.
  • Various ghost-written pieces. I’ll never tell which ones!

Vulnerabilities in TSA PreCheck

Back in December, Bruce Schneier wrote about vulnerabilities in TSA PreCheck. His article leaned heavily on quotes from a former TSA administrator who felt that the PreCheck program should have stricter screening requirements. Schneier agreed with the vulnerability assessment, but drew the opposite conclusion. Since it has not been defeated, he argues, all screening should be reduced to that level.

That will never work.

I agree that the level of at-airport screening that PreCheck members get is sufficient. The question isn’t one of security, it’s of security theater. Having said that the current level of screening is needed, how can TSA officials say “nah, we don’t need that any more” and expect to keep their jobs? After all, terrorism is not a solved problem.

But this is also the longest stretch without a hijacking of a  domestic-origin flight since the 1960s. As Schneier points out, if there were terrorists who wanted to attack an airliner, you’d think they’d have figured out a way through by now. Considering the abysmal detection rate in internal tests, it’s not a stretch.

As I wrote on the blog post, the TSA has made for good theater. When I travel with PreCheck, I get essentially the same screening process that I did pre-9/11. But now it feels special. I get shorter lines and less hassle. But the lines don’t have to be long, and the hassle doesn’t have to be there.

Yes, TSA PreCheck has vulnerabilities. All systems do. It cannot be made fully secure, so maybe it should be retired. But really, it should become the standard.

Putting the “F” in “FCC”

Ars Technica reported earlier this month that Comcast is bringing an app to Roku. Cool! Now people who want to use their Roku instead of a set-top box for cable can do that. Here’s the trick: once it exits “beta”, Comcast will charge users an outlet fee — essentially treating it the same as an additional set-top box.

What Comcast is doing, then, is charging its customers for the privilege of watching the content they already pay for. I can understand their reasoning: it could lead to additional simultaneous viewings, which means more bandwidth. But given the cable industry’s history of unfriendliness to the consumer, I’m not inclined to be sympathetic. Futhermore, given the trend toward cord-cutting, it seems to be in the cable providers’ best interests to not alienate an increasingly disinterested customer base.

Former Federal Communications Commission (FCC) Chairman Tom Wheeler favored a rule that would require cable providers to make such an app available for free. It did not pass and the new chariman, Ajit Pai, has no interest in pursuing it. Many in the tech community worried when Wheeler came on board (he had been a cable industry lobbyist), he turned out pretty well. Pai was a Verizon lawyer before joining the FCC in 2012, but I have less hope of him becoming a consumer advocate.

Pai opposes net neutrality, which is a philosophy that has been the foundation for the Internet. De-regulation of an oligopoly, which the ISP market unquestionably is, will spur entrenchment, not innovation. The FCC will likely become much more favorably to industry than to consumer, and that is a real disappointment.