America has always been aspirational

A few nights after the election, I was at a basketball game. At the conclusion of “The Star-Spangled Banner,” my buddy said “still the best country in the world.” “For a few more months, at least,” I quipped. I’m not convinced that America is the “best” country, if for no other reason than I don’t know what “best” means.

What I am convinced of is that America has never been what we claim it to be. America is not great, it is an aspiration.

From the very beginning, we have failed to live up to the story we tell about ourselves. Virginia’s House of Burgesses sat for the first time in the same year that the African slave trade began. Thomas Jefferson, while perhaps one of the greatest philosophers on human rights, did not act on his philosophy.

As our ancestors forced Africans from their home and sold them into slavery an ocean away, they also pushed out the indigenous people by force and treaty after treaty that would be broken to be replaced by another treaty that would also be broken. Centuries before Adolf Hitler gave it that name, it’s hardly a stretch to say that the British and Americans pursued a policy of lebensraum.

As late as 1840, the “antislavery” north still had 1,000 enslaved people. And while the Civil War may have ended legal slavery, that wasn’t the goal. Lincoln was more concerned with preserving the union than freeing an enslaved people. For almost a century more, segregation was legal, voter suppression was rampant, and racism ruled policy. The effects of these policies is still visible today.

The target of our racism has shifted over the years. For a time, southern Europeans were the lesser “other”. Then east Asian. The U.S. built concentration camps for the Japanese in World War II, but had no similar facilities for Germans or Italians. The Supreme Court upheld the legality of these, in one of the all-time worst decisions to come from that body.

We tell ourselves that America is a land where anyone can go from rags to riches. While some do achieve that level of class mobility, it’s not true for everyone. As far back as 1770, 1% of Bostonians owned 44% of the wealth. Wealth disparity has only continued to grow in my lifetime. The educational outcomes of school districts remain best correlated with the income of the districts residents.

We have done much of what we accused the bad guys of. Sometimes to a lesser degree, sometimes not. So as the worst person to occupy the White House returns today, I will remind myself that the work never ends. The poem “Let America Be America Again” by Langston Hughes captures the sentiment far more eloquently than I could.

Why SemBr doesn’t work for me

There’s one problem with prose stored in version control systems: line breaks. Longer lines mean bigger diffs, which can make it hard to review changes. One approach, and the one I take, is to put each sentence on a line. This works pretty well, but there’s a better approach: semantic line breaks (SemBr). In SemBr, line breaks separate sentences into logical pieces.

The problem with SemBr is that I struggle to make my brain do it. It took me a while to figure out why I struggled with SemBr despite understanding the benefits. A while back I realized it’s a simple answer: I overthink it. When I write, I think about the meaning of words. When you add in thinking about the line breaks, the cognitive load goes way up.

Sentences (and character counts) are unambiguous. Everyone who uses the same style will end up with the same line breaks. Semantic chunks can be more ambiguous, so if you’re working with others, there’s a (self-imposed, no doubt) pressure to get the breaks Right™.

My friend Matthew offered this succinct summary:

put things that are likely to change on their own lines.

Like
URLs,
dates,
and
list items.

So maybe I’ll give it a try again. And if not, at least I can explain why not.

What I told my kids this morning

It’s more important than ever that we work very hard to be the people we want to be. We must be kind and loving. We must care for ourselves, each other, and those around us who need our help.

Things are not okay. I cannot promise that they will be okay. What I can promise is that we will do our best to love and support you no matter what goes on in the world around us.

Happy birthday, BASIC!

Today is apparently the 60th birthday of the BASIC programming language. It’s been nearly a quarter of a century since I last wrote anything in basic, but it’s not unreasonable to say it’s part of why I am where I am today.

When I was in elementary school, my uncle gave us a laptop that he had used. I’d used computers in school — primarily the Apple II — but this was the first time we’d had a computer in the house. Weighing in at 12 pounds, the Epson Equity LT was better suited for the coffee table than the lap, but it was a computer, damn it! In a time when we didn’t have much money, we could still afford the occasional $5 game on a 3.5″ floppy from Target. (I still play Sub Battle Simulator sometimes!)

But what really set me down my winding path to the present was when my uncle taught me how to write programs in GW-BASIC. We started out with a few simple programs. One took your age and converted it to the year of the planets in the solar system. Another did the same but with your weight. I learned a little bit about loops and conditionals, too.

Eventually, I started playing around in QBasic, learning to edit existing programs and write new ones. I remember writing a hearing test program that increased generated sounds of increasing pitch through the PC speaker. After using Azile at my friend’s house, I wrote my own chat program. I learned how to make it play musical notes from some manuals my uncle had left us.

I didn’t really know what I was doing, but I learned through trial and error. That skill has carried me through my entire career. At 41, I have a mostly-successful career that’s paid me well primarily due to networking, privilege, and luck. But I also owe something to the skills I learned writing really shitty BASIC code as a tween and teen.

Barbenheimer

Over the weekend, I took part in the Barbenheimer Experience. We saw “Barbie” and — after a break to feed my sister’s dogs and also myself — “Oppenheimer”. I’ll be honest: I mostly did it because it felt like a silly Internet thing to do. But I’m glad I did it.

Barbie

Not since “Citizen Kane” has a movie about a beloved childhood possession made such good Art™. I wasn’t prepared for how much I enjoyed it. It was fun in a silly, self-aware way. Credit to the folks at Mattel who approved this, because it addresses some of Barbie’s problems.

It’s not just a fun movie, though. The movie addresses serious themes, sometimes satirically and sometimes earnestly. The message gets a little ham-handed in a few spots, but it quickly reels back in. Overall, it provokes thought in a fun way.

One thought it provoked in me: how many times did they have to shoot the beach off scene before they got a usable take?

Oppenheimer

“Oppenheimer” is not a fun movie, but it was interesting. I didn’t know much about Robert Oppenheimer before the movie, and I’m not sure how much I can claim to know now. While not fawning, the movie’s portrayal of Oppenheimer is complimentary. It doesn’t ignore his personal failings, but it also doesn’t explore them. They are just facts in the story.

I spent the rest of the evening thinking about atomic weapons. Truman’s decision to drop atomic bombs on Japan may be the ultimate Trolley Problem. An American invasion of mainland Japan would have cost many military and civilian lives. But that didn’t happen. The death of a few hundred thousand civilians did happen. No matter what the outcome of the road not traveled, we can’t ignore what did happen.

Was Oppenheimer’s opposition to Teller’s hydrogen bomb principled or was it petty? I either case, was it hypocritical? Was it ethical? What lessons should we take for the things we invent today?

Barbenheimer

Both movies are about the end of the world as the characters know it. Both grapple with what that means for the future. They are very different movies, but they compliment each other quite nicely. They’re good on their own, but I’m glad I saw them together.

Booth Tarkington on “The Golden Bachelor”

This weekend, it came to my attention that ABC is making a change in its long-running dating show The Bachelor. A 71 year old man will be the first “golden bachelor” in the upcoming 28th season. I don’t have much of an opinion on the show generally or the new season particularly, but I couldn’t help but think of Booth Tarkington’s Pulitzer Prize-winning The Magnificent Ambersons.

Youth cannot imagine romance apart from youth. That is why the roles of the heroes and heroines of plays are given by the managers to the most youthful actors they can find among the competent. Both middle-aged people and young people enjoy a play about young lovers; but only middle-aged people will tolerate a play about middle-aged lovers; young people will not come to see such a play, because, for them, middle-aged lovers are a joke—not a very funny one. Therefore, to bring both the middle-aged people and the young people into his house, the manager makes his romance as young as he can. Youth will indeed be served, and its profound instinct is to be not only scornfully amused but vaguely angered by middle-age romance.

Booth Tarkington, The Magnificent Ambersons

In an interesting coincidence, both Tarkington and the “Golden Bachelor” are from Indiana. At any rate, I suppose ratings will tell if Tarkington was right or not.

Book review: If Nietzsche Were a Narwhal

I recently read Justin Gregg’s If Nietzsche Were a Narwhal: What Animal Intelligence Reveals About Human Stupidity. As humans, we tend to assume that our intelligence sets us apart and that our exceptional cognitive abilities are good. There’s no doubt that we’re exceptional, but it’s not clear that we’re good. As Gregg wrote:

Our many intellectual accomplishments are currently on track to produce our own extinction, which is exactly how evolution gets rid of adaptations that suck.

Unique among Earth’s animals, humans have bent our environment to our will. This, of course, has resulted in some undesirable side effects. Despite all of our supposed advancement, we are biologically predisposed to prioritize immediate needs over long-term needs. We get benefit from burning fossil fuels now and assume that we’ll be able to deal with the long-term impacts later. But will we?

Gregg studies animal cognition, so this book is steeped in facts. Indeed, the reader will probably learn more about animals than people. And after reaching the end, the reader may find it’s hard to disagree with Gregg’s assertion that Nietzche — and the rest of the species — would have been happier as a narwhal.

Evolution has many dead ends. It could be that what makes us special actually makes us less happy. Humans have a relatively short time on Earth, so it’s folly to assume that our unique adaptations aren’t maladaptive. It reminds me of the joke where an angel is talking to God about creating humans and says “you’ve ruined a perfectly good monkey. Look, it has anxiety!”

I didn’t come away from this book convinced that human cognition is a bad thing on balance. But as a philosophical starting point, I see a case for Gregg’s argument that “human intelligence may just be the stupidest thing that ever happened.”

Write a tweet before you write a book chapter

When my publisher’s very smart and talented publicist suggested I post about my book weekly for four weeks, I decided to do one better. Or nine better, really. “I’ll do one post about each chapter,” I confidently said to my kanban board. This turned out to be great advice that I wish Past Ben had.

I don’t know how much of an effect it had on sales. The feedback loop is far too long there. But even if I’ve tapped out the buying (and sharing) power of my network, the thought process is useful. I wish I had done it before I started writing. If you can’t explain a chapter’s value in 240 characters, is it worth including?

When you’re writing a non-fiction book, you’re in a bit of a race against time. Particularly in tech, the longer it takes you to write the book, the more likely it is that the earliest content is out of date. One of the ways to keep the writing time low is to not include material that doesn’t matter. If you can concisely express why a chapter (or section, even) matters, it’s probably good to include it. If not, you either need to cut it or think a little harder about why it’s important.

One suggestion that my editor gave me early in the process is to state a problem that each section solved. This was mostly for the reader’s benefit: it told them why they should care about a particular section. But it also made me think about why the section should be included. More than once, I cut or reworked a planned section because I couldn’t clearly express a meaningful problem.

Plagiarism in music

Last week I read an LA Times article about allegations of plagiarism leveled at Olivia Rodrigo. Rodrigo is a very talented artist (“good 4 u” gets stuck in my head for days at a time), but is she a thief? I haven’t heard the songs mentioned in the article, so I can’t say in this specific case.

But in the general sense, my bar for “plagiarism” in music is pretty high. The entirety of popular music is based on artists incorporating things they’ve heard before to varying degrees. Rob Paravonian’s Pachelbel rant is a great demonstration. I’ll grant that “Canon in D” has long entered the public domain. But imagine if musicians had to wait a century to reuse a few bars of music.

My personal view—which may or may not match copyright law—is that unless it’s taking audience from the previous song/artist, it’s fine. This is similar to one of the factors in fair use. As a concrete example, Vanilla Ice’s “Ice, Ice Baby” definitely takes from Queen & David Bowie’s “Under Pressure”. And that’s fine. The existence of “Ice, Ice Baby” hasn’t stopped anyone from listening to “Under Pressure”.

Cultural works, particularly in music and Internet discourse, rely inextricably on remixing. We should embrace a very permissive paradigm.

Seek first to understand

One of the lessons that I’ve had to repeatedly re-learn over my career is “understand the problem before you fix it.” I try to fix a problem as quickly as I can. It’s a laudable goal, but a fix without understanding may not actually fix the problem. And it may not prevent future occurrences. If you’re particularly unlucky, it will make the problem worse.

I learned this lesson late last week. On Thursday, someone reported some HTML appearing in some Fedora documentation on translated pages. “Oh! It was probably that PR I merged yesterday,” I thought. So I reverted it.

Then I started digging into it some more. And I realized that it’s probably not that change at all. In fact, it worked locally and on the staging server. It was just broken on the production server. It’s not clear to me if both staging and production sync the translation data on the same schedule (without getting too sidetracked, the staging environment isn’t really a staging environment. It needs a better name). But I became convinced that it’s not a problem in the docs infrastructure, but in the translations. So I reverted my reversion.

This is not the first time I jumped in to fix something before I took a look around to see what’s going on. Unfortunately, it probably won’t be the last.

Here’s the thing: most of the time, a slight delay doesn’t matter. No one’s safety was at risk. We weren’t losing hundreds of thousands of dollars a minute. There was no real harm in spending 10 minutes to figure out what was going on. Perhaps I could try to reproduce it. After all, if you can’t reproduce the error, how do you know you’ve fixed it?

Hopefully the next time I go to fix a problem, I’ll understand the problem first. As astronauts do, I need to work the problem.