A shift in my social media habits

Amazingly, it’s been slightly less than a month since Mark Zuckerberg decided that hate speech is good and facts are bad. As you may recall, that decision led me to create a Bluesky account. It also led me to dramatically reduce my Facebook and Instagram use, although the latter was pretty sporadic to begin with. In that time I’ve noticed a definite shift in how I use social media.

For one, I’m just on it a lot less. It doesn’t take long for me to get caught up on Bluesky and Mastodon. I follow almost no one on Pixelfed so far, so that’s quick, too. This leaves me with a lot of time to do other things. For example, I successfully completed the “read every day in January” challenge on The Story Graph for the first time this year. There were a few days where I’d just do a couple of pages or a few minutes of an audio book in order to check the box, but most days I read for an hour or so. I’m also writing more, as evidenced by the number of posts on this blog lately.

Twitter always felt like the most natural platform for me, since it favors short shitposts. My brain makes so many of those. For some reason (perhaps because my network was mostly people I knew through my professional work), I never felt as comfortable doing that on Mastodon. But on Bluesky, it’s like the good old days.

Surprisingly to me, I’ve cross-posted a lot more than I thought I would, thanks to Openvibe. I’m normally opposed to cross-posting, but since Mastodon and Bluesky are largely the same format, I guess my aversion is lessened. Some stuff still goes to one or the other, but I really expected myself to always direct posts to a single choice. I’m still learning about myself at the age of 41.

I haven’t completely abandoned Facebook (and you are not invited to argue why I should), but I only check it briefly every few days. Years ago, I had gotten my usage to near zero, but then Twitter went to hell and Facebook had the largest share of my People™. Surprisingly, simply removing the app shortcut from my home screen has kept me from opening it out of boredom. Now when I go to Facebook, it’s because I’m actively choosing to check in.

Partially as a side effect of the smaller networks on the platforms I use and partly because of intentional choices, I find myself doomscrolling less. I’m following a lot fewer journalists and Online Political Opinion Havers than I did in the old days. I have enough ways of finding out what new terrors appear every day that I not need to immerse myself in it. That seems to have helped my mental state quite a bit (duh, right?).

By the way, did you know I have a weekly-ish newsletter? Subscribe if you’d like. If you have one, let me know and I’ll subscribe to yours.

Would Teenage Ben in 2024 be a Christian Nationalist?

My friend Renee recently wrote how if her 20-year-old self were here today, she’d be a Christian Nationalist. It got me thinking about the political and religious beliefs of Past Ben. In high school, I was very conservative. My U.S. history teacher gave us a political spectrum quiz at one point and I was to the right of Reagan. The highlight of my school breaks was being able to catch Rush Limbaugh on the radio.

It’s an understatement to say I’m not like that now. I can’t pinpoint when I moved left. It was a gradual process throughout my early 20s in particular, but even now into my early 40s. Past Ben would certainly have blamed this on liberal indoctrination in college, but I couldn’t begin to tell you the political beliefs of most of my professors. The one professor where I did know his political views was, in fact, a socialist. He told us so at one point, but I wouldn’t say there was a particularly socialist bent to the class. I don’t really remember much of anything about it, other than his imitation of his Irish grandfather saying “It’ll be Tammany Hall or no hall at all.”

So I don’t think Professor Hogan had much to do with it. But after reading Renee’s post and thinking about Past Ben, I recalled what might have been the first step in being more self-reflective about my politics. I remember at one point in high school (I think) that I drew a bunch of sketches of politicians in a grid. They weren’t particularly accurate renderings — I was going for clownishness, not realism. Each of them were labeled with some prominent Democratic politician of the time. They had speech bubbles saying some silly thing or another. But at the end was Rush Limbaugh, and his speech bubble said “I am the truth!”

That gave me pause. I’m not sure if Limbaugh ever said that specifically, but it was certainly plausible to me. I thought “wait. That’s a statement only Jesus can make, and Rush Limbaugh is not Jesus.” Nothing changed for me that day, I think, but it opened the door for more critical thought.

As most kids who have any interest in politics do, I followed my parents. Or at least my dad. Mom has always been quieter about her politics. It wasn’t until I was an adult on my own that I started to examine my views in terms of “based on what I value, here are the positions and candidates I support” instead of “well I’m a conservative, so of course I’m in favor of such and such.”

Like 20-year-old Renee, I’d like to think that Teenage Ben would find Trump repellent and unqualified to be president, no matter what he thought of the policies. But I’d probably have found Elon Musk hilarious in a douchey edgelord sort of way. Would I have been a Christian Nationalist if I were a teenager today? It’s hard to say. I can’t remember ever having a desire to explicitly make my religion the dominant one. I had no desire to talk about my beliefs to anyone who wasn’t interested in them. Whatever else I may have wanted to promote politically, I believed that the promises of equality and freedom that the US was notionally founded on. So maybe I would have avoided that path. I’m glad I don’t have the opportunity to find out.

Listening to vinyl

When my grandmother entered a memory care facility a few years ago, I drove down to clean out the condo she lived in for the past three decades. One room was basically a dumping ground for things she brought with her from New York and then never touched again. In that room was an old stereo with AM/FM radio, an 8-track deck, and a turntable. I brought that, along with her records, back home with me.

To my dismay, I couldn’t get the system to make music happen. After a little bit of tinkering, I decided to junk it and just buy a working turntable. (I settled on the AudioTechnica AT-LP60X, in case you’re wondering.) Pretty quickly, I started listening to a lot of records. Which meant I also started hitting up flea markets to expand my catalog.

As my friend Lyz wrote,

I am not an audiophile, so I never really understood the recent rise of record player popularity. Day to day I’m perfectly happy to stream music through the tiny Bluetooth speakers that float around our house. It wasn’t until recently when I started seeing the value of slowing down and appreciating the warm, physical sound of a record. The discourse around this thread of thinking tends to be that we’re all running around living this fast-paced life, so we’re losing some of what is so beautiful about life. Mindfulness and other slowing down practices are bringing us back to enjoying the present, and this is right where the record player comes in. You slow down, pull out this giant piece of media from a beautiful sleeve, and hear the scratchy of the edge of a record before settling in. It turns hitting play on your phone into a ritual, one that I really like.

I also have come to appreciate the album itself as an art form, not merely a collection of songs. A well-crafted album can take the listener on a journey.

I also discovered another, more practical benefit: listening to a vinyl record is a great time box. When I was writing Program Management for Open Source Projects, I would put a record on and write until the side ended. Having to get up to flip or replace the record gave me a good mental break and also got me moving.

America has always been aspirational

A few nights after the election, I was at a basketball game. At the conclusion of “The Star-Spangled Banner,” my buddy said “still the best country in the world.” “For a few more months, at least,” I quipped. I’m not convinced that America is the “best” country, if for no other reason than I don’t know what “best” means.

What I am convinced of is that America has never been what we claim it to be. America is not great, it is an aspiration.

From the very beginning, we have failed to live up to the story we tell about ourselves. Virginia’s House of Burgesses sat for the first time in the same year that the African slave trade began. Thomas Jefferson, while perhaps one of the greatest philosophers on human rights, did not act on his philosophy.

As our ancestors forced Africans from their home and sold them into slavery an ocean away, they also pushed out the indigenous people by force and treaty after treaty that would be broken to be replaced by another treaty that would also be broken. Centuries before Adolf Hitler gave it that name, it’s hardly a stretch to say that the British and Americans pursued a policy of lebensraum.

As late as 1840, the “antislavery” north still had 1,000 enslaved people. And while the Civil War may have ended legal slavery, that wasn’t the goal. Lincoln was more concerned with preserving the union than freeing an enslaved people. For almost a century more, segregation was legal, voter suppression was rampant, and racism ruled policy. The effects of these policies is still visible today.

The target of our racism has shifted over the years. For a time, southern Europeans were the lesser “other”. Then east Asian. The U.S. built concentration camps for the Japanese in World War II, but had no similar facilities for Germans or Italians. The Supreme Court upheld the legality of these, in one of the all-time worst decisions to come from that body.

We tell ourselves that America is a land where anyone can go from rags to riches. While some do achieve that level of class mobility, it’s not true for everyone. As far back as 1770, 1% of Bostonians owned 44% of the wealth. Wealth disparity has only continued to grow in my lifetime. The educational outcomes of school districts remain best correlated with the income of the districts residents.

We have done much of what we accused the bad guys of. Sometimes to a lesser degree, sometimes not. So as the worst person to occupy the White House returns today, I will remind myself that the work never ends. The poem “Let America Be America Again” by Langston Hughes captures the sentiment far more eloquently than I could.

Why SemBr doesn’t work for me

There’s one problem with prose stored in version control systems: line breaks. Longer lines mean bigger diffs, which can make it hard to review changes. One approach, and the one I take, is to put each sentence on a line. This works pretty well, but there’s a better approach: semantic line breaks (SemBr). In SemBr, line breaks separate sentences into logical pieces.

The problem with SemBr is that I struggle to make my brain do it. It took me a while to figure out why I struggled with SemBr despite understanding the benefits. A while back I realized it’s a simple answer: I overthink it. When I write, I think about the meaning of words. When you add in thinking about the line breaks, the cognitive load goes way up.

Sentences (and character counts) are unambiguous. Everyone who uses the same style will end up with the same line breaks. Semantic chunks can be more ambiguous, so if you’re working with others, there’s a (self-imposed, no doubt) pressure to get the breaks Right™.

My friend Matthew offered this succinct summary:

put things that are likely to change on their own lines.

Like
URLs,
dates,
and
list items.

So maybe I’ll give it a try again. And if not, at least I can explain why not.

What I told my kids this morning

It’s more important than ever that we work very hard to be the people we want to be. We must be kind and loving. We must care for ourselves, each other, and those around us who need our help.

Things are not okay. I cannot promise that they will be okay. What I can promise is that we will do our best to love and support you no matter what goes on in the world around us.

Happy birthday, BASIC!

Today is apparently the 60th birthday of the BASIC programming language. It’s been nearly a quarter of a century since I last wrote anything in basic, but it’s not unreasonable to say it’s part of why I am where I am today.

When I was in elementary school, my uncle gave us a laptop that he had used. I’d used computers in school — primarily the Apple II — but this was the first time we’d had a computer in the house. Weighing in at 12 pounds, the Epson Equity LT was better suited for the coffee table than the lap, but it was a computer, damn it! In a time when we didn’t have much money, we could still afford the occasional $5 game on a 3.5″ floppy from Target. (I still play Sub Battle Simulator sometimes!)

But what really set me down my winding path to the present was when my uncle taught me how to write programs in GW-BASIC. We started out with a few simple programs. One took your age and converted it to the year of the planets in the solar system. Another did the same but with your weight. I learned a little bit about loops and conditionals, too.

Eventually, I started playing around in QBasic, learning to edit existing programs and write new ones. I remember writing a hearing test program that increased generated sounds of increasing pitch through the PC speaker. After using Azile at my friend’s house, I wrote my own chat program. I learned how to make it play musical notes from some manuals my uncle had left us.

I didn’t really know what I was doing, but I learned through trial and error. That skill has carried me through my entire career. At 41, I have a mostly-successful career that’s paid me well primarily due to networking, privilege, and luck. But I also owe something to the skills I learned writing really shitty BASIC code as a tween and teen.

Barbenheimer

Over the weekend, I took part in the Barbenheimer Experience. We saw “Barbie” and — after a break to feed my sister’s dogs and also myself — “Oppenheimer”. I’ll be honest: I mostly did it because it felt like a silly Internet thing to do. But I’m glad I did it.

Barbie

Not since “Citizen Kane” has a movie about a beloved childhood possession made such good Art™. I wasn’t prepared for how much I enjoyed it. It was fun in a silly, self-aware way. Credit to the folks at Mattel who approved this, because it addresses some of Barbie’s problems.

It’s not just a fun movie, though. The movie addresses serious themes, sometimes satirically and sometimes earnestly. The message gets a little ham-handed in a few spots, but it quickly reels back in. Overall, it provokes thought in a fun way.

One thought it provoked in me: how many times did they have to shoot the beach off scene before they got a usable take?

Oppenheimer

“Oppenheimer” is not a fun movie, but it was interesting. I didn’t know much about Robert Oppenheimer before the movie, and I’m not sure how much I can claim to know now. While not fawning, the movie’s portrayal of Oppenheimer is complimentary. It doesn’t ignore his personal failings, but it also doesn’t explore them. They are just facts in the story.

I spent the rest of the evening thinking about atomic weapons. Truman’s decision to drop atomic bombs on Japan may be the ultimate Trolley Problem. An American invasion of mainland Japan would have cost many military and civilian lives. But that didn’t happen. The death of a few hundred thousand civilians did happen. No matter what the outcome of the road not traveled, we can’t ignore what did happen.

Was Oppenheimer’s opposition to Teller’s hydrogen bomb principled or was it petty? I either case, was it hypocritical? Was it ethical? What lessons should we take for the things we invent today?

Barbenheimer

Both movies are about the end of the world as the characters know it. Both grapple with what that means for the future. They are very different movies, but they compliment each other quite nicely. They’re good on their own, but I’m glad I saw them together.

Booth Tarkington on “The Golden Bachelor”

This weekend, it came to my attention that ABC is making a change in its long-running dating show The Bachelor. A 71 year old man will be the first “golden bachelor” in the upcoming 28th season. I don’t have much of an opinion on the show generally or the new season particularly, but I couldn’t help but think of Booth Tarkington’s Pulitzer Prize-winning The Magnificent Ambersons.

Youth cannot imagine romance apart from youth. That is why the roles of the heroes and heroines of plays are given by the managers to the most youthful actors they can find among the competent. Both middle-aged people and young people enjoy a play about young lovers; but only middle-aged people will tolerate a play about middle-aged lovers; young people will not come to see such a play, because, for them, middle-aged lovers are a joke—not a very funny one. Therefore, to bring both the middle-aged people and the young people into his house, the manager makes his romance as young as he can. Youth will indeed be served, and its profound instinct is to be not only scornfully amused but vaguely angered by middle-age romance.

Booth Tarkington, The Magnificent Ambersons

In an interesting coincidence, both Tarkington and the “Golden Bachelor” are from Indiana. At any rate, I suppose ratings will tell if Tarkington was right or not.

Book review: If Nietzsche Were a Narwhal

I recently read Justin Gregg’s If Nietzsche Were a Narwhal: What Animal Intelligence Reveals About Human Stupidity. As humans, we tend to assume that our intelligence sets us apart and that our exceptional cognitive abilities are good. There’s no doubt that we’re exceptional, but it’s not clear that we’re good. As Gregg wrote:

Our many intellectual accomplishments are currently on track to produce our own extinction, which is exactly how evolution gets rid of adaptations that suck.

Unique among Earth’s animals, humans have bent our environment to our will. This, of course, has resulted in some undesirable side effects. Despite all of our supposed advancement, we are biologically predisposed to prioritize immediate needs over long-term needs. We get benefit from burning fossil fuels now and assume that we’ll be able to deal with the long-term impacts later. But will we?

Gregg studies animal cognition, so this book is steeped in facts. Indeed, the reader will probably learn more about animals than people. And after reaching the end, the reader may find it’s hard to disagree with Gregg’s assertion that Nietzche — and the rest of the species — would have been happier as a narwhal.

Evolution has many dead ends. It could be that what makes us special actually makes us less happy. Humans have a relatively short time on Earth, so it’s folly to assume that our unique adaptations aren’t maladaptive. It reminds me of the joke where an angel is talking to God about creating humans and says “you’ve ruined a perfectly good monkey. Look, it has anxiety!”

I didn’t come away from this book convinced that human cognition is a bad thing on balance. But as a philosophical starting point, I see a case for Gregg’s argument that “human intelligence may just be the stupidest thing that ever happened.”