Earlier this year, I attended a pitch night at my local coworking space. One of the teams that presented is working on a product to help prevent driving while drowsy. It’s a noble goal — drowsy driving can be as dangerous as drunk driving. Like drunk driving, people aren’t very good at self-evaluating their fitness to drive when they’re sleepy.
But as they talked, I sat there going “no. What are you doing?” They talked about learning who the driver is and establishing a baseline of their attentiveness to measure their drowsiness. All kinds of cool whiz-bang stuff. And maybe someday they’ll add some haptic feedback to the steering wheel.
I suggested that they target diabetics as a first market. Hypoglycemia is dangerous, too, and it’s an easily-defined audience, which helps in initial go-to-market efforts. When I talked to them after their presentation, they started talking about maybe adding blood sugar sensors.
No. Just no. I asked them if they’d considered what storing data might mean. It creates a liability for the driver in case of an accident. It could be used by insurance companies to change rates. It could be compromised and used for something else.
Using the latest tech is neat, but only when it makes sense. And sometimes, the more you add, the worse things get.