I was a bad driver. It would frequently beep at me to let me know that I had braked too hard. I was mystified. "What should I have done differently," I'd think, as I raged at the objective machine that judged me so.
The next time my brother came to visit, he called mom. "Oh, and presidentender is a good driver now." I didn't put the pieces together right away, but it turned out that the dongle had actually trained me, like a dog's shock collar.
The reason for my too-frequent hard-braking events wasn't speed, although that would be a contributing factor. It was a lack of appropriate following distance. Because I'd follow the drivers in front of me too closely I'd have to brake hard if they did... Or if they drive normally and happened to have a turn coming up.
Over the period I had the insurance spy box in my truck I learned without thinking about it to increase my following distance, which meant that riding with me as a passenger was more comfortable and it beeped less often. Of course since I'd been so naughty early during the evaluation they didn't decrease my rates, but I think the training probably did make me statistically less likely to crash.
- Road Accidents: "A driver caused this, let's determine who, and find them at fault."
- With Air Accidents: "The system caused this, let's determine which elements came together that ultimately lead to this event."
The first is essentially simplifying a complex series of events into something black and white. Easy to digest. We'll then keep doing it over and over again because we never changed the circumstances.
The second approach is holistic, for example even if the pilot made a mistake, why did they make a mistake, and what can we do to prevent that mistake (e.g. training, culture, etc)? But maybe other elements also played a part like mechanical, software, airport lightning, communications, etc.
I bet everyone reading this knows of a road near them that is an accident hotspot and I bet they can explain WHY it is. I certainly do/can, and I see cops with crashed cars there on a weekly basis. Zero changes have been made to the conditions.
In unknown roads/highways I can predict hard bumps/gaps by seeing dark oil spots in the middle of each lane.
PDF download: https://iase-pub.org/ojs/SERJ/article/download/215/119/726
In theory, the most dangerous turns would probably have higher variance on hard braking data.
I'll never use one of these dongles, though, because I don't want my every move second-guessed. There's nothing _inherently_ dangerous about isolated hard braking or cornering or acceleration events. It all depends on context. Am I braking hard to avoid an obstacle or mistake by another driver? Is there someone behind me that's likely to rear-end me, or am I in the middle of a highway in the desert? Did I just replace my brake pads and I'm bedding in the new pads?
I don't want to have to worry about whether I've used up my invisible quota before the algorithm decides I should be moved into a more expensive insurance bracket.
That's not gonna be something Google would research, of course, due to next to no alignment with their interests.
In a similar way that Google Maps shows eco routes, it’d be fun for them to show “safest” routes which avoid areas with common crashes. (Not always possible, but valuable knowledge when it is.)
"A 1974 study by Hall and Dickinson showed that speed differences contributed to crashes, primarily rear end and lane change collisions"
Hall, J. W. and L. V. Dickinson. An Operational Evaluation of Truck Speeds on Interstate Highways, Department of Civil Engineering, University of Maryland, February, 1974.
This research team used Google's first-party location data to identify San Jose's Interstate 880/US 101 interchange as a site with statistically extreme amounts of hard braking by Android Auto users.
But you don't need machine learning to know that... San Jose Mercury News readers voted that exact location as the worst interchange in the entire Bay Area in a 2018 reader poll [1]
It's not a lack of knowledge by Caltrans or Santa Clara County's congestion management agency that is keeping that interchange as-is. Rather, it's the physical constraints of a nearby airport (so no room for flyovers), a nearby river (so probably no tunneling), and surrounding private landowners and train tracks.
Leaving aside the specifics of the 880/101 interchange, the Google blog post suggests that they'll use this worst-case scenario on a limited access freeway to inform their future machine-learning analyses of other roads around the country, including ones where presumably there are also pedestrians and cyclists.
No doubt some state departments of transportation will line up to buy these new "insights" from Google (forgetting that they actually already buy similar products from TomTom, Inrix, StreetLight, et al.) [2]
While I genuinely see the value in data-informed decision making for transportation and urban planning, it's not a lack of data that's causing problems at this particular freeway intersection. This blog post is an underbaked advertisement.
[1] https://www.mercurynews.com/2018/04/13/101-880-ranks-as-bay-...
[2] https://www.tomtom.com/products/traffic-stats/ https://inrix.com/products/ai-traffic/ https://www.streetlightdata.com/traffic-planning/
While you're at it, give me an option to avoid unprotected left turns and to avoid making a left turn across a busy road where cross traffic does not stop. (But only during heavy traffic; it's fine when nobody is on the road.) Not only are these more dangerous, they're also more stressful and they also introduce annoying variation into my travel time.