Researchers trick Tesla Autopilot into steering into oncoming traffic

This should give people pause. But don’t count on it.

“Researchers have devised a simple attack that might cause a Tesla to automatically steer into oncoming traffic under certain conditions. The proof-of-concept exploit works not by hacking into the car’s onboard computing system. Instead, it works by using small, inconspicuous stickers that trick the Enhanced Autopilot of a Model S 75 into detecting and then following a change in the current lane.”

Slow down

It is great to be first. But it is better to be correct. Humans seems to always be in a hurry, which often leads to a mistake. Countless times we have jump into something without fully understanding the situation. This phenomenon has given rise to the phrase unintended consequences.

There is a rush to be the first to implement IoT. These early solution almost certainly deficient in some significant ways that will lead to unintended consequences. This paper argues that we are throwing caution to the wind. Read it and beware.

Researchers can now send secret audio instructions undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant.

Holy crap. I thought it was a bad idea to self-bug your home with Alexa. But it is worse than I ever thought. Researchers at UC Berkeley and Georgetown have shown that one can embed commands in music. That means just listening to music or a video while Alexa is in the room is a risk. Just don’t do it.