Researchers can now send secret audio instructions undetectable to the human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant.

Holy crap. I thought it was a bad idea to self-bug your home with Alexa. But it is worse than I ever thought. Researchers at UC Berkeley and Georgetown have shown that one can embed commands in music. That means just listening to music or a video while Alexa is in the room is a risk. Just don’t do it.

Alexa flaw allows eavesdropping

Alexa had flaw that allows eavesdropping. The flaw comes from an architecture that also Skills to be added. A Skill provides a service (such as weather report). The problem is that some one can write a Skill that does not stop listening. A research firm found the flaw, alerted Amazon who promptly made a fix. Good for them.

It is good and right that Amazon immediate fixed the problem. But this is not good enough. I don’t know if the primary cause of the flaw is incompetency, negligence, apathy, pressure from tight production deadlines, etc. But I believe two problems are (1) the problem is hard and (2) it is a low priority. Consequently, the architecture is not designed with privacy and security first because it is too expensive. We need to change this. Stop buying crappy solutions and vendors will have an incentive to build better ones.