top of page

Everything You Say to Alexa, Amazon can hear it


Amazon Echo speaker

Picture this: your Echo speaker, once a discreet confidant handling your voice commands in-house, now beams every word you utter to Amazon’s servers. That shift arrives March 28, when the company axes local processing for certain Echo devices, pinning the change on the demands of its new AI assistant, Alexa+.

 

This isn’t a quiet tweak, it’s a loud recalibration of how Alexa interacts with you. Starting that date, every request you make will travel to the cloud, leaving behind the option to keep your data close. Amazon insists it’s about capability, not control, but the move has sparked a fresh debate about privacy in an age where AI promises dazzle and data risks loom.

 

The change targets a select trio: Echo Dot (4th Gen), Echo Show 10, and Echo Show 15. These models, once capable of processing voice requests locally, catered to U.S. users with English settings, a niche group now facing a mandatory cloud migration. Amazon broke the news via email, first flagged by Ars Technica, explaining that Alexa+’s generative AI tools need more muscle than these devices can muster.

 

“Starting on March 28th, your voice recordings will be sent to and processed in the cloud, and they will be deleted after Alexa processes your requests,” Amazon’s email declares. “Any previously saved voice recordings will also be deleted. If your voice recordings setting is updated to ‘Don’t save recordings,’ voice ID will not work and you will not be able to create a voice ID for individual users to access more personalised features.”

 

Translation? Opt for privacy, and you sacrifice tailored perks like music picks or calendar cues tied to your voice profile. It’s a stark choice, one that forces users to weigh convenience against control.

 

Amazon pledges vigilance. “Alexa voice requests are always encrypted in transit to Amazon’s secure cloud, which was designed with layers of security protections to keep customer information safe,” the email assures. Recordings vanish post-processing, the company says, unless you choose otherwise. Yet, scepticism lingers, fuelled by Amazon’s rocky privacy track record.

 

Take 2023: regulators slapped Amazon with a $25 million fine for failing to delete children’s recordings and location data despite user requests. That same year, Ring, Amazon’s home security arm paid $5.8 million after third-party contractors accessed customer videos. These stumbles cast a long shadow over assurances from an Amazon spokesperson, who told Tom’s Guide, “The Alexa experience is designed to protect our customers’ privacy and keep their data secure, and that’s not changing. We’re focusing on the privacy tools and controls that our customers use most and work well with generative AI experiences that rely on the processing power of Amazon’s secure cloud. Customers can continue to choose from a robust set of tools and controls, including the option to not save their voice recordings at all. We’ll continue learning from customer feedback, and building privacy features on their behalf.”

 

But can trust be rebuilt when every “Alexa, play my playlist” now pings a distant server? It’s a question users must wrestle with as the deadline nears.

 

Contrast this with Apple and Google. Both lean on-device for some AI processing, sidestepping the cloud where possible. Apple’s Apple Intelligence, though playing catch-up in features, doubles down on privacy, while Google’s Pixel devices flex powerful chips to keep responses swift and local. Their hardware, think iPhones and Pixels, dwarfs the modest Echo Dot, which retails for a fraction of the price.

 

This gap explains Amazon’s cloud pivot but doesn’t erase the unease. Why can’t Echo match its rivals’ on-device prowess? Is cost-saving trumping user autonomy? The answers matter as smart homes grow smarter and more intrusive.

 

Users concerned about privacy can opt for “Don’t save recordings,” but this disables Voice ID, which personalises Alexa’s responses for music and calendar events. To set this before the deadline, users should:


1. Open the Alexa App on their mobile phone.

2. Go to Settings > Device Settings and select the device.

3. Choose “Do Not Send Voice Recordings” from the menu.

4. Disable the setting.

 

Amazon frames this as progress, a necessary leap for Alexa+’s AI ambitions. Yet, it’s also a reminder of trade-offs in our tech-driven lives. Privacy once meant keeping your words at home; now, it’s a toggle in an app, a compromise baked into innovation.

 

Look at 2023’s fines again, $25 million for Amazon, $5.8 million for Ring. Then consider the cloud’s allure: faster AI, richer features. Are we okay surrendering a slice of sovereignty for that? And if Amazon’s past missteps linger, what’s the real cost of trusting them with every “Alexa” we speak?

 

The clock’s ticking. March 28 will redraw the line between you and your Echo.

 
 
 

Comments


bottom of page