As voice assistants like Alexa and Google Home continue to play ever more prominent roles in the lives of consumers, privacy concerns around the capabilities of these voice assistants are starting to take hold. These devices are permeating our homes at an astronomical rate—research firm Canalys predicts global sales of 56 million smart speakers this year alone. Already more than 35 million voice-first devices have shipped in just under two years, one full year less than it took for the iPhone to reach that mark.
It’s clear that we’re on the precipice of a voice-dominated society. With this fresh approach to interacting with technology creating a new paradigm, companies need to prioritize corporate social responsibility in order to establish trust amongst its customers and protect their rights to privacy and freedom of speech. Applications of voice-first technology can be ethically malleable in nature, meaning tech companies can sway easily into pretty murky, corrupt territories when it comes to these devices. That’s why the brunt of responsibility falls solely on corporations to ensure they’re developed and designed in a human-centered way that prioritizes and protects the customer and their data first and foremost.
So, just how can companies continue innovating and profiting in the voice-first realm while staying true to customers and remaining mindful of basic human rights?
For starters, they could be more like the tortoise rather than the hare. It’s easy, especially with technology, to rush changes out quickly to claim the title of “first in market” and expedite greater levels of innovation and adoption. But moving too fast can prove detrimental down the line—look no further than Siri. Sure, Siri was the first voice assistant to truly rise to prominence, but compared now to Alexa, its AI capabilities and general accuracy when it comes to fulfilling requests pale in comparison. Amazon took its time to refine Alexa before pushing to market and that hasn’t gone unnoticed by consumers.
Not only does going slow and steady make for a better customer experience, it also allows companies to take stock of the limitations of voice-first technologies. This cognizance goes a long way toward preparing companies for any potential hacks or nefarious uses of voice assistants. Only then can companies map a strategy for how to protect customers should the worst-case scenario occur. And make no mistake, the odds are ever in a hacker’s favor. Cybersecurity isn’t a hot topic in tech just because; with more connected devices infiltrating our lives, these days a hack is usually a matter of when, not if.
This is a particularly relevant concern for voice assistants and devices, especially those with an always-on microphone that are actively listening to customers without explicitly being in use. Why are these devices always listening? What are they doing with the information they overhear? Are customers being recorded, and if so, for what purpose? These are some of the most pressing questions arising from voice assistants and ones that companies need to be wholly transparent about answering. The common response is that these devices can respond more accurately through active listening that’s bolstered by machine learning, and while that’s all fine and dandy, consumers have a right to know if they’re being recorded and what’s being done with the information these devices are logging.
Full transparency should be at the foundation of any company looking to embody corporate social responsibility. Even if the voice technologies a company develops are hacked or misused, a company that is completely forthcoming about the intricacies of its products and can establish trust amongst its customers can easily come out on the positive end of consumer sentiment. Of course, companies should also keep a design thinking approach in play while developing voice-first products so that the customer experience and their privacy are of the utmost consideration. Cybersecurity breaches are bound to happen with almost any device, but corporations that make it a point to be socially responsible can lessen the negative impact and remain a reliable, trusted voice in this sector.
- “The Cynicism Contaminating Corporate Social Responsibility and Citizenship”
- “How Voice Assistants Are Becoming the Office Admins of the Future”
- “Why Protecting Data Is Critical for CSR Moving Forward”
- “Talking Back: The Evolution of Voice Assistants Hinges on User Experiences”