One of the most annoying pitfalls of digital life might be software upgrades that are secretly downgraded. I want to share a cautionary tale about a Google demotion that could end up costing you money. After many back and forths with me, Google described it as a “bug”, but it poses a permanent risk to our virtual future.
Google feature prevents families from sharing music subscription
Reader Matt Hirsch from outside Boston contacted support about a strange phenomenon on his Google smart speakers. He and his wife both used them to stream music from YouTube Music, a Google alternative to Spotify. But soon the music stopped working for his wife. She was hearing commercials before he played a song she had requested.
One thing had changed. Hirsch and his wife recently activated Google Voice match a service. This optional update trains the AI-powered Google Assistant to recognize different voices and present personalized responses to them. Voice Match can be useful if, for example, you want to access individual calendars or shopping lists.
But the Hirsch family certainly didn’t expect Voice Match to prevent their household from sharing a music account. Hirsch asked, “Is it something intentional to get us to buy the family plan or an accidental oversight?”
When I told Google about his experience, the company initially denied that it could happen. So I tried to replicate his situation using a Google Nest Hub speaker, which contains a small screen, with the help of voices from a few family members and friends.
Sure enough, the smart speaker wouldn’t let another Voice Match user in my house play from my own premium YouTube Music subscription. The other user started on the “free” version of YouTube Music with ads. Our choices were to have everyone join a more expensive family plan or turn Voice Match off.
The experience reminded me of the digital rights locks on music files you used to buy from the iTunes Store back then. Now the locks are on the modern world of streaming, and the only key is your own voice.
I shared the results of my experiment with Google, and they denied it could happen a second time. It was only after sending him a video of the experience that Google changed its tune. “This issue is caused by a bug affecting smart displays. We are working on a fix as soon as possible,” Google spokesperson Robert Ferrara said.
Explanations of how Voice Match and music services work within a household are about as complicated as logic puzzles. The root of the problem is that Google products are designed for individuals, whose data can be collected and advertised, not for homes full of people who rightly expect to be able to share experiences like listening to music.
Google’s policy is that if the owner of the smart speakers has a music subscription, other members of the household can also access it. When the smart speakers do not recognize an individual’s voice, they default to assigning the music service to the owner.
How the tech monopoly made smart speakers dumber
But something clearly went haywire when the speaker’s primary user subscribes to YouTube Music and a second user activates Voice Match. Things make more sense with Amazon Alexa and Apple Siri, which also have voice capabilities. I’ve registered with both companies, and neither is preventing other household members with voice matches from using a shared streaming music account.
Google’s spokesperson didn’t respond when I asked him to respond to Hirsch’s question about whether the use of voice ID as a lock was intentional. Maybe it’s just an oversight by Google. But I also wouldn’t say that a business development manager at the company thinks they could sell us nickel and silver to generate additional revenue from YouTube Music.
We need to push back against the idea that companies can use software updates to impinge on or change the functionality of devices we’ve paid for. But we’ve seen it time and time again with products like printers that receive updates to limit ink sources. We now have over a decade of reminders that when something connects to the internet, you have no real control over it.
My favorite example is even more ridiculous. In 2019, Nike released internet-connected sneakers that used an app to lace up. The company released a software update that inadvertently broke part of the shoes’ motorized mechanism, so they couldn’t even lace up. The software update turned the shoes into bricks.