While analyzing comments about Google Home shortly after its release in Germany I came across the following fascinating thought from one user:
“And will users become content not knowing? (when the assistant does not have the answer to a question)? The device does not recommend any sources for further research” [1]
There are two themes in this comment I find worth-exploring; criticism of the smart speaker’s narrow helpfulness and what I will call Alexa effect.
Alexa effect
I will define the Alexa effect as
the tendency towards contentedness with nescience about a topic if information about said topic cannot be obtained immediately or easily [2]
In some way, we can already observe this “not knowing phenomena” in our online and offline behavior. For example, in how we google:
If it isn't on the first page of Google, it doesn't exist.
— Not Will Ferrell (@itsWillyFerrell) October 27, 2013
depend on our mothers:
or the Google effect. This cognitive bias refers to the phenomenon that we tend to forget information that is easily obtainable through search engines.
Alexa makes us childish by asking it questions that we wouldn’t have googled [3]
Looking back on my Alexa-usage I can indeed remember being “ok with not knowing”. I am under the impression (really unsure how good my memory is here) that most of the time when Alexa was unable to answer me a question I did not look elsewhere for an answer. Of course, I could downplay these occurrences by claiming that these questions were not relevant to me in the first place. Although I had indeed more important things to do at the time (as opposed to googling, I use Alexa mostly alongside other activities) I would not have asked these question at all if they were not but a little bit relevant to me. I believe this relevance to stem from curiosity rather than necessity. In essence, it is about asking questions that we would not have asked otherwise. We can observe such situations in interpersonal communication as well, where we happen to shrug conversations off with a “Nah, does not matter anyway”.
In a way this is a childish behavior (in a neutral way); children tend to ask adults (or other people they consider knowledgeable) tons of questions out of pure curiosity. Questions to which they could find the answer themselves, but if they had to, they wouldn’t. I am no expert on children but still pretty sure that every child would benefit from as many answered questions as possible. And the same undoubtedly goes for everybody else as well.
I have no idea if I am reading into something completely useless here, but if we “have” a Google effect, we should at least consider the possibility of having an “Alexa effect” in the future as well.
Smart speaker’s narrow helpfulness
The other theme from the comment was criticism of the smart speaker’s narrow helpfulness because Google Home does not recommend any sources for further research if it is unable to provide an answer.
I am using Amazon Alexa at home and it is really annyoing to get an “I do not know that” or similar. Besides annoyance, unhelpful responses also decrease your trust in Alexa. After a series of useless interactions you will stop trusting in Alexa’s capabilities and as a result, stop using altogether (here I have explained the concept of trust as the most important product feature in more detail). Google itself (the search engine) at least helps you with autocompletion, the related searches features and its “Did you mean…”-function.
If smart speakers similarly replied with what they think might be true or provided us with alternative sources we could look into, they would not only be friendlier but also more useful.
Notes
[1] Translated from German, italics added by me. The German original: “Und werden die User sich bald damit zufrieden geben ‘etwas’ nicht zu wissen? Das Gerät macht ja nicht den Vorschlag anderweitig nachzuschauen.”
[2] Even though the Alexa effect emerged out of a discussion about Google Home I am naming it based on Amazon’s smart speaker because Alexa is more widespread and Google already has an effect named after it (just trying to be fair here ?)
[3] Alternative heading: Alexa makes us childish by asking her (instead of it) questions that we wouldn’t have googled
Featured Image’s Source: http://media.corporate-ir.net/media_files/IROL/17/176060/EchoDot/Echo%20and%20Echo%20Dot.jpg