MENUMENU

Don’t be Alarmed if You Hear Laughter, could be Alexa

  • Crystal Ng
  • 4 Months ago
  • 0

Owing to the advancement and innovation in science and technology, we have become the generation that resorts to the utilization of voice-enabled devices in order to complete our daily tasks such as switching the lights on and off, playing music and so forth. Consequently, various companies in that industry – Google and Amazon, to name a few – have begun to offer “smart” devices. Amazon has a series of devices that holds the name of “Alexa”. Few examples are the echo spot, echo dot, echo show, echo plus etc.

Alexa is to Echo devices as Siri is to Apple devices. Alexa was designed to be as human as possible in performing conversational tasks. Although it may have been entertaining in the beginning to have an ostensible artificial intelligence, what happens when it starts to act unpredictably? An issue that has arisen recently clearly demonstrates this situation. Consumers have been complaining about a problem with these Alexa-enabled devices throughout the past couple of days. Apparently, Alexa has been causing a bizarre laughter without being instructed to do so. Some reported that the laughter came after they commanded Alexa to switch off the lights. This subject has been brought up and supported by a large number of consumers.

Many indicated that the Echo devices have laughed despite of the absenteeism of the wake word, which is “Alexa” by default. This issue has captured a lot of attention from media outlets. Aside from media attention, this has also temporarily generated a social trend on the social platform, Twitter. Through Twitter and Reddit, consumers have expressed their distaste because of the element of surprise. Imagine yourself alone at home. Out of nowhere, you hear a strange laughter coming from a close distance. A clear description of this image is certainly frightening and a cause of concern. Users claimed that they have resorted to directly pulling the plug on their Echo devices.

In response to these negative reviews, Amazon has stepped out to address the issue.

The Verge quoted a response from Amazon that says, “We’re aware of this and working to fix it.”

They have since then announced a couple of their upgrades. One of them involves the deactivation of the command, “Alexa, laugh,” with the replacement of “Alexa, can you laugh?” Amazon proceeded to justify this move by suggesting that in this way, false positives are less likely to occur.” To clarify, this means that the Alexa voice detection will have a lower chance of pick up on arbitrary words that subsequently triggers the laughter.

 “We are also changing Alexa’s response from simply laughter to ‘Sure, I can laugh,’ followed by laughter,” added by a representative of Amazon.

In spite of all the improvements made, several people have made remarks regarding a different aspect. In the movie 2001: A Space Odyssey, a scene depicting the machine HAL 9000 acknowledging his wicked plans and uttered, “I’m sorry Dave, I’m afraid I can’t do that.” Ultimately, this leads to the discussion of machine versus mankind. Robot overlords rule. Is this another indication of our over dependence on gadgets and devices? It seems as though we are fast approaching a dystopian future as depicted in movies. While we remain gratified and proud of the progress we have made in the field of technology, we should still keep it in moderation when it comes to the utilization of said devices.

At this point, it seems like the only distinction between men and robots is the ability to feel compassion and detect emotions. Should we be trying to create more artificial intelligence? Are we moving forward or bringing an end to the world?

Featured Image via Flickr/Guillermo Fernandes

Facebook Comments
Previous «
Next »

The concept of time is lost on me as I venture into the world of business, politics, technology and all other matters concerning recent events. No matter where I am; out in the big cities or isolated in the desert, writing is seemingly the only constant in my life.