Amazon invented the smart speaker market with the Echo and its virtual assistant Alexa — bringing the concept of having a virtual digital assistant in your home and daily life. The distinct platform empires by Amazon (Alexa), Google (Google Assistant), Apple (Siri) and Microsoft (Cortana) made virtual digital assistants a fragmented and complex type of computing experience. It is not Lord of the Rings with one device or assistant to “rule them all”. It’s a Game of Thrones of virtual assistants: a marketplace divided between competing customer experiences and data, as well as complicated by alliances and partnerships between different players.
This space is so dynamic and complex, that it’s worth thinking about what we really mean when we talk about virtual digital assistants and the future it holds for consumers. Looking ahead, one trend at an early stage is Alexa’s move into home robots. The team behind robot Temi announced that they’re working to bring Amazon Alexa onboard. Anki was also at the CES show with its Vector device to demo its new Alexa voice integration.
It’s clear that home robots need Alexa more than Alexa needs home robots at this stage.
Virtual assistant robots are different in that they have a larger impact on our daily lives and are designed to help us accomplish manual tasks. They can learn when we typically wake up and go to sleep, what we watch on television, who visits frequently at our homes or what we purchase online. For now, the bulk of consumers have yet to graduate to more advanced activities like shopping or controlling other smart devices in the home.
The key comes when one of these computers (or machines/robots/drones) becomes self-aware. What happens afterward? It depends on what we’ve taught that A.I and how they are trained with our data.
What’s Amazon’s strategy for Alexa and voice search?
So far, Alexa’s iconic status has owed everything to the dominant market share of the Amazon Echo. Right now, Amazon’s Echo is like having the Star-Trek computer available at the command of your voice, ready to answer your questions, play your music or audiobooks, and control other devices in your home through smart home-connected technology. It might not be able to pilot the entire ship but it will make the everyday running of the ship considerably easier.
Amazon and Microsoft have announced to provide Cortana access on Echo devices and Alexa access on Windows 10 PCs. The third-party skills have been a big part of Amazon’s strategy for Alexa from the start, opening up the platform to developers and tech partners, and building an ecosystem that greatly expands the range of what will come as a multi-assistant world. It will thus be an intelligent system that will have access to all the information on the internet and one day have the ability to derive its own conclusions and possible scenarios.
What’s Google’s strategy for Google Assistant and voice search?
Like its main smart speaker rival Amazon, Google has been growing the influence of its AI assistant by putting its smart speakers into as many homes as possible as well as integrating within its Pixel smartphones. Google Assistant, described by the company as “your own personal Google,” is driven by the deep learning algorithms and other useful Google products. Google recently released a paper on federated learning (FL) which is a distributed machine learning approach to train model on large decentralized dataset. Google can now train deep learning models without moving the data out of your mobile phones (or devices) and instead train it on the phone and just send out the model weights to a global model that sits somewhere on the server.
“Google Assistant is one of the core implementations of an AI-first world”, Google senior engineering director Behshad Bezhadi. “If we solve the problem of conversational understanding, we can expand it to many use cases.”
In May this year, they demonstrated the Google Duplex which may serve as an extension to the Assistant. Duplex-style technology has the potential to dramatically increase the range of tasks that virtual digital assistants perform. Duplex is a preview of things to come, eventually leading the world to a bot-to-human communication & bot-to-bot interaction.
I can’t help myself making a comparison between Amazon Alexa as Jarvis and Google Assistant as Ultron. Alexa is a “she”, Google Assistant is an “it”. Jarvis has personality like Alexa; while Ultron is the Data to the Lore like Google search engine.
Who doesn’t know about Jarvis developed by Tony Stark? Jarvis is a personal assistant who can optimize tasks for Tony and follow instructions very well but isn’t great at independent decisions. Just like Alexa, Jarvis recollects things exactly as recorded and unable to develop its own perspectives. When Jarvis communicates his findings, he uses the holographic display of Iron Man helmet and different graphical representations similar to Amazon Echo Show. Amazon’s Alexa aims to be a ubiquitous virtual assistant wherein the user can talk to devices & gadgets around anywhere to provide the information on a day-to-day basis.
On the other hand, Ultron was a high-end AI program developed by Tony Stark and Bruce Banner using the decrypted code found inside the mind stone. Ultron was created to make decisions rather than follow orders. Like Google Now, Ultron is hyper-charged to read all the conversations and email of Tony Stark and extrapolate his personality as a self-loathing, misanthropic inventor. Ultron had a sense of his own plans separate from the original programming which we can glimpse through the Deep Mind project at Google.
Elon Musk began warning about the dangers of A.I. around 2014 speculating that A.I. was probably humanity’s biggest existential threat. Musk, Stephen Hawking, and Bill Gates are all raising the same warning about A.I. Google Brain researchers have discovered that AI, when tasked properly, can create inhumanly cryptographic schemes and are better at encrypting than decrypting them. This means one-day robots will be able to talk to each other in ways we won’t be able to understand or create systems that can crack.
In the movie Ex Machina, programer Caleb Smith is experimenting with his AI digital assistant, “Ava.” Ava, it seems, knows Caleb better than anyone. And while the voice-search enabled digital assistants of the real-world like Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa may not yet have had anyone confess their undying love, we do know that they are quickly becoming the go-to search mode for consumers everywhere. Today’s digital assistants are going beyond voice input, and are evolving to understand user intent and behaviors through available data and information to help consumers take actions.
Empathy is about understanding people’s individual character and how they feel it. Digital assistants should comprehend that human interactions are based on emotional processes more than cognitive abilities. One of the common pitfalls of AI comes from people assuming that it is smarter than it actually is
This disclaimer informs readers that the views, thoughts, and opinions expressed in the text belong solely to the author, and not necessarily to the author’s employer, organization, committee or other group or individual. This blog is meant for humor & entertainment purposes only.