Faceoff: Amazon Echo Show vs Google Home Part II

As a part of our continuing efforts to understand the ways and speed at which artificial intelligence enters our everyday lives, we reexamined two home assistants based on a study we performed in February.  The two most popular assistants, Google Home and Amazon Echo were put to the test, this time substituting the Echo with the Echo Show, which includes a 7″ touchscreen.

Methodology. For this experiment, we asked the same 800 queries of both the Echo Show and Google Home, similar to our first study. We graded the queries on two metrics:  First, did the device understand what we asked correctly? Second, did the device answer the query correctly? In our study, Amazon’s Echo Show understood 95.88% of the queries we asked and answered 53.57% of all queries correctly. Google Home understood 94.63% of the queries we asked, but was able to answer 65.25% correctly. Below, you can see the improvements that each home assistant made since our last set of queries.

One advantage the Amazon Echo Show has when it comes to understanding queries is that we have the ability to confirm the data using Amazon’s companion app.  This app gives the user a live feed of what Amazon Echo Show heard.  Google Home does not offer a transcript of what it’s home assistant device picked up.  Because of this, it was difficult to tell if Google Home understood the queries but couldn’t answer them, or if it truly had a harder time understanding queries. Since we were unable to see exactly how well Google Home understood our queries, we assumed that if Google Home responded that it was unable to perform a certain function, then it had understood the query correctly.  For example, if we asked, “Hey Google, send a text to John” and received a response “Sorry, I can’t send texts yet,” then the query would be marked as understood correctly, but answered incorrectly.

Results. Both home assistants showed increased performance across the board.  This time the Google Home outperformed the Echo in total number of correct answers by nearly 12 percentage points, up from a 5 point performance gap in our February results.  While each digital assistant has its strengths and weaknesses, Google Home outperformed its rival in 3 of the 5 query categories by a surprising margin.  This is significant because it shows not only rapid improvement, but outperformance of Amazon who has both a 2-year head start and a near 70% market share vs. Google’s 24% share of the home assistant market, according to eMarketer.

Both Home Assistants Notably Improved in Navigation. The most dramatic increase for both assistants was in navigation. In February, over 90% of navigation questions were answered with: “I can’t help you with that.” Today, navigation is the best category for both the Google Home and the Echo Show, with the Google Home answering 92% of queries correctly, and the Echo Show answering 63% of queries correctly.

Echo Show: Screen adds to experience, but software upgrades drive improvement. The Echo Show’s camera and touchscreen allow it to make video calls, monitor your security cameras, visually display some forms of information, and introduces new use cases with Alexa Skills that incorporate a screen. For instance, you can say, “Alexa, show me the trailer for the new Spiderman movie,” or scroll through recommendations for local pizzerias. While this adds to the user experience, the addition of the screen itself isn’t driving all of the improvement that we are seeing with Alexa. Instead, numerous software updates have increased the way Alexa can contribute to our daily lives. The Echo Show had a near 20% improvement in its ability to answer both local questions (“Where can I find good barbecue?”), and respond to commands (“Cancel my 2:00 p.m. meeting tomorrow”). Both of these changes are driven by software improvements, not the addition of the screen.

Google Home: Quickly adding features to pass Alexa. Google Home improved its local and commerce results by 46 percentage points and 24 percentage points, respectively. This represents a broadening of its skills along with high navigation, information, and local scores. Google Home also supports up to 6 different user accounts, meaning your whole family can get personalized responses when you say, “Okay Google, what’s on my calendar today?” Google Home will recognize your voice and read your upcoming events. Separately, commerce is an area that was previously dominated by Amazon, but Google is now at parity, mainly due to its superior ability to understand more diverse natural language. While Alexa still has a larger database of add-on skills, Google Home outperformed in our set of queries.

Future home assistant competition looks intense. While Amazon and Google are the current frontrunners in the home assistant race, they are facing competition from several notable future entrants:

  • Apple HomePod (expected December 2017)
  • Alibaba Tmall Genie (released August 8th, 2017)
  • Microsoft Invoke (expected Fall 2017)
  • Lenovo Smart Assistant (utilizing Alexa, expected Fall 2017)
  • HP Cortana Speaker
  • Samsung Vega

Persisting Problems of Home Assistants. While home assistants continue to make noticeable improvements, we still believe that they are in the early innings of a platform that will become an important part of computing in the future. That being said, there are small, technologically reasonable improvements that we would like to see from these products. Our main complaint is the lack of integration with devices to make use of information or take further action. In most cases, the fastest way to get information to a user is on a screen – it’s hardly convenient to have a list of 10 restaurant recommendations read to you one at a time. Instead, you should be able to call up information verbally and have it sent to your smartphone, computer screen, or television. The Echo is able to interact with your phone via the Alexa app. Google Home can control a Chromecast. Both are able to control certain smart home devices. There is clear progress being made on this front, but it remains a key obstacle to the devices’ effectiveness. Another shortcoming that persists is unsatisfactory natural language processing, an added barrier to widespread use. Both assistants were selective in the way you had to phrase a question in order for it to be answered correctly. For example, Google Home will understand, “What is a gigawatt?” but cannot process, “Google gigawatt.” or, “Tell me what a gigawatt is.” In order for digital assistants to reach widespread adoption, users need to interact with them seamlessly.

Overall, we were impressed by the improvement that took place in a few short months and remain optimistic that the technology will continue to advance at this pace going forward.  As new players enter the space and homes become more connected, the technology in these devices will be increasingly important in our everyday lives.  Later this year we will track the further progress made by the Echo and the Home, and compare them to some of the new entrants set to arrive by the end of 2017.

Disclaimer: We actively write about the themes in which we invest: artificial intelligence, robotics, virtual reality, and augmented reality. From time to time, we will write about companies that are in our portfolio. Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.

Feedback Loup: iPhone User Survey

In celebration of the iPhone’s 10th anniversary, we went to the University of Minnesota and the surrounding area to ask 25 people a series of questions to get their insights on the iPhone.  Most insightful were the answers to questions about respondents’ favorite apps and what they would change about their iPhones.

  • Regardless of phone, what is your favorite app?

  • What would you change about or add to the iPhone?

Disclaimer: We actively write about the themes in which we invest: artificial intelligence, robotics, virtual reality, and augmented reality. From time to time, we will write about companies that are in our portfolio.  Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.

Feedback Loup: Snapchat Announces Diminished Reality

On Tuesday, Snapchat rolled out new features to its platform, including an infinite snap timer, looped videos, emoji drawing, and a magic eraser. With the new snap timer and looped videos, recipients will now be able to see a picture or video until they choose to exit the snap. Once the recipient exits the snap, it is deleted. These are nominal improvements but they show the direction and emphasis of Snap’s R&D as well as it’s technical chops in the field of Augmented Reality.

We are most excited about the Magic Eraser feature, an example of Diminished Reality. The Magic Eraser allows users to remove objects from a photo, by scanning the surrounding colors and filling in over a selected area. Let’s play a quick game of Photo Hunt. How many changes do you see?

Read More

Feedback Loup: Clips

Yesterday Apple released Clips, a new app for iPhone users. Apple describes Clips as, “A new iOS app for making and sharing fun videos with text, effects, graphics, and more.” And Clips is fun, but it doesn’t show us the kind of augmented reality lenses and layers that we were hoping to see from Apple.

We’ve written a lot about how AR will change the way we interact with computers. Over the next several years, the smartphone will increasingly become a window through which users can see an augmented world. Players like Apple and Google are well-positioned to win the jump ball to own the dominant operating systems in that new paradigm. Google’s leadership in core disciplines like maps, data, and content make it an important incumbent. Apple’s leadership among app developers and payments will be important, but we think design is Apple’s trump card in AR. But Clips is more filters and effects than lenses and layers. There is an interesting real-time transcription capability, but unfortunately Clips is short on true AR.

In about 5 min. I was able to put together a short video with text, effects, filters, and music. Clips uses fairly rudimentary real-time computer imaging, but this could be the beginning of the underlying technology that will one day direct you to your seat in a stadium, overlay talking points during a presentation, or provide instructions as you assemble new furniture.

Read More

Feedback Loup: Google Daydream

Google’s smartphone-powered VR platform, Daydream, represents the company’s most significant push to date in its effort to accelerate the adoption of VR. We’ve spent the last few weeks testing the platform with a Pixel phone and a Daydream View headset. Bottom line: Daydream isn’t there yet, but the platform establishes a solid foundation for the future of “low-immersion” VR.

Along with Samsung’s Gear VR  platform and Google Cardboard, we continue to believe these smartphone-powered, low-immersion platforms will drive the global VR user base above 100m by 2018. We expect the vast majority of VR users will be using low-immersion VR over the next several years. Low-immersion platforms are the on ramp for high-immersion VR platforms like the Oculus Rift and HTC Vive, so it is important to understand the low-immersion platforms of today in order to anticipate broader high-immersion use and the future of VR more broadly.

Hardware, software and content are all critical components for the future of VR, but our experience with Daydream left us feeling that content represents the biggest near-term opportunity to show the power of VR.

Hardware: Daydream is powered by Daydream-ready Android phones running the Nougat operating system. Currently, there are 4 Daydream-ready phones, including Pixel, with (many) more on the way. After a month-long wait, we used a Pixel ($649) for our testing of the Daydream platform. These phones pair with the Daydream View headset ($79), which is the best smartphone-powered VR headset we’ve ever used.

Daydream View is the best VR headset we’ve ever used.

Unlike some headsets we’ve tried, Daydream View is wearable. The soft fabric and angled head strap are clear signs of thoughtful design, built for wearability. Plus, in what seems like an industry first, it’s even comfortable for users with glasses. Daydream View comes with a remote control that we found easy to set up and intuitive to use. The remote conveniently nests in the viewer when not in use. Sound can be heard directly from the phone’s speakers, but it’s more immersive to use the easily accessible headphone jack on the Pixel.

Read More