Faceoff: Amazon Echo Show vs Google Home Part II

As a part of our continuing efforts to understand the ways and speed at which artificial intelligence enters our everyday lives, we reexamined two home assistants based on a study we performed in February.  The two most popular assistants, Google Home and Amazon Echo were put to the test, this time substituting the Echo with the Echo Show, which includes a 7″ touchscreen.

Methodology. For this experiment, we asked the same 800 queries of both the Echo Show and Google Home, similar to our first study. We graded the queries on two metrics:  First, did the device understand what we asked correctly? Second, did the device answer the query correctly? In our study, Amazon’s Echo Show understood 95.88% of the queries we asked and answered 53.57% of all queries correctly. Google Home understood 94.63% of the queries we asked, but was able to answer 65.25% correctly. Below, you can see the improvements that each home assistant made since our last set of queries.

One advantage the Amazon Echo Show has when it comes to understanding queries is that we have the ability to confirm the data using Amazon’s companion app.  This app gives the user a live feed of what Amazon Echo Show heard.  Google Home does not offer a transcript of what it’s home assistant device picked up.  Because of this, it was difficult to tell if Google Home understood the queries but couldn’t answer them, or if it truly had a harder time understanding queries. Since we were unable to see exactly how well Google Home understood our queries, we assumed that if Google Home responded that it was unable to perform a certain function, then it had understood the query correctly.  For example, if we asked, “Hey Google, send a text to John” and received a response “Sorry, I can’t send texts yet,” then the query would be marked as understood correctly, but answered incorrectly.

Results. Both home assistants showed increased performance across the board.  This time the Google Home outperformed the Echo in total number of correct answers by nearly 12 percentage points, up from a 5 point performance gap in our February results.  While each digital assistant has its strengths and weaknesses, Google Home outperformed its rival in 3 of the 5 query categories by a surprising margin.  This is significant because it shows not only rapid improvement, but outperformance of Amazon who has both a 2-year head start and a near 70% market share vs. Google’s 24% share of the home assistant market, according to eMarketer.

Both Home Assistants Notably Improved in Navigation. The most dramatic increase for both assistants was in navigation. In February, over 90% of navigation questions were answered with: “I can’t help you with that.” Today, navigation is the best category for both the Google Home and the Echo Show, with the Google Home answering 92% of queries correctly, and the Echo Show answering 63% of queries correctly.

Echo Show: Screen adds to experience, but software upgrades drive improvement. The Echo Show’s camera and touchscreen allow it to make video calls, monitor your security cameras, visually display some forms of information, and introduces new use cases with Alexa Skills that incorporate a screen. For instance, you can say, “Alexa, show me the trailer for the new Spiderman movie,” or scroll through recommendations for local pizzerias. While this adds to the user experience, the addition of the screen itself isn’t driving all of the improvement that we are seeing with Alexa. Instead, numerous software updates have increased the way Alexa can contribute to our daily lives. The Echo Show had a near 20% improvement in its ability to answer both local questions (“Where can I find good barbecue?”), and respond to commands (“Cancel my 2:00 p.m. meeting tomorrow”). Both of these changes are driven by software improvements, not the addition of the screen.

Google Home: Quickly adding features to pass Alexa. Google Home improved its local and commerce results by 46 percentage points and 24 percentage points, respectively. This represents a broadening of its skills along with high navigation, information, and local scores. Google Home also supports up to 6 different user accounts, meaning your whole family can get personalized responses when you say, “Okay Google, what’s on my calendar today?” Google Home will recognize your voice and read your upcoming events. Separately, commerce is an area that was previously dominated by Amazon, but Google is now at parity, mainly due to its superior ability to understand more diverse natural language. While Alexa still has a larger database of add-on skills, Google Home outperformed in our set of queries.

Future home assistant competition looks intense. While Amazon and Google are the current frontrunners in the home assistant race, they are facing competition from several notable future entrants:

  • Apple HomePod (expected December 2017)
  • Alibaba Tmall Genie (released August 8th, 2017)
  • Microsoft Invoke (expected Fall 2017)
  • Lenovo Smart Assistant (utilizing Alexa, expected Fall 2017)
  • HP Cortana Speaker
  • Samsung Vega

Persisting Problems of Home Assistants. While home assistants continue to make noticeable improvements, we still believe that they are in the early innings of a platform that will become an important part of computing in the future. That being said, there are small, technologically reasonable improvements that we would like to see from these products. Our main complaint is the lack of integration with devices to make use of information or take further action. In most cases, the fastest way to get information to a user is on a screen – it’s hardly convenient to have a list of 10 restaurant recommendations read to you one at a time. Instead, you should be able to call up information verbally and have it sent to your smartphone, computer screen, or television. The Echo is able to interact with your phone via the Alexa app. Google Home can control a Chromecast. Both are able to control certain smart home devices. There is clear progress being made on this front, but it remains a key obstacle to the devices’ effectiveness. Another shortcoming that persists is unsatisfactory natural language processing, an added barrier to widespread use. Both assistants were selective in the way you had to phrase a question in order for it to be answered correctly. For example, Google Home will understand, “What is a gigawatt?” but cannot process, “Google gigawatt.” or, “Tell me what a gigawatt is.” In order for digital assistants to reach widespread adoption, users need to interact with them seamlessly.

Overall, we were impressed by the improvement that took place in a few short months and remain optimistic that the technology will continue to advance at this pace going forward.  As new players enter the space and homes become more connected, the technology in these devices will be increasingly important in our everyday lives.  Later this year we will track the further progress made by the Echo and the Home, and compare them to some of the new entrants set to arrive by the end of 2017.

Disclaimer: We actively write about the themes in which we invest: artificial intelligence, robotics, virtual reality, and augmented reality. From time to time, we will write about companies that are in our portfolio. Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.

AGV Deep Dive: How Amazon’s 2012 Acquisition Sparked a $10B Market

Special thanks to Austin Bohlig for his work on this note. 

Amazon’s 2012 acquisition of Kiva Systems was the spark that ignited the Autonomous Guided Vehicles (AGV) industry, which we believe will represent a $10B market by 2025. We’ve taken a deep dive into the AGV market, where we identify the different use cases for AGVs, market leaders and opportunity, as well as highlight specific areas where we see the best investment opportunity.

We believe the aggregate robotics hardware market will grow 17% y/y to $24.5B in 2017, and by 2025 we believe the market will eclipse $73B. When including supporting software and services, we believe the total robotics market will be more than $200B in the next ten years. Many categories within the 5 robotics domains (industrial, commercial, domestic, military and social entertainment) will flourish over this time frame.

We are particularly excited about the impact three categories will have on the world: collaborative robots (co-bots), domestic robots (aka robot vacuums, mops and lawnmowers), and Autonomous Guided Vehicles. While we have recently picked up positive data points in the Co-bot and domestic robot markets, the AGV market is a little bit harder to track due to the limited number of publicly traded companies in the space. However, based on the number of AGVs Amazon is deploying internally, as well as the amount of funding and M&A activity occurring in the space, we are convinced this sub-segment of the commercial robot market is inflecting.

What Is An Autonomous Guided Vehicle (AGV)?

AGVs are mobile robots used in manufacturing and other commercial industries to improve logistics efficiencies by transporting goods and other materials autonomously. The major benefits of AGVs are twofold: 1) these robots do not require human interaction when deployed; 2) AGVs do not require supporting infrastructure (tracks, floor sensors, etc), which are needed to operate legacy material handling equipment. Without the need for supporting infrastructure, these robots are more flexible and have a lower total cost of ownership. Advancements in Simultaneous Location and Mapping (SLAM) software and computer vision technologies allow these robots to understand their surrounding environment in real-time, which allows them to operate in variable surroundings and around people. Pricing on AGVs has come down significantly over the last 5 years, which has been a catalyst for the industry. Today, AGV pricing varies from $35 – 50K (not including supporting software and services). Below we highlight a few examples of AGVs in the market today.

Amazon Sparked the AGV Industry

The AGV market flew under-the-radar throughout the early 2000s, but in 2012, the industry became one of the most talked about sub-markets in the robotics space after Amazon acquired the AGV leader (Kiva Systems) for $775M. Amazon had no plans to sell these robots externally, only using them internally to improve logistics efficiencies within their fulfillment centers, which created a significant supply/demand imbalance and a massive opportunity for other companies to enter the space. Since deploying Kiva robots, Amazon has highlighted publicly the positive impact that robots are having on productivity and efficiencies. According to a 2017 Business Insider article, Amazon has deployed 15K mobile robots annually since 2014, and now has over 45K robots in operation throughout 20 fulfillment centers. These data points show the benefits of AGVs and validate that this market is real.

AGV Applications: Today Warehouses; Tomorrow Hospitals, Hotels, and Beyond

Today, most AGVs are deployed within warehouses and fulfillment centers to automate material handling and package logistics. Robots in these settings autonomously retrieve a shelf or bin and return to a packaging station where a human employee picks specific SKUs out of bin. Or, more commonly, a human will follow the AGV around a warehouse, and the AGV will stop in front of specific spots where a human then places the desired product in a bin. While most AGV products need to be fully purchased, there are a few companies capable of retrofitting legacy equipment with autonomous technologies and transforming them into AGVs. There are also a few companies that are taking automation to the next level by adding a robot arm to pick the desired object, taking humans completely out of the equation. While this is where the industry is heading, object recognition and grasping are two of the toughest challenges to solve in this space. Random pick-and-place is considered the “holy grail” of robotics, and it will take time for humans to be fully eliminated within a warehouse.

While we believe AGV adoption within warehouses and fulfillment centers will be a key industry driver, we believe the opportunity in other verticals will add meaningful tailwinds to this market. For example, AGVs are already being deployed in hospitals to autonomously transport food, prescriptions, or other medical supplies throughout a medical facility. In addition, manufactures in all different industries are adopting these technologies because of the cost advantages and flexibility over other legacy solutions. We also see a large opportunity for AGVs to be deployed in many commercial services settings such as delivering products to rooms in a hotel, as well as eventually companies such as Amazon using AGVs to deliver packages autonomously.

Read More

How the Future of Voice Search Affects Marketers Today

Written by guest author Lindsay Boyajian at Conductor

Since Amazon announced its acquisition of Whole Foods, the running joke across social media has been, “Jeff Bezos said to Alexa, ‘Buy me something on Whole Foods,’ and Alexa bought Whole Foods.”

This quip highlights the shortcomings that plague voice search. Today, voice recognition technology is very much flawed and often falls short in delivering on the user’s intent.

Despite its weaknesses, voice search is promising to be the user input of tomorrow. The major tech companies are investing heavily in the technology— Apple has Siri, Amazon has Alexa, Google has Google Assistant, and Microsoft has Cortana. Even with the technology in its nascency, Google reports 20 percent of queries on its mobile app and Android devices are voice searches.

And thanks to artificial intelligence and machine learning, voice search is improving quickly. It improves with every user interaction, becoming more apt at understanding user intent. With the technology advancing, more users will adopt voice search, fueling the growth cycle.

The work that is going into voice recognition technology today will power the next evolution in computing— augmented reality.

Augmented Reality & Voice Search

Augmented reality (AR) represents a new computing paradigm. Augmented reality overlays digital assets on the real-world environment. The technology promises to change how users interact with the digital world.

Soon, everything from office activities to shopping will be experienced through augmented reality. For instance, a shopper will be able to put on a lightweight pair of AR glasses to visualize in 3D what different couches will look like in her home. Some AR experiences like this are already offered today through head-mounted devices like Hololens and Meta. However, these devices are only available to developers and still have their limitations. They are not ready for mass consumer adoption.

The principal user input for augmented reality devices (excluding hardware input accessories like keypads and clickers) is gesture and voice. The issue with gesture controls is user discomfort and fatigue. Many experts agree that voice will be the primary input for these devices.

As the augmented reality space matures so will the importance of voice search.

The tech company with the most advanced voice recognition technology will have an advantage in augmented reality computing.

Optimizing Organic Search for the Future of Voice Search

Although mass consumer adoption of AR hardware is still years away, brands that optimize for voice search early will lead in organic and search marketing when the technology becomes ubiquitous.

Voice search behavior differs from traditional search patterns. Consumers approach voice search using natural, more conversational language. The queries are often longer and delivered as questions.

The result for marketers is that content optimized only for keywords will falter, while content that delivers value and matches the intent of the user will see improved organic search performance. To do this, marketers need to develop a deeper understanding of their customers to deliver content that provides relevant and timely value. This approach to marketing is known as customer first marketing.

Customer first marketing is not new. More and more brands are quickly adopting a customer-centric marketing approach. Relevant and contextual content drives traffic, fosters customer engagement, and builds loyalty. The rise of voice search and its link to the future of augmented reality only makes adopting a customer first marketing strategy even more advantageous for brands and marketers.

This piece originally appeared on LinkedIn. For more, follow Lindsay Boyajian on Twitter and LinkedIn

Disclaimer: We actively write about the themes in which we invest: artificial intelligence, robotics, virtual reality, and augmented reality. From time to time, we will write about companies that are in our portfolio.  Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.

Robotics Software and Services is Where the True Value Lies

We recently published a six-part series on the future of robotics including a detailed outlook through 2025 on the five major robotics categories: IndustrialCommercialDomesticMilitary, and Social. Each part highlighted our thesis, outlook and market size for each category of robotics hardware, but we expect robotics software and services to be even larger.

In total, we believe the robotics hardware market will grow from $20.9B in 2016 to $73.0B in 2025, representing a 14.9% 10-year CAGR. However, our estimate only includes hardware sales, not software or other supporting services for robotics. When factoring in these additional markets, we believe the total robotics market value could be 3x larger including $73B in hardware plus $140B+ in software and services. We believe the hardware and components used for robotics will largely be commoditized over the next 10 years. Differentiated hardware will be the exception to the rule and unique value will be driven by software and services.

Advancements in software, including robot control software and data analytics platforms, will be key to improving robot functionalities and, in-turn, drive further adoption. For example, an unmanned traffic management platform to track robots that move autonomously in society will be crucial to large scale deployment of robotics. We view traffic management as one of the more attractive investment opportunities in the robotics space. We also believe robotics as a service (RaaS) will continue to gain momentum and we view this business approach as an attractive value proposition for business of all sizes. In the paragraphs below, we dive deeper into key robot software platforms, as well as robotic services driving growth across all five robot markets.

Robot Control Software

Programming a robot is one of the most challenging, costly and time-consuming tasks involved in creating an autonomous machine. Few companies that employ robotics will build proprietary control software from scratch. Most will lean on standardized, open-source robotics operating platforms to program robots. The Open Source Robotics Foundation has developed the Robot Operating System (ROS), which is a collection of tools to simplify the task of programming robots across a wide variety of platforms. ROS has become the preferred platform for programming and it has significantly accelerated the number of robotics companies coming to market. However, due to advancements in computer vision and artificial intelligence, companies are developing more efficient and cost effective ways to train robots. Because of lower costs and smaller form factors of 3D cameras, LIDAR and RADAR sensors, robots can better understand their surrounding environment. Coupled with machine learning mechanisms, these new technologies are allowing robots to learn on their own. A future in which robots are self-taught in real-time is still many years away, but we believe further advancements in artificial intelligence will significantly narrow this gab and allow robots to become more human-like.

Data Analytics Software

Due to the advanced sensors that robots carry, machines are gathering incredible amounts of valuable data every day. We believe one of the most attractive investment opportunities in the years to come will be companies capable of taking this data and turning it into actionable insights for businesses to improve efficiencies and productivity. Providing enterprises with affordable, real-time intelligence will be a game-changer for many and a driver of robot adoption. In robotics industries, such as the commercial drone market, early adopters have quickly focused less on the drone hardware and more on the data gathered by the drone. We believe that over time this data-focused shift will continue, and robots will be seen as automated data collectors in many industries. Today, it can take days to remotely process data from robots. Cloud computing is helping to process these large data sets, but leaders in the industry will need to be able to process the data in real-time onboard the robots and provide businesses with answers immediately.

Unmanned Traffic Management Software

For robots to be deployed at scale and operate autonomously within society, we believe an unmanned traffic management platform for air, ground and sea is required. In the near term, we believe the focus will be on implementing a drone traffic management system that provides situational awareness for other drone operators and manned aircraft pilots. Unmanned aerial vehicles are used in a growing variety of applications. By 2020, we expect over 400K commercial drones will be sold annually. For manned aircraft and drones to operate together, both will need to be able to communicate on a common platform, which will allow drones to fly regularly beyond visual line of sight. Amazon and Alphabet are both working on proprietary drone management systems, but we’ve seen a handful of smaller companies working on sophisticated software platforms and believe there will be many different services available to provide situational awareness to the aircraft community. Leaders in the space will be able to support management of drone operators, which will include flight planning, flight approval, tracking as well as remote identification. We believe similar tracking systems should also be in place for ground and marine robots, but a system that works for air, land and sea vehicles in conjunction may be the ultimate goal.

Robotics as a Service

While advancements in software are improving robot functionalities, new services will also be a growth catalyst for the industry. Given certain robot-related costs remain high, many companies are starting to offer robotics as a service (RaaS) for more business to benefit from the advantages of robot technology. This model allows for customers to either lease robots and/or the RaaS provider will visit the customer to perform a specific task with a robot and provide the customer with the data gathered. For example, this service model is common within the commercial drone industry, where businesses may not have the pilot expertise to operate a drone and, in the end, the drone user is primarily after the data gathered. We believe robotics as a service (RaaS) is gaining momentum across multiple robotics domains and we view the offering as an attractive value proposition for business of all sizes.

Delivery Robots

Package delivery via drone or ground robot represents one of, if not the, largest opportunity for robotics services. While we do not see robot package delivery being deployed at scale for at least five years in the US due to current regulations in place, we believe package delivery is real and represents a multi-billion-dollar market opportunity.  While most package delivery will be by drone, we see a meaningful opportunity for ground robots to also participate. To allow packages to be delivered by a robot, we believe 2 things need to occur over the next couple years: First, the US and other countries need to implement favorable regulation to allow drones to fly over people and beyond visual line of sight. Second, we believe an unmanned traffic management system needs to be put in place allowing all stakeholders to track all robots in use. Both issues will eventually be resolved in the US, but due to more favorable regulation internationally, we believe robot deliveries could occur much sooner in foreign markets.

Bottom Line

We estimate that robot hardware will represent a $73.0B market opportunity by 2025; however, we believe many robots will be commoditized over time. Instead, the sustaining value add in robotics will come from software and supporting services. For that reason, we believe investing in companies with a software- or service-focused business model will be the more attractive way to play the growing robotics theme given these approaches are more scalable and, arguably, more defensible. We believe the largest software opportunities will be companies that improve the speed at which robots learn, as well as companies turning the data gathered from robots into actionable insights companies can use to improve efficiencies and productivity. In addition, we believe companies working on platforms for an unmanned traffic management system is crucial to accessing new multi-billion market opportunities; e.g., drone delivery. We believe advancements in robotics software platforms will be reliant on innovation in artificial intelligence, cloud computing and computer vision. With regards to services, we believe the RaaS model is gaining momentum, allowing more businesses to adopt robot technologies.

We believe a cultural shift is underway and robots are playing an increasingly crucial role in our everyday lives. Improvements in robot hardware, software and services will positively impact the industry and drive robot adoption globally.

Austin Bohlig contributed to this note. 

Disclaimer: We actively write about the themes in which we invest: artificial intelligence, robotics, virtual reality, and augmented reality. From time to time, we will write about companies that are in our portfolio. Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.

Bad Culture Doesn’t Scale

The most important lesson from Uber’s travails is that bad culture doesn’t scale. Talented teams with bad culture can build fantastic businesses, but not businesses that last. A unicorn with bad culture is a unicorn with a bomb strapped to its back — it’s only a matter of time before bad culture catches up and forces disruptive change. Sometimes bad culture rears its ugly head quickly, as it did with Zenefits. Sometimes it doesn’t happen until after multi-billion dollar per year business is established, as it did with Uber.

The culture at Uber wasn’t a secret. It had always been known as an aggressive one, and that culture deserves some credit for helping Uber transform the ride hailing industry; however, the bigger and more established a company becomes, the harder it is to maintain bad culture. Rumors spread, lawsuits happen, and good hires leave because it wasn’t what they signed up for. The media will report every painstaking detail. Advanced companies like Uber also face public backlash from customers, impacting revenue. If Uber were a publicly traded company, the stock would be down at least 30% in the past month given the CEO turmoil. Maybe down 50% for the year adding in the Google lawsuit and other well-publicized troubles.

During our time as public equity analysts, we’ve had the opportunity to cover some great, lasting companies like Apple, Amazon, Google, and Facebook. A common thread between all four of those companies is great culture. When Steve Jobs passed away, we wrote that his greatest achievement wasn’t the iPhone, the iPod, or the Mac, but Apple itself. He left behind a culture of good people driving revolutionary innovation. That might sound simple, but not compromising on your values and consistently hiring the right people that share those values is hard. It’s especially hard for a startup trying to build quickly while bearing the pressure of venture investor expectations.

It’s hard to determine the long-term fallout of Uber’s culture problem. The company has “verbed” itself, much like Google, which allows it a significant brand advantage. One of our teammates has joked that he would, “uber us a Lyft.” With broad leadership change, including the departure of its CEO, Uber has a chance to grow new roots and overcome the negative culture that’s now detracting far more than it ever added.

Disclaimer: We actively write about the themes in which we invest: artificial intelligence, robotics, virtual reality, and augmented reality. From time to time, we will write about companies that are in our portfolio. Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.