AirPods Are More Important Than The Apple Watch

At this point, it might not even be that crazy to say it, but we think AirPods are going to be a bigger product for Apple than the Watch. After using AirPods for the past month, the Loup Ventures team is addicted. The seamlessness in connecting and disconnecting with our phones and enabling Siri has meaningfully improved the way we work and consume content. AirPods are a classic example of Apple not doing something first, but doing it better. And they look cool. We think there are three reasons that AirPods are more important than the Apple Watch.

AI-First World
Google has been talking about designing products for an AI-first world for about a year now. In our view, an AI-first world is about more natural interfaces for our screen-less future. Speech is an important component of the next interface. Siri, Alexa, Google Assistant, and Cortana are making rapid improvements in terms of voice commands they understand and what they can help us with.

We view AirPods as a natural extension of Siri that will encourage people to rely more on the voice assistant. As voice assistants become capable of having deeper two-way conversations to convey more information to users, AirPods could replace a meaningful amount of interaction with the phone itself. By contrast, using Siri on the Apple Watch is less natural because it requires you to hold it up to your face. Additionally, the screen is so small that interaction with it and information conveyed by it is not that much richer than an AI voice-based interface.

Read More

The Five Senses of Computing

The trend in computing towards more natural user interfaces is unmistakable. Graphical user interfaces have long been dominant, but machines driven by more intuitive inputs, like touch and voice, are now mainstream. Today, audio, motion, and even our thoughts, are the basis for the most innovative computer-user interaction models powered by advanced sensor technology. Each computing paradigm maps to one or more of the five human senses; exploring each sense gives us an indication of the direction in which technology is heading.

Sight – Graphical User Interface

The introduction of the graphical user interface (GUI) drove a step function change in computers as productivity tools, because users could rely heavily on sight, our dominant sense. The GUI was then carried forward and built on with the advent of touchscreen devices. The next frontier for visual user interfaces lies in virtual reality and augmented reality. Innovations within these themes will further carry forward the GUI paradigm. VR and AR rely heavily on sight, but combine it more artfully with other inputs like audio, motion, and touch to create immersive interfaces.

Touch – Touchscreen Devices

PCs leveraged basic touch as a foundational input via the keyboard and the mouse. The iPhone then ushered in a computing era dominated by touch, rejecting the stylus in favor of, as Steve Jobs put it, “the best pointing device in the world” – our fingers.  Haptics have pushed touchscreen technology further, making it more sensory, but phones and tablets fall well short of truly immersive computing. Bret Victor summarized the shortcomings of touchscreen devices in his 2011 piece, A Brief Rant on the Future of Interaction Design, which holds up well to this day.

More fully integrating our sense of touch will be critical for the user interfaces of the future. We think that haptic suits are a step we will take on the journey to full immersion, but the best way to trick the user into believing he or she is actually feeling something in VR is to manipulate the neurochemistry of the brain. This early field is known as neurohaptics.

Hearing – Digital Assistants & Hearables

Computers have been capable of understanding a limited human spoken vocabulary since the 1960s. By the 1990s, dictation software was available to the masses. Aside from limited audio feedback and rudimentary speech-to-text transcription, computers did not start widely leveraging sound as an interface until digital assistants began to be integrated into phones.

As digital assistants continue to improve, more and more users are integrating them into their daily routines. In our Robot Fear Index, we found that 43% of Americans had used a digital assistant in the last three months. However, our study of Amazon Echo vs. Google Home showed that Google Home answered just 39.1% of queries correctly vs. the Echo at 34.4%. Clearly we’re early in the transition to audio as a dominant input for computing.

Hearables, like Apple’s AirPods, represent the next step forward for audio as a user interface.

Read More

AirPods: The First Mass Market Hearable

Apple’s AirPods are a step towards the future of computing. Starting with the move from the keyboard and mouse to the touchscreen, computing continues to move towards more intuitive user interfaces. Audio and motion capture are the basis for most innovative computer hardware today, including wearables. And we think that voice-controlled wearables, like AirPods, show a lot of promise. We call them hearables.

In order to look forward to the impact of AirPods on the future of computing, it helps to first look back at how sound has evolved as an interface. Computers have been capable of understanding a limited human vocabulary since the 1960s. By the 1990s, dictation software was available to the masses. Aside from limited audio feedback and rudimentary speech-to-text transcription, computers have not leveraged sound as an input or as an interface until natural language processing matured in the early 2000s.

As digital assistants continue to improve, more and more users are integrating them into their daily routines, and AirPods make that even more convenient. In our Robot Fear Index, we found that 43% of Americans had used a digital assistant in the last three months. However, our study of Amazon Echo vs. Google Home using 800 different everyday queries showed that Google Home answered just 39.1% of the queries correctly vs. the Echo at 34.4%. We’re early in the transition to audio as a dominant input for computing.

Hearables, like Apple’s AirPods, represent a giant leap forward for audio as a user interface. Now, Siri is always available and your phone stays in your pocket. In fact, using AirPods necessarily means using your phone less frequently – or at least pulling it out of your purse or your pocket less frequently. AirPods can handle information requests, dictation, media control, and phone calls; meanwhile, quick glances at the Apple Watch on your wrist will suffice for most notifications. All of this means that your phone stays in your pocket. As Jason Calacanis declared just yesterday, “AirPods are the new smartphone.” And we believe audio as a UI is a key enabler of AR technology. AirPods may not be perfect, but they’ll get better, smarter, and easier to use. They are just the beginning for hearables and a new wave of computing.

We surveyed 55 AirPods users to better understand the state of the early hearables market. We were surprised that AirPods received a net promoter score (NPS) of -2, which means that the number of detractors, passives, and promoters were split roughly in thirds, with slightly more detractors than promoters.

While an NPS of -2 could actually represent relative outperformance within the bluetooth headphones category, AirPods clearly have some kinks to work out if hearables are going to perform more of our daily computing. Among detractors and passives, half identified the ear fit as the opportunity for improvement, not the software or functionality. So, we remain convinced that AirPods and other hearables will play a big role in shaping the future of how we interface with our devices and with each other.

Disclaimer: We actively write about the themes in which we invest: artificial intelligence, robotics, virtual reality, and augmented reality. From time to time, we will write about companies that are in our portfolio.  Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.

Manifesto

The Future Perfect: Rediscovering Utopia

Technology ruined our utopia. When did humans have it better than the dawn of time? We were born with everything and nothing. The entire world was ours. Since everyone had nothing, everyone had everything. We had no property, no houses. We were free to roam and inhabit as we pleased. We only worried about survival — finding enough food and avoiding dangerous predators. We didn’t have to worry about 401(k)s or what car the neighbors just bought. There wasn’t a 1 percent or 99 percent. We didn’t have politics. We just had survival. Humanity at its purest. Then invention doomed us. It was innocent at first. Innovation made it easier to survive. Food and safety became essentially guaranteed, so we needed to find other things to define our lives. Then inventions became those things. Things not for survival, but for status. For having something someone else didn’t. For benefitting unequally based on that ownership. For handing down to the next generation so they didn’t start with nothing. Things became the new goal of survival and they defined our differences. People who had things treated people without them differently. We went to war with others who had things we wanted.

Now most of us are born with nothing. We have to earn everything, buy everything. We’re trapped in a system that forces us to chase things that maintain our differences. We innovated ourselves out of utopia and into industry, and innovation is the only way to get our utopia back.

Read More