Latest Research
Innovate Like Apple, With Baby Steps

Innovate Like Apple, With Baby Steps

When Steve Jobs introduced the iPhone 10 years ago, he talked about how fortunate he had been to introduce three revolutionary products over the course of his career: Macintosh, iPod and iPhone. For good reason, Apple is known for major leaps forward in innovation. But as we’ve watched their progress over the last two decades, we’ve recognized a pattern of smaller, more incremental changes: baby steps. These are the steps that often disappoint Apple watchers. Sometimes the steps forward appear to be giant leaps, but those are the exception to the rule. Even their most revolutionary products have been the sum of baby steps that preceded them. Nobody’s articulated this better than Kirby Furguson in Everything is a Remix.

Remember the Motorola ROKR E1? It was the phone we had all be dreaming of: a combination iPod and mobile phone that made it easy to load music from iTunes. Baby step. Or worse: stumble. We had all been sandwiching our Motorola RAZRs and our iPod nanos together, praying Apple would combine them. After the launch of the ROKR, Playlist summed it up nicely in their review of the device: “While I’m pleased that the phone finally saw the light of day, my pleasure just about ends there. As a phone, it’s hardly cutting edge. And as a music player, it’s a poor substitute for an iPod.” But the iPhone development team learned a lot from that 2005 project with Motorola as they prepared for the 2007 launch of the iPhone. And consumers learned to load music onto their devices, getting in the habit of using their phones as music players.

Remember the watch bands made for the 6th gen iPod nano (2010)? Apple created beautiful watch faces for the device and Phil Schiller even highlighted the trend during a keynote. Baby step. Five years later, Apple introduced Apple Watch. In the interim, the company had sold over 100M iPods and over half a billion iPhones. The Apple Watch was made possible by the technical and production capabilities Apple developed over the course of the iPod’s lifecycle in combination with the addressable market Apple created with the iPhone, which does much of the heavy lifting for the Apple Watch.

In hindsight it seems obvious, but these baby steps were critical. There are many benefits to recognizing and leveraging how gradual innovation truly is. We categorize them in two groups:

  1. Train your customers. As much as we think we want something totally new, we’re creatures of habit. We want new capabilities that fit into and streamline our routines. We want new tools that are immediately understandable. New skills require training, and your customers are no different. Baby steps in features, innovations and user interfaces help to train your customers. And they help you commercialize increasingly complex technology. The first Apple TV, for example, synced media via iTunes in exactly the same way we had been syncing media with iPods for years. We were well trained.
  2. Ramp your production capability. You learn a lot when you build your first product. You learn even more when you scale your first product to 10,000 units. But if each new product or feature is completely different, you can’t transfer the learnings from one to another. The iPhone would not exist if not for the iPod. Even though the iPhone today makes the iPod look irrelevant, it stands on its shoulders. Apple ramped its iPhone production capability to the incomprehensible level it’s at today because of the baby steps it took with the iPod line, ramping capability for flash storage, mobile displays, camera lenses, etc. Especially with physical products, ramping supply chain and production capability happens in baby steps.

We love to look carefully at the baby steps we see Apple taking today and predict what it means for the future. Looking at the iPhone 7 Plus’s dual cameras and software features, we see Apple building a huge competitive advantage in augmented reality. Portrait mode is training customers and helping to drive demand for the hardware; meanwhile, Apple is ramping their own technical and production capability with dual lens devices. The iPhone 7 Plus will have a huge implications for 3D mapping and real-time image processing in an AR world. Similarly, the haptic home button on the iPhone 7 and 7 Plus are training customers to respond to haptic button presses rather than mechanical button presses. We will be well trained for the eventual iPhones that have no home button, have no bezel, but use on-screen buttons with haptic feedback for home button functions.

AirPods (our new favorite toys), in combination with Apple Watch, are clear steps towards a post-mobile world. The watch takes care of notifications, nearly eliminating the need to take your phone out of your pocket for any reason other than a phone call, and AirPods eliminate the need to  take your phone out of your pocket for calls. Glance at your wrist to read a text. Double tap an AirPod to initiate a call-back, talk, and double tap an AirPod to end the call. Even though the iPhone still bears the majority of the processing and wireless burden, Apple is clearly taking steps towards post-mobile computing through the wearables it’s already shipping.

And baby steps work for startups too. Amazon started solely as a bookseller, only to evolve into the Everything Store. Facebook started as a network for college students at Harvard, then Boston and ultimately a network for the entire world. Airbnb started as a place to rent a couch, now it replaces hotel rooms for many.  Baby steps let Apple and others figure out the market and optimize for it before they changed the world. Prove the concept first, dial it in, then scale it. Revolutionary products seem to sneak up on us, but the steps to get there are usually apparent in hindsight.

Disclaimer: We actively write about the themes in which we invest: virtual reality, augmented reality, artificial intelligence, and robotics. From time to time, we will write about companies that are in our portfolio.  Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.

Apple, Augmented Reality, Philosophy
4 min. read Show less
VR Over The Holidays: What To Expect For Oculus and Vive Sales In Q4

VR Over The Holidays: What To Expect For Oculus and Vive Sales In Q4

VR headsets may not have been the “it” gift of the holiday season, but there was definitely a pick up in consumer interest. Based on our analysis of Google Trends data, we believe that Oculus sold about 55k Rifts in Q4 and HTC sold about 65k Vives. We note that based on Facebook’s quarterly reports in Q2 and Q3, Oculus was selling around 40k units per quarter.  Thus the holiday season appears to have brought about a 40% increase in unit volume for Oculus.

While Facebook does not report Oculus unit sales (we back into them), management will likely offer some additional VR related commentary on February 1st. We would expect management to maintain their consistent commentary that VR is still early. Beyond that, we may get an updated number of Gear VR users (the company reported 1 million monthly actives at the Q2 call) or additional color on future VR investment (they announced an incremental $250 million in VR investment during the Q3 call). Net-net, we expect Facebook’s Q4 update to be a modest positive to the VR ecosystem.

Disclaimer: We actively write about the themes in which we invest: virtual reality, augmented reality, artificial intelligence, and robotics. From time to time, we will write about companies that are in our portfolio.  Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.  

Facebook, News, Virtual Reality
1 min. read Show less
Seeing What We Say: Improving Siri And Alexa

Seeing What We Say: Improving Siri And Alexa

We’ve been talking a lot about digital assistants lately. They were a big theme at CES and a recent survey of ours showed that US consumers view digital assistants as the fourth most frustrating tech product, behind devices, poor Internet service, and automated-telephone systems. Here’s a view into how we might be able to improve digital assistants in the future.

Humans are non-verbal communicators by nature. Almost 60% of human-to-human communication is through body language, but our current natural language interfaces only use voice. This means robot assistants miss 60% of the information we send to them. How often do you say thanks to Siri or Alexa after you get a right answer? How often do you curse at them when you get a wrong one? Then how often do you nod your head when Siri or Alexa give you a right answer? How often do you scrunch your face up in anger when they give you a wrong one?

The most obvious answer to this problem would seem to be some sort of computer vision implementation. This would solve part of the body language problem as the digital assistant could see any obvious gestures we make in response to its answers, but that’s not all the device would need to know.  The assistant would also need to know who’s talking if there are multiple people in a room and what the speaker’s facial expressions mean in the context of the answer. You might frown at bad news, even if that was the correct answer to your question. You might smile at the hilarity of a wrong answer. This means the robot needs to build a model of what humans may interpret as good or bad or associated with some other emotion and that model must be specific to the user. Good and bad are subjective to the individual with politics as a dangerous example.

Another potential solution to help digital assistants interpret body language might be connecting with a sensor on your body. Sensors could help address one issue with computer vision solutions: that we aren’t always in the robot’s line of sight.  Some of these sensors are already built into watches or advanced fitness trackers and detect biomarkers like change in heart rate or blood pressure. A rise in blood pressure might signify anger at a wrong response. A decrease in body temperature may imply sadness.

Both of these solutions beg the privacy question. Are we comfortable with allowing our robot assistants to see us and our physical data? Privacy tends to be a point of contention for every evolution of technology. It was an issue for Facebook as it grew to be indispensable for over a billion users.  It was an issue for Google Glass as the most recognizable wearable with a camera. Our belief is that we already live in a post-privacy world. One of the key trade-offs we make for the convenience of many technologies we use today is that we give up privacy. We trade privacy for the benefit of connecting with people on social platforms.  We trade privacy for better recommendations in search and shopping. Yes, there will be some noise about the intrusion of privacy that comes with incorporating body language and body data into digital assistants, but we expect that concern to go about as far as it did with Facebook.

The bottom line is this: adding the ability to read and interpret body language would result in a step-function change in our experiences with digital assistants. Incorporating body language is a crucial step in being able to create robots that can truly understand humans, allowing them to perform complex, human-like tasks. We view this as a complex and extremely interesting AI and robotics opportunity and a problem we need to solve as we pursue The Future Perfect.

Disclaimer: We actively write about the themes in which we invest: virtual reality, augmented reality, artificial intelligence, and robotics. From time to time, we will write about companies that are in our portfolio.  Content on this site including opinions on specific themes in technology, market estimates, and estimates and commentary regarding publicly traded or private companies is not intended for use in making investment decisions. We hold no obligation to update any of our projections. We express no warranties about any estimates or opinions we make.  

Amazon, Apple, Artificial Intelligence, Future, Google
3 min. read Show less