These are sensory experiences. They are human sensory experiences. These are emotional triggers and instinctual reactions. These are not attributes one would normally ascribe to a computer-driven experience.
As our computers learn from and augment our connected lives, they are learning to mirror our behaviors, quirks and honesty. Google Now holds incredible power - the eventual power to deliver many of these sensory experiences. But it is still an appendage, separate from our natural lives. Using Google Now requires that the user lift a phone, push a button, swipe to unlock, and then swipe to load.
This isn't natural. It's isn't integrated. And this friction is holding back the true potential of the platform.
Wearables are only the first step towards better information capture and delivery. We need to create an invisible operating system and interactive layer - computing that requires no additional effort. We need to learn to publish and communicate differently. We need to learn to develop and adopt an API for ever-present access and non-invasive communications.
We spent the last 30 years proving to the world that computing was worth the distraction from life. We will spend the next 10 years learning how to seamlessly augment life itself.