Understanding the Natural World
Sometimes, the storm of change comes as a real storm. Visit Half Moon Bay, California, which is a sleepy coast-side town near San Francisco and Silicon Valley, and you'll probably meet "Farmer John." That's John Muller, and he moved there in 1947 to open a small family-run farm that is still operating, mostly growing pumpkins. We met him a while back and he told us that when he started his farm, he had to guess the weather, mostly by "feel." Today, he told us his son relies on satellite imagery and AI models to know what tomorrow's weather will hold, rather than holding his finger to the wind and trying to guess if a storm is on the way. By knowing the weather, he can better hire people to plant or harvest, and that knowledge saves his family many dollars. Knowing a storm is coming a few days in advance can save much. Knowing that a heavy rainstorm that might flood fields is coming can save millions in seed cost, particularly in places like Kansas. Knowing a tornado will soon hit a neighborhood can save many lives.
Soon, Farmer John's family will be warned of changing weather with Augmented Reality glasses, and they will be able to see storms in 3D instead of watching Twitter accounts like many farmers around the world do today (we built a Twitter feed composed of meteorologists and scientists to watch our changing climate better at https://twitter.com/Scobleizer/lists/climate-and-weather1). The government and others have spent billions on this infrastructure. It is hardly the only expenditure humans have made to understand our changing environment.
In 2003, we had a discussion with Bill Hill, who was a computer scientist at Microsoft. He invented the font smoothing technology that we use on all of our devices now, and he told us how he invented the technique. He had an interest in animals, thanks to reading tons of books when he grew up in poverty in Scotland and learned to track them through forests and meadows by looking at their footprints, or other signals. While doing that, he realized that humans evolved over millions of years to do exactly what he was doing as a hobby. Those who survived had deep visual skills and generally could see tons of patterns, especially in the green grass or trees that our ancestors lived in. If you didn't, you were attacked and eaten by an animal who was camouflaged in those trees. So, those who couldn't see patterns, especially in the green foliage around us, had their DNA taken out of the gene pool. He used that knowledge to figure out that he could hide visual information in red color fringes that surround each letter on your screen. Get out a magnifying glass and look at fonts on Microsoft Windows and you'll see his work.
Today, we no longer need to worry about a lion hiding in the grass waiting to eat us, but we have new challenges that continue to push us toward developing new tools that can help us continue this great human experiment that we are all in.
The lions of today might be those in the capital markets that are pushing our companies to be more profitable. Or they might be a disability that is slowing down our ability to be our best. We think that's all important, but what did our ancestors do after they survived the day walking in the jungle trying to find food to feed their families? They gathered around fires to tell stories to each other after a day of hunting and gathering. We can see some of those stories even today as drawings on cave walls, where drawings show brave hunters facing down animals and dealing with the challenges of crossing mountains and streams. Today, we don't need to hunt to eat, but there are those who still want to do these things as sport, and what they are doing with technology demonstrates we are under a Prime Directive to build better tools and experiences, even to do something humans have done for many generations.
Let's look at another example of how people are utilizing technology in order to better understand their environment and augment their experiences. Vail Colorado resident Ryan Thousand, https://www.linkedin.com/in/ryanthousand/, is an IT administrator for a healthcare company, Vail Health. During the day and many evenings after work or on weekends, he is a passionate fly fisherman who loves catching fish, not to eat, but to capture for his Instagram channel to brag to his friends, before he releases them back into the stream for others to experience the same joy. He saw a need to track what kind of lures he was using as stream conditions changed, so now he's building 3D models of the streams he's fishing in. While wading in the water, he wears smart boots that measure water flow, temperature, and can capture other things. The data he is capturing using those and other tools is then captured on his phone and streamed to a database he's building, and also to the government, who is using the data from sportspeople to track environmental ecosystems.
His early attempts to build useful tools for fly fishermen brought him to Oakley's ski goggles, which had a tiny little monitor built into them. He is seeing a world, coming soon, where he can move away from having to view data through a tiny little monitor that gets in the way of the real beauty he is usually surrounded by, and instead transition to wearing a headset like Microsoft's HoloLens while fishing so that he can capture stream conditions on top of the actual stream. Soon, he sees, people will be wearing a set of Spatial Computing glasses, which will include a virtual guide that will not just show you potential fishing spots in rivers and lakes before you go on a fishing vacation, but will then help you learn how to properly cast a fly, and even tell you where to stand in the stream to have the best chance of catching a fish. Thousand's Prime Directive is to make fishing better and to help everyone who visits Vail learn more about the environment, which will help them save it for many future generations.
The changes, he told us, could change all human activity, even one as familiar as grabbing a fishing rod for a day of relaxation in a stream. That's due, in part, to things like new 3D sensors and new optics that will let us augment the real world with new kinds of digital displays.