Anthropologists believe that wild grains were first domesticated about 11,000 years ago somewhere on the Indian subcontinent. It was a transformative development, allowing humans to give up foraging and start building something like civilization.
Agriculture was itself transformed some 8,000 years later with the invention of the plow circa 2,800 BC, a tool which made it possible for one farmer to feed many, and freed many to pursue new developments. Fast-forward 4,600 years to 1892, when Iowa farmer John Froelich invented the gasoline-powered tractor, a transformative development making it possible for one farmer to feed hundreds and prompting millions to cultivate new fields of endeavor. Just 125 years later it’s estimated that less than two percent of Americans are directly involved with agriculture, more than 60 percent of American farming is accomplished using hyper-efficient GPS-guided semi-autonomous tractors, and experts predict that fully autonomous combine harvesters will be feeding much of the country by 2025.
The quickening march of agriculture neatly illustrates something forward-thinkers and backward-thinkers alike have labeled “accelerating change,” a perceived principle by which the pace of technological change increases with each technological advance, and society is transformed apace. The better things get, the faster things get better, so to speak, and the more often we all feel like strangers in a strange land. The evidence for accelerating change is abundant and persuasive, and the principle works for just about everything.
One fine day in 1600, and presumably in between patients, English physician William Gilbert was messing around with magnets and coined the term “electricus” to describe the little-understood force that animated them. In 1886, Alexander Graham Bell wired electricus to a speaker and transformed communication over the telephone. Exactly 87 years later Motorola took the “sound telegraph” mobile, and the smartphone debuted just 34 years after that. It’s been a technological rocket ride like no other, and since 2007 the iPhone and its touch-screen brethren have transformed a lot more than communication. On the other hand, we’ve been limping along with most of our lives and much of the world at our fingertips for an interminable 21 years now, and it’s clear we’re overdue for some serious transformation. Industry analysts agree that smartphone innovation reached its peak at least two years ago, and lately manufacturers spend most of their time trifling with aesthetic tweaks while awaiting new developments that will point them in new and profitable directions.
Fact is, the next era of communication is well begun as the Internet of Things (IoT). It’s getting harder to buy any powered appliance these days that can’t be connected to the Cloud and manipulated through a smartphone, and the number and variety of IoT-capable devices is accelerating by leaps and bounds. But the IoT concept is less about having the ability to minutely control your physical and intellectual environments than about not having to. Within the Internet of Things, people don’t just talk to their car, their TV, their hot water heater and their latte machine, those things talk to each other, too, in concept forming a cooperatively self-directed bubble of all-but-autonomous comfort and convenience around their blithe human dependent.
The rub, of course, is that individual parameters of comfort and convenience must be occasionally communicated to the IoT by fingertips that might be more comfortably and conveniently employed doing something else. Happily, science is even now working hard to free us from the drudgery of touch-screen technology, and serious transformation is right around the corner.
First in the lineup is a next-generation virtual assistant of the Siri persuasion, only one contained in a disc about the size of a silver dollar strapped to your wrist, or possibly distributed throughout the beads of a stylish necklace, or maybe even sewn into the fabric of a garment. Possessing all the computing power of a smartphone, enhanced voice recognition software, seamless IoT connectivity and, it is expected, the ability to project a functioning keyboard onto any flat surface for your anywhere-typing pleasure, that hands-free cyber helper is merely the first dagger in the smartphone’s back.
Microsoft, Facebook and Google are all working hard to deliver the coup de grace, which is a headset capable of projecting detailed, three-dimensional images directly onto your retinas. No clunky helmet visors, these, but light and comfortable eyewear that won’t replace the world we see, but rather “augment” it by deftly overlaying images onto real life within the privacy of our own eyeballs. If, or rather when, they succeed, it could very well spell doom for anything with a screen, including your television set. Together, those two coming technologies should herald a truly transformative age of “augmented reality,” a hybrid realm occupying the twilit space between Nature and Technology. And before you get all weird about it, those drawing a paycheck on augmented reality’s account are quick to reassure that watching TV inside your head will “reduce technological distractions,” and that such a collision of the physical and digital worlds will most certainly result in “greater balance.”
More balance and fewer distractions sure would be nice, and one can only imagine the serene equilibrium that will reign once Elon Musk rolls out a retail version of Neuralink, an ultra-thin mesh implanted in the brain to provide a direct interface between Man and Machine. With such advancements on the near horizon, the smartphone doesn’t stand a chance. Even so, it’s difficult to feature just what kind of miraculous phone could one day supplant the one buried in your bean. There’ll be one, though, because the principle of accelerating change demands it.
Indeed, the latest in agriculture transformation is the robotic home farming system. Currently under development by FarmBot, the Genesis XL home farming system, for example, lets John and Jane Q. Self-Sufficient cultivate a wide variety of produce from the comfort of their couch. With a greenhouse in the back yard and squeaky-clean fingernails, they can simply drag-and-drop their wishes on a user-friendly app and then sit back while FarmBot brings in a bumper crop.
“It is not merely in the number of facts or sorts of knowledge that progress lies,” pronounced American urban designer Daniel Burnham, addressing a conference of English thinkers on the subject of accelerated change in 1910. “It is still more in the geometric ratio of sophistication, in the geometric widening of the sphere of knowledge which every year is taking a larger percentage of people. Our pace of development having immensely accelerated, our sons and grandsons are going to demand and get results that would stagger us.”
Prepare to be staggered. As the interval between transformative technological breakthroughs halves and halves again, many learned people conversant on the topic calculate that “technological singularity” will be achieved not later than 2045. For those less conversant on the topic, technological singularity is the point at which technology becomes autonomously self-improving, sparking a runaway cycle of instant upgrade and throwing human society into a perpetual state of transformation.
Some folks think technological singularity will be great. Some others think it will be Hell on Earth. A lot of folks dismiss the hypothesis out of hand. They all agree on one thing, though. From here on in we’re all strangers in a strange land.
You must be logged in to post a comment.