After electricity, we got light bulbs, clothes dryers, ovens, electric heaters and refrigerators. We also got hot running water, toasters, vacuum cleaners, air conditioners and a thousand other home appliances.
Then, in addition to applying electrical power to comfort and convenience, all-new things emerged that few even dreamed of when electricity first became ubiquitous. TV, the internet and smartphones, to name a few big ones. When the first cities were wired for electrical power, few could envision W-iFi, social networking and GPS.
Electricity made the existing things in use more powerful and convenient, and it made new things possible that nobody could have imagined.
In hindsight, it would have been absurd in the year 1900 to compare the light bulb with electricity as technologies. The light bulb is a specific set of applications with a specific purpose (to light and sometimes to heat). But electricity is an underlying, foundational technology.
That same error is applied when commentators categorize, say, augmented reality with A.I. They’re not in the same league. Not even close. In fact, no emerging technology is truly comparable to A.I.
While augmented reality (mixed reality, 360 video, virtual reality and all the rest) will have a giant impact on human life (as did the light bulb), A.I. is the technology that will change absolutely everything – including augmented reality.
Like electricity, A.I. will prove to be an underlying, foundational technology that “powers” everything else.
If you were to list the emerging technologies that will define the coming few decades, these might include augmented and virtual reality, self-driving cars, gene therapy and many others. But none of these are possible without A.I.
What is A.I., anyway?
Everybody talks about A.I, but misconceptions predominate those discussions. First of all, A.I. is a vague, non-specific label that even experts don't fully agree upon. It's also become a marketing term, adding to confusion.
In general, A.I. happens when a computer does a task that used to require human intelligence. These include learning, decision-making, speech and visual recognition, natural speech and language translation, as well as pattern recognition.
It’s a clumsy definition. Because A.I. also does things no human could ever do.
You can also look at A.I. as an umbrella term that covers a wide range of computer science specialties, including machine learning, deep learning and many others. For example, some in the industry talk about machine learning as separate or different from A.I. In fact, machine learning is a subset of A.I (and deep learning is a subset of machine learning).
The public and press often talk about A.I. as a future or edge-case technology (as with chess-playing computers), when in fact most of us already encounter A.I. every day.
Sign up for Computerworld eNewsletters.