The Human Brain vs. Computers

The Human Brain vs. Computers

Should we fear artificial intelligence?

Image for post

Stephen Hawking has said, ?The development of full AI could spell the end of the human race.? Elon Musk has tweeted that AI is a greater threat to humans than nuclear weapons. When extremely intelligent people are concerned about the threat of AI, one can?t help but wonder what?s in store for humanity.

As of 2017, brains still have a leg up on AI. By some comparisons, human brains can process far more information than the fastest computers. In fact, in the 2000s, the complexity of the entire Internet was compared to a single human brain. This might surprise you. After all, computers are better at activities that we equate with smarts, like beating Gary Kasparov in chess or calculating square roots. Brains, however, are great at parallel processing and sorting information. They are so good at some activities that we take their strengths for granted, like being able to recognize a cat, tell a joke, or make a jump shot. Brains are also about 100,000 times more energy-efficient than computers, but that will change as technology advances. Estimates are that computers will surpass the capability of human brains around the year 2040, plus or minus a few decades. Whenever computers reach ?human capacity,? they may just keep right on improving. They are not burdened by the constraints that hold back brains. Neurons, for example, are the brain?s building blocks and can only fire about 200 times per second, or 200 hertz. Computer processors are measured in gigahertz: billions of cycles per second. Signals on neurons travel at about one-millionth of the speed of fiber optic cables. And don?t forget, brains have to be small enough to fit inside skulls, and they inconveniently tire, forget, and die.

When it comes to storing information, however, biology once again shows that technology has a long way to go. This might surprise you, as well. After all, a computer hooked up to the Internet can beat human Jeopardy champions, and computers are great at memorizing things like phone books. But consider DNA as memory storage. Each of your six trillion cells contains all of the information to make your whole body. DNA can hold more data in a smaller space than any of today?s digital memories. According to one estimate, all of the information on every computer in 2015 coded onto DNA could ?fit in the back of an SUV.? In fact, DNA can already be used to store non-biological information. In 2015, the works of Shakespeare were encoded into DNA.

The essence of memory, of course, lies in its durability. DVDs and other hard drives decompose after 20 or 30 years. However, scientists have sequenced 30,000-year-old Neanderthal DNA. (The Neanderthal who left us her personal data may have paid with her life, but unless she sends us a bill, the data storage was free!) Intact DNA has been found that is close to a million years old. DNA can also be used to store non-biological information. Who would have imagined that in 2015 I could bring my son a Bitcoin encoded on a fragment of DNA as a birthday present?

Brains and DNA show us that our methods of storing and processing digital information still have a lot of runway to keep getting better. This potential will be realized by new approaches, such as quantum computing or 3D neural processing.

Computer scientists like Ray Kurzweil contend that Artificial Intelligence (AI) will breeze past human intelligence ? and keep on learning. AI and humans will work side by side to turbocharge the speed of invention. Kurzweil and others call this the ?singularity,? a term used to describe phenomena that approach infinity in some way. The singularity is a self-stoking cycle of machines using their own AI to make even smarter machines. There is plenty of speculation about what the singularity will look like, when it will arrive, or whether it will even occur. The notion of the singularity might seem pretty abstract, but super-smart AI might represent a real danger, as cited by Stephen Hawking and Elon Musk. In 2015, Hawking and Musk joined Apple co-founder Steve Wozniak and about 1,000 other robotics and AI researchers in an open letter warning of ?a military artificial intelligence arms race.?

It is hard to know whether or not to lie awake at night worrying about AI?s threat to humanity, but the idea that machines can get much smarter is important to all of us. Learning machines are fundamentally different from other technologies. Steamships can?t make themselves into better steamships, but smart machines can make themselves smarter.

In many ways, machine learning is already a reality, though many people might not realize it. Any interaction you have with Siri, Google, Netflix, or Amazon is influenced by machines that make themselves better. At Starwood, we used machine learning to improve our targeted special offers and hotel revenue management systems. Machine learning today is helping companies interpret data, learn from missed forecasts, and find new correlations. Though the analytics may be sophisticated, so far the interactions with people are nowhere near ?human.? Siri can access a lot of information, but she is still pretty robotic.

Digital technology is the ultimate story of an accelerating trend line. Computers used to be rare, expensive, and hard to use. Now, smart machines are cheap and ubiquitous. Soon, we will be online all the time, along with most of our appliances, tools, and vehicles. Sensors that monitor our health and alert us to potential dangers will be everywhere. Thinking machines and the Internet will connect seamlessly with our lives and become a natural component to how we make sense of the world.

The trend line is clear, but the ?headlines? are hard to see coming. Back in 2000, for example, we had a decent estimate of the advances in computing power over the next several years. Even with that information, though, no one could have identified ?headline? disruptors like Facebook or the App Store. In the mid-1990s, my wife brought home a book about the future of technology that was so advanced it came with its own CD-ROM. Although the book shared insights about potential new uses of computers, it was later criticized for having said little about the World Wide Web. The author was very smart and quite familiar with technology businesses. His name was Bill Gates.

We all (as individuals, companies, and countries) have to get ready for disruptors that we cannot foresee. Technology lies behind global development and alters how people live. It affects every job, every human activity. It makes services, health care, and information available in ways that were unimaginable just a few decades ago. Along the way, it upsets social norms, disrupts industries, and dislocates workers. The pace of ever-improving technology shows no signs of letting up. Advancing AI can seem scary, but it also poses great opportunity. Every business will have to think about what it means for them. What will the next couple of decades bring?

This post is an excerpt from The Disruptors? Feast by Frits van Paasschen, released January 16, 2017.

www.fritsvanpaasschen.com

14

No Responses

Write a response