Posts Tagged ‘artificial intelligence’

Apocalypse in the Wings—Hint: It’s Not Zombies, Part 2

April 17, 2017

Two rants ago I promised to return to the subject of the dangers of artificial intelligence. I made the assertion, likely outrageous to some (and dissonant with my inclinations as a technophile), that the threat it will pose is much more proximate than global warming. Years ago, based on the changes in technology we’d seen up until that point in time, I extrapolated what seemed to be its inevitable evolution born of the exponential advancement in both computer hardware and software. Prior treatments of runaway technology in word and cinema are legion, and the act of purveying it as entertainment, just like natural disaster and post-apocalyptic visions, frighten and desensitize us to them at the same time. It’s a “Whoa—what if …? Nah!” reaction. The sad truth is, zombies excluded, the scenarios all have terrifying plausibility and are so overwhelming that it’s easier and more palatable to shrug and say, “Good thing it’s not for real.” Unfortunately, while EMPs asteroid collisions, and global pandemics are rare and far between, wars and malevolent AI are neither. The former we’re familiar with because, large or small, they impact every generation. As our martial technologies progress they become more deadly, but they are the devil we know. Untethered artificial intelligence is something we write about, but have yet to experience, because we’re approaching a nexus never before seen, what Ray Kurzweil calls the Singularity. In his book by the same name, written 12 years ago, he painstakingly outlines the multiple technological changes advancing geometrically that will converge in the not-too-distant future that will provide unimaginable benefits in terms of our health, energy availability, longevity, and much more (you can listen to him here). Artificial intelligence will augment human intelligence a thousand, or perhaps a million-fold. He sees it not as a competing force, but a symbiosis between the human and artificial, one that will be realized when we reach the point of Singularity some time in the 2040s. The 20-20 hindsight of the last 12 years perhaps exposes his timeline as a bit aggressive but more on point than my prior estimates. I, like others, even the experts, suffer from the same shortcoming of thinking linearly, not geometrically, in assessing change. But the truth is that, as Mr. Kurzweil documents, we are on the knee-bend of the technological growth curve, where everything before appears to have progressed linearly, and will soon take off like a Saturn rocket. If you closely examine the last ten years, this is already becoming evident, with powerful computers in every pocket, incredible advances in voice recognition, and medical advances on the brink of a major paradigm shift, to name a few.

So this is a good thing, right? Yes, and no. With the unfathomable good comes an equally disturbing downside. Look at the bellwether of the Internet to see this more clearly: The old “World Wide Web” changed our lives forever in big ways and small. Information-gathering, interacting, and commerce are instantaneous and global. With it comes the thieves, the scammers, and the predators. With each new threat comes another solution, the good a half-step ahead of the bad. We talk about the risks of cyberattacks on the financial sector and the infrastructure, but thus far have managed to limit these to skirmishes rather than world war, at increasing cost and with the recognition of increasing risk with every passing year.

Now let’s move the above reality into the arena of AI. As computers continue to grow in computational power at an accelerating rate and the software that controls them follows suit, program complexity also soars. Programming languages layer one on another, increasing the ease of use and also the automation of programming functions. In the near future programs will write and improve programs. This is not conjecture, but inevitability. At some point, and I argue this is withing a few decades, the intellect and speed will be so far beyond human reach, that only machine intelligence will be able to write and debug code. Along with this, genetic manipulation, bio- and nanotechnology, and robotics will improve in a parallel fashion, becoming faster and more nimble in executing the changes in their own evolution. Recognizing the risk, of course there will be multiple layers of safeguards. Past history has shown, however, that there will always be bad people hijacking the technology for their own foul purposes, and again we will witness the war between good and evil. Unfortunately, the risks and the stakes in this new war will be unlike anything the human race has previously encountered. When the atomic bomb was introduced, we entered an era where, for the first time, mass destruction became possible. To date, we’ve been able to narrowly avert this catastrophe. With malevolent, runaway code combined with high-tech machines we may not be in a position to decide.

The promise and risk of the next few decades is enormous and unprecedented. May God give us the wisdom to prevail.

***

Shameless plug alert: Just released my second novel, The Nidus, on Amazon as a Kindle ebook which pertains to this topic. It highlights near-future events as they might happen. I know that it’s by no means the first treatment of this subject, only the first to get it right (yes, I am privy to future events, the stock market, unfortunately, excluded).

Apocalypse in the Wings—Hint: It’s Not Zombies

January 23, 2017

The turns of history predict we’re approaching a Crisis in the next 5-15 years. There are many candidates proposed. If you were to watch popular television, it’s gluttonous zombies. If you’re PC, it’s global warming, repackaged as climate change. If you’re a historian, war and/or economic collapse rate high on your list. If your leanings are more to the celestial, it’s that pesky rogue asteroid or a well-aimed electromagnetic pulse (EMP) flaring from the sun. They’re all plausible speculations (well, maybe not the perambulating re-vivified carcasses), but let’s examine them in the light of reason.

War and economic collapse certainly occupy the 1 and 2 slots, in either order. The Middle East is a hotbed, we’re doing our best to fight, in as limited a fashion as possible, the neo-Nazi neo-Caliphate, and we’re printing money and borrowing cash as breathlessly as we can to keep up with our insatiable urge to create a more utopian society and bolster a standard of living we always seem to be just one or two paces behind (if only ancient Rome had had the Federal Reserve!).

If these weren’t enough, we’ve got the specter of global cooling to deal with (oops!; that was the 1970s). There is evidence that we’ve had progressive warming of the planet, AKA climate change. Some of my more expert acquaintances on the subject tell me that longer term evidence on past climate patterns does not jive with the short term temperature records used to define the trend. Other analyses suggest that many scientists who support the concept of global warming don’t necessarily feel the evidence supports the level of short-term risk trumpeted in the media. But the mainstream warns that such views are tantamount to denying the Holocaust. Accepting as fact that we’re into a long term warming trend, and the cause an increase in atmospheric CO2, the second proclaimed non-controversy is that mankind is the culprit. Assuming this too as fact, we must deal with (or ignore, which is safer in this political climate, pun intended) the issue that some experts have calculated that if we were to impose all the carbon restrictions the world has proposed in recent edicts, it would have a miniscule effect on the trend, but a major impact on the world economy. So, in my cataclysmic conjecture, that brings us back to economic collapse. No matter where the truth lies, we can all agree that reducing carbon emissions and levels of associated pollution isn’t a bad idea. The solutions, I believe, will come not from arbitrarily imposed carbon restrictions but from technology, which I expect to be the source of abundant, reduced-carbon and carbon-free energy much sooner than people think. Unfortunately, as I will point out, this technology boon or boom comes at the cost of one of our greatest threats. So, for your edification and convenience, I provide the true and incontrovertible risk assessment for the next Apocalypse (drum roll, please):

  • 1/2. War (including cyber warfare and man-made EMP attacks)
  • 1/2. Economic collapse.
  • 3. Artificial intelligence.
  • 4. EMP from the sun.
  • 5. Climate change (of the hot variety)
  • 6. Asteroid collision.
  • 7. The Walking Dead.

AI as number 3, you ask? Too much scifi in my entertainment diet, right? Scoff if you will. It’s true that science fiction has given about the same emphasis to numbers 3 and 7, trivializing  and desensitizing us to the former. That has been a tragic mistake. Because number 3 is very real, and coming at us like a freight train (or, more apropos, a hurtling asteroid).

I strongly recommend viewing Sam Harris’s brief but excellent TED talk on the subject here, then rejoin me at your leisure (if we’re all still here) for additional thoughts on the matter.