• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
MENUMENU
MENUMENU
  • Home
  • About
    • Contact Us
    • FlaglerLive Board of Directors
    • Comment Policy
    • Mission Statement
    • Our Values
    • Privacy Policy
  • Live Calendar
  • Submit Obituary
  • Submit an Event
  • Support FlaglerLive
  • Advertise on FlaglerLive (386) 503-3808
  • Search Results

FlaglerLive

No Bull, no Fluff, No Smudges

MENUMENU
  • Flagler
    • Flagler County Commission
    • Beverly Beach
    • Economic Development Council
    • Flagler History
    • Mondex/Daytona North
    • The Hammock
    • Tourist Development Council
  • Palm Coast
    • Palm Coast City Council
    • Palm Coast Crime
  • Bunnell
    • Bunnell City Commission
    • Bunnell Crime
  • Flagler Beach
    • Flagler Beach City Commission
    • Flagler Beach Crime
  • Cops/Courts
    • Circuit & County Court
    • Florida Supreme Court
    • Federal Courts
    • Flagler 911
    • Fire House
    • Flagler County Sheriff
    • Flagler Jail Bookings
    • Traffic Accidents
  • Rights & Liberties
    • Fourth Amendment
    • First Amendment
    • Privacy
    • Second Amendment
    • Seventh Amendment
    • Sixth Amendment
    • Sunshine Law
    • Third Amendment
    • Religion & Beliefs
    • Human Rights
    • Immigration
    • Labor Rights
    • 14th Amendment
    • Civil Rights
  • Schools
    • Adult Education
    • Belle Terre Elementary
    • Buddy Taylor Middle
    • Bunnell Elementary
    • Charter Schools
    • Daytona State College
    • Flagler County School Board
    • Flagler Palm Coast High School
    • Higher Education
    • Imagine School
    • Indian Trails Middle
    • Matanzas High School
    • Old Kings Elementary
    • Rymfire Elementary
    • Stetson University
    • Wadsworth Elementary
    • University of Florida/Florida State
  • Economy
    • Jobs & Unemployment
    • Business & Economy
    • Development & Sprawl
    • Leisure & Tourism
    • Local Business
    • Local Media
    • Real Estate & Development
    • Taxes
  • Commentary
    • The Conversation
    • Pierre Tristam
    • Diane Roberts
    • Guest Columns
    • Byblos
    • Editor's Blog
  • Culture
    • African American Cultural Society
    • Arts in Palm Coast & Flagler
    • Books
    • City Repertory Theatre
    • Flagler Auditorium
    • Flagler Playhouse
    • Flagler Youth Orchestra
    • Jacksonville Symphony Orchestra
    • Palm Coast Arts Foundation
    • Special Events
  • Elections 2024
    • Amendments and Referendums
    • Presidential Election
    • Campaign Finance
    • City Elections
    • Congressional
    • Constitutionals
    • Courts
    • Governor
    • Polls
    • Voting Rights
  • Florida
    • Federal Politics
    • Florida History
    • Florida Legislature
    • Florida Legislature
    • Ron DeSantis
  • Health & Society
    • Flagler County Health Department
    • Ask the Doctor Column
    • Health Care
    • Health Care Business
    • Covid-19
    • Children and Families
    • Medicaid and Medicare
    • Mental Health
    • Poverty
    • Violence
  • All Else
    • Daily Briefing
    • Americana
    • Obituaries
    • News Briefs
    • Weather and Climate
    • Wildlife

AI Is an Existential Threat. But Not the Way You Think.

July 5, 2023 | FlaglerLive | 2 Comments

Deep Blue, a computer similar to this one defeated chess world champion Garry Kasparov in May 1997. It is the first computer to win a match against a world champion. Photo taken at the Computer History Museum. (Wikimedia Commons)
Deep Blue, a computer similar to this one defeated chess world champion Garry Kasparov in May 1997. It is the first computer to win a match against a world champion. Photo taken at the Computer History Museum. (Wikimedia Commons)

By Nir Eisikovits

The rise of ChatGPT and similar artificial intelligence systems has been accompanied by a sharp increase in anxiety about AI. For the past few months, executives and AI safety researchers have been offering predictions, dubbed “P(doom),” about the probability that AI will bring about a large-scale catastrophe.




Worries peaked in May 2023 when the nonprofit research and advocacy organization Center for AI Safety released a one-sentence statement: “Mitigating the risk of extinction from A.I. should be a global priority alongside other societal-scale risks, such as pandemics and nuclear war.” The statement was signed by many key players in the field, including the leaders of OpenAI, Google and Anthropic, as well as two of the so-called “godfathers” of AI: Geoffrey Hinton and Yoshua Bengio.

You might ask how such existential fears are supposed to play out. One famous scenario is the “paper clip maximizer” thought experiment articulated by Oxford philosopher Nick Bostrom. The idea is that an AI system tasked with producing as many paper clips as possible might go to extraordinary lengths to find raw materials, like destroying factories and causing car accidents.

A less resource-intensive variation has an AI tasked with procuring a reservation to a popular restaurant shutting down cellular networks and traffic lights in order to prevent other patrons from getting a table.




Office supplies or dinner, the basic idea is the same: AI is fast becoming an alien intelligence, good at accomplishing goals but dangerous because it won’t necessarily align with the moral values of its creators. And, in its most extreme version, this argument morphs into explicit anxieties about AIs enslaving or destroying the human race.

A paper clip-making AI runs amok is one variant of the AI apocalypse scenario.

Actual harm

In the past few years, my colleagues and I at UMass Boston’s Applied Ethics Center have been studying the impact of engagement with AI on people’s understanding of themselves, and I believe these catastrophic anxieties are overblown and misdirected.

Yes, AI’s ability to create convincing deep-fake video and audio is frightening, and it can be abused by people with bad intent. In fact, that is already happening: Russian operatives likely attempted to embarrass Kremlin critic Bill Browder by ensnaring him in a conversation with an avatar for former Ukrainian President Petro Poroshenko. Cybercriminals have been using AI voice cloning for a variety of crimes – from high-tech heists to ordinary scams.

AI decision-making systems that offer loan approval and hiring recommendations carry the risk of algorithmic bias, since the training data and decision models they run on reflect long-standing social prejudices.

These are big problems, and they require the attention of policymakers. But they have been around for a while, and they are hardly cataclysmic.

Not in the same league

The statement from the Center for AI Safety lumped AI in with pandemics and nuclear weapons as a major risk to civilization. There are problems with that comparison. COVID-19 resulted in almost 7 million deaths worldwide, brought on a massive and continuing mental health crisis and created economic challenges, including chronic supply chain shortages and runaway inflation.




Nuclear weapons probably killed more than 200,000 people in Hiroshima and Nagasaki in 1945, claimed many more lives from cancer in the years that followed, generated decades of profound anxiety during the Cold War and brought the world to the brink of annihilation during the Cuban Missile crisis in 1962. They have also changed the calculations of national leaders on how to respond to international aggression, as currently playing out with Russia’s invasion of Ukraine.

AI is simply nowhere near gaining the ability to do this kind of damage. The paper clip scenario and others like it are science fiction. Existing AI applications execute specific tasks rather than making broad judgments. The technology is far from being able to decide on and then plan out the goals and subordinate goals necessary for shutting down traffic in order to get you a seat in a restaurant, or blowing up a car factory in order to satisfy your itch for paper clips.

Not only does the technology lack the complicated capacity for multilayer judgment that’s involved in these scenarios, it also does not have autonomous access to sufficient parts of our critical infrastructure to start causing that kind of damage.

What it means to be human

Actually, there is an existential danger inherent in using AI, but that risk is existential in the philosophical rather than apocalyptic sense. AI in its current form can alter the way people view themselves. It can degrade abilities and experiences that people consider essential to being human.

a robot hand points to one of four photographs on a shiny black surface
As algorithms take over many decisions, such as hiring, people could gradually lose the capacity to make them.
AndreyPopov/iStock via Getty Images

For example, humans are judgment-making creatures. People rationally weigh particulars and make daily judgment calls at work and during leisure time about whom to hire, who should get a loan, what to watch and so on. But more and more of these judgments are being automated and farmed out to algorithms. As that happens, the world won’t end. But people will gradually lose the capacity to make these judgments themselves. The fewer of them people make, the worse they are likely to become at making them.




Or consider the role of chance in people’s lives. Humans value serendipitous encounters: coming across a place, person or activity by accident, being drawn into it and retrospectively appreciating the role accident played in these meaningful finds. But the role of algorithmic recommendation engines is to reduce that kind of serendipity and replace it with planning and prediction.

Finally, consider ChatGPT’s writing capabilities. The technology is in the process of eliminating the role of writing assignments in higher education. If it does, educators will lose a key tool for teaching students how to think critically.

Not dead but diminished

So, no, AI won’t blow up the world. But the increasingly uncritical embrace of it, in a variety of narrow contexts, means the gradual erosion of some of humans’ most important skills. Algorithms are already undermining people’s capacity to make judgments, enjoy serendipitous encounters and hone critical thinking.

The human species will survive such losses. But our way of existing will be impoverished in the process. The fantastic anxieties around the coming AI cataclysm, singularity, Skynet, or however you might think of it, obscure these more subtle costs. Recall T.S. Eliot’s famous closing lines of “The Hollow Men”: “This is the way the world ends,” he wrote, “not with a bang but a whimper.”

 

Nir Eisikovits is Professor of Philosophy and Director of the Applied Ethics Center at the University of Massachusetts, Boston. 

The Conversation arose out of deep-seated concerns for the fading quality of our public discourse and recognition of the vital role that academic experts could play in the public arena. Information has always been essential to democracy. It’s a societal good, like clean water. But many now find it difficult to put their trust in the media and experts who have spent years researching a topic. Instead, they listen to those who have the loudest voices. Those uninformed views are amplified by social media networks that reward those who spark outrage instead of insight or thoughtful discussion. The Conversation seeks to be part of the solution to this problem, to raise up the voices of true experts and to make their knowledge available to everyone. The Conversation publishes nightly at 9 p.m. on FlaglerLive.
See the Full Conversation Archives
Support FlaglerLive's End of Year Fundraiser
Thank you readers for getting us to--and past--our year-end fund-raising goal yet again. It’s a bracing way to mark our 15th year at FlaglerLive. Our donors are just a fraction of the 25,000 readers who seek us out for the best-reported, most timely, trustworthy, and independent local news site anywhere, without paywall. FlaglerLive is free. Fighting misinformation and keeping democracy in the sunshine 365/7/24 isn’t free. Take a brief moment, become a champion of fearless, enlightening journalism. Any amount helps. We’re a 501(c)(3) non-profit news organization. Donations are tax deductible.  
You may donate openly or anonymously.
We like Zeffy (no fees), but if you prefer to use PayPal, click here.

Reader Interactions

Comments

  1. Bill C says

    July 5, 2023 at 9:27 pm

    “Quantum computing enhances AI by increasing its speed, efficiency and accuracy. It utilizes qubits and operates non-linearly, outperforming conventional computers. This breakthrough enables quantum computing to be applied in various AI use cases. ”
    https://www.techopedia.com/can-quantum-computing-impact-the-applications-of-artificial-intelligence

  2. Dennis C Rathsam says

    July 6, 2023 at 8:14 am

    A.I. will be the destruction of America! Remember Space Oddesy 2001.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Conner Bosch law attorneys lawyers offices palm coast flagler county
  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Primary Sidebar

  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Recent Comments

  • Whathehck? on Two Florida congressional Democrats Want Hope Florida Investigated
  • Kath on Margaritaville’s Compass Hotel in Flagler Beach Opens in Buffett-Themed Celebration of a Downtown Remade
  • Dennis C Rathsam on Margaritaville’s Compass Hotel in Flagler Beach Opens in Buffett-Themed Celebration of a Downtown Remade
  • Dennis C Rathsam on Palm Coast’s Golden Chopsticks Buffet Open Again 2 Days After Sanitation Inspection Ordered It Closed
  • Beach Cat on State Attorney Investigating Records Linked to Casey DeSantis’ Hope Florida
  • jim on Palm Coast’s Golden Chopsticks Buffet Open Again 2 Days After Sanitation Inspection Ordered It Closed
  • Skibum on The Daily Cartoon and Live Briefing: Wednesday, May 21, 2025
  • Keep Flagler Beautiful on Reversing Planning Board’s Decision, Palm Coast Council Approves 100,000-Sq.-Ft. Storage Facility on Pine Lakes Pkwy
  • Land of no turn signals says on Reversing Planning Board’s Decision, Palm Coast Council Approves 100,000-Sq.-Ft. Storage Facility on Pine Lakes Pkwy
  • Laurel on The Daily Cartoon and Live Briefing: Sunday, May 18, 2025
  • Ray W, on The Daily Cartoon and Live Briefing: Wednesday, May 21, 2025
  • Sherry on AI Is Changing How Students Write
  • Laurel on Here’s What Makes the Most Dynamic and Sustainable Cities
  • laurel on Federal Judge Orders Florida to Follow Series of Steps to Protect and Feed Manatees
  • Laurel on Reversing Planning Board’s Decision, Palm Coast Council Approves 100,000-Sq.-Ft. Storage Facility on Pine Lakes Pkwy
  • JimboXYZ on Flagler County Clears Construction of 124 Single-Family Houses at Veranda Bay in Latest Phases of 453-Unit Development

Log in