• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
MENUMENU
MENUMENU
  • Home
  • About
    • Contact Us
    • FlaglerLive Board of Directors
    • Comment Policy
    • Mission Statement
    • Our Values
    • Privacy Policy
  • Live Calendar
  • Submit Obituary
  • Submit an Event
  • Support FlaglerLive
  • Advertise on FlaglerLive (386) 503-3808
  • Search Results

FlaglerLive

No Bull, no Fluff, No Smudges

MENUMENU
  • Flagler
    • Flagler County Commission
    • Beverly Beach
    • Economic Development Council
    • Flagler History
    • Mondex/Daytona North
    • The Hammock
    • Tourist Development Council
  • Palm Coast
    • Palm Coast City Council
    • Palm Coast Crime
  • Bunnell
    • Bunnell City Commission
    • Bunnell Crime
  • Flagler Beach
    • Flagler Beach City Commission
    • Flagler Beach Crime
  • Cops/Courts
    • Circuit & County Court
    • Florida Supreme Court
    • Federal Courts
    • Flagler 911
    • Fire House
    • Flagler County Sheriff
    • Flagler Jail Bookings
    • Traffic Accidents
  • Rights & Liberties
    • Fourth Amendment
    • First Amendment
    • Privacy
    • Second Amendment
    • Seventh Amendment
    • Sixth Amendment
    • Sunshine Law
    • Third Amendment
    • Religion & Beliefs
    • Human Rights
    • Immigration
    • Labor Rights
    • 14th Amendment
    • Civil Rights
  • Schools
    • Adult Education
    • Belle Terre Elementary
    • Buddy Taylor Middle
    • Bunnell Elementary
    • Charter Schools
    • Daytona State College
    • Flagler County School Board
    • Flagler Palm Coast High School
    • Higher Education
    • Imagine School
    • Indian Trails Middle
    • Matanzas High School
    • Old Kings Elementary
    • Rymfire Elementary
    • Stetson University
    • Wadsworth Elementary
    • University of Florida/Florida State
  • Economy
    • Jobs & Unemployment
    • Business & Economy
    • Development & Sprawl
    • Leisure & Tourism
    • Local Business
    • Local Media
    • Real Estate & Development
    • Taxes
  • Commentary
    • The Conversation
    • Pierre Tristam
    • Diane Roberts
    • Guest Columns
    • Byblos
    • Editor's Blog
  • Culture
    • African American Cultural Society
    • Arts in Palm Coast & Flagler
    • Books
    • City Repertory Theatre
    • Flagler Auditorium
    • Flagler Playhouse
    • Flagler Youth Orchestra
    • Jacksonville Symphony Orchestra
    • Palm Coast Arts Foundation
    • Special Events
  • Elections 2024
    • Amendments and Referendums
    • Presidential Election
    • Campaign Finance
    • City Elections
    • Congressional
    • Constitutionals
    • Courts
    • Governor
    • Polls
    • Voting Rights
  • Florida
    • Federal Politics
    • Florida History
    • Florida Legislature
    • Florida Legislature
    • Ron DeSantis
  • Health & Society
    • Flagler County Health Department
    • Ask the Doctor Column
    • Health Care
    • Health Care Business
    • Covid-19
    • Children and Families
    • Medicaid and Medicare
    • Mental Health
    • Poverty
    • Violence
  • All Else
    • Daily Briefing
    • Americana
    • Obituaries
    • News Briefs
    • Weather and Climate
    • Wildlife

How Facebook’s ‘Dangerous’ Algorithms Can Manipulate You

October 7, 2021 | FlaglerLive | 3 Comments

facebook algorithms
They see you. (Glen Carrie on Unsplash)

By Filippo Menczer

Former Facebook product manager Frances Haugen testified before the U.S. Senate on Oct. 5, 2021, that the company’s social media platforms




“harm children, stoke division and weaken our democracy.”

Haugen was the primary source for a Wall Street Journal exposé on the company. She called Facebook’s algorithms dangerous, said Facebook executives were aware of the threat but put profits before people, and called on Congress to regulate the company.

Social media platforms rely heavily on people’s behavior to decide on the content that you see. In particular, they watch for content that people respond to or “engage” with by liking, commenting and sharing. Troll farms, organizations that spread provocative content, exploit this by copying high-engagement content and posting it as their own, which helps them reach a wide audience.

As a computer scientist who studies the ways large numbers of people interact using technology, I understand the logic of using the wisdom of the crowds in these algorithms. I also see substantial pitfalls in how the social media companies do so in practice.

From lions on the savanna to likes on Facebook

The concept of the wisdom of crowds assumes that using signals from others’ actions, opinions and preferences as a guide will lead to sound decisions. For example, collective predictions are normally more accurate than individual ones. Collective intelligence is used to predict financial markets, sports, elections and even disease outbreaks.




Throughout millions of years of evolution, these principles have been coded into the human brain in the form of cognitive biases that come with names like familiarity, mere exposure and bandwagon effect. If everyone starts running, you should also start running; maybe someone saw a lion coming and running could save your life. You may not know why, but it’s wiser to ask questions later.

Your brain picks up clues from the environment – including your peers – and uses simple rules to quickly translate those signals into decisions: Go with the winner, follow the majority, copy your neighbor. These rules work remarkably well in typical situations because they are based on sound assumptions. For example, they assume that people often act rationally, it is unlikely that many are wrong, the past predicts the future, and so on.

Technology allows people to access signals from much larger numbers of other people, most of whom they do not know. Artificial intelligence applications make heavy use of these popularity or “engagement” signals, from selecting search engine results to recommending music and videos, and from suggesting friends to ranking posts on news feeds.

Not everything viral deserves to be

Our research shows that virtually all web technology platforms, such as social media and news recommendation systems, have a strong popularity bias. When applications are driven by cues like engagement rather than explicit search engine queries, popularity bias can lead to harmful unintended consequences.

Social media like Facebook, Instagram, Twitter, YouTube and TikTok rely heavily on AI algorithms to rank and recommend content. These algorithms take as input what you like, comment on and share – in other words, content you engage with. The goal of the algorithms is to maximize engagement by finding out what people like and ranking it at the top of their feeds.

On the surface this seems reasonable. If people like credible news, expert opinions and fun videos, these algorithms should identify such high-quality content. But the wisdom of the crowds makes a key assumption here: that recommending what is popular will help high-quality content “bubble up.”




We tested this assumption by studying an algorithm that ranks items using a mix of quality and popularity. We found that in general, popularity bias is more likely to lower the overall quality of content. The reason is that engagement is not a reliable indicator of quality when few people have been exposed to an item. In these cases, engagement generates a noisy signal, and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality item is large enough, it will keep getting amplified.

Algorithms aren’t the only thing affected by engagement bias – it can affect people too. Evidence shows that information is transmitted via “complex contagion,” meaning the more times people are exposed to an idea online, the more likely they are to adopt and reshare it. When social media tells people an item is going viral, their cognitive biases kick in and translate into the irresistible urge to pay attention to it and share it.

Not-so-wise crowds

We recently ran an experiment using a news literacy app called Fakey. It is a game developed by our lab that simulates a news feed like those of Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyperpartisan and conspiratorial sources, as well as mainstream sources. They get points for sharing or liking news from reliable sources and for flagging low-credibility articles for fact-checking.

We found that players are more likely to like or share and less likely to flag articles from low-credibility sources when players can see that many other users have engaged with those articles. Exposure to the engagement metrics thus creates a vulnerability.

facebook data
Chart: The Conversation CC-BY-ND Source: Avram et al.

The wisdom of the crowds fails because it is built on the false assumption that the crowd is made up of diverse, independent sources. There may be several reasons this is not the case.

First, because of people’s tendency to associate with similar people, their online neighborhoods are not very diverse. The ease with which social media users can unfriend those with whom they disagree pushes people into homogeneous communities, often referred to as echo chambers.




Second, because many people’s friends are friends of one another, they influence one another. A famous experiment demonstrated that knowing what music your friends like affects your own stated preferences. Your social desire to conform distorts your independent judgment.

Third, popularity signals can be gamed. Over the years, search engines have developed sophisticated techniques to counter so-called “link farms” and other schemes to manipulate search algorithms. Social media platforms, on the other hand, are just beginning to learn about their own vulnerabilities.

People aiming to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks. They have flooded the network to create the appearance that a conspiracy theory or a political candidate is popular, tricking both platform algorithms and people’s cognitive biases at once. They have even altered the structure of social networks to create illusions about majority opinions.

Dialing down engagement

What to do? Technology platforms are currently on the defensive. They are becoming more aggressive during elections in taking down fake accounts and harmful misinformation. But these efforts can be akin to a game of whack-a-mole.

A different, preventive approach would be to add friction. In other words, to slow down the process of spreading information. High-frequency behaviors such as automated liking and sharing could be inhibited by CAPTCHA tests, which require a human to respond, or fees. Not only would this decrease opportunities for manipulation, but with less information people would be able to pay more attention to what they see. It would leave less room for engagement bias to affect people’s decisions.

It would also help if social media companies adjusted their algorithms to rely less on engagement signals and more on quality signals to determine the content they serve you. Perhaps the whistleblower revelations will provide the necessary impetus.

Filippo Menczer is Luddy Distinguished Professor of Informatics and Computer Science at Indiana University.

The Conversation

The Conversation arose out of deep-seated concerns for the fading quality of our public discourse and recognition of the vital role that academic experts could play in the public arena. Information has always been essential to democracy. It’s a societal good, like clean water. But many now find it difficult to put their trust in the media and experts who have spent years researching a topic. Instead, they listen to those who have the loudest voices. Those uninformed views are amplified by social media networks that reward those who spark outrage instead of insight or thoughtful discussion. The Conversation seeks to be part of the solution to this problem, to raise up the voices of true experts and to make their knowledge available to everyone. The Conversation publishes nightly at 9 p.m. on FlaglerLive.
See the Full Conversation Archives

Support FlaglerLive's End of Year Fundraiser
Thank you readers for getting us to--and past--our year-end fund-raising goal yet again. It’s a bracing way to mark our 15th year at FlaglerLive. Our donors are just a fraction of the 25,000 readers who seek us out for the best-reported, most timely, trustworthy, and independent local news site anywhere, without paywall. FlaglerLive is free. Fighting misinformation and keeping democracy in the sunshine 365/7/24 isn’t free. Take a brief moment, become a champion of fearless, enlightening journalism. Any amount helps. We’re a 501(c)(3) non-profit news organization. Donations are tax deductible.  
You may donate openly or anonymously.
We like Zeffy (no fees), but if you prefer to use PayPal, click here.

Reader Interactions

Comments

  1. Dennis says

    October 8, 2021 at 5:58 am

    Everyone complains but Facebook keeps growing. Just deactivate your account and move in. Try it, you might be surprised you really didn’t need Facebook. Too much crap on the site

  2. Greg says

    October 8, 2021 at 6:01 am

    Too much money here. Congress will be bought off and Facebook wins as usual. Pretty sure Zuck lied to Congress several times? Money buys what is needed.

  3. Concerned Citizen says

    October 8, 2021 at 2:15 pm

    I use messenger to stay in touch with family. And more of us are switching to Slack.

    You control what social media does. Not the other way around. Don’t like it? Don’t use it. I haven’t in years. And don’t miss the silliness one bit. I surely don’t let those programs control my daily life.

    At nearly 60 years old I look back fondly on the days that we weren’t tied to a cell phone. It seems like it was simpler time when you couldn’t make a call until you got to the house/office or pay phone.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Conner Bosch law attorneys lawyers offices palm coast flagler county
  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Primary Sidebar

  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Recent Comments

  • The dude on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Atwp on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Purveyor of Truth on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Jim on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Maria on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Charlie Thomas on School Supplies Sales Tax Holiday Through Tuesday, Back To School Jam Saturday at FPC
  • Villein on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • James on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Mothersworry on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Sherry on The Daily Cartoon and Live Briefing: Saturday, May 10, 2025
  • JC on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Jane Gentile-Youd on Young Boy in Cardiac Arrest Saved by Flagler County 911 Team, Deputies and Paramedics
  • JohnX on Flagler County Prepares to Rebuild 5.5 Miles of Beach for $36 Million North of Pier Even as Long-Term Plan Is In Doubt
  • Paul T on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Deborah Coffey on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone
  • Let it burn on Palm Coast Mayor Mike Norris Thinks the FBI or CIA Is Bugging His Phone

Log in