Unveiling the power and responsibility of “Psychotechnology”

Pittsboro, NC – In a rapidly evolving world driven by technology, where artificial intelligence (AI) increasingly permeates our daily lives, William Ammerman, the author of “The Invisible Brand” took center stage for a presentation on September 14. In the last portion on his Innovate Technology talk, he explores the concept of “psychotechnology” and the profound ways it influences us psychologically.

Psychotechnology: The Emerging Frontier

Psychotechnology may not be a term that’s on everyone’s lips just yet, but according to Ammerman, it’s a concept that deserves our attention. He coined this term to describe the technology that wields a psychological impact on our lives. It’s not about mere gadgets and gizmos; it’s about the way these technologies influence our thoughts, behaviors, and emotions.

Ammerman’s definition of psychotechnology hinges on four key characteristics: personalization, persuasion, learning, and human-likeness. These attributes make it a powerful force in shaping our lives and altering our perspectives.

Modern remake of the creation of Adam (Image by rawpixel.com on Freepik)

The Spider-Man Principle: Great Power, Great Responsibility

Ammerman likens the power of psychotechnology to the famous Spider-Man principle coined by Stan Lee: “With great power, comes great responsibility.” He reminds us that the technological advancements we are currently witnessing rival the significance of historical innovations like fire and electricity. Therefore, the responsibility to understand and navigate these changes falls on every individual’s shoulders.

Groupthink Automation: Shaping Our Beliefs

One of the most intriguing concepts presented by Ammerman is “groupthink automation.” It delves into how AI has the capability to create groupthink among specific segments of society. By exploiting our vulnerabilities, AI can control the flow of information and keep us addicted to our devices.

Ammerman identifies several cognitive vulnerabilities that psychotechnology targets. These include cognitive biases, which are systematic errors in thinking, and confirmation bias, our tendency to favor information that aligns with our existing beliefs. Psychotechnology also thrives on the “filter bubble” phenomenon, where algorithms surround us with information that confirms our biases, ultimately creating an echo chamber.

The Dark Side of Social Media

Social media platforms play a pivotal role in the propagation of psychotechnology. Algorithms meticulously curate content, ensuring that it aligns with our preconceived notions, thereby fostering division and polarization. We’re drawn to content that validates our beliefs, and this, in turn, creates a cycle of confirmation and polarization.

Framing: Changing Our Perception

The way information is framed can drastically alter our perception of it. Ammerman emphasizes the importance of framing in shaping our beliefs. Even something as seemingly trivial as font choice can impact how we interpret information. He demonstrates this with a compelling visual example, driving home the idea that framing is a powerful tool that can be wielded by psychotechnology.

The Prince William Effect: A Cautionary Tale

To illustrate the impact of framing and social media, Ammerman presents the “Prince William Effect.” In this case, an innocent photograph of Prince William addressing a crowd is framed in a way that incites outrage. However, the actual context was quite different; Prince William was joyfully announcing the birth of his third child. This example serves as a stark reminder of how easily we can be manipulated by the way information is presented.

AI and the Perils of Division

Ammerman closed his talk by highlighting the potential consequences of AI-induced societal division. He contends that AI may not directly cause wars but can fuel animosity to the point where conflict becomes increasingly likely. Thus, the responsibility to navigate these treacherous waters lies with each one of us.

The Call to Awareness and Responsibility

As we navigate the age of AI and psychotechnology, it’s essential to recognize the power they wield over our minds. Psychotechnology’s ability to manipulate our vulnerabilities and shape our beliefs demands our vigilance. In his presentation, William Ammerman shed light on the profound impact of psychotechnology and its potential consequences for society. To truly harness the great power at our fingertips, we must also embrace the great responsibility it entails.

Quick Notes

🧠 Psychotechnology: The impact of technology on human psychology, including cognitive biases and addiction.
📱 Groupthink Automation: AI’s power to create groupthink by exploiting biases and controlling information flow.
🌐 Filter Bubbles: Media algorithms reinforce biases by showing information that confirms beliefs.
🧲 Framing Effects: How information is framed changes how it’s perceived and influences beliefs.
🌐 Social Media Polarization: AI-driven algorithms amplify out-group hatred and confirmation bias.
📷 The Prince William Effect: An example of how limited perspectives on social media can fuel anger and misinformation.
🌍 AI’s Influence on Perception: AI has been shaping our worldview for a decade, impacting how we perceive reality.
🔥 The Intelligence Revolution: The significance of AI’s impact on society compared to past innovations.

Watch William Ammerman’s Author of “The Invisible Brand Living with AI part 3 on YouTube

00:16 Psychotechnology is a powerful technology that affects us psychologically.

Psychotechnology refers to the psychological relationship between humans and machines.

It is personalized, persuasive, able to learn, and human-like.

It has the potential to change society and requires great responsibility.

Understanding and experiencing psychotechnology is important for everyone.

02:39 AI exploits cognitive biases and filter bubbles to create groupthink

Groupthink automation is a power AI has to create groupthink among select groups in society

AI controls the flow of information to keep people addicted to their devices

Cognitive biases, such as confirmation bias, make people attracted to facts that confirm their bias

Filter bubbles created by media algorithms further reinforce people’s existing beliefs

05:01 AI exploits our vulnerability to confirmation bias

Media algorithms create filter bubbles that confirm our biases

We are chemically addicted to the pleasure of having our biases confirmed

AI frames information in a way that appeals to us

07:22 Framing changes perceptions

Changing the framing of information alters its meaning

Framing can be influenced by factors like font choice

Framing affects perceptions of physical and mental models

Negative posts about out-groups on social media are shared twice as often as posts about anger

09:44 Confirmation bias drives us to hate and believe negative information about others.

Shared information targets out-groups more frequently.

Social media platforms reinforce bias and create divisions.

12:01 The internet reflects your own attitudes

Social media is a reflection of your personality

Tim Berners-Lee’s insights on the impact of love and hatred on social media

The dangers of AI leading to conflicts and wars

The role of media algorithms in creating filter bubbles and confirmation bias

14:16 Social media algorithms amplify and confirm existing opinions, fuelling anger and animosity.

The dopamine released in our brains when we feel validated on social media keeps us coming back for more.

An image of Prince William was shared and circulated on social media, confirming existing biases and stoking anger towards the royal family.

The image actually captured Prince William announcing the birth of his third child, a touching and endearing moment.

16:47 AI frames the way we see the world

Perspective influences perception

AI has been shaping our perspective for 10 years

Different perspectives can lead to disagreement

There are limits to our ability to perceive the world