READ MORE: Understanding Social Media: An Increasingly Reflexive Extension of Humanity (Tech Policy Press)
We may be aware that on social media we exist in an echo chamber of news and opinions that seemingly reflect our own but we may not quite understand how it all works. That’s important if you believe that the information disseminated on social media does more harm than good, and if you want to do something about it.
A pair of media academics have done the legwork, writing about it at Tech Policy Press.
Richard Reisman, “an independent media-tech innovator,” and Chris Riley, senior fellow for internet governance at the R Street Institute, lean on the core ideas expounded by media theorist Marshall McLuhan half a century ago.
“We shape our tools and thereafter our tools shape us” becomes a vicious circle as algorithms dictate what we are exposed to in our own increasingly self-contained echo chamber.
Riley and Reisman call it “reflexivity.”
“Modern media increasingly does more than merely reflect the world it presents,” they explain. “It shapes it, such that content and context are inextricably interwoven, and the result is a reflexive chain of collaborative transformation.”
The reflexivity generated by “this increasingly rapid and potentially universal feedback process is making social media as transformative of a change in civilization as language, writing, the printing press, and mass media,” they argue.
“We will increasingly depend on automated systems to filter and prioritize the ever-more-overwhelming information flow and become more reliant on systems that continuously learn work to amplify our own signals, whether shaped for us as we choose or manipulatively used against us.”— Richard Reisman and Chris Riley
The change could be positive or negative, but it’s pretty clear the authors think that if nothing is done to re-wire social media then we’re all in for a rough ride.
Problem is, it’s not enough to simply put in place content moderation filters.
“Social media do not behave like other media,” they explain. “Speech is not primarily broadcast, as through megaphones and amplification but rather, propagates more like word-of-mouth, from person to person. Feedback loops of reinforcing interactions by other users can snowball, or they can just fizzle out if ignored.”
ALSO ON NAB AMPLIFY:
Understanding how to modulate the harmful aspects of these wild messaging cascades requires stepping back. Instead, we should understand that “Social media virality flows more like rumors, relayed step by step.” Rather than view social media posts as individual items of content we should see them “as stages in reflexive flows in which we and these new media tools shape each other.”
In other words, we shape our social media — as do all of those who interact with and propagate each message — and, thereafter, our social media shapes us.
Contrary to popular opinion, social media is not an unregulated forum of free speech — at least not in the way that Twitter owner Elon Musk says he wants it to be.
Social media is moderated, filtered and regulated by algorithms. This filtering function, the authors argue, draws from the decisions and actions of millions of human users and “breaks the pairing of sender and receiver, of speaker and audience, that characterize traditional and mass media.”
They continue, “the omnipresence of ranking and recommendation systems has dramatically increased the reflexivity of modern media, with little attention to its dangers and how to limit them.”
The cycle of selection and feedback that emerges from user-engagement “is automated, instant, and global” and it often “collapses context and identity, destroys nuance, and incentivizes performative behavior.”
ALSO ON NAB AMPLIFY:
While anyone can express a view online, potentially to millions of people, and therefore increases engagement with social media but this gets in the way of due consideration and deliberation about what is posted or read and ultimately, obscures the truth.
Riley and Reisman warn that these changes are about to get more embedded in society. All media will effectively become “social media,” they say, accelerating “faster and more pervasive reaction and reflex, with the same danger of offering fewer opportunities to introduce friction and deliberation.”
As more of our reality is dominated by digital in the form of the metaverse (beginning with smart watches, going via VR to end up with brain-computer and neural interfaces), “we will increasingly depend on automated systems to filter and prioritize the ever-more-overwhelming information flow and become more reliant on systems that continuously learn work to amplify our own signals, whether shaped for us as we choose or manipulatively used against us.”
“Tools don’t control us. We control them, and we can remake them.”— Barack Obama
VR and AR may seem clumsy and artificial today “it’s still easy for us to separate our understanding of humanity and its digital extensions” — but as the connections to machines grow more seamless, more personal, and simultaneously more complex, “they will increasingly blur the lines between man and machine.”
This is the era of hyper-reflexivity where we will no longer be able to think ourselves out of the box.
So, what’s to be done? The first step is to recognize the problem, which media theoreticians like these are determined can be done.
The second is to somehow re-insert some of the friction that has been removed from our social media experience in a bid to make people think twice about the messages they are sending and receiving.
We Shape Our Tools and Then They Shape Us
By Adrian Pennington
“The medium is the message” has become such a hackneyed quote as to have been rendered meaningless. That’s a problem since we are being blinded to critical analysis of the true impact of media technology, according to political analyst Ezra Klein, writing in The New York Times.
Noticing the recent backlash against social media, Klein writes, “The visionaries inventing our future have been recast as the Machiavellians undermining our present. My frustration with these narratives, both then and now, is that they focus on people and companies, not technologies.”
Marshall McLuhan remains the most famous academic to call out the fundamental role of technology itself in shaping our media. We’ve been told, underlines Klein, that mediums are neutral and content is king.
Another McLuhan quote, from 1964, states the same but in a more overtly political way: “The ‘content’ of a medium is like the juicy piece of meat carried by the burglar to distract the watchdog of the mind.”
McLuhan’s view is that mediums matter more than content; it’s the common rules that govern all creation and consumption across a medium that change people and society.
“Oral culture teaches us to think one way, written culture another,” restates Klein. “Television turned everything into entertainment, and social media taught us to think with the crowd.”
Klein gives cable TV as an example. CNN and Fox News and MSNBC may be ideologically different, he says, but cable news in all its forms carries a sameness: the look of the anchors, the gloss of the graphics, the aesthetics of urgency and threat, the speed, the immediacy, the conflict, the conflict, the conflict.
“There is a grammar and logic to the medium, enforced by internal culture and by ratings reports broken down by the quarter-hour. You can do better cable news or worse cable news, but you are always doing cable news.”
In other words, and again to paraphrase McLuhan and his acolytes, don’t just look at the way things are being expressed; look at how the way things are expressed determines what’s actually expressible.
To Klein this means that the medium itself blocks certain messages or certain types of people (while promoting others).
The obvious example is the impact television has had on our politics and on the people who “would be locked out of politics because they couldn’t command the screen.”
Ronald Reagan charmed the public through TV to ascend to the presidency in a trend that has reached its apotheosis with Donald Trump.
In reality TV, “Trump perfected the persona of a ruthlessly effective executive with a particular talent for hiring and firing. Without The Apprentice would there have been a Trump presidency?
The same can be said of Volodymyr Zelensky, the president of Ukraine, who “secured his job by playing an Everyman who becomes president of Ukraine on a sitcom.”
Klein is worried that the prevailing narrative suggests that if there’s a fault with the echo chambers of social media warping our politics it is the responsibility of tech leaders to alter the algorithm.
He picks out Marc Andreessen, a co-founder of the internet browsers Mosaic and Netscape and Web3 venture capital firm A16Z, for attention. Andreessen has suggested that Twitter is like “a giant X-ray machine” in that it unmasks the private feelings of public figures, but Klein disagrees.
“I don’t even think this is true for Andreessen, who strikes me as very different off Twitter from on. There is no stable, unchanging self. People are capable of cruelty and altruism, farsightedness and myopia. We are who we are, in this moment, in this context, mediated in these ways.”
He warns, “It is an abdication of responsibility for technologists to pretend that the technologies they make have no say in who we become. Where he sees an X-ray, I see a mold.”
This is not an argument against technology but for “taking technology as seriously as it deserves to be taken, for recognizing, as McLuhan’s friend and colleague John M. Culkin put it, ‘we shape our tools, and thereafter, they shape us.’ ”
Re-Intermediate for Reflection
One example is to create space for institutional mediators such as journalists and NGOs to play a role and weigh in with real-time, third-party perspectives. Traditional models of authority and institutional validation can be reintroduced — they suggest, but (in theory at least) in a manner that better matches the scale and pace of modern technology.
“This could create a new marketplace of ideas with supporting institutions, including new and revitalized business opportunities as well as new levers for social good.”
Clearly this isn’t easy to put into practice, assuming there was a will to do so. The authors suggest that voluntary action by digital platforms, some amount of carefully tuned government intervention, growing pressure from critical community voices, and even the organized actions of employees, could all help advance the cause of “re-intermediation and a systemic response to reflexivity.”
The aim, as former president Barack Obama put it in a recent speech on technology and democracy, is, “tools don’t control us. We control them, and we can remake them.”