Written by: James Marlin, Executive Contributor
Executive Contributors at Brainz Magazine are handpicked and invited to contribute because of their knowledge and valuable insight within their area of expertise.
Like many of us, I can’t help but notice the tension and divisiveness that seems to be everywhere we turn. This divisiveness has rooted itself in almost everything, from the highest seats in our government to our own kitchen tables.
This can manifest itself in various ways, from simple bickering to outright violence. You may no longer speak to your cousins because they’ve "gone off the deep end", or you could be an innocent teenage kid who gets shot for accidentally knocking on the wrong door.
Although divisiveness and conflict are certainly not new to society, the way they spread has been accelerated by our use of technology. Now, I’m not one of those who think every technological advance spells the end for humanity. But I am also not naive enough to overlook that it certainly has that potential.
Technology is a tool, or even a vehicle, plain and simple. And much like any other vehicle, you can use it to drive your family to beautiful destinations, or you can use it to run over pedestrians on your way to driving off a cliff. The outcome depends on how you use it. And nowadays, its use seems more in line with the latter. But technology is so ingrained in our modern-day lives that saying we need to eliminate it altogether is as ridiculous as it is impossible.
So, what can we do? How is technology contributing to or causing this polarization? And how do we use it to bridge these current divides? In February, I took a trip to San Francisco for the Designing Tech for Social Cohesion conference to find out.
The conference was organized by members of some of the global leaders in peacebuilding, like Shamil Idriss, CEO of Search for Common Ground, and Dr. Lisa Schirch of the Toda Peace Institute. Featuring people like Daanish Masood, a technologist at the United Nations Department of Political and Peacebuilding Affairs, and Calista Small at More In Common.
The goal of this conference was to put these peacebuilders in the same room as those who design our technology so we can find ways to embed social cohesion into the code. So we can drive this vehicle towards building empathy and closing perception gaps. Individuals like Tristan Harris from the Center for Humane Technology, Colin Megill from Polis, Andrew Konya of Remesh, and my two favorites, Waidhei Gokhale of Soliya, and Lisa Conn of Gatheround.
The road
Steve Jobs once pointed out that computers are like bicycles. They enhance what we can do. Although we can steer the bike, it can only go where the road is paved. And the road the bike travels on is paved in code.
Since the way the code is paving this road now drives for profit, the goal is to keep your eyes on the screen. Period. That’s how these big tech companies make money. Unfortunately, there is more profit in our collective dysfunction. Because, as it turns out, outrage mixed with splashes of vindication is very compelling and satisfying. And will keep you tuned in and coming back for more.
At the beginning of the conference, Shamil Idriss got on stage and laid out the three steps to peacebuilding. Connection leads to collaboration, which leads to breakthroughs. Sounds simple, right? But when I look at where we are today:
1. Connection? We are increasingly divided.
2. Collaboration? We are working against each other, not with each other.
3. Breakthroughs? Things are breaking down.
According to Dr. Schirch, “The road to hell is paved in code.” As she puts it, the platforms we use today—Twitter, Facebook, Youtube, etc.—are designed like “digital coliseums” where “gladiators can fight it out with a passive and anonymous audience cheering and commenting.” Where the algorithms don’t create an environment for constructive conversation but rather an arena for battle.
The major tech companies formed Trust and Safety groups to create a set of business practices to reduce the risk of exposure to harmful and fraudulent content. But that didn’t last long. The end of 2022 and the beginning of 2023 saw groups like Meta, Amazon, Alphabet, and Twitter drastically reduce the size of their teams that fight online hate and misinformation to cut costs and maximize profit. And more recently, Elon Musk and Twitter started rolling back misinformation rules, which Mark Zuckerberg and Meta quickly followed.
While they were involved, these Trust and Safety groups found that content moderation alone cannot (and will never be able to) keep pace with the amount of hate and misinformation online. We need to build from the ground up. It has to be embedded in the code. It has to be written into the very structure of these platforms. We must start paving this road toward social cohesion rather than the coliseum. We have to design our technology to be a more benevolent manipulator instead of a shit-stirring instigator.
Because, at this point, none of us are blind to the fact that digital online hate does “spill over” into the real world. With real physical violence. And it’s not just violence that is “spilling over”. A recent warning from the Surgeon General of the United States, Dr. Vivek Murthy, explained the dangers of social media for the mental health of today’s youth. His report points out that social media use among young people between 13 and 17 is almost universal, at 95%. More than a third of them say they use it “almost constantly.”
This age represents a critical stage in brain and social development. While they are in this critical stage and engaging almost constantly with technology, they “are exposed to harmful content on social media, ranging from violent and sexual content to bullying and harassment,” according to Dr. Murthy.
The point here is that ALL tech design involves social engineering. The consistent use of these devices and platforms embeds habits and values in their users. As Meagan Mitchell from the New Republic said at the conference, “Tech is shaping what it is to be a human right now.” It is time to accept that. And accept what we must do to fix it.
We need to accept and address the fact that many of the tech designers who built this coliseum built targeted design features to make this coliseum as addictive as possible. Again, your eyes on the screen is the one and only goal.
And we can see how this is panning out.
Heightened outrage and anger that leads to polarization. Which means more individuals are being pushed to the extremes of their ideologies. How the code is written now, taps into emotional contagions while stifling emotional communication. It then compounds it by enabling it to become a social contagion.
What does that mean?
Emotional contagion is a phenomenon that occurs when the emotional state of one person is transferred to another via unconscious sensory input rather than communication. Getting you riled up about something before you’ve had a chance to think about it logically.
Social contagion is when this emotion can spread from one large group of people to another large group rather than from person to person. Meaning your moods and perspectives, as a group, can influence the moods and perspectives of another group, spreading this contagion farther…faster.
These contagions are one reason we say things can “go viral.” These algorithms and platforms enable and amplify these types of contagions. And as Gustave Le Bon pointed out over 100 years ago in his book The Crowd, groups experiencing an emotional situation can cause the emotional intensity felt by each individual to intensify even further.
This all leads us, as a whole, to begin to bypass things like empathy and understanding. We become locked in our echo chambers with our like-minded tribe. The algorithm feeds either our outrage at the other or our vindication with our brother. Proving to us repeatedly how right we are and how wrong they are.
One of the most important things I learned at the conference was the concept of “perception gaps.” I had never heard of this term before but soon realized we all have them.
Perception gaps – more in common
At one point in the conference, Dr. Schirch invited everyone to get up from where we had been sitting and find another seat at a different table to converse with someone we had not spoken with yet. That is when I met Calista Small from More In Common.
She casually threw out this phrase, “perception gaps,” which initially sounded interesting, but I had no idea what it meant. So, she gave me a link to the Perception Gap Quiz designed by More In Common that will lay it all out, right in front of you.
It is quick, easy, and incredibly eye-opening. You start by defining yourself. Are you a Republican, Democrat, or Independent? Male or female, etc. Once they know where you are coming from, they ask for your opinion on gun rights, immigration, and sexual assault, among other things. Then they ask you to guess what percentage of “the other side” would agree or disagree with your feelings on these topics.
For example, “What percentage of Republicans would agree that properly controlled immigration would be good for America?” So, you guess what you think they think. This shouldn’t be hard because we all make these assumptions about “the other side” and what they think all the time.
Next, they take your answers about what you think they think and match them up with their actual views. These views are taken from a survey conducted among 2,100 US adults (974 male, 1126 female, mean age 49) from November 7th to 10th, 2018—the week immediately following the 2018 midterm elections.
This is where things get really interesting. This survey will spit out a graph showing you exactly how far off your assumptions about “the other side” actually are. Check out the examples from their report below:
As we can clearly see, a good chunk of our assumptions about each other are just plain wrong. And how we engage with each other on these platforms only fortifies these incorrect assumptions, a.k.a., perception gaps.
If we don’t make an effort to break out of our own echo chambers and engage in curiosity-based conversation with those we disagree with, then how are we ever going to come together and come up with a National course of action? A course of action on things like gun control, abortion rights, or climate change.
I’m not suggesting that there is a solution that will please everyone. But as the studies by More In Common suggest, there are solutions we can reach that can include more people from both sides rather than one side versus the other. There are issues where our values are closer than we think.
We are not as divided as we may feel.
The inherent problem with placating one side and enraging the other, is that it guarantees any progress made in one direction can and will be dismantled after the next election. One step forward, two steps back. Or, as the guys at More In Common put it, “preventing progress on shared concerns, and undermining Americans’ faith in democracy.”
We have to re-energize this idea of a “United” States of America.
Bridging algorithms
So how can we build these platforms in a way that would bring opposing sides together rather than pit them against each other? How can we expose these perception gaps for what they really are?
One suggestion is being put forward by Aviv Ovadya, a technologist and researcher focusing on the impacts of internet platforms and artificial intelligence on society and democracy, and that is what he calls Bridging-Based Ranking. He created a report in May 2022 for the Technology and Public Purpose Project, which was published by the Harvard Kennedy School's Belfer Center for Science and International Affairs.
In his report, Aviv explains that the current way platforms like Facebook, among others, use algorithms to amplify or boost content in your feed, is to promote content that receives engagement in the form of likes, shares, or comments.
As mentioned earlier, the best way to get someone to engage is to enrage them. The second-best way is to echo their currently held beliefs. Again, they want your eyes on the screen for as long as possible. They don't care if the reason you are staring at the screen is good or bad. They also do not care if that information is true or false. So they "feed" you what angers you and what you agree with—further driving the divide with both fury and justification.
As Aviv puts it in his report, these platforms currently reward this divisive content with top placement in your stream of information, "resulting in significant impacts on the quality of our decision-making, our capacity to cooperate, the likelihood of violent conflict, and the robustness of democracy."
He further explains, "In summary, sensationalism and divisiveness attract attention. This drives engagement. Engagement-based ranking systems reward this with more attention. This combination of human psychology (what we pay attention to) and societal incentives (our desire for attention and its rewards) leads to harm; engagement-based recommendations are just a particular way to increase the reward and, thus, the harm. All of this leads rapidly to a race to the bottom."
His suggestion: "We can call such a system a bridging recommender—a recommendation system that rewards content that helps bridge divides. It would use bridging-based ranking to reward content that leads to positive interactions across diverse audiences, even when the topic may be divisive."
In other words, these algorithms would reward posts and articles that would help opposing sides understand each other. And, in my opinion, that should be the goal here. Not to get opposing sides to necessarily agree with each other but rather to help them understand each other.
Because there will always be groups within our society that inherently disagree with each other. But, as Julia Kamin of the Civic Health Project states, “It’s not about getting rid of conflict. It’s about getting rid of conflict that is generated to sell shirts and ad space.” Conflict can be useful for growth. But there is a difference between constructive conflict and destructive conflict. It is how these conflicts are engaged in and resolved that will either promote growth or divisiveness.
It is imperative that we identify where and how technology is diminishing our cross-group communication. Where is it causing our ability to have constructive conversations with someone we disagree with to atrophy? And I believe Aviv and his group have pinpointed a key element that could prove pivotal in turning things around if it is embraced and utilized.
Conflict is inevitable. But nothing resolves conflict and builds empathy better than communication. That’s right. A simple conversation could be the key to this whole thing. But how can conversation be instigated and cultivated through technology? How can platforms be built to explore the complexities and nuance of an issue and expose the interdependence our different tribes rely on to make our society function in a healthy manner? And how can they do this without engaging in the aspects of technology that drive destructive conflict and polarization?
Polis
One of the more intriguing ways in which technology is being designed to help bridge these gaps is called Polis. Polis is supported by a group called, The Computational Democracy Project and was founded by Colin Megill.
As they described on their site, “Polis is a real-time system for gathering, analyzing, and understanding what large groups of people think in their own words, enabled by advanced statistics and machine learning.” They work to bring data science to “deliberative democracy,” hoping to create policies that better reflect public will.
Polis generally works by taking a question or statement, such as “How can we affect climate change?” and allowing the participants to submit short statements reflecting their views. They can also vote to agree, disagree, or pass on statements made by other participants. As these statements and responses accumulate, Polis’ algorithm starts to group participants into clusters of “like-minded” responders.
But my favorite part is that the algorithm can identify “consensus statements” across these diverse clusters and show where these different groups actually agree with each other. Participants can see how this plays out over the days and weeks this conversation is being run. It also allows them to create new statements that respond to what they see in that overview.
They do all this without engaging in a “comments” section, which on most platforms is the seat of the divisiveness and hate we find online. As this conversation evolves, the statements become more specific, realistic, and workable.
Now, those who have designed this conversation have real-world feedback that can help make practical, far-reaching, and tangible solutions available to decision-makers. These conversations and results have already been applied in places like Bowling Green, Kentucky, and New York City, all the way to Taiwan, to get a real-world consensus on what citizens think and feel about real issues they face in their communities. The local governments can take this feedback and make their decisions based on what the participants have to say. All of this occurs while the algorithm points towards the consensus statements that are reached across the different viewpoints.
Soliya
Another presenter at the conference that piqued my interest was Waidehi Gokhale. At the time of the conference, she was the CEO of Soliya. After the conference, I was able to speak with her, and it was one of the best conversations that came from this event.
Soliya is an International Non-Profit with headquarters in Manhattan and Tunisia. Soliya has been in operation for over 20 years and was founded by Lucas Welch and Liza Chambers. At the onset, they understood that to bridge divides, they needed to get more people to come together and have meaningful exchanges across those divides. They design each experience to drive dialogue through a virtual exchange in a way that leans toward empathy and curiosity.
Soliya’s original focus was on the 18 to 25 demographic because it is a period of personality crystallization. It is a time in most people’s lives when they find their “tribe” and are exposed to others. Waidehi says, “We thrive in tribe; it is human nature. But wouldn’t it be wonderful to keep the walls of these tribes permeable so we can bend and flex as things inevitably change?”
Soliya is a fee-for-service business in which anyone can come to them and say, “Our community is having trouble with this particular issue.” Or, a professor can come to them and say, “I would like for my students to have a facilitated discussion about this topic.” They have four global versions and one US-based, so no matter where you are, they should be able to accommodate you.
They will do an initial consultation and a design process, then deliver you a customized program with a facilitator. The applications of this are extensive. From the professor in the classroom to a community leader who would like to get 20 people together to address an issue. Or, a company that has expanded to three different regions and is having a hard time getting its people to communicate.
They can also provide facilitator training for those who wish to guide others in difficult conversations. Now you can learn how to have these conversations, and you can have them with others, who can then have them with others, etc.
These dialogue experiences are meant to be facilitated by someone from Sloiya who serves as the “invisible hand.” The facilitators are there, but they are not contributing. Because as Waidhei points out, “Can people come together for conversations? Absolutely. Can these conversations remain productive? Less absolutely.” These facilitators serve as guardrails to keep things on track. And this is important because these topics and issues can be difficult for some to navigate without getting emotional.
“We are not driving to common ground,” she explains. “We are driving through the mud.”
“Let’s ask the hard questions and have the tough conversations, and at the end of it, if we can find agreements, that’s wonderful. But what we are trying to accomplish is for those participating to understand and respect the differences.”
“We can still walk away thinking differently; we can still walk away voting differently, but I am now more human to you, and you know that I have heard and respected what you are saying and feeling.”
“This helps us not just put people in drawers and then close them.” Which is something that we all do.
Keeping in mind that although stereotypes do come from somewhere, it is important to recognize and not weaponize our differences.
My conversation with Waidehi left me optimistic and energized. It was relieving to know that people like her are working on the solution rather than just identifying the problem.
Gatheround
With things like “alternative facts” these days, it’s hard for these polarized groups to agree on common truths. For example, to have a conversation about shapes, we all have to agree that a square has four equal sides and four right angles. Without this shared understanding, nothing productive or meaningful will come from our interaction. And, more likely, this interaction will devolve into something more destructive.
After the conference, I spoke with the CEO of Gatheround, Lisa Conn. Lisa has studied at the MIT Sloan School of Management and the MIT Media Lab. She also worked in community partnerships at Facebook, now Meta.
During her time at Facebook, Lisa gained excellent insight. She noticed that when groups would form, let’s say, “Hiking With Your Kids,” to use her example, the members would begin to interact. People from different backgrounds and beliefs would join under this “shared container”, understanding that they liked hiking with their kids. But, as these groups evolved, Lisa began to notice that, naturally, differences would arise.
They may start talking about the weather during hikes, leading to talks about climate change and differences in opinion. Then the group admin would shut these conversations down, saying something like, “No politics here,” etc.
Lisa’s insight was that these groups are containers of shared understanding and are exactly the space where conversations could be the most productive. But the tools to facilitate this didn’t exist on Facebook. So, she left and co-founded Gatheround.
When developing Gatheround, Lisa decided to implement her platform in the workplace. Because the workplace is where most adults spend most of their time, it is also the place that will have the greatest likelihood of “cross-cutting interactions,” as she puts it. These are interactions between people who do not share the same ideology.
Even though you may not share all of the beliefs of your co-workers, you are more likely to have shared values and a common mission as far as the goals of the workplace are concerned. This helps the walls of the different tribes to be more permeable and easily traversed.
The tools used in Gatheround were developed over her time spent with different peacebuilding organizations, and her approach boils down to Structure + Facilitation = Engagement. And in using the tools provided in Gatheround to brainstorm to solve a business problem, you will also learn the tactics of listening, storytelling, compassion, and empathy, with the hope that this will carry over into other parts of your life.
As she gave me a live demo of the platform, I began to get more and more excited. The tools and tactics used are simple and very effective in building camaraderie and empathy. She showed me how a simple question like, “What was your favorite food growing up?” could uncover things like the environment in which you were raised, your socioeconomic background, and your personal preference, ultimately making you a better colleague and collaborator.
Gatheround, as it is today, is the result of many iterations. What started as a list of questions is now a series of quizzes, polls, flashcards, and more. It even has a little photo booth. The experience is highly engaging, fun, and very effective. They now have a “library of experiences”, as she calls them, that you can implement with your team.
This platform is brilliant. For me, this was one of the gems I stumbled upon at the conference. This is how technology can be used to expose the interdependence our different tribes rely on to make our society function in a healthy manner. And it does this without engaging in the aspects of technology that drive destructive conflict and polarization.
What’s next?
I went to the Conference on Designing Tech for Social Cohesion, not knowing what I would find. And although, at times, it seems like the problem we face with technology’s role in our collective dysfunction can be insurmountable, there are ways to correct it. Not only that, there are people who recognize the problem and are implementing effective solutions.
We still have a long way to go. More people need to get involved. More solutions have to be identified and executed. More light has to be shed, not only on the problem but also on the people working to remedy it. We cannot continue to blindly consume every new toy the tech companies throw at us and not examine how it affects us.
As a species, our present and future are now inextricably linked to technology. It will and does modify our behavior, thoughts, and emotions. Ignoring this fact means we have no control over how it affects us moving forward.
In this volatile and shifting point in our history, I feel the late science fiction writer Isaac Asimov may have put it best: “It is change—continuing change—inevitable change—that is the dominant factor in society today. No sensible decision can be made any longer without considering not only the world as it is, but the world as it will be.”
Dig deeper
To readers: The Associated Press’ initiative “Not Real News: A look at what didn’t happen this week” publishes weekly roundups exposing untrue stories and visuals.
James Marlin, Executive Contributor Brainz Magazine
James Marlin is a professional questioner, storyteller, dad, and husband with a passion for investigating. He works to distill complex findings into actionable and relatable information through his written articles and keynote talks. Having battled and overcome addiction, James firmly believes in the power of change. In the last five years, James has dedicated himself to investigating our beliefs, emotions, the conscious and subconscious minds, addiction, ADHD, mental illness, and the impact of technology on society. James is enrolled in a Modern Journalism course with NYU in partnership with Rolling Stone Magazine. Alongside his studies, he works as an investigator in the city of New York.