SOCIAL MEDIA’S DAY’S OF RECKONING: HOW FACEBOOK AND TWITTER MAY CHANGE

The president’s supporters invaded the Capitol as lawmakers prepared to affirm Joe Biden’s victory.

I do not think it’s possible to overstate the impact of social media and really the Internet on what happened. On January 6th, rioters stormed the US Capitol building to overturn the results of a free and fair election.

I mean, I don’t think we’ve ever faced a situation where you know the president of the United States is is advocating and effectively advocating armed insurrection against the Congress and certifying the electoral vote to beat him.

These events were inspired by President Trump, organized and promoted on the platforms of publicly traded companies, most notably Facebook and Twitter. Five people died. Twitter has just blocked the president of the United States.

To avoid further violence, Those companies and then many more thereafter banned or blocked President Trump’s access to the megaphone they provide. I’m definitely concern that a world leader was removed from one of the two platforms, which basically contained almost all of our speech today.

This exposed a major flaw in the business model of many social media platforms. Share first, think later. There’s nothing magic About January 20. The people who stormed the Capitol, there are many more people like that in other cities.

Without a check on what information is shared, lies spread like wildfire and are just as difficult to stop. Back peddling is just kind of spineless. It didn’t change the actual perspective of whether there was voter fraud or not. There are many answers, but no consensus on how to protect free speech while respecting facts and keeping these companies profitable.

We know that the government wasn’t going to be overthrown. At the same time, we know that our political leaders have a game plan for political instability, but it’s not as important as the corporate leaders who seem to have a better game plan. You know, if the feedback loop to Facebook was fix this problem, otherwise we will sell your stock. They would have fixed it a long time ago.

So Well, social media platforms stop the spread of fake news. And how did big tech manage to anger politicians and business leaders alike?

BANNING A PRESIDENT

The process of thinking that that a president would ever stoop to the level that President Trump has been hard for most Americans to wrap their minds around, let alone the fiscal policy apparatus. Since the Telecommunications Act of 1996 legalized Section 230, social media platforms have had the ability to regulate what kind of speech is posted on their sites.

So Twitter and Facebook both have the standard of imminent harm. If you post something that’s going to lead to imminent harm on their platforms, they will suspend you and sometimes they’ll ban you.

What was the implication of doing that, of limiting free speech and also what’s the implication of doing that to the president of the United States? I think we can’t underestimate in this situation the fact that we are going to have a different president in a couple of weeks.

Taking Trump off the platform is a desperation act. And I think at this point probably very important. And I think I commend Facebook for doing that. But I would point out that I don’t think it solves the problem, that essentially they’ve allowed this kind of behavior to build for so long that it’s tripped over into the real world. This isn’t just an issue in the US. Facebook has previously admitted it was used to incite violence in the Rohingya genocide.

SHORT- TERM PROFITABILITY

This whole thing has become such a huge problem that we really need to rethink the role of Internet platforms in our society. Their product is based entirely off of morality. They want you to share content based off of instinct. You hit the retweet button, you hit the share button, and you don’t really think a lot about the information that you’re passing along. So it ends up that the content that does well on that platform is stuff that plays to emotion and plays to instinct and not thoughtfulness.

We optimized for short term profitability at the sake of our democracy. And what we left in tatters was any sense that there was any sort of moral or ethical imperative that would govern decision making at that company. And so that saddens me, saddens me for the people that work there as a businessperson.

I think we’re also slightly to blame because we’ve sort of, you know, said that that’s OK because we’ve been enamored with the short term profitability of Facebook. The platforms over the last couple of years have consistently adjusted policies to really make way for Donald Trump to say what he can say. You’ve seen it over. You’ve seen it with the creation of different labels and checkmarks and different ways of limiting the ability of the information to spread, but still showing the information. And it turns out.

In human nature, that hate speech and disinformation and conspiracy theories are particularly engaging, they force us to look, it’s part of our survival instinct and that business model is the problem. It’s the amplification of dangerous content that I would like to restrict. Changing the business model to better highlight thoughtful instead of emotional content could help fight the spread of misinformation. But users and the news media play a role in protecting facts, too.

There is no doubt that Donald Trump has attracted people to Twitter. But if you really look at his social media superpower, I don’t actually think it’s tweets that go up on Twitter. It’s the fact that those tweets then become the news. 0users can do more to think about what is it that I’m sharing because it’s so easy to, like, not have any hesitation at all and just click based off of emotion. This is an example I like to give. You know, if you see something that says Hillary Clinton is a space alien, you’re going to know that’s not true. You don’t like Hillary.

You might hit that share button, end up passing it along to all your followers. But if you actually had to copy and paste that and put it under your own name, you might have second thoughts. So I think there’s two parts to this. One is it’s on us to be smarter about the way that we share, but two is it’s on the platforms, to put some speed bumps in to make it to a place where we don’t end up having so much instinctual sharing and we end up having more thoughtful and even kill sharing.

PUBLISHERS AND ANTITRUST

I think Section 230 needs to change, and I think that these social media companies need to be dealt with as publishers and I think at the front of the line is Facebook, because they are algorithmically deciding there are people inside of that company that are building these things, that are amplifying the lobotomization the intellectual cornering of people so that they cannot learn what’s really happening, so that their worst fears and their worst concerns are amplified. And we need to do a better job of understanding that that diet is unhealthy.

If they are regulated as publishers, it would entirely change how the companies operate, requiring approval before every post. They are a publisher, just like you and I.

We have a responsibility to tell the truth to our audiences and they have the responsibility with so many people turning to these platforms for news and information to make sure the news and information they get in front of them is true and accurate. And they just failed at that. And so many of these things, these conspiracies that were promoted by the president start on the dark corners of the Internet and kind of filter up from the dark corners to Twitter to Reddit to Facebook.

And before you know it, the mainstream media is covering it. Now that Democrats have taken the Senate and Congress and the White House, I find that there’s very little chance of anything happening to Section 230. Joe Biden has expressed some interest in looking at the law. But I think that when you’re a candidate and when you’re a president, you experience two different realities. Many social media giants have ballooned well past their original intentions.

Facebook, which famously started in Zuckerberg dorm room, reported over one hundred and ninety six million active daily users across the US and Canada. In the third quarter of 2020, their revenue was twenty one point forty seven billion dollars. The really central thing for investors is that regulation is definitely coming. I think you’re going to see it in three areas safety, which really is about having accountability for products that people have to make sure that they make an effort to prevent harm.

This is Section 230 and all that is a part of that problem. Then you’re going to see regulation of privacy. We see California and we also see Apple implemented really important privacy things. And then lastly, antitrust, where we’re very far along and yesterday lost in all of the news.

Facebook announced that it will require WhatsApp users to agree to a merging of their WhatsApp and Facebook data. This is a challenge to antitrust regulators. Mark And the team there have been saying for years, we’re going to integrate the back end of all these services, which is one of the effects that will help us make it virtually impossible to pull them apart.

And then on the other side, you’ve got folks like Apple who are charging, you know, a 30 percent big I’m sure they wouldn’t appreciate the mafia term big, but a 30 percent charge on all subscriptions and on and anything you buy on iOS, which it’s hard to argue, that’s not leverage in the marketplace and a monopoly, because if you didn’t have Monopoly, you wouldn’t be able to charge 30 percent.

So I think there’s going to be they’re going to be coming at them from lots of different angles. And I think they should be rightly concerned.

ECHO CHAMBERS

I think the platforms really need to take a look at themselves and what they’re causing society to do and say, Is there anything we can fix within our product that’s going to make a difference here.

I think That we have to make a very clear distinction here? I think Twitter and Facebook are two very different products. I think the thing with Twitter is I actually think they’re less to blame. And the reason is because the product is designed for everybody to be able to follow everybody else.

So there is an incredible free flow of information. It does create a different problem, which is how do you construct a healthy social media diet? But that’s what the problem is on Twitter. The problem on Facebook is different, which is it? It amplifies echo chambers on purpose and by design. And even if you block a user from Facebook or Twitter, they can find other digital communities outside the purview and regulation of big tech.

When you ban someone from one platform, they don’t just vanish from the Internet, they go somewhere else. And we’ve seen a real migration from places like Facebook and Twitter to free speech social networks like gab and parlor, where content moderation is not enforced. So you end up seeing a lot of the QAnon type stuff that is not allowed on the bigger platforms anymore anymore.

Migrate over there and get more radical because people are in an echo chamber and they just keep talking to each other and amplifying their own views. And there’s really nothing else except for that imagined reality. I think you’re going to see a lot of the people who are leaving Parler say find me aontelegram or find me on Gab and put up their contact information. And there will just be this migration. And at a certain point, you do hit a place where you are outside the purview of big tech Gab for instance set up its own servers, which is why it’s still running while Parlar is down.

I don’t think they’re going to be channeled elsewhere. We’ve seen so many attempts to create alternatives to Facebook and and Twitter for these kind of people who feel like they’re being censored on these major platforms, Parlar is still a small player. We’ve seen Gab try and failed to become the Twitter for the alt-right. They never work out. And that’s because these platforms have gotten so large there’s nowhere else to go.

A lot of this is about enforcement at the end of the day that in many cases the sites have the right policies and have articulated them over time and striking the balance between allowing for pretty extensive political speech and wide, robust, open debate. And the move to violent insurrection can sometimes, unfortunately, be quite, quite a thin line.  

Can Spirit Airlines Stage A Comeback?