Twitter announced today the selection of two proposals to study the health of its network.
The selection comes after a public request for proposals in March to study the network's "health metrics," which Twitter CEO Jack Dorsey said was part of the company's commitment to "increas[ing] the collective health, openness, and civility of public conversation."
We’re committing Twitter to help increase the collective health, openness, and civility of public conversation, and to hold ourselves publicly accountable towards progress.
— jack (@jack) March 1, 2018
On Friday, Twitter held its Q2 2018 earnings call, where it reported a drop in one million monthly active users -- which was partially the result of its sweeping removal of accounts from its site.
That account removal was also part of the company's larger efforts to improve user experience, by eliminating accounts belonging to trolls, spammers, or malicious bots.
In the days since the earnings call, Twitter's stock, at one point, dropped by as much as 27%.
Source: Google
While it's unclear if the timing of today's announcement was in any way a response to the fallout from its earnings call, it doesn't seem to have boosted investor confidence.
Here's a look at the proposals, as well as the public perception of them.
Twitter Announces Selected Proposals to Study Network Health Metrics
"Examining Echo Chambers and Uncivil Discourse"
The first study -- which will be led by researchers from four universities -- will examine how different "communities" of Twitter users come together when political discussions take shape, and the issues that sometimes arise from them.
One of those issues is the formation of these communities into digital echo chambers, which is what often happens "when discussions involve only like-minded people and perspectives," as Twitter's official statement about the proposal describes it.
That can cause a greater reluctance to hear or try to understand other points of view (and those who hold them), leading to the uncivil discourse alluded to in the study's title.
Researchers say such discourse is chiefly comprised of two types of "problematic" behaviors on Twitter.
The first is "incivility," which the statement essentially chalks up to rude behavior and dialogue among platform users.
The second is "intolerant discourse," which is more severe -- and includes things like hate speech and racism.
One of the projected outcomes of this study is the development of algorithms that can differentiate incivility from intolerant discourse. While one is impolite, the statement says, the other "is inherently threatening to democracy."
Additionally, the study seeks to measure just how much Twitter users actually acknowledge and participate in conversations with those who share other viewpoints. Whether or not it will also measure the nature of that discourse -- and how uncivil or intolerant it is -- remains unclear.
"Bridging Gaps Between Communities on Twitter"
The second study -- which appears to complement the first -- seeks to determine to what extent user-to-user engagement with different viewpoints can decrease prejudice and discrimination.
Led by researchers from the University of Oxford and the University of Amsterdam, this study builds upon previous findings that when discourse between different groups includes exposure to different perspectives -- among other factors, like critical thinking -- it can reduce prejudice.
An update! We’ve selected 2 partners from 230 idea submissions. Our first goal is working to measure the “health” of public conversation, and that measurement be open and defined by third parties (not by us). https://t.co/QjUg5P1RLZ
— jack (@jack) July 30, 2018
Greater Context
This announcement comes after months of criticism toward Twitter's approach and efforts toward improving the user experience on its network.
As recently as Sunday, for example, it was discovered that some users accounts were suspended for including the words "Elon Musk" (the name of the Tesla and SpaceX CEO) in their names or handles.
If you put "Elon Musk" in your handle, the very next Twitter screen you click to will be a notice that your account has been locked.
— April Daniels (@1aprildaniels) July 29, 2018
They have the technology to clamp down on Nazis using Nazi buzzwords to organize. They could do that. They chose not to. pic.twitter.com/diDjfW1DpG
HubSpot Art Director Tyler Littwin had a similar experience.
"Twitter was abuzz with all these 'don't change your name to Elon Musk' jokes, and I was curious as to whether this was real or not," he says. "Long story short: it is."
While Twitter is punishing that behavior, many say that at the same time, the network has not only allowed those who could be more detrimental to conversational health -- like white supremacists -- to freely use the network, but has also verified their accounts.
Littwin said that pretense contributed to his confusion. "I tried to change my name to 'Real Elon Musk' and was immediately locked out of my account," he explains. "Not a major hassle to deal with, but it still seems like a weird policy for Twitter to aggressively pursue."
That points to a potential flaw in the selection of these proposals. While the metrics researchers aim to discover are critical measures of Twitter's conversational health, what appears to be absent from both of them are proposed solutions to the problems they uncover. As Kia Kokalitcheva of Axios writes, "This won't solve some of the big criticisms of the company, including its policies and enforcement regarding abusive and harassing behavior."
To get a better idea of the public perception of these studies, we asked 717 internet users across the U.S., UK, and Canada: Which study should be prioritized?
Data collected with Lucid. Survey participants were provided with a description of each study.
The results point to the idea that the studies work in tandem. One looks to measure to what extent Twitter users engage with other points of view, while the other seeks to determine if that engagement can reduce prejudice and discrimination.
But again -- if the answer to these questions ultimately is found to be "not much" and "no" -- then what?
Twitter has acknowledged that these studies are "ambitious," perhaps implying that reaching potential solutions -- even with detailed metrics from independent researchers -- could be a prolonged exercise.
As for whether or not it will result in tangible change -- that remains to be seen.
"We simply can’t and don’t want to do this alone," Dorsey tweeted when the request for proposals was first announced in March. "This will take time."