Advertisement

Columns

Hard Truths From Hashtags: Learning From the Twitter Files

{shortcode-837a01e90787946778282108634ccf512041ef4d}

In 2016, 98 percent of college students in the Western world had a Facebook account. Despite this, it seems we rarely think about the forces that dictate what we see and say on social media.

Consider, for example, the Twitter Files: a set of internal documents about Twitter’s content moderation decisions on a multitude of controversial issues such as tweets about Covid-19 that conflicted White House policy, the New York Post’s coverage of material on Hunter Biden’s laptop, and United States government accounts attempting to promote pro-U.S. interests in the Middle East.

The Twitter Files showcase instances of moderation decisions that favor certain viewpoints or philosophies. This is at odds with Twitter’s prior vow to enforce its rules “impartially and consistently, considering the context involved.” Moreover, the lack of clarity in moderation practices — as evidenced by “secret blacklists” — contradicts Twitter’s older commitments to transparency and accountability.

While we believe that social media companies like Twitter are entitled to set their policies as they wish, these practices should be clearly communicated and uniformly enforced so that users can make an informed decision as to whether they will use the platform.

Advertisement

When it comes to censorship, there should be a free market of sorts. Private institutions may censor or promote whatever they like, but only at their own peril. If patrons object to a company’s moderation policies, they can move to another platform whose guidelines they prefer.

Key to this paradigm, however, are consistency and transparency. Twitter failed on both accounts. Its user base needed to understand the full extent of the site’s actions and moderation policies, so that they could’ve knowledgeably chosen to abandon the platform in case of objection.

In contrast, we commend Apple, which, in 2016, repeatedly stuck by its stated policies and principles on data security, even under intense scrutiny from government and law enforcement — for example, to unlock the iPhone of a terrorist. By being clear and consistent in its policies, Apple has empowered consumers to make informed decisions about their engagement with the company’s products.

This transparency also helps patrons determine how they should best conduct themselves on a given platform. When censorious policies are clear, users can take individual steps to avoid consequences like being shadow banned or removed from the platform.

This issue of speech policies lacking lucidity is shared by Harvard as well. Our University has vague policies governing student conduct, use of computer networks, and student organizations.

It is legitimate to have such policies; these are all aspects of the student experience that Universities have reason to moderate. However, when these policies are ill-defined, there exists opportunity for administrative abuse and overreach.

Our use of computers and networks policy provides some hard rules for what appropriate usage allows, but leaves a lot of room for discretion — which it fills by telling us to be “careful, honest, responsible, and civil.” These are commendable principles, but what does it actually mean to be “civil,” for example? We should not have to ponder philosophical questions like this before determining if our speech online may be restricted or punished.

As an example of a better approach, American University’s responsible use of technology policy, while it acknowledges that there may be cases not covered by the policy, makes it more explicitly clear what is allowed and what is not. The university bans the use of networks to “harass, stalk, intimidate, or impersonate another person,” dox individuals, or exchange unauthorized copyrighted materials, to name three of their eleven listed prohibited actions. Furthermore, American University makes it clear what consequences to violating these policies may be.

As phone-addicted college students, we are subject to speech codes in every waking moment, whether or not we know it. Understanding how algorithms on social media and written rules at our own University shape what we are exposed to should be taken very seriously.

Our entanglement with social media is not just a passive doom-scroll through streams of content — it’s an active engagement with platforms that wield significant power over public discourse. It’s imperative that we demand the transparency and consistency necessary to be informed consumers.

Milo J. Clark ’24 is a Physics concentrator in Lowell House. Tyler S. Young ’26 is a joint concentrator in Electrical Engineering and Chemistry & Physics in Leverett House. Their column, “Voices Unbound,” runs bi-weekly on Tuesdays.

{shortcode-ee0cd9d50ab79f4590764f25d3873f1dfe52fd27}

Tags

Advertisement