Why Digital Rights are Important to Avoid Ubiquitous Oppression from Data
In response to the latest Facebook whistleblower, digital rights and ethics in technology are taking the main stage. Ever since 2013, National Security Agency (NSA) whistleblower Edward Snowden has called for new international laws to protect data privacy, arguing that now that citizens know about mass data surveillance, and it is time to “assert our traditional and digital rights so that we can protect them.”
What Are Digital Rights?
Digital rights refer to technological human rights that allow people to “access, use, create, and publish digital media, as well as access and use computers, other electronic devices, and communications networks.” Digital rights are closely linked to the Universal Declaration of Human Rights by the United Nations, applying the declaration to the online world. Types of digital rights include universal and equal access to the internet, freedom of expression, information and communication, privacy and data protection, the right to anonymity, the right to be forgotten (removed from internet searches), the protection of minors, and intellectual property.
However, the existence of digital rights may also bring about their exploitation. The development of digital ethics is to prevent violations of these rights. This idea led to the creation of the Electronic Frontier Foundation (EFF), a non-profit organization founded by Internet activists John Perry Barlow, Mitch Kapor, and John Gilmore. EFF's work defending digital rights cases in court created the foundation for the international recognition of cyber-rights. For example, in 1996, Barlow published an article titled A Declaration of the Independence of Cyberspace where he noted that the postal service was inviolable. Still, emailwas not, highlighting discrepancies between the Bill of Rights and violations of citizens' rights on the Internet.
Facebook Whistleblower:
Frances Haugen, known colloquially as the Facebook Whistleblower, worked at Facebook as a product manager for civic misinformation. She studied how the social network's algorithm amplified misinformation and was exploited by foreign adversaries. She previously worked at companies such as Google, Yelp, and Pinterest, but the problems she saw at Facebook were far worse than any at the other companies. Before Haugen left the company in May, she copied tens of thousands of pages of internal research that she then gave the documents to the Wall Street Journal for their series about Facebook. Haugen has since spoken out about her decision on 60 minutes and testified before the Senate Commerce subcommittee on consumer protection. Haugen also filed complaints with the Securities and Exchange Commission, hinging on the claim that Facebook misled investors and advertisers by misrepresenting what the company knew the platform was used to spread misinformation, and the measures (or lack thereof) the company was taking to fix the problem. A turning point for Haugen was when the company dissolved the civic integrity team after the 2020 U.S. election.
“I saw that Facebook repeatedly encountered conflicts between its own profits and our safety,” Haugen said in her testimony. “Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism, and polarization and undermines societies around the world. I came forward because I recognized a frightening truth: almost no one outside of Facebook knows what happens inside Facebook. The company’s leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world.”
NPR reports that Haugen's lawyers allege that Facebook violated U.S. securities laws by lying to investors. They have also filed eight complaints with the Securities and Exchange Commission on Facebook's public statements relating to what the company knew about how organizers of the Jan. 6 Capitol siege used its platform; how effective the site was at removing hate speech; and how Instagram amplified body image issues in teens and young users.
The Future of Digital Rights
Non-profit research initiatives such as Ranking Digital Rights have found that internet and telecommunications companies are failing to respect their users’ rights to digital privacy and freedom of expression to varying degrees.
Most requests for user information such as Facebook users’ IP addresses and account details have come from US law enforcement agencies. Brands want to look at what content people create and share, social media profile information, and location data from mobile phones to help them figure out how each person spends their money.
A legislative framework is needed to protect digital rights. Unlike the UN Declaration of Human Rights, the lack of international consensus on digital rights has led each country to develop its own Digital Rights Charter. In 2018, the General Data Protection Regulation (GDPR) was created by the EU to preserve citizens' personal data as well as allow the free movement of data. 120+ countries worldwide have some kind of legislation protecting personal data and access to information on the Internet. Unlike in the USA, where there is no uniform federal data protection law, and each state makes its own. Countries such as the US need to take legislation on digital rights seriously and hone in powerful tech giants such as Facebook.