Skip to Content, Navigation, or Footer.
Support independent student journalism. Support independent student journalism. Support independent student journalism.
The Dartmouth
December 25, 2024 | Latest Issue
The Dartmouth

Stanescu-Bellu: Ethics of Data

Facebook's recent scandals have irrevocably damaged users' trust.

Data breaches are not a novelty. They’ve existed ever since humans first started recording data and are an inherent risk of storing information. However, the rise of the digital age has made data breaches much easier to execute. Theoretically, an individual on the opposite side of the world could infiltrate any company’s database, siphon the data of millions of people with some nifty lines of code and sell it, all through a few clicks. Since 2005, there have been over 8,000 data breaches made public, with over ten billion records breached. The recent Facebook-Cambridge Analytica scandal at first seems like it should be just another unfortunate statistic, but its implications are foreboding for the future of digital privacy.

One of the first lectures I attended in the first computer science class I took at the College exposed me to a notion that has resonated with me as I continue to pursue the computer science major. Perhaps the greatest asset of studying computer science at Dartmouth, a liberal arts institution, is the ability to not only write good code but also to understand the ethical implications of that code and be cognizant of its impact on the world. Mark Zuckerberg either did not consider or did not care about the ethical impact and implications of Facebook allowing user data to be readily accessible, and now that approach has come back to haunt him.

Specifically, the Cambridge Analytica scandal did not involve a “breach” of user data but was instead a gross abuse of personal data shared with third-party applications on Facebook. People willingly shared their information with an app that Cambridge Analytica created, but were manipulated into allowing said app to extract data from any unwitting friends of the user. There was no malicious hacker sitting behind a computer, gleefully executing scripts. In this case, the “hacker” was Alexander Nix: the CEO of Cambridge Analytica, an Eton College graduate, a former financial analyst and the mastermind who engineered the mass extraction of data.

Facebook deserves more than a slap on the wrist for this infraction — privacy, which should be a basic human right, was forgone in favor of money derived from a revenue strategy that did not take users’ rights into account. How can people now trust that the information they share on Facebook is safe? People’s liked photos, the pages they follow, their entire friends lists — things people once thought were completely under their own control — were compromised and used against them to build psychological profiles, find weak spots and manipulate them. Facebook knew about the loophole that allowed Cambridge Analytica to obtain this information for two years and did nothing about it. No apology from Mark Zuckerberg can erase the negligence that led to this situation.

For me, this begs the question: what is happening to the data I have stored on other applications? Where is my Google Chrome browsing history going? Where are my Siri questions stored? Some might say that by using these websites and applications, people are renouncing control over personal data. It may be impossible to truly regulate what happens to personal data and how it is analyzed or used. It should be accepted that there may very well be breaches where privacy is compromised. But why should this be the norm? As technology advances and data becomes more accessible for both good and malicious intentions, data also becomes easier to protect. With greater transparency and restrictions on who obtains access to sensitive user information on behalf of the companies users entrust it to, users could be comfortable sharing details about their lives and using these applications.

If the U.S. had something similar to Europe’s soon-to-be-implemented General Data Protection and Regulation law, which gives consumers greater insight into and control over how their data is used, maybe Facebook would have been more stringent and diligent in policing the flow of data on their website. But looking at counterfactuals is pointless. There is no going back from this scandal. Trust has been broken and the shape of the web will be irrevocably altered. Whether that is for the better or for worse remains to be seen.