In March 2018, The New York Times and The Observer of London published explosive investigations revealing that Cambridge Analytica, a British political consulting firm, had obtained the personal data of up to 87 million Facebook users without their consent. The data had been harvested through a third-party application on Facebook's platform, then used for political profiling and targeted advertising during the 2016 U.S. presidential election and the Brexit referendum. The scandal exposed catastrophic failures in third-party data governance and led to the largest privacy enforcement action in history.
How the Data Was Harvested
In 2014, Dr. Aleksandr Kogan, a psychology researcher at the University of Cambridge, created a Facebook application called "This Is Your Digital Life." The app presented itself as a personality quiz and was installed by approximately 270,000 Facebook users, who consented to share their profile data for academic research purposes.
However, Facebook's platform policies at the time allowed third-party app developers to access not only the data of users who installed the app, but also the data of all of those users' Facebook friends. Through this mechanism, Kogan's app harvested the personal data of approximately 87 million Facebook users — more than 300 times the number who actually consented to use the app.
The collected data included public profiles, page likes, birthday, current city, and in some cases, the content of private messages and news feed posts. Kogan then transferred this dataset to Cambridge Analytica, violating Facebook's terms of service, which prohibited developers from selling or sharing collected data.
Facebook's Failure of Third-Party Oversight
Facebook learned about the unauthorized data transfer to Cambridge Analytica in December 2015, when a Guardian journalist first reported on the connection. Facebook demanded that both Kogan and Cambridge Analytica certify they had deleted the data. Both parties provided written certifications.
However, Facebook did not verify that the data had actually been deleted. It did not conduct an audit of Cambridge Analytica's systems, did not notify the affected users, and did not report the incident to regulators. It was not until the 2018 investigative reporting by the Times and the Observer — more than two years later — that the full scope of the data misuse became public.
This failure of vendor oversight is a textbook example of inadequate third-party risk management. Facebook operated a platform that granted third-party developers access to vast amounts of user data but lacked the controls, monitoring, and enforcement mechanisms to ensure that data was handled responsibly.
Regulatory and Legal Consequences
The fallout from the Cambridge Analytica scandal was unprecedented in the technology industry:
- FTC Fine: $5 Billion. In July 2019, the U.S. Federal Trade Commission imposed a $5 billion penalty on Facebook — the largest privacy fine in history and approximately 20 times larger than any previous FTC privacy penalty. The settlement also required Facebook to establish an independent privacy committee on its board of directors and submit to regular third-party audits of its privacy practices.
- SEC Settlement: $100 Million. The U.S. Securities and Exchange Commission fined Facebook $100 million for misleading investors about the risk of third-party misuse of user data.
- UK ICO Fine: £500,000. The UK Information Commissioner's Office fined Facebook the maximum allowable amount under the pre-GDPR Data Protection Act 1998. (Had GDPR been in force at the time of the breach, the fine could have been up to 4% of global annual revenue.)
- Cambridge Analytica dissolved. Cambridge Analytica filed for bankruptcy in May 2018, just two months after the scandal broke.
- Congressional testimony. Facebook CEO Mark Zuckerberg testified before the U.S. Senate Commerce and Judiciary Committees in April 2018, facing questions about the company's data practices and third-party oversight.
Impact on Global Privacy Regulation
The Cambridge Analytica scandal became a catalyst for privacy regulation worldwide. While the EU's General Data Protection Regulation (GDPR) had already been adopted in 2016, the scandal provided powerful public momentum for its enforcement when it took effect in May 2018. The incident also accelerated the development and passage of the California Consumer Privacy Act (CCPA) in 2018, Brazil's LGPD, and privacy legislation in numerous other jurisdictions.
For the third-party risk management community, the scandal was transformative. It demonstrated that data shared with third parties can be re-shared, repurposed, and weaponized in ways the original platform never intended or anticipated. It made clear that vendor risk does not end at the point of data transfer — it persists for as long as the data exists.
Lessons for Third-Party Risk Management
The Facebook-Cambridge Analytica case provides critical guidance for TPRM programs governing data-sharing relationships:
- Minimize data access. Grant third parties the minimum data necessary for their stated purpose. Facebook's policy of allowing apps to access friends' data was a systemic design failure.
- Verify, don't trust. Written certifications from vendors are not sufficient. Organizations must conduct periodic audits to verify that third parties are handling data in accordance with agreements.
- Implement data flow monitoring. Automated systems should track how data moves through third-party relationships and flag unauthorized transfers or access patterns.
- Plan for downstream sharing. TPRM assessments must account for the possibility that a vendor shares data with fourth parties, creating cascading risk that is difficult to control.
- Respond promptly to violations. Facebook's two-year delay in addressing the known data misuse dramatically amplified the damage. Incident response procedures must include timely notification of affected individuals and regulators.
The Facebook-Cambridge Analytica scandal remains a defining moment in the history of data privacy and third-party risk management. It proved that even the world's largest technology platform can lose control of user data when third-party governance is treated as an afterthought rather than a core security function.
Protect Your Organization from Third-Party Risk
Fair TPRM is a free, open-source platform for vendor risk management, GRC compliance, and FAIR risk quantification.
Free Demo Download SourceSources & References
- FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook - Federal Trade Commission, July 2019
- Disinformation and 'Fake News': Final Report - UK Parliament Digital, Culture, Media and Sport Committee, February 2019
- How Trump Consultants Exploited the Facebook Data of Millions - The New York Times, March 2018
- Facebook to Pay $100 Million for Misleading Investors About Data Misuse - U.S. Securities and Exchange Commission, July 2019
- Facebook Ireland Ltd Monetary Penalty Notice - UK Information Commissioner's Office, October 2018