April 10, 2018 Breach

In March 2018, The New York Times and The Observer of London published explosive investigations revealing that Cambridge Analytica, a British political consulting firm, had obtained the personal data of up to 87 million Facebook users without their consent. The data had been harvested through a third-party application on Facebook's platform, then used for political profiling and targeted advertising during the 2016 U.S. presidential election and the Brexit referendum. The scandal exposed catastrophic failures in third-party data governance and led to the largest privacy enforcement action in history.

How the Data Was Harvested

In 2014, Dr. Aleksandr Kogan, a psychology researcher at the University of Cambridge, created a Facebook application called "This Is Your Digital Life." The app presented itself as a personality quiz and was installed by approximately 270,000 Facebook users, who consented to share their profile data for academic research purposes.

However, Facebook's platform policies at the time allowed third-party app developers to access not only the data of users who installed the app, but also the data of all of those users' Facebook friends. Through this mechanism, Kogan's app harvested the personal data of approximately 87 million Facebook users — more than 300 times the number who actually consented to use the app.

The collected data included public profiles, page likes, birthday, current city, and in some cases, the content of private messages and news feed posts. Kogan then transferred this dataset to Cambridge Analytica, violating Facebook's terms of service, which prohibited developers from selling or sharing collected data.

Facebook's Failure of Third-Party Oversight

Facebook learned about the unauthorized data transfer to Cambridge Analytica in December 2015, when a Guardian journalist first reported on the connection. Facebook demanded that both Kogan and Cambridge Analytica certify they had deleted the data. Both parties provided written certifications.

However, Facebook did not verify that the data had actually been deleted. It did not conduct an audit of Cambridge Analytica's systems, did not notify the affected users, and did not report the incident to regulators. It was not until the 2018 investigative reporting by the Times and the Observer — more than two years later — that the full scope of the data misuse became public.

This failure of vendor oversight is a textbook example of inadequate third-party risk management. Facebook operated a platform that granted third-party developers access to vast amounts of user data but lacked the controls, monitoring, and enforcement mechanisms to ensure that data was handled responsibly.

TPRM Lesson Learned: The Facebook-Cambridge Analytica scandal demonstrates that granting third parties access to data is only the first step — organizations must continuously monitor and verify how that data is used, stored, and shared. A vendor's written certification of compliance is not a substitute for auditing and verification. Effective third-party risk management requires enforceable data-sharing agreements, automated monitoring of data flows, regular audits of third-party data handling practices, and clear incident response procedures when violations are discovered. Simply asking a vendor "did you delete the data?" is not TPRM.

Regulatory and Legal Consequences

The fallout from the Cambridge Analytica scandal was unprecedented in the technology industry:

Impact on Global Privacy Regulation

The Cambridge Analytica scandal became a catalyst for privacy regulation worldwide. While the EU's General Data Protection Regulation (GDPR) had already been adopted in 2016, the scandal provided powerful public momentum for its enforcement when it took effect in May 2018. The incident also accelerated the development and passage of the California Consumer Privacy Act (CCPA) in 2018, Brazil's LGPD, and privacy legislation in numerous other jurisdictions.

For the third-party risk management community, the scandal was transformative. It demonstrated that data shared with third parties can be re-shared, repurposed, and weaponized in ways the original platform never intended or anticipated. It made clear that vendor risk does not end at the point of data transfer — it persists for as long as the data exists.

Lessons for Third-Party Risk Management

The Facebook-Cambridge Analytica case provides critical guidance for TPRM programs governing data-sharing relationships:

The Facebook-Cambridge Analytica scandal remains a defining moment in the history of data privacy and third-party risk management. It proved that even the world's largest technology platform can lose control of user data when third-party governance is treated as an afterthought rather than a core security function.

Protect Your Organization from Third-Party Risk

Fair TPRM is a free, open-source platform for vendor risk management, GRC compliance, and FAIR risk quantification.

Free Demo Download Source

Sources & References

  1. FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook - Federal Trade Commission, July 2019
  2. Disinformation and 'Fake News': Final Report - UK Parliament Digital, Culture, Media and Sport Committee, February 2019
  3. How Trump Consultants Exploited the Facebook Data of Millions - The New York Times, March 2018
  4. Facebook to Pay $100 Million for Misleading Investors About Data Misuse - U.S. Securities and Exchange Commission, July 2019
  5. Facebook Ireland Ltd Monetary Penalty Notice - UK Information Commissioner's Office, October 2018