{"id":2568,"date":"2019-07-26T01:00:00","date_gmt":"2019-07-25T18:00:00","guid":{"rendered":"https:\/\/www.amnesty.or.th\/en\/the-great-hack-cambridge-analytica-is-just-the-tip-of-the-iceberg\/"},"modified":"2024-11-11T18:43:13","modified_gmt":"2024-11-11T11:43:13","slug":"the-great-hack-cambridge-analytica-is-just-the-tip-of-the-iceberg","status":"publish","type":"post","link":"https:\/\/www.amnesty.or.th\/en\/blog\/2019\/07\/the-great-hack-cambridge-analytica-is-just-the-tip-of-the-iceberg\/","title":{"rendered":"\u2018The Great Hack\u2019: Cambridge Analytica is just the tip of the iceberg"},"content":{"rendered":"\n<p>It was the scandal which finally exposed the dark side of the big data economy underpinning the internet. The inside story of how one company, Cambridge Analytica, misused intimate personal Facebook data to micro-target and manipulate swing voters in the US election, is compellingly told in \u201c<a href=\"https:\/\/www.netflix.com\/gb\/title\/80117542\">The Great Hack<\/a>\u201d, a new documentary out today.<\/p>\n\n\n<blockquote class=\"blockquote is-lined\"><p><strong style=\"font-weight: 400;font-family: amnestytradegothicbold, dindanmaibold, Helvetica, Arial, sans-serif\"><span style=\"font-size: large\">One of the most urgent and uncomfortable questions raised in The Great Hack is: to what extent are we susceptible to such behavioural manipulation?&nbsp;<\/span><\/strong><\/p>\n<\/blockquote>\n\n\n<p>But as the former CEO of the now-defunct Cambridge Analytica tells the film-makers, this is \u201cnot just about one company\u201d. The film goes further to open our eyes to the way our lives are constantly monitored &#8211; and controlled &#8211; through digital technology. And it goes to the heart of how far the entire business model of some Big Tech companies may be deeply threatening our human rights.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-rich is-provider-embed-handler wp-block-embed-embed-handler\"><div class=\"wp-block-embed__wrapper\">\nhttps:\/\/youtube.com\/watch?v=iX8GxLP1FHo%3Ffeature%3Doembed\n<\/div><\/figure>\n\n\n\n<p>In the online and digital world, everything you do leaves a trace of \u201cdata exhaust\u201d \u2013 a record of everything, from what time you put petrol in your car, to what websites you visited. When combined, even seemingly innocuous data points can reveal a LOT about a person.<\/p>\n\n\n\n<p>Cambridge Analytica bragged that it had up to&nbsp;<a href=\"https:\/\/www.bbc.co.uk\/news\/technology-46822439\">5000 data points<\/a>&nbsp;on every US voter. By applying \u201cpsychographic\u201d analytics to its dataset, it claimed to be able to determine people\u2019s personality type and then individually micro-target messages to influence their behaviour. The most important source of the data was Facebook. Via a third-party app, Cambridge Analytica improperly obtained data from&nbsp;<a href=\"https:\/\/www.wired.com\/story\/facebook-exposed-87-million-users-to-cambridge-analytica\/\">up to 87 million<\/a>&nbsp;Facebook profiles \u2013 including status updates, likes and even private messages.<\/p>\n\n\n\n<p>But the incident was not an aberration: it was an inevitable consequence of a system founded on harvesting and monetising our information &#8211; the business model that academic Shoshana Zuboff dubs \u201c<a href=\"https:\/\/www.theguardian.com\/technology\/2019\/jan\/20\/shoshana-zuboff-age-of-surveillance-capitalism-google-facebook\">surveillance capitalism<\/a>\u201d. The model\u2019s fundamental characteristics are: aggregating vast amounts of data on people, using it to infer incredibly detailed profiles on their lives and behaviour, and monetising it by selling these predictions to others such as advertisers. Cambridge Analytica simply deployed the same basic model to target voters rather than consumers.<\/p>\n\n\n\n<p>This model has become core to the data economy, and underpins a complex ecosystem of tech companies,&nbsp;<a href=\"https:\/\/www.amnesty.org\/en\/latest\/research\/2017\/02\/muslim-registries-big-data-and-human-rights\/\">data brokers<\/a>, advertisers and beyond. But it is the model\u2019s pioneers Google and Facebook that have unparalleled access to tracking and monetising our lives, by controlling the primary gateways \u2013 outside China \u2013 to the online world (between them Google Search, Chrome, Android, YouTube, Instagram and WhatsApp).<\/p>\n\n\n<blockquote class=\"blockquote is-lined\"><p><span style=\"font-size: large\">Facebook and Google have amassed data vaults with an unprecedented volume of information on human beings. This goes far beyond the data that you choose to share on their platforms to include the vast amounts of data tracked as you engage with the digital world.&nbsp;<\/span><\/p>\n<\/blockquote>\n\n\n<p>Facebook and Google of course have long&nbsp;<a href=\"https:\/\/globalnetworkinitiative.org\/gni-principles\/\">affirmed their commitment<\/a>&nbsp;to respecting human rights. But increasingly, we are being forced to ask whether the internet\u2019s surveillance model itself inherently conflicts with our human rights.<\/p>\n\n\n\n<p>Facebook and Google have amassed data vaults with an unprecedented volume of information on human beings. This goes far beyond the data that you choose to share on their platforms to include the vast amounts of data tracked as you engage with the digital world. Mass corporate surveillance on such a scale threatens the very essence of the right to privacy. Indeed, in 2010, Facebook CEO Mark Zuckerberg famously admitted that social networking had already changed privacy as a \u201csocial norm\u201d.<\/p>\n\n\n\n<p>But harvesting the data is only the first part of the story. The next step is using sophisticated analytics powered by machine learning to profile people \u2013 and thereby influence their behaviour. In the furore over Cambridge Analytica, Facebook\u2019s own profiling practices largely escaped scrutiny. The company has explored&nbsp;<a href=\"http:\/\/www.bbc.co.uk\/news\/technology-43869911\">personality<\/a>&nbsp;profiling, how to&nbsp;<a href=\"https:\/\/www.theguardian.com\/technology\/2014\/jun\/29\/facebook-users-emotions-news-feeds\">manipulate emotions<\/a>, and target people based on&nbsp;<a href=\"https:\/\/www.wired.com\/2017\/05\/welcome-next-phase-facebook-backlash\/\">psychological vulnerabilities<\/a>&nbsp;such as when they felt \u201cworthless\u201d or \u201cinsecure\u201d. Google developed a tool to target ads so precisely that they can sway people\u2019s beliefs and change behaviour through \u201c<a href=\"https:\/\/www.nytimes.com\/2019\/07\/07\/opinion\/google-ads.html\">social engineering<\/a>\u201d \u2013 while initially developed to counter Islamic extremism, the tool is publicly available for anyone to (mis)use.<\/p>\n\n\n\n<p>One of the most urgent and uncomfortable questions raised in The Great Hack is: to what extent are we susceptible to such behavioural manipulation? Ultimately, if these capabilities are as powerful as the companies and their customers claim, they pose a real threat to our ability to make our own autonomous decisions or even our right to opinion, undermining the fundamental value of dignity that underpin all human rights. Advertising and propaganda aren\u2019t new, but there is no precedent for targeting individuals in such intimate depth, and at the scale of whole populations.<\/p>\n\n\n<blockquote class=\"blockquote is-lined\"><p><span style=\"font-size: large\">The push to grab users\u2019 attention and to keep them on platforms can also encourage the current toxic trend towards the politics of demonization.&nbsp;<\/span><\/p>\n<\/blockquote>\n\n\n<p>The model may also be helping to fuel discrimination. Companies \u2013 and governments \u2013 could&nbsp;<a href=\"https:\/\/www.amnesty.org\/en\/latest\/research\/2017\/02\/muslim-registries-big-data-and-human-rights\/\">easily abuse data analytics<\/a>&nbsp;to target people based on their race, ethnicity, religion, gender, or other protected characteristics. The push to grab users\u2019 attention and to keep them on platforms can also encourage the current toxic trend towards the&nbsp;<a href=\"https:\/\/www.amnesty.org\/en\/latest\/news\/2017\/02\/amnesty-international-annual-report-201617\/\">politics of demonization<\/a>. People are more likely to click on sensationalist or incendiary material, meaning platforms&nbsp;<a href=\"https:\/\/www.nytimes.com\/2018\/03\/10\/opinion\/sunday\/youtube-politics-radical.html\">systematically privilege conspiracy theories<\/a>, misogyny, and racism.<\/p>\n\n\n\n<p>What is to be done? The data-driven business model presents a systemic and structural issue that will not be easy to address and requires a mix of political and regulatory solutions. Stronger data protection is certainly part of the answer: properly enforcing Europe\u2019s General Data Protection Regulation, which has international reach, and using it as a model in other countries, would mitigate the extent of data-mining and profiling.<\/p>\n\n\n\n<p>More radical calls to break up the Big Tech companies have now become&nbsp;<a href=\"https:\/\/medium.com\/@teamwarren\/heres-how-we-can-break-up-big-tech-9ad9e0da324c\">commonplace<\/a>, and the industry is already being examined by competition authorities in various jurisdictions. A recent decision by Germany\u2019s Federal Cartel Office to limit data sharing and aggregation between Facebook and WhatsApp is an example of a precise measure to counter the concentration of power towards the big players.<\/p>\n\n\n\n<p>Whatever regulatory tools are deployed, it is vital that they are grounded in an analysis of the human rights risks posed by the model. Human rights provide the only international, legally binding framework that can capture the multi-faceted ways in which the business model is impacting our lives and what it means to be human \u2013 and hold the companies to account.<\/p>\n\n\n\n<p>What is clear is that current efforts are not tackling the root causes of the problem. Two weeks ago, US regulators approved&nbsp;<a href=\"https:\/\/www.wsj.com\/articles\/ftc-approves-roughly-5-billion-facebook-settlement-11562960538?mod=hp_lead_pos1\">a record $5bn settlement against Facebook<\/a>&nbsp;over Cambridge Analytica. But after news of the fine broke, Facebook\u2019s share price went UP.<\/p>\n\n\n\n<p>The lesson: the company and its investors would be happy for this to remain an isolated incident. It will pay some relatively nominal fines &#8211; $5bn is a drop in the ocean for a company that makes&nbsp;<a href=\"https:\/\/www.theverge.com\/2019\/7\/12\/20692524\/facebook-five-billion-ftc-fine-embarrassing-joke\">$22bn in pure profit a year<\/a>&nbsp;\u2013 and make a few improvements to their privacy protections, but then go back to business as usual.<\/p>\n\n\n\n<p>We cannot let this happen. It is high time to confront the human rights impacts of \u201csurveillance capitalism\u201d itself.<\/p>\n\n\n\n<p><a href=\"https:\/\/www.amnesty.or.th\/about-us\/\" target=\"_blank\" rel=\"noreferrer noopener\">\u0e40\u0e23\u0e35\u0e22\u0e19\u0e23\u0e39\u0e49\u0e40\u0e1e\u0e34\u0e48\u0e21\u0e40\u0e15\u0e34\u0e21\u0e40\u0e01\u0e35\u0e48\u0e22\u0e27\u0e01\u0e31\u0e1a\u0e41\u0e2d\u0e21\u0e40\u0e19\u0e2a\u0e15\u0e35\u0e49<\/a><br><a href=\"https:\/\/www.amnesty.or.th\/donate\/\" target=\"_blank\" rel=\"noreferrer noopener\">\u0e1a\u0e23\u0e34\u0e08\u0e32\u0e04\u0e2a\u0e19\u0e31\u0e1a\u0e2a\u0e19\u0e38\u0e19\u0e41\u0e2d\u0e21\u0e40\u0e19\u0e2a\u0e15\u0e35\u0e49<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>It was the scandal which finally exposed the dark side of the big data economy underpinning the internet. The inside story of how one company, Cambridge Analytica, misused intimate personal Facebook data to micro-target and manipulate swing voters in the US election, is compellingly told in \u201cThe Great Hack\u201d, a new documentary out today. But [&hellip;]<\/p>\n","protected":false},"featured_media":2569,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_yoast_wpseo_focuskw":"","_yoast_wpseo_title":"","_yoast_wpseo_metadesc":"","_yoast_wpseo_meta-robots-noindex":"","_yoast_wpseo_meta-robots-nofollow":"","_yoast_wpseo_canonical":"","_yoast_wpseo_opengraph-title":"","_yoast_wpseo_opengraph-description":"","_yoast_wpseo_opengraph-image":"","_yoast_wpseo_opengraph-image-id":0,"_yoast_wpseo_twitter-title":"","_yoast_wpseo_twitter-description":"","_yoast_wpseo_twitter-image":"","_yoast_wpseo_twitter-image-id":0,"_hero_title":"","_hero_content":"","_hero_cta_text":"","_hero_cta_link":"","_hero_alignment":"","_hero_background":"","_hero_size":"","_hero_show":"","_hero_type":"","_hero_embed":"","_hero_video_id":0,"_hero_hide_image_caption":true,"_hero_hide_image_copyright":false,"_nav_style":"","_disable_share_icons":false,"_disable_sidebar":false,"_display_author_info":false,"_hide_featured_image":false,"_hide_featured_image_caption":true,"_maximize_post_content":false,"_reduce_content_width":false,"_sidebar_id":0,"_stretch_thumbnail":false,"byline_context":"","byline_entity":"","byline_is_author":false,"disable_related_content":false,"download_id":0,"download_text":"","show_published_date":true,"show_updated_date":true,"term_slider":"","amnesty_index_number":"","recipients":"","recipients_refresh":"","recipients_refreshed":"","amnesty_umbraco_data":"","document_ref":"","amnesty_updated":"","footnotes":""},"category":[1584],"location":[1588,1587,1589],"resourceType":[],"topic":[],"class_list":["post-2568","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog","location-southeast-asia","location-thailand","location-world"],"datePosted":"July 26, 2019","mlpRelationships":{"1":2265,"2":2568},"_links":{"self":[{"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/posts\/2568","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/users\/14"}],"replies":[{"embeddable":true,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/comments?post=2568"}],"version-history":[{"count":0,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/posts\/2568\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/media\/2569"}],"wp:attachment":[{"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/media?parent=2568"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/category?post=2568"},{"taxonomy":"location","embeddable":true,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/location?post=2568"},{"taxonomy":"resource-type","embeddable":true,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/resourceType?post=2568"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/www.amnesty.or.th\/en\/wp-json\/wp\/v2\/topic?post=2568"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}