FAQ

About the film

Filming for The Social Dilemma took place between 2018-2019. After our premiere at the 2020 Sundance Film Festival, we made revisions in March-April 2020 and were able to incorporate additional materials as COVID-19 started to hit.

We were drawn to tell the stories of our changing glaciers and changing coral reefs because they were powerful signs of a huge global issue facing humanity: climate change. When we started talking with Tristan Harris and the Center for Humane Technology, we saw a direct parallel between the threat posed by the fossil fuel industry and the threat posed by our technology platforms. Tristan calls this “the climate change of culture,” an invisible force that is shaping how the world gets its information and understands truth. Our hope has always been to work on big issues, and we now see “the social dilemma” as a problem beneath all our other problems.

While interviewing tech insiders over the past two years, we kept learning about what actually drives the algorithms on the other side of our screens. Many of these conversations revealed a highly technical, nearly invisible force. As with Chasing Ice and Chasing Coral, our hope was to reveal the invisible, to bring the hidden story to the surface. We wanted to bring the algorithms to life, and to give viewers a new way to see and understand these tech platforms. Additionally, by following the family in the film we are able to see the different ways these platforms cause real-world harm, from Isla’s mental health struggle to Ben’s political polarization and their family’s inability to connect with one another.

All of the big tech platforms–Facebook, Google, Twitter, Youtube, etc–have a digital model of you.  All of the information they collect is gathered into the model, and their programs are constantly testing those models to see what works on you.  In the film, we bring this to life through a virtual avatar to represent the vast data that is being collected on each and every one of us. As they collect more data, the model becomes more and more accurate. For example, a 2016 ProPublica report found some 29,000 different criteria for each individual Facebook user–those models are only getting better and better.

We wanted to focus on the root causes of the problem as told by the people who contributed to its creation. That led us to focus on the former employees from the tech giants. However we can’t rely on the people who created the problem to be the ones to solve it. Our impact campaign will pass the mic to the activists, organizations and survivors of exploitative technology whose work and experiences will be instrumental in growing the humane tech movement.

There are countless positive things that have come from social media, and many more positive things will continue to come from it. But our point in the film is less about any one issue or campaign, but about the system as a whole. 

For years, we have only been hearing the positive aspects as broadcast by the platforms themselves. The promise to keep us all connected has given rise to many unintended consequences that we are now seeing on a global scale. By focusing the film on how these technologies prey on human weaknesses, we can shine a light on the devil’s bargain we’re forced to make when using these platforms and create the collective will to change them.

Many of the film subjects are working to advance our understanding of the problem or accelerating momentum around solutions. We plan to amplify their work as part of our impact campaign and encourage you to follow along at TheSocialDilemma.com. You can reference a list of the initiatives that they are involved in on The Film page.

Yes, absolutely. We are each on our own individual journey, but having worked on this issue now for nearly three years, the team has spent a lot of time reflecting on our own habits. Some of us have stopped using social media entirely, while others have embraced new norms around more mindful use. We encourage you to check out the resources on our site and subscribe to our newsletter for more information about how you can reboot your relationship with social media and extractive tech.

We’ve addressed this question in our Code of Ethics.

In The Social Dilemma there is a clip of Hong Kong anti-surveillance protesters taking down a “smart” lamp pole in protest of facial recognition and surveillance technology. Many months later, this clip was repurposed online with a misleading caption stating 5G is connected to the spread of COVID-19, inaccurately spreading a story that the clip was showing the take down of a 5G tower. As some have asked about the usage of this shot, we wanted to clarify that it was intentionally chosen as one of many examples of misinformation, to reinforce Tristan Harris’ point during that moment in the film that “we are being bombarded with rumors.” We invite you to watch the full film to see this in context and understand the many ways misinformation can spread and context can be skewed.

Some have asked about a news clip reporting that social media “accounts were deliberately and specifically attempting to sow political discord in Hong Kong” (sources included below).  According to reports from Twitter, Facebook, and Google in August of 2019, fake social media accounts were created in The People’s Republic of China to deliberately undermine the legitimacy of the Hong Kong protests, a pro-democracy movement. While we acknowledge the positive role technology has played in social movements, including the ongoing Hong Kong protests, we decided that including the Hong Kong protests in The Social Dilemma was an important example of how social media can also be weaponized, especially by those with power or resources, to propagate alternative agendas, thus undermining and threatening legitimate movements like the Hong Kong protests and others across the world.
Additional Sources:
  • Twitter Blog, Information operations directed at Hong Kong, Link
  • Facebook, Removing Coordinated Inauthentic Behavior From China, Link
  • Google, Maintaining the Integrity of Our Platforms, Link
  • Vox, How China used Facebook, Twitter, and YouTube to spread disinformation about the Hong Kong protests, Link
  • Tech Crunch, Twitter says accounts linked to China tried to ‘sow political discord’ in Hong Kong, Link

After three years of making a film about unethical uses of personal data, our team takes privacy very seriously. Our use of cookies on the site is strictly limited to those that help us understand and improve how the site is being used for our impact campaign. This includes gathering basic website analytics (e.g. pages visited) and tracking associated with views of our trailer. We do not use any of this data for advertising or tracking users offsite in any way, and we will not ever share or sell your data.

We believe in providing our users transparency and control over their data, which is why there is a clear notification regarding our cookie policy upon arriving on the site, and users have an opportunity to opt in or out. If a user does not accept our default cookie settings, there will be no cookies dropped on their system, and they will not be tracked. They also have the ability to click on “Cookie Settings” to choose which categories of cookies they would like to accept.

In the spirit of the film, we are constantly reflecting on our website’s use of cookies and other tools to respect our visitors’ data and privacy and will continue to make improvements.

For viewers streaming The Social Dilemma in the USA, the film is categorized as “Provocative, Investigative.” It is rated PG-13 “for some thematic elements, disturbing/violent images and suggestive material.”

For more information on Netflix maturity ratings in your country, please visit: How does Netflix decide maturity ratings?

While Netflix, Hulu, HBO, Amazon Prime and other services employ recommendation algorithms, there’s an important distinction between streaming platforms and the social media and search companies we critique in the film: on streaming platforms, users pay for the content, and therefore the business model is more aligned with our best interests.

Jaron Lanier and many of our interview subjects have advocated that one solution to extractive technology is for users to pay for the service with actual money, rather than personal data. With its monthly subscription model, Netflix and other streaming services are one such example. They make the same amount of money from users whether they spend 10 hours a day or one hour a week on the platform. Furthermore, many streaming services also hold their content to rigorous fact-checking standards, and are curated by real people.

Thanks for your interest in repping the film and spurring conversations IRL! While we do not currently have The Social Dilemma merchandise available for purchase, we’ll be sure to update our newsletter and website when we do.

About the dilemma

In the broadest sense, an algorithm is simply a set of calculations to be carried out, often to perform a mathematical function. Artificial intelligence (AI) is a very broad term that applies to the many different advanced uses of algorithms to mimic and/or replace the need for human intelligence. Unlike a simple fixed algorithm, AI uses a system of algorithms and can create or modify algorithms without human intervention through the process of continually optimizing for better and better results – often referred to as machine learning.

As the surplus of data generated by our digital life grows, corporations are increasingly building AI algorithms that draw upon this data to model our behavior, target us, and make complex business decisions. The promise of an algorithm’s objectivity has engendered our trust in these data-driven approaches, however when used outside of a purely mathematical context, algorithms reflect an instance of logic programmed by a human – logic that frequently reflects the individual bias or the interests of the company they represent.

The attention extraction economy refers to technology platforms that profit from the monetization of human attention and engagement. This includes, but is not limited to, Facebook (which owns Instagram), Twitter, TikTok and companies like Google (which owns YouTube) that profit from keeping users hooked on their platforms because more engagement means more advertising dollars.

Surveillance capitalism is a term popularized by film subject Shoshana Zuboff in her book The Age of Surveillance Capitalism. It refers to the mass surveillance of our online activity in ways that we are often unaware, and the commodification of this data for commercial purposes. The unprecedented scale of the data collected by these online companies and its use to predict and influence our purchases, behaviors, and thoughts has made them some of the richest companies in the history of the world.

Major tech companies have been using targeted advertising since the early 2000s, but things changed a lot with the advent of the smartphone. Now, instead of a shared family computer, the surveillance capabilities are tied directly to the individual user, allowing for much more data to be collected. Platforms are not just collecting basic demographic information that we willingly provide upon signup; they are tracking every click, every like, every photo, whether we end our post with an exclamation or question mark, and then combining this knowledge with powerful AI. The bigger problem is how the data feeding the underlying algorithms is being used to model and predict all of human behavior – giving the highest bidder the ability to influence us at scale like we’ve never seen before from determining elections to sparking revolutions.

 

If you believe that technology is contributing to mental health issues, we  encourage you to consult with a psychologist. If you are in need of urgent support or are experiencing suicidal thoughts, you can anonymously contact the Suicide Prevention Lifeline 24 hours, 7 days a week at 1-800-273-8255 or use their online chat feature. 

We’re currently developing resources to help people realign their personal relationships with technology. Subscribe to our newsletter to stay tuned. We also recommend connecting with members of your community and/or trusted family members or friends to talk about what you’re going through using our discussion guide (particularly pages 7, 8, and 13). Finally, please refer to this document for mental health and counseling resources during COVID-19.

Film subject Jaron Lanier says near the end of The Social Dilemma

“If you are privileged enough to be able to get away with not being on social media, you have a positive responsibility to get out of it, because you being there is only perpetuating a system that’s abusing other people. So if you’re privileged enough to get by, do it. Get out of the system.”

See Lanier’s book, Ten Arguments for Deleting Your Social Media Accounts Right Now, for a longer answer we highly recommend. 

If you can’t delete your accounts due to work or other reasons, or aren’t ready to let go, there are other ways to take action. It’s also important to note that deleting your own accounts won’t solve the systemic problem (though it will likely improve your life!). Film subject Tristan Harris has emphasized that even people who aren’t on social media are affected by what the business model is doing to society. While individuals reducing or eliminating their usage and support of companies is a first step, we need systemic changes to how technology is designed and regulated to fully address and reverse our shared dilemma.

If you rely on social media to promote your product, connect with other professionals, and/or as a platform for activism and need to keep your or your company’s accounts, you might consider: 

  • Focusing your social media strategy on organic forms of engagement rather than paying for ads, which sustain this extractive business model. You could also experiment with alternative platforms like Patreon, Yelp, or Kickstarter. 
  • Writing up and linking to an easy-to-read social media “Code of Ethics” for your business, in which you outline your privacy policy, social media strategy, and commitments to protecting your customers’ data and wellbeing. (See The Social Dilemma’s Code of Ethics as an example.)
  • Getting more engaged in personal forms of action to help dismantle the system. Your voice as a business owner on these platforms has power, and you can use it to call for change from tech companies and/or legislators. Visit our Take Action page for how to get involved.

Our team will continue to compile more suggestions and resources, and we encourage you to subscribe to our newsletter for updates. 

As you consider which platforms to use and support, you might ask yourself the following questions: 

  • How is the company collecting and using your data? Is it possible to use it against you – for example, to hook or manipulate you? 
  • How is the company making money? Companies that charge subscription fees and/or sell hardware in addition to software tend to treat your data and wellbeing more ethically because they don’t exclusively rely on advertising revenue. Is the service free? If so, remember that you’re the product. 
  • Do people behave civilly on the platform? 
  • Who and what does the platform connect you with? Services that foster authentic connection and relationships, like video calls and texting, are a safer bet than platforms that use an algorithm to recommend new people, products, and posts.  

We believe deeply in free speech: that people are entitled to hold and express their own opinions. But, as subject Aza Raskin and Reneé Diresta have noted, “there’s a difference between ‘freedom of speech’ and ‘freedom of reach.’”

Tech workers will be a critical force for changing our current reality and creating a more humane future for all. Because our small team is composed of filmmakers, not technology workers, we’re not in a position to provide feedback on or support with new technology ideas. However, as a starting point we recommend: 

Hosting an event

You can access the film on Netflix beginning September 9th at 12:00am PST.

There is no cost to screen the film beyond a Netflix subscription. However, for group screenings, a one-time grant of permission is required. You can obtain a grant of permission by registering your event and agreeing to Netflix’s Terms & Conditions.

Groups are free to promote their event in conjunction with our Social Dilemma Virtual Tour as a way to engage others in conversation about the issues featured in the film. We ask that these events are virtual or kept to small groups with strict social distancing protocols in place. To curb the spread of COVID-19 we are not supporting large scale in-person events at this time.

For those who register with us, we are offering a series of planning, discussion and technical guides as well as exclusive bonus clips that dive deeper into the different dilemmas examined in the film. We also plan to promote some events on our Virtual Tour page for those interested in making their event details public.

A free 40-minute version of the film as well as 3-4 minute supplemental clips are available as of October 15, 2021. You can access them by registering your event.

If you have access to an iPad you can download the film to your iPad from Netflix. Once downloaded, you can connect your iPad to either a TV or projector using an HDMI cable, which will allow you to share the film without internet connection or worry of streaming interruptions.

Register your event to receive our technical guide, which includes other tips and tricks.

While the film is available in 190+ countries on Netflix and subtitled in many languages, we currently do not have resources in any language but English. Be sure to subscribe to our newsletter, and we’ll let you know if that changes.

To change your audio or subtitles to an alternate language on Netflix, instructions can be found here.

Due to COVID-19 we are not able to participate in any in-person festivals until further notice. We are participating in a limited number of virtual festivals, but, to protect the intellectual property of the film, we are focusing on screening clips and/or participating in festival panels.