Critical Hearings, Whistleblower Testimony and Research
Why is it so hard to hold social media platforms liable?
The brief answer to this: In 1996, Congress wanted online platforms to grow as venues for commerce and communication while at the same time proactively curb harmful behavior online. If platforms were viewed as a publisher, they could be held liable for its online content. So, in 1996 Congress enacted the Communications Decency Act to protect online companies from liability arising from what is posted on their platforms. In short, Section 230 of this act provides these companies complete immunity from whatever is posted. Click HERE understand what the intent of Section 230 initially was and why it has morphed into a blanket immunity shield for these companies.
While this may have made sense in the 1990s, no one could have predicted the explosion of social media and it's grip hold. In the 1990s, the general public was entirely unaware that these platforms primarily operate by algorithms generating and pushing content to end users without regard for the appropriateness of that content and/or the potential dangers that came with that content.
Still … courts interpret Section 230 as a complete immunity shield for all online platforms. And it empowers platforms not to care because they are forever not liable for any potential consequences.
Click HERE to understand why Section 230 needs to be amended.
Where does that leave us now?
Since 2021, state and federal legislatures appear to have finally awakened to this issue and have started both to investigate (via hearings) and proposing laws, though the hurdle of Section 230 appears almost insurmountable.
Click on the links below to learn more of what has been (and continues to be) uncovered with respect to social media’s harm in general but particularly as it relates to the harms it causes to our kids, as well as both federal and state attempts to hold social media platforms accountable.
2021:
- Protecting Kids Online led by Senate Subcommittee on Consumer Protection, Product Safety and Data Security
- The Facebook Files a Wall Street Journal Investigation of internal research showing how Facebook has amplified hate speech and misinformation and how Instagram has harmed teen mental health. (Subscription required to read the actual WSJ investigation. A summary can be found HERE)
- 60 Minutes Segment: “Whistleblower” Facebook is misleading the public on progress against hate speech, violence misinformation.
- Protecting Kids Online: Testimony from a Facebook Whistleblower Frances Haugen’s testimony in front of the third Senate subcommittee hearing candidly discussing this being Facebook’s “Big Tobacco moment.”
- Protecting Kids Online: Snapchat, TikTok and YouTube Executives from Snapchat, TikTok and YouTube discuss safety features for young people on their apps. But when grilled by Senators (about safety features as well as unsolicted disturbing content showing up in social media feeds), the social media representatives side-stepped questions and some outright denied that their platforms maintained and promoted harmful content to our youth.
- 52 State Attorneys General form a Bi-Partisan Coalition to take aim at social media giants to hold them accountable.
- 10 State Attorneys General launch investigation into Meta-owned Instagram’s impact on kids regarding the potential harms caused by this platform.
(02-2023: STILL UNDER CONSTRUCTION - STAY TUNED FOR ADDITIONAL MATERIAL FROM 2022 and 2023!)