Friday, March 27


An American civil court jury this week found Meta and YouTube designed their social media services to be addictive and awarded $6 million in damages to a young woman who said features such as infinite scroll, beauty filters and algorithmic recommendations drove her into compulsive use, depression and body dysmorphia. A day earlier, another US jury ordered Meta to pay $375 million for misleading users about child safety. The sums are, by Big Tech standards, modest. But for the first time, a jury has held social media companies liable not for what users post but for how the product is built — the theory of harm that broke Big Tobacco in the 1990s, when it was found to be knowingly manufacturing addictive products while concealing the health risks.

The verdict is a beginning, and appeals will follow. But this week’s ruling cuts an important legal path that must be leveraged to drive more structural regulatory interventions. (AFP)
The verdict is a beginning, and appeals will follow. But this week’s ruling cuts an important legal path that must be leveraged to drive more structural regulatory interventions. (AFP)

Lawyers for the plaintiff in the latest case, a 20-year-old identified as KGM, reframed the claim from speech — long shielded by US federal law — to product design. Internal Meta documents presented at trial showed executives discussed the harm their platforms caused children while actively courting young users: One memo urged the company to “bring them in as tweens”; data showed 11-year-olds were four times as likely to return to Instagram as to rival apps, despite a minimum age requirement of 13. KGM testified she began using Instagram at nine and developed body dysmorphia she traces to the platform’s beauty filters. Instagram’s head, Adam Mosseri, rejected the word “addiction” at trial, preferring “problematic use”. The jury found both companies had acted with malice.

For years, society has been a petri dish for technology companies engineering every feature — from autoplay to algorithmic recommendations — to maximise engagement regardless of cost. The consequences, from an anxiety epidemic among adolescents to the erosion of shared facts in democratic discourse, are well documented. The harm to young users is where the damage is most visible and the corporate defence most vulnerable. India, with one of the world’s youngest populations and over 400 million social media users, is deep inside that experiment.

The verdict is a beginning, and appeals will follow. But this week’s ruling cuts an important legal path that must be leveraged to drive more structural regulatory interventions. The tobacco precedent is instructive beyond the courtroom: Litigation in the 1990s led not just to damages but to enforceable bans on advertising to minors and restrictions that made targeting children commercially unviable. A jury has now found that social media companies did much the same thing by different means. Governments, India’s included, should be asking why the regulatory consequences remain so far behind.



Source link

Share.
Leave A Reply

Exit mobile version