Newswise — During today’s meeting with the Senate’s consumer protection subcommittee, Facebook’s global head of safety faced questions from senator’s about concerns that the photo-sharing app has caused mental and emotional harm.
Brooke Erin Duffy, professor of communication, studies the intersection of media, culture and technology. Duffy says Big Tech’s self-regulation mechanisms continue to fail users.
Duffy says:
“Facebook’s Antigone Davis obfuscated questions about the impact of the Facebook-owned platform on teen’s mental health and young women’s body image – despite the WSJ’s recent release of internal data correlating platform use with body dissatisfaction and suicidal idealizations. Lawmakers expressed particular concern with the features that “quantify popularity” – such as likes, favorites, and comments features – and the lack of standards for influencer marketing.
“Senator Markey’s references to traditional media’s regulation of material for children –including limitations on advertising that have long guided the television industry – attest to a growing recognition that external regulation of the platforms is critical. While Big Tech has long flaunted its mechanisms of self-regulation, these have failed – and continue to fail – its users.
“One key takeaway from Davis’s testimony was a refusal to agree to a long-term promise to abandon plans of further developing Instagram for Kids. After all, the initiative is part of a long-term strategy by Big Tech to court younger – and less witting – users who the platforms can inevitably collect data from.
“Lawmakers also drew attention to a lack of transparency and absence of accountability in the face of ‘black-boxed’ algorithms and inconsistent systems of content moderation. While the senate hearings focused on the implications for children and teens, those who rely on these platforms for income report marked disparities along gender, race and class lines.”