Currently traveling the country with her “Special” tour, Lizzo still finds time to fight for mental health.

On a panel discussion hosted by the Dove Self-Esteem Project, Lizzo joined teenagers, Dove representatives and experts to advocate for better education about social media and its harmful effects.

“We believe no young person should be held back from reaching their full potential. However, low body confidence and anxieties over appearance keep young people from being their best selves, affecting their health, friendships, and even performance at school,” says the Dove campaign on its website.

Teenagers shared stories of how quickly they became addicted to TikTok, Instagram and other social media apps, often looking for validation and acceptance from their peers. They discussed feeling FOMO (fear of missing out) if they don’t check the app regularly and how it causes anxiety, eventually sending them down a rabbit hole of content.

“I started seeing all these influencers who were getting a lot of recognition, and I didn’t look like them,” said Chioma, a teenager at the event. “They were people who were mainly skinny and white, and it definitely lowered my self-esteem, and it changed how I felt about myself.”

The National Organization for Women reports 53% of American girls struggle with body acceptance, growing to 78% by the time they reach 17. Dove is partnering with Common Sense Media, Parents Together Action, and Lizzo to advance the 2023 Kids Online Safety Act (KOSA).

This bill is the first written specifically to help regulate standards and safeguards to protect kids online and limit their exposure to toxic content.

According to Parents.com, the bill would:

  • Disable addictive product features and allow users to opt out of algorithmic recommendations.
  • Create a “duty of care” for platforms to “act in the best interests” of minors using their site when it comes to content promoting self-harm, suicide, eating disorders, substance abuse and sexual exploitation. Platforms would also need to “take reasonable measures in its design and operation of products and services to prevent and mitigate” harm.
  • Require social media platforms to perform an annual independent audit assessing risks to minors.
  • Provide experts access to critical data to foster research regarding harms to the safety and well-being of minors.

“There are a number of key civil, privacy, and human rights organizations that are putting forward policy recommendations that can hold the tech industry to rigorous consumer safety and anti-discriminatory standards, such as the Algorithmic Accountability Act and other important bills,” Safiya Noble, Ph.D., the author of “Algorithms of Oppression: How Search Engines Reinforce Racism” and the director of UCLA’s Center on Race & Digital Justice, told the panel.