top of page
Search
Writer's pictureJoshua J Smith

Social media advertising changes amid pushback about digital spaces and teen mental health.


Lawmakers, big tech companies and parents are thinking hard about the impact of social media and digital spaces on teens and kids. In a purely digital environment framed around content sharing, young people are subjected to the same content moderation as adults. This includes content focused on politics, health and wellness, corporate advertising and news. A 2018 Pew Research study indicates "...95% of teens have access to a smartphone, and 45% say they are online 'almost constantly.'" (Pew 2018) with "About half of Hispanic (52%) and Black Americans (49%) using platforms like Instagram, compared with smaller shares of White Americans (35%) who say the same.2 (Pew 2021)


Image: Teens, Social Media and Technology 2018, Pew Research Center. Image by Drew Angerer/Getty Images News via Getty Images.


Foremost on the lineup of issues related to teens and kids using these social platforms is the question of advertising, content strategically generated to soclict profit and action. Should teens be marketed to the same way companies market to adults? According to some lawmakers, no.


In fact, the act of advertising to kids online through social media and apps show clear strategy by organizations to "...use manipulative approaches like showing a sparkling present, which takes the child to an ad when clicked." (CNN 2021) Or to incentivise advertising by enabling kids to "earn rewards (such as virtual candy) for watching ads, and ad viewing" on apps and games that often "...take up more time than playing the game itself." (CNN 2021) There are currently no laws restricting advertising frequency or duration, though some lawmakers would like to know more about ways of protecting kids in digital spaces from malicious and harmful advertising.


In fact plans were in-the-works to launch an Instagram platform designed specifically for kids. Yet some lawmakers are hoping to stop CEO and founder, Mark Zuckerberg's efforts entirely until the law can catch up to the industry. House Democrats, and others have doubled down on calls for the platform to abandon controversial plans after "an 18-month investigation that ran until this spring as a "teen mental health deep dive” concluded that notably “social comparison is worse on Instagram.” (The Hill)


As a result of growing concern and pressure by lawmakers and parents, some progress has been made while researchers look for data to help support a safer space for kids online. Most notably, Facebook has decided it will limit targeted advertising to users younger than 18 on its current platforms: Facebook, Instagram and What's App. Prior to this commitment, Alphabet, the Google parent company, decided to regulate and limit the influence brands have on young people based solely on their interests. And just yesterday, September 15th, 2021, "A total of 13 advertising, public relations and influencer agencies have agreed to stick to the rules, which include using platforms with technology to check a person’s age—or alternatively using influencers who are at least 25 years old and who 'primarily appeal to adult audiences'." (Campaign 2021)


As questions about mental health, safety and digital citizenship for teens and kids broach profitability amongst big tech developers, we are only now circumventing what impacts these platforms may have on developing minds more than 14 years after Facebook made its debut to college students in 2007: That's an entire youth generation who grew up in an unregulated, unmonitored burgeoning digital space.

Joshua Smith, M.S.








135 views0 comments

Comments


bottom of page