Support the Timberjay by making a donation.

Serving Northern St. Louis County, Minnesota

Social media bills address multiple issues

David Colburn
Posted 3/20/24

REGIONAL- As the social media and artificial intelligence landscapes continue to evolve, legislators are playing catch-up, reacting to problems stemming from the technology that threatens online …

This item is available in full to subscribers.

Please log in to continue

Log in

Social media bills address multiple issues

Posted

REGIONAL- As the social media and artificial intelligence landscapes continue to evolve, legislators are playing catch-up, reacting to problems stemming from the technology that threatens online privacy and safety, exploitation of children for financial gain, and political campaign manipulation through the use of deepfake audio and video.
Bills addressing each of these issues are being considered in the Minnesota Legislature this session as lawmakers strive to tip the scales back into favor for those impacted by high-tech practices that foist unwanted and sometimes fraudulent content on unsuspecting users and increasingly turn kids into lucrative social influencers by promoting sponsored, i.e. paid, content.
Privacy concerns
House File 4400, titled “Prohibiting Social Media Manipulation Act,” authored by Rep. Zack Stephenson, DFL-Coon Rapids, would flip the script for social media companies such as Facebook, X, Instagram, and TikTok by requiring them to enact strict privacy protections for new accounts as defaults when they sign up. While all have privacy safeguards, users typically have to “opt-in” to them by going to the app settings and choosing which safeguards to enable. HF 4400 would mandate that all safeguards would be automatically enabled for new accounts, and a user would have to choose to “opt out” of them.
The goal is to prevent social media platforms from pushing content to users through their algorithms that don’t align with what they want to see. Algorithms, increasingly powered by artificial intelligence (AI), track users’ interactions with an app’s content and push more of that type of content into their feed with the intent of increasing the user’s time and engagement with the program. If a user clicks on political content on Facebook, for example, an algorithm will attempt to show more political posts in a user’s news feed.
Loose privacy defaults have particularly impacted child users of social media apps who are often oblivious to their opt-in safety features.
“You sign up (for these apps) to talk to your friends and you don’t realize that anyone in the world may be able to contact you,” said Ravi Iyer, a former Facebook executive who now leads research at the University of Southern California Marshall Center for Ethical Leadership and Decision Making.
The bill would enact numerous safeguards for Minnesota social media users, including:
• Requiring an interface that allows users to indicate what content they do or do not want. An app is specifically prohibited from pushing content to users that does not align with their preferences.
• Default privacy settings that focus on keeping user-generated content within their own chosen social network and not available to the general public. Platforms would be prohibited from using generative AI to scrape and utilize a user’s content without a user’s consent. A user would be able to change their privacy settings.
• Account holder daily usage limits would restrict the amount of time new users can be on the app, and also restrict highly active account users to limit their impact on Minnesota users. This guards against distortion or domination by a small minority of users who may exert undue influence on content and how algorithms disseminate it.
• Requires apps to provide specific heightened protections such as a prohibiting features that encourage increased usage, such as auto-play of next videos and infinite content feeds. It would also include a prohibition on visual counts of “likes” on user-generated content. Heightened protections would be opt-in, and a user who chooses heightened protections as a device option on their cellphone or tablet would automatically have them applied to any apps they use on those devices. If a parent has enabled parental controls on those devices, parental controls would be enabled on the apps as well.
Multiple trade industry groups voiced opposition to the bill, including written comments from TechNet and the Computer and Communications Industry Association. Comments by Robert Singleton, director of policy and public affairs for California and the Western U.S. at the Chamber of Progress, a tech industry trade group, echoed the overall industry concerns. Singleton said the bill is vaguely written, would have a chilling effect on social media platforms, “broadly infringe” on the First Amendment, and be “destined to lose in court.”
Platforms would self-censor out of fear of litigation and decline to show any content that might possibly contradict a user’s preference, Singleton said.
The committee voted to refer the bill to the Judiciary Finance and Civil Law Committee for further consideration.
Child influencers
A new type of occupation spawned by social media is that of a social media “influencer,” an individual or group with a large number of followers who create content to share their expertise of a particular topic and partner with companies to promote their products or services. Mega influencers have over one million followers, while at the lower end nano influencers have 10,000 or fewer.
Companies pay influencers handsomely for their promotions, and in the world of influencers kids are generating lots of business. For example, 12-year-old YouTube influencer Ryan Kaji has over 36 million followers who have viewed his 2,600 videos more than 45 billion times as of January 2023. Kaji’s channel became popular for his toy reviews and has branched out to include personal vlogs and reviews with other members of his family. Kaji became so popular that he has his own Ryan’s World toy line and a Race with Ryan game for video gaming platforms. His income exceeds $30 million annually, and his net worth has been estimated at $140 million, fifth highest among all YouTube influencers.
Some Minnesota legislators want to be sure that kid influencers in the state aren’t being financially exploited by their parents. Zach Stephenson authored this bill as well, which would require parents to establish a trust account for payments received for content using a minor’s likeness that would be maintained until they reach the age of 18. Records would have to be kept on minors who appear in at least 30 percent of a content creator’s videos, detailing when the videos create income, how much compensation was generated, and how much was paid to the trust account.
The bill also limits children under 14 from appearing in more than 30 percent of a creator’s videos.
The bill provides that a minor 14 and older is allowed to do such work under state law and is entitled to have any content deleted at their request once they turn 13. A minor or an adult previously depicted as a minor would have the right to sue for damages if any provisions of the law are violated.
“It’s over a $1 billion industry in the United States, and I think it’s time for it to have some guardrails,” Stephenson said.
Teen Vogue reporter Fortesa Latifi told FOX 9 news in February that child influencers aren’t covered under child labor laws.
“It’s totally legal in 49 states in this country for these kids to basically have full-time jobs their entire childhood and adolescence and get to 18 and have nothing to show for it,” Latifi said.
The Minnesota bill is modeled after one passed in Illinois last year.
Deepfake teeth
Last year the Legislature passed a law regulating deepfakes, video and audio representations that mimic real people doing and saying things that typically are misinformation and often reflect negatively on them.
The unusually rapid advance of AI over the past year and a half has put the power of creating extremely realistic, low-cost deepfakes into the hands of anyone. A quick Google search revealed over 40 deepfake apps available for use on phones or online. A recent highly publicized use of a deepfake for political manipulation was an AI-generated recording of President Joe Biden’s voice used for robocalls prior to the New Hampshire primary telling Democrats not to go to the polls.
The Minnesota law criminalized the creation of sex-related deepfake photos, videos, and audio, as well as prohibited political deepfakes intended to influence elections.
This year, a bill in the Senate written by Sen. Erin May Quade, DFL-Apple Valley, seeks to put more teeth into the political portion of the law by increasing the severity of the penalties.
“This is a new frontier for all of us. We are really grappling with technology that you’re looking and seeing and hearing something that did not happen from somebody who did not do it,” Quade told WCCO News. “There has never been a time in our life where we could look at something and be so sure we’re looking at the real thing, and it absolutely is not. It tests our sense of reality in a way that’s really troubling.”
Content-manipulated deep fakes created without the consent of the person depicted and with the intent of hurting a candidate would be a crime 90 days before a general election and 30 days before state, local, and presidential primaries and political party nominating conventions.
A candidate or other person who has created and disseminated a deepfake would be subject to imprisonment for up to 90 days and payment of a fine up to $1,000. If the deepfake was intended to cause violence or bodily harm, the maximum penalties would increase to 364 days and $3,000.
A repeat offense within five years of a prior conviction could get someone up to five years in prison or a $10,000 fine, or both.
A candidate convicted of using a deepfake who won the vote will be disqualified from office and may not be appointed to fill that office should a vacancy occur during its regular term.
The bill has been referred to the Judiciary and Public Safety Committee for additional deliberation.