Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • The New York Times

    Children’s Groups Want FTC to Ban ‘Unfair’ Online Manipulation of Kids

    By Natasha Singer,

    2022-11-17
    https://img.particlenews.com/image.php?url=1gqgyR_0jEbG51h00
    A loading screen from My Talking Tom, a wildly popular app game targeted at very young children. (Handout via The New York Times)

    My Talking Tom, an animated video game featuring a pet cat, is one of the most popular apps for young children. To advance through the game, youngsters must care for a wide-eyed virtual cat, earning points for each task they complete.

    The app, which has been downloaded more than 1 billion times from the Google Play Store, also bombards children with marketing. It is crowded with ads, constantly offers players extra points in exchange for viewing ads and encourages them to buy virtual game accessories.

    “Every screen has multiple traps for your little one to click on,” Josiah Ostley, a parent, wrote in a critical review of the app on the Google Play Store last month, adding that he was deleting the app.

    Now some prominent children’s advocacy, privacy and health groups want to ban user engagement techniques that, they say, unfairly steer the behavior of minors and hijack their attention. On Thursday morning, a coalition of more than 20 groups filed a petition asking the Federal Trade Commission to prohibit video games such as My Talking Tom, as well as social networks such as TikTok and other online services, from employing certain attention-grabbing practices that may hook children online.

    In particular, the groups asked regulators to prohibit online services from offering unpredictable rewards — a technique that slot machines use — to keep children online.

    The groups also asked the agency to prohibit online services from using social pressure techniques, such as displaying the number of likes that children’s social media posts garner, and endless content feeds that may cause children to spend more time online than they may have wished.

    The petition to federal regulators warned that such practices might foster or exacerbate anxiety, depression, eating disorders or self-harm among children and teenagers.

    “Design features that maximize minors’ time and activities online are deeply harmful to minors’ health and safety,” the children’s activists wrote in the petition. “The F.T.C. can and must establish rules of the road to clarify when these design practices cross the line into unlawful unfairness, thus protecting vulnerable users from unfair harms.”

    The coalition was led by Fairplay, a nonprofit children’s advocacy group, and the Center for Digital Democracy, a children’s privacy and digital rights group. Other signatories included the American Academy of Pediatrics and the Network for Public Education.

    https://img.particlenews.com/image.php?url=13UuRG_0jEbG51h00
    A level-up screen from My Talking Tom, a wildly popular app game targeted at very young children. (Handout via The New York Times)

    Outfit7, the developer of My Talking Tom, did not immediately return an email seeking comment.

    The FTC petition comes at a moment when legislators, regulators and health leaders in the United States and abroad are increasingly scrutinizing the online tracking and attention-hacking practices of popular online platforms — and trying to mitigate the potential risks to children. In doing so, these activists are challenging the business model of apps and sites whose main revenue comes from digital advertising.

    Online services such as TikTok, Instagram and YouTube routinely employ data-harvesting techniques and compelling design elements — such as content-recommendation algorithms, smartphone notices or videos that automatically play one after another — to drive user engagement. The more time people spend on an app or site, the more ads they are likely to view.

    Now legislators, regulators and children’s groups are taking a new approach to try to curb the use of such attention-hacking practices on minors. They are trying to hold online services to the same kinds of basic safety standards as the automobile industry — essentially requiring apps and sites to install the digital equivalent of seat belts and air bags for younger users.

    Last year, for instance, Britain instituted comprehensive online safeguards for young people, known as the Children’s Code. The new rules require social media and video game platforms likely to be used by minors to turn off certain features that could be detrimental — such as barraging users with notifications at all hours of the night — by default for younger users.

    Before the British rules went into effect, TikTok, YouTube, Instagram and other popular services bolstered their safeguards for younger users worldwide.

    In September, California also enacted a law requiring sites and apps likely to be used by minors to install wide-ranging safeguards for users younger than 18. Members of Congress have introduced two bills — the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act — intended to bolster online protections for youngsters.

    Civil liberties experts have argued that the safeguards could also have deleterious consequences. The measures, they argue, could subject children to increased surveillance, potentially deterring vulnerable young people from finding online resources on sensitive issues such as reproductive health or gender identity.

    Young people themselves report mixed feelings about their online activities. In a survey of roughly 1,300 teenagers in the United States, published Wednesday by the Pew Research Center, 80% said social media made them feel more connected to their friends’ lives. About 30% also said they felt that social media had a negative effect on people their age.

    With Congress split after the midterm elections, the FTC may end up regulating attention-hacking techniques on children before federal legislators do.

    In August, the agency posted a notice asking the public to weigh in on whether new rules were needed to protect consumers from “commercial surveillance” — that is, software services that amass data on individual consumers and use it to try to steer their behavior. The notice posed a series of questions specifically related to children.

    “Do techniques that manipulate consumers into prolonging online activity,” such as quantifying the number of likes on social media posts, “facilitate commercial surveillance of children and teenagers?” the regulators asked. “Is it an unfair or deceptive practice when a company uses these techniques despite evidence or research linking them to clinical depression, anxiety, eating disorders or suicidal ideation among children and teenagers?”

    This article originally appeared in The New York Times .

    Expand All
    Comments / 22
    Add a Comment
    TC Andrews
    2022-11-20
    Parents are responsible and accountable for their children.
    Jack Blythe
    2022-11-18
    Why are your children online without supervision? Bad parenting 101
    View all comments
    YOU MAY ALSO LIKE
    Local News newsLocal News
    The New York Times27 days ago

    Comments / 0