Alexis Hiniker is a computer scientist at the University of Washington Information School. Alexis studies the ways that digital technology products can manipulate children or help them thrive. She is sharing her findings with regulators to fight for developmentally appropriate online environments. Annie Brookman-Byrne talks with Alexis about exploitative designs, supporting children’s wellbeing, and improving technology.
Annie Brookman-Byrne: You research “dark patterns” – what are they?
Alexis Hiniker: Dark patterns are interface designs used by for-profit technology companies to manipulate user behavior. These exploitative designs in the software encourage kids to disclose data about themselves, make purchases, and stay online longer—all in instances when they wouldn’t otherwise do so. These patterns show up in a lot of different ways, from advertisements that look like games, to on-screen characters that exert peer pressure and encourage kids into making in-app purchases through whining, teasing, or shaming. I am trying to understand how dark patterns can undermine kids’ flourishing.
ABB: What else do you study?
AH: My work also showcases the many ways in which technology can support kids’ wellbeing and growth. Together with many wonderful collaborators, I have built apps and other technologies that lead kids to bond with siblings, learn new socioemotional skills, learn new academic skills, independently self-regulate their use of technology, and experience joy. For example, the Superhero Zip chatbot teaches children the practice of positive self-talk, a way of treating yourself with compassion that leads to positive outcomes. And Coco’s Videos supports children as young as three in being intentional about when to watch online videos and what content to choose.
My lab has also evaluated existing commercial products and shown how certain design patterns lead children to share digital experiences with family members, develop greater executive function, play and joke with friends, and learn new things. There is a lot of great innovation in this space, but unfortunately, a fair amount of manipulative design and exploitation as well. I try to tease those apart and get precise about which is which and what those differences look like from a design perspective.
ABB: What changes have you seen in companies and consumers over time?
AH: There’s been an arms race on both sides of this equation. Manipulative design has become both increasingly sophisticated and increasingly pervasive. A/B testing makes it possible for companies to learn a great deal about what designs are most effective in getting kids hooked, serving the most advertisements, or resulting in the greatest number of purchases. In a recent analysis, my collaborators and I found that 80% of apps for children contain manipulative designs. A manipulative pattern in one popular app leads to copycats, and exploitative designs can quickly become pervasive across the ecosystem of digital content for kids. The widespread use of a particular pattern then normalizes that practice in consumers’ minds, in the same way we don’t think twice about the common practice of charging $9.99 instead of $10.00 for an item to subtly make consumers feel like they’re spending less than they really are.
At the same time, consumers and regulators are becoming more savvy and less tolerant of these tactics. We’ve seen an increase in recent years in legislation that prohibits dark patterns and manipulative design, and I think that’s only going to grow.
“There is a lot of great innovation in this space, but unfortunately, a fair amount of manipulative design and exploitation as well.”
ABB: Do you hope your research will help to improve technology so that it has children’s thriving at the forefront?
AH: Some of my work involves inventing new designs and products to support children’s wellbeing, and I hope that it will inspire products that support autonomy, self-reflection, playfulness, and close interpersonal relationships—the things kids need to thrive. But I often think that designers don’t really need my help at all to come up with fantastic ways to support children; when I talk to product teams, it is clear that the industry is filled with incredibly talented designers who care a great deal about children’s wellbeing. They need a level playing field where it is commercially viable to create products that are great for kids and where there is no need for a race-to-the-bottom that extracts data, time, and money from children in coercive ways.
So, increasingly, I am trying to identify problematic designs in a precise way and to share these patterns with regulators. Once bad practices are off the table, I think designers will be free to create really fabulous content for kids and for those kinds of products to be successful.
ABB: What are the biggest mysteries when it comes to technology and kids?
AH: I think a lot of the practical questions that families want answers to still can’t be answered in a straightforward way. I don’t think there’s one right answer to questions like, “when should my child get a phone?” or “how much TV is ok for a school-aged kid?” But I do think that by pushing for more respectful, developmentally appropriate online environments, those questions will start to matter less and less.
“I’d love to see companies competing to outdo each other in the value their products offer children.”
ABB: What are your hopes for the future?
AH: I’d love to see companies competing to outdo each other in the value their products offer children, rather than competing to be most successful in extracting resources from kids. I’d also love to see more integration between research and practice, where researchers take up questions that are highly relevant to industry and feed findings back into products. All of that feels increasingly within reach, so it’s a very exciting time to be in this space.
Footnotes
This interview has been edited for clarity.