This weekend, the photo-editing app Lensa flooded social media with celestial, iridescent, and anime-inspired “magic avatars.” As is typical in our milkshake-duck internet news cycle, arguments as to why using the app was problematic proliferated at a speed second only to the avatars themselves.
I’ve already been lectured about the dangers of how using the app implicates us in teaching the AI, stealing from artists, and predatory data-sharing practices. Each concern is legitimate, but less discussed are the more sinister violations inherent in the app, namely the algorithmic tendency to sexualize subjects to a degree that is not only uncomfortable but also potentially dangerous.
Lensa’s terms of service instruct users only to submit appropriate content containing “no nudes” and “no kids, adults only.” And yet, many users—primarily women—have noticed that even when they upload modest photos, the app not only generates nudes but also ascribes cartoonishly sexualized features, like sultry poses and gigantic breasts, to our images. I, for example, received several fully-nude results despite uploading only headshots. The sexualization was also often racialized; nearly a dozen women of color told me that Lensa whitened their skin and anglicized their features, and one woman of Asian descent told me that in the photos “where I don’t look white they literally gave me ahegao face.” Another woman who shared both the fully-clothed images she uploaded and the topless results they produced—which she chose to modify with “some emojis for a lil modesty cuz omg”—told me, “I honestly felt very violated after seeing it.”
I’m used to feeling violated by the internet. Having been the target of several harassment campaigns, I’ve seen my image manipulated, distorted, and distributed without my consent on multiple occasions. Because I am not face-out as a sex worker, the novelty of hunting down and circulating my likeness is, for some, a sport. Because sex workers are not perceived by the general public as human or deserving of basic rights, this behavior is celebrated rather than condemned. Because sex work is so often presumed to be a moral failing rather than a job, our dehumanization is redundant. I’ve logged onto Twitter to see my face photoshopped onto other women’s bodies, pictures of myself and unclothed clients in session, and once even a word search comprised of my face, personal details, and research interests. I’m not afraid of Lensa.
I’m desensitized enough to the horrors of technology that I decided to be my own lab rat. I ran a few experiments—first, only BDSM and dungeon photos; next, my most feminine photos under the “male” gender option; later, selfies from academic conferences—all of which produced spectacularly-sized breasts and full nudity.
I then embarked on what I knew would be a journey through hell and decided to use my likeness to test the app’s other restriction: “no kids, adults only.” (Some of the results are below: Please be aware that they show sexualized images of children.)
I have few photos of myself from childhood. Until my late teens and between my unruly hair, uneven teeth, and the bifocals I started wearing at age seven, my appearance could most generously be described as “mousy.” I also grew up before the advent of the smartphone, and any other pictures are likely buried away in distant relatives’ photo albums. But I managed to piece together the minimum 10 photos required to run the app and waited to see how it transformed me from awkward six-year-old to fairy princess.
The results were horrifying.
In some instances, the AI seemed to recognize my child’s body and mercifully neglected to add breasts. This was probably not a reflection of the technology’s personal ethics but of the patterns it identified in my photo; perhaps it perceived my flat chest as being that of an adult man. In other photos, the AI attached orbs to my chest that were distinct from clothing but also unlike the nude photos my other tests had produced.
I tried again, this time with a mix of childhood photos and selfies. What resulted were fully-nude photos of an adolescent and sometimes childlike face but a distinctly adult body. Similar to my earlier tests that generated seductive looks and poses, this set produced a kind of coyness: a bare back, tousled hair, an avatar with my childlike face holding a leaf between her naked adult’s breasts. Many were eerily reminiscent of Miley Cyrus’ 2008 photoshoot with Annie Leibovitz for Vanity Fair, which featured a 15-year-old Cyrus clutching a satin sheet around her bare body. What was disturbing about the image at the time was the pairing of her makeup-free, almost cherubic face with the body of someone implied to have just had sex.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.