After spending her Laura Bates, a nanny in the UK in her 20s, noticed that the young girls she cared for were stimulated by the marketing they received. In 2012, London-based feminist writer and activist Bates launched the Daily Sexism Project, which is dedicated to documenting and fighting Sexismmisogyny and gender violence emphasize insidious events around the world, e.g. Invisible laborrefer to women as girls and comment on their outfits in a professional setting. The website became a book in 2014.
Since then, sexual harassment by women has encroached on online space, including Bates himself experience Become a victim Deepfake pornwhich prompted her to write her new book, A new era of sexism: How AI and emerging technologies reshape misogynypublished by SourceBooks on September 9.
Bates told Wired that while gender-based violence is still committed by people near victims, it is fast, easy, cheap, if not free access to AI “is lowering the standards to quickly access this particular form of abuse.” “Now anyone with the internet can… make a hugely abused, pornographic image to see any woman or girl who sifted from the internet to screen the full image.”
Through first-hand research, involving talking to tech creators and women victimized by AI and Deepfake technologies, as well as using her reduced chats and Sexbots A new era of gender discrimination Bates maps how AI conquers women’s new boundaries, even incorrect and urgent supervision.
“I know people are going to think, ‘She sounds like a pearl, na, na, tight-fitting feminist’, but if you look at the top of the big tech companies, those levels of men are saying I’m the exact same thing,” Bates said, pointing to Jan Leike, who points to Jan Leike leave Last year, Openai involved companies that prioritize “shiny products” over safe ones. “This warning call is sent by people embedded in top leaders of these companies. The question is whether we are ready to listen.”
Bates also talked with Wired about how AI girlfriends and virtual assistants can instill misogyny into children, AI’s environmental footprint first covers women, and how new technologies can transform new technologies into creators and users’ biases.
This interview has been condensed and edited for length and clarity.
Wired: One thing that shocked me was that new developments developed into misogyny. Do you think this is fair?
Laura Bates: This is a long and good model. We’ve seen it on the internet, we’ve seen it on social media, we’ve seen porn online. Almost always, when we have enough privilege to obtain new forms of technology, a large portion will soon be tailored to harass women, abuse women, conquer women and maintain patriarchal control over women. The reason is because technology itself is not inherently good or bad or anything. It is encoded with creators’ bias. It reflects the historical social form of misogyny, but it gives them a new life. It provides them with new ways to achieve goals and new forms of abuse. It is particularly concerning that new technology areas with AI and generative forms of AI, in particular, not only reflect on those existing forms of abuse, but also enhance them through further threats, harassment and control, which will be exercised by the abuser.