Wed. Jan 26th, 2022

Six months ago pilot Hana Khan saw her picture on an application which apparently “auctioned” dozens of Muslim women in India. The application was quickly downloaded, no one was charged, and the issue was suspended – until a similar application appeared on New Year’s Day.

Khan was not mentioned on the new application Bulli Bai – a slander for Muslim women – it was to hawk activists, journalists, an actor, politicians and Nobel laureate Malala Yousafzai as maids.

Amid growing outrage, the application was dropped, and four suspects were arrested last week.

Police are escorting a 20-year-old man through New Delhi airport after he was arrested in the eastern state of Assam for allegedly creating an online application that 'sold' Muslim women.Police officers escort Niraj Bishnoi, a 20-year-old man after he was arrested in the eastern state of Assam for allegedly creating the Bulli Bai app [Reuters]

The fake auctions widely shared on social media are just the latest examples of how technology is being used – often with ease, speed and low cost – to endanger women through online abuse, theft of privacy or sexual exploitation.

For Muslim women in India who are regularly abused online, it is an everyday risk, even if they use social media to incite hatred and discrimination against their minority community.

“When I saw my photo on the app, my world shook. “I was upset and angry that someone could do this to me, and I got angry when I realized this anonymous person was getting away with it,” said Khan, who filed a police complaint against the first app. Sulli Deals, another pejorative term for Muslim women.

“This time I felt so much fear and despair that it happened to my friends again, with Muslim women like me. “I do not know how to stop it,” Khan, a commercial pilot in her 30s, told the Thomson Reuters Foundation.

Mumbai police said they were investigating whether the Bulli Bai application was “part of a larger conspiracy”.

A GitHub spokesman, who presented both applications, said it had “long-standing policies against content and behavior involving harassment, discrimination and incitement to violence.

“We have suspended a user account after investigating reports of such activities, all of which violate our policies.”

Misconception

Advances in technology have increased the risks for women around the world, whether it be trolling or doxxing with their personal details, surveillance cameras, location tracking or deep-fake pornographic videos with doctored images.

Deepfakes – or artificial, intelligence-generated, synthetic media – are used to create pornography, with applications that allow users to take off women’s clothes or replace images of their faces in explicit videos.

Digital abuse of women is pervasive because “everyone has a device and a digital presence,” says Adam Dodge, CEO of EndTAB, a United States-based nonprofit that tackles technology-enabled abuse.

“The violence has become easier to commit, as you can find someone anywhere in the world. “The order of magnitude of damage is also greater because you can upload something and show it to the world in a matter of seconds,” he said.

“And there’s a permanence to it, because that photo or video exists online forever,” he added.

Police officers escort husband and wife over Bulli Bai appPolice officers escort a man and a woman after appearing in court in Mumbai following their arrest over alleged involvement in Bulli Bai application [Niharika Kulkarni/Reuters]

The emotional and psychological effects of such abuse are “just as exhausting” as physical abuse, with the consequences being exacerbated by the virality, public nature and permanence of the content online, said Noelle Martin, an Australian activist.

At 17, Martin discovered her image had been digitally altered and distributed in pornographic images. Her campaign against image-based abuse has helped change the law in Australia.

But victims are struggling to be heard, she said.

“There is a dangerous misconception that the harm of technology-facilitated abuse is not as real, serious or potentially fatal as abuse with a physical element,” she said.

“For victims, this misconception makes speaking out, seeking support and making access to justice much more difficult.”

Prosecution

It’s hard to spot lone creators and rogue coders, and technology platforms tend to protect anonymous users who can easily create a fake email or social media profile.

Even lawmakers are not spared: in November, the U.S. House of Representatives criticized Republican Paul Gosar over a digitally altered anime video showing him killing Democrat Alexandra Ocasio-Cortez. He then tweeted the video back.

“With any new technology, we need to immediately think about how and when it will be abused and armed to harm girls and women online,” Dodge said.

“Technology platforms have created a very unbalanced atmosphere for victims of online abuse, and the traditional ways of seeking help when we are harmed in the physical world are not as available when the abuse takes place online,” he said.

Some technology companies are taking action.

Following reports that its AirTags – tracking devices that can be attached to keys and wallets – are used to track women, Apple has launched an application to help users protect their privacy.

In India, the women on the auction apps are still being shaken.

Ismat Ara, a journalist exhibited at Bulli Bai, called it “nothing less than online harassment”.

It was “violent, threatening and with the intention of creating a feeling of fear and shame in my mind, as well as in the minds of women in general and the Muslim community,” Ara said in a police complaint. posted on social media.

Arfa Khanum Sherwani, who is also for sale, wrote on Twitter: “The auction may be fake, but the prosecution is real.”

Source link

By admin

Leave a Reply

Your email address will not be published.