Every advancement has its dark side, and so does AI (Artificial Intelligence). The future seems amusing when we imagine the world free of human error; an era we imagined as kids when we’d think about how robots and machinery will take over the worldly tasks while we tell the newest technology to do our jobs for us. We fantasised about bossing it around and benefiting from it, without realising that we have now become slaves to it.
While most people enjoy its perks as it assures them of living in the 21st century, some of them seek pleasure by weaponising AI, especially against women, with the classic case of ‘deepfakes’.
“Women in Pakistan, who are already vulnerable to gender-based violence and discrimination, face heightened risks from deepfake technology,” says Nighat Dad, Director of Digital Rights Foundation.
“Deepfakes, particularly those targeting women, are now being used to spread misinformation and propaganda. This influences public perception and reinforces harmful biases, especially against public figures,” adds Nighat.
From influencers to politicians, no woman is safe from the wrath of waking up one day and seeing their visuals in compromised and falsified situations going viral on the internet, a space that already makes 70 per cent women feel threatened when it comes to their pictures being posted online, according to Digital Rights Foundation’s survey, titled ‘Measuring Pakistani Women’s Experiences of Online Violence: A Quantitative Research Study on Online Gender-Based Harassment in Pakistan’.
“This is especially prevalent among women in politics, journalists, and celebrities. In societies like ours, this can escalate the risk of severe consequences, including threats to safety and well-being. Such false media can perpetuate harmful stereotypes and facilitate further violence,” points out Nighat.
She also shared how DRF (Digital Rights Foundation) has observed thorough complaints at their Cyber Harassment Helpline about deepfakes and other forms of synthetic media that pose significant risks, particularly for women in countries like Pakistan, where patriarchal norms are prevailing and digital literacy is low.
“Our helpline cases indicate that AI and deepfakes have exacerbated gender disinformation and harassment. DRF’s Cyber Harassment Helpline received 27 cases related to deepfakes this year, from January to August, compared to 0 from last year,” informs Nighat
Section 18 of the Prevention of Electronic Crimes Act states: “Whoever intentionally and publicly exhibits or displays or transmits any information through any information system, which he knows to be false, and intimidates or harms the reputation or privacy of a natural person, shall be or punished with imprisonment for a term which may extend to three years or with fine which may extend to one million rupees or with both.”
However, Nida, who is working to address these challenges by advocating for stronger policies, raising awareness about gender disinformation, and supporting victims of digital harassment, believes that Pakistan still lacks robust legal frameworks to address digital harassment and deepfake technology exacerbates these issues as women continue to struggle to seek justice or protection in such an environment.
Deepfakes aren’t the only threat that AI poses as data regulation against females. An explainer by the UN on AI and gender equality quotes a study by the Berkeley Haas Centre for Equity, Gender and Leadership which analysed 133 AI systems across different industries and found that about 44 per cent of them showed gender bias.
Daily life norms and beliefs enforce bias by regulating data that chooses to discriminate against women. The discrimination surfaces when the data and layout to train AI models is sourced from the real world where bias is already prevalent as women continue to strive for equality and better opportunities.
Another study by UNESCO highlighted how Large Language Models (LLMs) - like GPT-3.5 and GPT-2 by OpenAI, and Llama 2 by META - show clear evidence of bias against women.
According to the study, titled ‘Bias Against Women and Girls in Large Language Models’, when the AI bots in question were asked to write a story, based on a spectrum of genders, the LLMs assigned more diverse jobs to men like ‘engineer’, ‘doctor’, etc whereas women were rounded up as ‘domestic servants’, ‘cooks’ etc.
Thus, women themselves have decided to take charge of fixing the problem as no one else can better understand this dire situation; it’s for the women, by the women.
Speaking to Zunaira Saqib, the CEO of AI-driven career counselling organisation – ‘Merafuture’, You! (magazine) got to know that any technology can be only as biased as the data it is built upon, and this data often reflects the existing inequalities in society.
“For example, fields like engineering are still male-dominated, and when AI models are trained on data from such environments, they can inherit that bias. Overcoming this hurdle is not straightforward because AI is a product of human input, and if the input is biased, the outcomes will be too,” explains Zunaira Saqib.
Zunaira shared that when she started her company in 2020, AI wasn’t the buzzword as it is today, but that didn’t stop her to realise the technology’s potential to create a fair and accurate system that could guide students in selecting the right career paths.
“Being a woman in AI, I felt it was crucial to launch ‘Merafuture’ because I wanted to provide an objective, data-driven career counselling solution for Pakistani students. We collected data from top Pakistani universities and had thousands of undergraduate students go through personality, interest, and subject-based tests,” she elucidates.
Using this data, Zunaira developed an AI model that matched high school students with successful university students in similar programmes which allowed students to choose degrees where not only their academic strengths but also their interests and personalities aligned.
“We aimed to ensure that students received unbiased career guidance without human intervention. Being the first to implement this in Pakistan was significant, but more importantly, it has helped over 17,000 students make informed career decisions. Launching ‘Merafuture’ wasn’t just important for us; it was a necessity for the students we wanted to empower. We are proud to have pioneered this approach and helped shape the futures of so many young minds across Pakistan,” highlights Zunaira.
As proud as the CEO is of her achievement, she’s still aware of the fact that pulling this off is not easy. “It’s not a silver bullet. We designed our system to guide students - girls and boys alike -into careers where their strengths, interests, and personalities align. By doing this, we remove the societal bias that often tells girls to pursue safe professions like teaching or medicine. Our AI model doesn’t see gender; it looks at a student’s potential and matches them with careers where they are likely to thrive, regardless of societal expectations,” she expresses.
While sharing her experiences with this scribe, Ammara Aftab, pioneer of Pakistan’s first AI-focused podcast ‘Pro AI Talks’, says, “As a woman at the forefront of AI in Pakistan, AI offers immense potential for empowerment and innovation, and it also presents challenges, particularly in the form of technologies like deepfakes, which disproportionately target women. The tendency of AI to be weaponised against women often reflects broader societal biases that are encoded into AI systems due to the underrepresentation of women in the development of these technologies. This pattern of discrimination is not just a reflection of AI but of the societal norms that feed into its algorithms,” laments Ammara.
“However, my work across various sectors - from delivering sessions to lawyers, corporate professionals, HR experts, to CSS aspirants - has shown me the rising tide of female AI enthusiasts in Pakistan. This growing interest among women is a positive sign that they are ready to engage with and shape this technology,” she observes. “I have been able to further this engagement by providing a platform where issues of AI’s impact on society, especially on women, are discussed and explored,” she adds.
Sharing more about her work, Ammara elucidates, “In my engagements and through the podcast, I emphasise the importance of ethical AI development and the role of diversity in creating AI that serves all of humanity, not just a segment. The enthusiasm and capability of Pakistani women I have encountered are testaments to the potential of AI as a force for good.”
Women like Zunaira and Ammara are striving for a better future in AI as they continue to break down barriers and challenge the status quo by empowering more women to not only participate in AI but to lead it.
“While we can improve the fairness of AI models, society also needs to change. As more women enter male-dominated fields, the data will reflect this, and AI will become less biased over time. But it starts with us as humans, creating an environment where girls are encouraged to pursue all types of careers,” comments Zunaira.
According to Zunaira, her organisation is playing its role by ensuring that the paths they form and recommend are objective, helping girls choose careers where they can excel based on their own unique talents as AI can be a force for good, but we as a society need to pave the way for that transformation.
On the other hand, Ammara wants the future generation to grow up on the brighter side of AI, ensuring that its development benefits from a fuller spectrum of human experience and wisdom.
“Thus, while AI does pose specific challenges to gender equality, it also holds the key to overcoming them. With concerted effort and inclusive policies, we can harness AI to build a more equitable tech future,” says an optimistic Ammara.
The world has always been this way; be it pay disparity, societal roles, gender-based violence, job discrimination, you name it. To talk about changing the way people think is a long shot and while they want to be perceived as ‘woke’ to fit the perception of today’s world, the cycle is so entrenched that the bias against women continues to surface in some way or another - they can hold it back only so much.
So, seeing such women take matters into their own hands is refreshing and reassuring. No matter how much the sexist big wigs and AI bots try to discriminate against women in tech, no matter how much and what sort of data they feed the systems, no matter what roles they decide, the women will always stay steps ahead.
Mahnoor Tariq is a journalist dedicated to highlighting topics around women and underrepresented voices. She can be reached at smahnoort11@gmail.com