This year, the UN's annual Internet Governance Forum was convened virtually. Members of the DemTech team attended panels to hear discussions and analysis from global experts on the current state of global governance of the internet, AI, and data. If you missed any panels or want to learn more, all sessions are recorded and available on IGF's YouTube page. Here are some of our takeaways from the event.
Harnessing tech for good:
Major themes at this year's IGF were the intersections between technology, environmental sustainability, and social justice. Several panels highlighted some huge technological breakthroughs that could allow us to greatly improve medical care, social services, and climate action.
IGF's main session on the environment highlighted that digital technologies can be used to limit deforestation, decrease carbon emissions, limit fertilizer usage, and better manage sustainable energy sources. The ICT sector in particular has the potential to abate three times as much energy consumption as it produces. 5G technology presents potential new opportunities for remote supervision and work in the medical field, which could allow societies to bounce back from the coronavirus pandemic. Infrared temperature measurement, wider availability of remote psychological counseling and telehealth, and automated checks for health workers to ensure their PPE fully covers them are just a handful of potential 5G applications. Two panel discussions on technology for smart cities focused on opportunities to develop privacy and transparency of AI and data systems and how local officials can get democratic control over AI, as cities use technology to manage public spaces, protect vulnerable communities, and improve service delivery.
Rethinking rights and justice in AI:
Institutional frameworks have huge impacts on governance of emerging technologies, artificial intelligence, and data. Around the world, we now see automated decision-making systems codify systems of inequality rather than confront injustice. In the panel "Internet Diverse - People United!" speakers called our current framework one of "techno-imperialism," and stressed the importance of incorporating concepts of justice into our frameworks for AI and optimization technologies. One speaker emphasized the importance of considering feminist perspectives as we think about internet and AI governance. Our data is our body, and our bodies are our data.
The panel discussed the need to democratize the algorithms that control so much of our day-to-day life, as the people who make automation decisions and the framework in which they make choices about technology can play a crucial role in healthcare, employment, welfare, policing, and other areas that impact marginalized communities. Some feel as though the mantra of discrimination, transparency, and accountability has been captured by large tech companies, who utilize such language while falling short of deeper work. The perceived willingness of Google and Facebook talking about racial justice and accountability may distract from a lack of efforts from these companies to deal with larger structural and systemic problems.
Another speaker added that today, we often mask larger social problems as technological problems as algorithms become embedded in more and more sensitive parts of our lives. If, for example, a job database repeatedly ranks unemployed women as having lower job prospects than men, it is a matter of both correcting for that algorithm's bias and addressing the inequality that existed before that technology formalized it.
Panelists emphasized the importance of moving beyond discussions of privacy to focus on the importance of tech for social rights. While privacy protects our basic identities, this will allow us to bring in material aspects of justice into our framework, such as fair labor and wage practices. We can steer the conversation towards reclaiming our data and making sure technologies actually serve the public interest, rather than amplify injustices.
Addressing violence against women online:
Twenty-five years after the Beijing Declaration was adopted to advance women's rights around the world, technology and innovation are a top priority for gender equality as the internet shapes societies and gender relations around the world. As distinctions between violence against women in the real world and the digital world have disappeared, "Safe Digital Spaces: a dialogue on countering cyberviolence against women" posed the question: what would it mean to abolish misogyny and reclaim women's rights online in the data age?
Cyber-violence campaigns against women have a common strategy: they deploy personal attacks at women who are politically active to prevent their free expression, whether that is political, creative, or sexual expression. These attacks undermine women and minorities' rights to expression online. Addressing the gender digital divide should be everyone's responsibility--government, civil society, and tech platforms. Making digital spaces safer for women will require a redesign to make our entire society more safe and inclusive. The companies that are most impacted by online gender based violence must be brought to the table and make better use of data to protect women. A Facebook representative described the company's ongoing efforts to develop automated monitoring tools that can detect coordinated attacks and cyber-mobbing, as well as the creation of machine learning that can identify harmful language and introduce friction before users post misogynistic content.
Another panel, titled "Stop stalkerware: tackling digital stalking helps victims of domestic violence," focused on stalkerware and its consequences for survivors of domestic violence. Stalkerware has given abusers new avenues of power to exert control over their victims, as they can access women's photos, messages, search history, and more. Women who talk to family and friends or call hotlines for help may face even more violence if abusers discover this through these applications. Speakers emphasized the rights of survivors to access safe internet, phones, and technology, and their rights to live free from fears of tracking by their abusers. NGOs cannot solve this problem on their own, but as many communities around the globe view police forces as oppressive, they are unlikely to go to the authorities for help when facing digital stalking. Trust needs to be built through a multi-stakeholder approach, and the voices of survivors and other marginalized people must be at the table to develop long term solutions to hold digital abusers accountable. At the end of the day, any solution needs to empower survivors to make the best decisions for themselves.