The excesses of 2021 have demonstrated how digital technology can threaten future ” human flourishing,” according to philosophers. This topic has received a lot of attention in the first few days. But let’s take two examples: MIT Technology Review’s worst technology excesses list and Fast Company’s best tech moments. It is evident that technology has limited power when things go wrong.
As we approach 2022, it is clear that technology’s unchecked disruptions of societal institutions and conventions are becoming less acceptable. This year, governments will adopt legislation to limit the impact of digital technology on society across multiple jurisdictions and with regard to many emerging and existing technologies. There are many initiatives underway, including the EU AI and Digital Services Acts and the UK Online Safety Bill, and the US SAFETECH Act.
While legislation is an indicator of society’s concern, it’s clear that ordinary people, non-specialists, have a more sophisticated understanding of technology and society. Liker or Hater, the satire Don’t Look Up debuted in December. It shows how important goal setting is for big tech and how it shapes our society. The dilemma of “Do we save the planet?” or “Do we save the precious mineral resources of the comet?” could be translated to “Do we make technology work for corporate or social goals?”
Peter Haff, a Duke professor, called the “technosphere” a complex environment. It is time for a change. This is not an easy technical fix. This will require us to think beyond the current frameworks and make global changes to support and complement each other. For example, we need to avoid creating “Switzerlands” technology development domains that are not subject to regulation.
Joseph Stiglitz, an economist, pointed out that regulation brought us regulatory capture. This should prompt us to look at ways that we can further ensnare corporate and societal power imbalances. David Graeber, an anthropologist, pointed out that “The ultimate hidden truth about the world is it is something we create, and could just equally easily make differently.”
We believe in bold new beginnings and big thinking. Here are seven ideas for goals, approaches, and behaviors that the technology sector could adopt by 2022, to support global society, rather than corporate goals.
1. Reducing its carbon footprint
Technology has always wished to be recognized as the global leader in decarbonizing and helping other sectors become more efficient. Research shows that technology companies still underrepresent their carbon footprints by failing to account for emissions in the value chain, from raw material extraction through to end-product usage. The apparent frivolousness of blockchain use like Dogecoin, non-fungible tokens (NFTs), metaverses, and entrepreneur-driven moonshots are evidence of a value system that fails to recognize the immense energy overhead that underpins all technology.
2. Be transparent about technological progress
It is still common to misrepresent the reality of technologies such as artificial intelligence in order to support a futurist vision of technology.is failing to recognize when real-world technology doesn’t meet expectations. Digital contact tracing apps are not proving to be as effective as they could. Autonomous vehicles still have a higher number of injuries than human-driven vehicles, and new products such as Web 3.0 are often surrounded with confusing hype.
3. Work with regulators
Multinational technology companies are being regulated by international regulators. The EU AI Act attempts to establish an international standard for developing trustworthy AI through risk-based classifications. The UK Online Safety Bill, the EU Digital Services Act requires transparency reporting. The US SAFETECH Act seeks to affirm civil rights, victims’ rights, and consumer protections.
California, Virginia, and Colorado were among the first to adopt privacy protection legislation. But investigative journalism discovered a “lobbying juggernaut” that targets international privacy regulation and gives companies like Amazon enormous influence over the drafting thereof. Meta (formerly Facebook), is outspoken in its desire for regulation. However, the informed view states that the company wants to maintain credibility while steering legislators towards areas where it feels more comfortable with tighter government control.
Regulators are taking action by hiring the brightest and most experienced from industry and ethical research. For example, the FTC has strengthened its AI Strategy Group by bringing in academic experts in policy, economics, law and technology. Because they are experts in technology and the technical architectures of technology, these people are well-equipped to tackle this difficult task. They work towards societal goals and will be able to benefit from industry support.
4. Cocreate better practices
The unadopted ethical frameworks, most recently the UNESCO guidelines, will continue without industry support. The parable of Alexa versus the penny, as Meredith Whittaker points to, shows that the fundamental problems in the relationship between AI & society cannot be solved by relying on engineering hygiene or algorithmic auditing.
However, measures that help us better understand how technology works are getting more traction. These include algorithmic transparency Standards for public sector bodies created by the UK Government. Moving from transparency to recourse will lead to accountability mechanisms and the creation of a relational dynamic between regulators, affected communities, and members of public-sector bodies that allow them to take part in the design and deployment of technologies. Some more radical ideas, such as requiring interoperability between platforms and portability data to rebalance power dynamics towards users, may change technology, business models, and people’s relationships using digital tools and services.
5. Cooperate with independent research
Simply put, corporate research capture means you are liable if you take the tech company dollars and you will be excluded from technologies and data access if not. Facebook has revoked access to NYU researchers who are looking into political advertising targeting data. Or Timnit Gebru was expelled by Google for his problematic relationship with ethical research, which undermined its carefully constructed narratives. This power balance must be reset to ensure technology companies don’t undervalue or co-opt peer-reviewed research.
New, independent, community-rooted organizations like Gebru’s DAIR Institute, The Minderoo Foundation’s frontier technology network nodes located in Oxford, Cambridge, and Los Angeles, Los Angeles, Los Angeles, and Western Australia, as well as the Ada Lovelace Institute based in London (where my work is), are essential players in monitoring bad practice and applying pressure to effect change. They will need access to the practices, policies, and infrastructures of technology companies in order to be effective.
6. Value and understand your employees
Widowblowers of tech workers continue to emerge. Frances Haugen provided evidence to Congress in the USA and to the European Parliament that Facebook knew its platform services were spreading misinformation. These brave acts are not enough to change the power imbalance between workers and corporate structures that is perpetuated by ingrained corporate culture and practices. We need to have a better understanding of how tech workers view morality and ethics in order to disrupt corporate control of ethics. This understanding must be mobilized to enable tech workers to follow their moral compasses within the company.
7. Trustworthy technologies are built to foster trust
Trust is a key topic in discussions about technology understanding. The public must be taught to trust technology. Research shows that distrust is often a rational response. It would be better to shift the focus to improving industry practices and building technologies and companies that can be trusted.
While many technology companies truly want to create products that promote human flourishing, if their primary goals are guided by shareholder interests, they could cause a backlash in public trust ( 2021 Pew Research Center data shows 56% of Americans support greater regulation of technology companies) and an “AI winter”.
It will be a huge leap forward if legislation is introduced in 2022. It will make a significant step towards rebalancing power dynamics, such as towards historically under-resourced and disadvantaged communities. However, such rebalancing can only be achieved with the support and participation of technology companies, especially those that have access to the code and processes that determine technology and the ability to create or stop change.