top of page

MAGIC 365

Mouriesse Alick Global Internet Consulting

Your Computer is on Fire

Updated: Mar 30, 2022

Technology touches pretty much every aspect of people’s lives today, yet few stop to think about its origins or where it might take society. Many forget that humans are involved at every stage of technological design, development and deployment, and that these individuals’ worldviews shape the technology they create.



This collection of essays by authors from the social sciences and STEM backgrounds argues against the view that technology is “neutral” in nature, and reveals the biases and power structures inherent in technological designs and tools.


Take-Aways

  • Technological progress without input from the social sciences is dangerous as it fails to realize its political, social and economic implications.

  • The seemingly virtual world of e-commerce and technology is profoundly physical.

  • Artificial intelligence is never purely artificial – it always depends on human input.

  • Sexism brought the UK computing industry to its knees.

  • Technology is neither neutral nor objective, but, rather, often reinforces negative racial and gender stereotypes.

  • The QWERTY keyboard serves as a site of both exclusion and creative possibility.

  • Contrary to what the tech industry wants you to believe, it is not a meritocracy.

Technological progress without input from the social sciences is dangerous as it fails to realize its political, social and economic implications.


Technological development always has the potential to improve lives. But even when an innovation aims to solve a serious societal problem, it often generates unintended consequences: Google played a key role in mapping the internet for the masses, but it also opened the door to invasive data mining and user privacy violations. Usually, people with little or no power in society suffer the most negative fallout from new technologies.

“We are witnessing a period in which it is becoming ever more urgent to recognize that technological progress without social accountability is not real progress.”

Tech developers’ lack of concern for the widespread effects of their creations stems, in part, from a lack of input from other disciplines, such as social or political sciences and other STEM subjects. Leaving technical experts without sufficient knowledge of history or psychology – or even just a sufficient incentive to fact-check – in charge of systems that touch every aspect of people’s lives can lead to dangerous results. Think, for example, about how lack of governmental oversight allowed Boeing to install flawed autopilot systems in its 737 Max airlines – resulting in the deaths of hundreds of people.

Many of today’s technologies and systems have power disparities and discrimination built into their makeup. Facebook founder Mark Zuckerberg’s first social media platform was voyeuristic and objectified his female classmates, so Facebook’s problematic history of privacy violations should not come as a surprise. Early computers and technology helped win wars and put people into space, but people often forget the women – in particular the Black women– involved in these ventures. Computing history is therefore also a history of the dominance of affluent white males.


The seemingly virtual world of e-commerce and technology is profoundly physical.


The cloud is a collection of technology resources that people can access via the internet, such as web hosting, server-based applications or data warehousing. People often think of the cloud as something without any physical presence, because of the seamless service it usually provides, and the invisibility of the labor and equipment involved. Yet the cloud is essentially a large factory, with a complex and extensive physical infrastructure. It is one of the largest consumers of electricity and water, and one of the largest sources of pollution in the world. Cloud servers and computers depend on rare materials, such as lithium, tin or cobalt, which companies source from all over the world.


“In rendering invisible the material infrastructure that makes possible the digital economy, the metaphor of the cloud allows the computer industry to conceal and externalize a whole host of problems, from energy costs to e-waste pollution.”

The cloud’s global supply chain raises political, social and environmental issues, ranging from worker safety to political and economic relations between countries. It also raises questions as to why big e-commerce companies are exempt from certain social, political and environmental controls. For example, despite being, in essence, a modern version of a mail-order catalog firm, Amazon receives tax subsidies by self-identifying as an e-commerce entrepreneur.


Artificial intelligence is never purely artificial – it always depends on human input.


All social media sites rely on artificial intelligence (AI) to monitor and moderate content. They use basic AI to filter keywords, IPs and URLs, as well as much more advanced tools such as hashing technologies, sentiment and forecasting tools, pixel analysis and machine learning. The use of AI might make it appear to remove potential biases from the process – but instead, it removes accountability. People forget that human decisions and actions lie behind the algorithms, programs and processes that comprise AI.

The quality of AI decision-making depends on the technology’s source materials. If you put poor information in, you get poor information out. For example, policing technologies tend to rely on crime statistics. Yet crime statistics often give a skewed picture because of the over-policing of lower-income neighborhoods, and the resulting frequency of arrests of non-white people. In recent years, most of the big tech firms, including Facebook, Twitter and YouTube, have had to increase their human workforce for content moderation because of scandals around election influence, fake accounts or automated content recommendations.


“AI is presently and will always be both made up of and reliant upon human intelligence and intervention.”

Human input is critical in dealing with crimes such as child pornography, for instance. While algorithms can detect a potentially pornographic image, the decision about whether it classifies as child pornography relies completely on human interpretation and review. In addition, there are known problems with image recognition software: For example, most of these tools struggle to accurately identify racial minorities. This is because many of the data sets used to train the algorithms use white male adults as a default.


Sexism brought the UK computing industry to its knees.


The first computer operators and programmers in the UK were women, who, during World War II, worked on code-breaking computers. The reason women held these roles was not because there were no men available to do them – rather, it was because people considered computing akin to factory work.

“With computing work becoming aligned with power, women computer workers who possessed all of the technical skills to perform the jobs found themselves increasingly squeezed out by new hiring rubrics that favored untested men trainees with no technical skills.”

When computers become a more important part of the economy in the 1960s, the government changed its hiring practices for technical jobs. Women who possessed technical skills had to train male colleagues, who then took on managerial roles. Still, men perceived computing work as feminized, so they often did the government training, and then left to take up managerial, non-computing positions somewhere else. This led to a massive labor shortage, but the government still refused to employ or promote women for technical managerial positions. One such woman was Stephanie “Steve” Shirley.

After being passed over for promotion in the Civil Service several times, Shirley decided to start her own software company, drawing on the large pool of female technical talent, whom she employed on a flexible basis. Because the government and other big businesses were short of talent, they began to outsource computing work to companies such as Shirley’s.

Ultimately, to tackle the continued labor shortage, the government decided to invest in a solution that would require fewer workers. The government forced British computer companies to merge into a single company – International Computers Limited (ICL) – and demanded it focus production on large, technologically advanced mainframes in return for government grants and contracts. But by the time these mainframes were ready, the world had moved on to smaller and more flexible solutions, leaving ICL – and therefore the British computer industry – behind.


Technology is neither neutral nor objective, but, rather, often reinforces negative racial and gender stereotypes.


Speech technologies make many people’s lives easier. Yet those who don’t speak standard English find themselves excluded from using services that depend on these technologies, as their design reinforces accent bias. Research has found that people perceive those who speak with an accent as less intelligent, loyal and competent, and that non-native speech often leads to discrimination in housing, employment or the justice system. Language has always been a tool of imperialism; the language of the colonizer became the language of power – which the colonized had to adopt it if they wanted to have a voice. Similarly, the inability of speech technologies to understand non-standard English forces non-native or accented speakers to change the way they speak.

13 views0 comments

Comments


bottom of page