Restart: the most powerful vote we have

With each cultural shift and technological advancement, we are constantly being forced to re-evaluate our ethics and policies, traverse new possibilities, and deal with unintended consequences, says Chris Ategeka

It’s not your imagination: you’re correct in thinking that every week there’s a new scandal about ethics and the tech industry. Even as the tech industry is trying to establish concrete practices and institutions around ethics, hard lessons are being learned about the wide gap between the practices of ‘doing ethics’ and what people think of as ‘ethical.’ Here are some examples.

Democratised technology

Gene editing: Is it ethical to eliminate species? Is it ethical to design a baby with features and qualities to your liking?

3D printing (additive manufacturing): This is the process whereby a physical object is constructed using a 3D computer model and a standard machine that extrudes material to build the object, often layer by layer. These machines are extremely affordable for small-batch productions relative to the manufacturing equipment we’ve relied on until now. Flexibility in 3D printing was designed into their architectures from the beginning. 3D printers are designed to enable someone to make almost any design a reality. Today, 3D-printable items already range from the mundane, like plastic toys, to the life-changing, like affordable housing. The first airplane with a 3D-printed parts took flight in 2014. And the world’s first 3D-printed heart was announced in April 2019.

Simply put, 3D printing is democratising the production of anything. On its face, this is amazing. Imagine eliminating the organ-transplant waiting list. The superpowers allowing us to bypass the controls that have existed for generations in supply chains are also potentially dangerous if gone unchecked.

Is it ethical to 3D print a human (although not possible now, we are on our way), as opposed to going through the traditional birth process? Is it ethical to 3D print a gun? The model for the Liberator, a 3D-printable plastic gun, was downloaded more than 100,000 times before a federal judge blocked the posting of 3D gun blueprints online, according to an article in USA Today.

Tech surveillance as a revenue model

In 2021, personal information on more than 533 million users spanning 106 countries was breached on Facebook, according to an article for The Verge.

Another incident happened in 2018 with Facebook and the Cambridge Analytica data scandal. Like most large technology conglomerates, Facebook collects a lot of personal data. Technology conglomerates use this personal data to determine which ads a user should be targeted with. Facebook’s algorithms are so advanced that they can predict almost everything about you—what recipe you want to try out in your new air fryer, whether or not you’re in a relationship, even if you haven’t shared this information publicly, or what you should get your dad for his birthday. Although it’s no secret that Facebook collects a lot of personal data, Facebook’s most serious ethical lapses in judgment have involved its handling of this data and how (or if) other large companies are able to access this data, as discussed in a 2020 article in the MIT Sloan Management Review.

YouTube’s recommendation algorithm promotes conspiracy theory videos as a way to boost ad revenue, which misinform users in a harmful way, according to an article in the Independent.

A wide variety of ethical issues still remain with smart-wearable glasses. Ethical issues identified by experts are related to privacy, safety, justice, change in human agency, accountability, responsibility, social interaction, power, and ideology.

The future of ethical technologies

The reality is that most tech companies practice market fundamentalism, also known as free market. This is a term applied to a strong belief in the ability of unregulated laissez-faire or free-market capitalist policies to solve most economic and social problems. It is often used as a pejorative by critics. Although tech companies don’t choose profit over social good in every instance, it is the case that for the organisational resources necessary for morality to win out, they need to be justified in market-friendly terms. When push comes to shove, profit usually wins.

In addition, most tech companies practice technological solutionism. The idea that all problems have tractable technical fixes has been reinforced by the rewards the industry has reaped for producing technology that they believe does solve problems. Even if the negative effects are glaring in everyone’s eyes, they would justify it with ‘it’s all in service for a greater good.’

Another reality is that most technology companies would like to build predictable processes and outcomes that serve the bottom line. Always evaluating how much ‘ethics’ adds to the cost of doing business? They need lots of external pressures to respond to ethical considerations.

The third reality is that until recently, the education system was busy cranking out programmers to build technology companies but did not spend enough time looking at the ethical considerations. Today, there is some momentum in universities around the world to develop courses that bridge technology and ethics. For example, at Harvard University, some of the ethical problems that courses are tackling include these: Are software developers morally obligated to design for inclusion? Should social media companies suppress the spread of fake news on their platforms? Should search engines be transparent about how they rank results? Should we think about electronic privacy as a right?

Establishing an ethics curriculum in schools or policies in companies is critical for creating an ethical technology. The contents of the policy should be specific to the values, goals, and culture of the institution or companies in different communities. One size does not fit all.

Are universal ethical principles in technology possible? The answer is no. Nor should universality be the goal. Ethics are culturally dependent; thus, expecting them to all be the same is setting yourself up for failure.

As a leader and a human being, think hard about the ethical implications of what you bring into this world. What are you building? Who plans to use it? What can it potentially be used for—intended and unintended? What are some of the worst-case scenarios if bad actors get their hands on it? What fail-safes can you put into place to mitigate or expose that? Will the systems you work for or create be used to hurt, control, or profile others?

If you were born in another country, would you feel differently about your contribution to this system? What effects on the planet will your project have? Is your system susceptible to bias?

With each cultural shift and technological advancement, we are constantly being forced to re-evaluate our ethics and policies, traverse new possibilities, and deal with unintended consequences. Knowing too well that our world is interconnected and interdependent, let’s strive together for a better world and for each other.

There is a lot that individuals can do—locally, nationally, internationally—to mitigate the unintended consequences of technology proactively and reactively. In order to get there, we need a reset. A reboot. A restart.

This is an edited extract from The Unintended Consequences of Technology: Solutions, Breakthroughs and the Restart We Need by Chris Ategeka (published by Wiley, 2021)

Chris Ategeka Is an engineer, entrepreneur, and philanthropist. He is the founder of the Center for the Unintended Consequences of Technology [UCOT] a company that focuses on finding solutions to the challenges at the intersection of technology and humanity. He was named to Forbes magazine’s 30 Under 30 in 2014, is an Echoing Green Fellow, a TED Fellow, and was recently recognised as a Young Global Leader by the World Economic Forum. To learn more visit helloucot.com/ christopherategeka.com/ @chrisategeka

You may also like...

employee wellbeing

Breathe easy: how to prepare for workplace presentations

Presentations can be daunting for even the most confident employee; fear of standing up in front of colleagues can quite easily make your heart race. Luckily, Carolyn Cowan is on hand with some timely tips on how to keep the worries at bay so you can focus fully on acing that important presentation

Read More »
New curriculum

A shorter route to an MBA opens up at LBS

London Business School (LBS) has announced the launch of a new one-year MBA for candidates who graduated three or more years ago with a master’s in management (MiM) degree from a reputable institution

Read More »