It shows in our use of language – the way the human effort needed for digitalisation and AI to function is rendered invisible. We say things like:
- Order food online with a few clicks, as if neither a chef nor a courier is needed to have it delivered to the door.
- Reserve a book from the library online, as if it then immediately leaves its shelf and ends up where it can be collected.
- Ask a chatbot a question, as if a machine has thought up the answers.
Read this story in Swedish on Arbeidsliv i Norden
Two minutes out of five hours
Julia Ravanis is a doctor of the history of technology. She has an interdisciplinary background in both the history of ideas and theoretical physics.
In her doctoral thesis, presented at Chalmers University of Technology earlier this year, she wishes to turn the usual image of computers and data processing on its head.
“There is an abstract way of thinking behind the digitalisation that makes the underlying work invisible.
“Google Books, for instance, publishes digitised texts. To do that, they employ low-wage workers who are scanning pages manually,” she tells the Nordic Labour Journal.
In her thesis, Julia Ravanis describes how data processing was organised between 1955 and 1975 at the Swedish National Defence Research Institute, FOA. The keywords used were rationality, efficiency and automation, even though the work was in practice dominated by the handling of manual data.
These words shaped the image of the computer as an abstract system in need of logical order rather than material care, she notes.
“To make a computer work, it was essential that humans sorted, prepared and carried boxes of result lists and punch cards between departments and machines.
“It would be a catastrophe if a box were dropped and the contents spilt out onto the floor. This, in turn, led to experience-based routines, such as numbering documents.”
Crowdwork Code of Conduct
According to the ILO authors, some crowdwork platforms have signed a “Crowdwork Code of Conduct” to improve working conditions, and some companies offer health and well-being services for workers who deal with upsetting content.
Meanwhile, guidelines for good practice and codes of conduct are being developed, including independent initiatives such as Fairwork AI Principles.
This manual work was not particularly valued, even though a data-processing task totalling five hours at FOA during this period only took two minutes of mainframe time while the remaining four hours, 58 minutes, depended on human hands.
This is evident from the 1967 payroll records from a data centre that Julia Ravanis studied.
“The assistants were the lowest paid workers – the punch card operators, office messengers, dispatch guards and switchboard operators. The operations manager earned four times as much.”
Digitalisation – nothing but a success story?
Today, digitalisation also includes artificial intelligence, but that does not mean invisible labour is eliminated. On the contrary, Lisa Reutter Larsen tells the Nordic Labour Journal.
“The need continues to grow. All AI models are in some way highly dependent on human labour.”
After completing her PhD at the Norwegian University of Science and Technology (NTNU) in 2023, she was awarded a post-doctoral position at the University of Copenhagen – first at the Center for Tracking and Society, and from this year at CAISA, Denmark’s national centre for AI in society.

both Norway and Denmark. Photo: Private.
The focus of Lisa Reutter Larsen’s research lies in the link between society and technology in public administration.
“My core interest is in the relationship between democracy, people and technology. I am a cross-disciplinary researcher.”
Today, Lisa Reutter Larsen belongs to the group of researchers working in critical data and algorithms studies – a field based on the paradigm that data is created by humans and therefore not neutral.
“There is an idea that merely using data somehow eliminates human values, which is not correct. We researchers focus on the entire value chain and try to decode the choices that are being made when data is collected, managed and used.”
Being critical to digitalisation and AI is not the same as being a technophobe, argues this Norwegian researcher. It is more about understanding the complexity that new technology brings.
“Digitalisation has become so all-pervasive and is seen as synonymous with progress. But it also brings other factors that we need to understand.”
The perception of AI being a success story is, according to Lisa Reutter Larsen, created by the tech giants.
“They want us to believe AI will take care of all the cumbersome and irritating tasks, while we enjoy the good life.”
The user interfaces, often both functional and elegantly designed, help automation appear seamless, argues Lisa Reutter Larsen.
“But human input is necessary and often takes place in non-Nordic contexts, while systems are increasingly integrated into our everyday lives.
“It’s the old story about the outsourcing of work and globalisation that contributes to this invisibility, and it is very important for the companies. Perhaps they want to create an illusion of perfect and fully automatic systems,” wonders Lisa Reutter Larsen.
Exposed micro workers
”From self-driving cars to virtual assistants, the AI industry thrives on data. This data needs to be meticulously labelled, categorised, and annotated. This requires human intelligence and labour – both of which still cannot be replaced by machines.”
That is how two representatives from the UN’s agency for labour issues, the International Labour Organisation ILO, put it in the article The Artificial Intelligence Illusion: How invisible workers fuel the “automated” economy.
The authors warn against celebrating AI as a driving force for automation, when its success depends on what they describe as “an invisible workforce performing low-paid, precarious tasks under challenging conditions”.
This happens because tasks that must be carried out by humans are often outsourced to digital work platforms.
Those who perform the tasks are referred to in the article as “crowdworkers”, a term that hints at the fragmentation of complex tasks into micro-tasks, which the authors warn against.
Examples given for tasks that need human input include text prediction, the recognition of objects, transcribing audio, scheduling meetings, fine-tuning responses to language models, including mitigating biases, toxicity and disturbing content.
Beyond poor pay, the authors point to other risks with the AI industry’s dependence on human labour. Like the fact that workers might:
- lack social benefits
- face poor working conditions
- risk losing their skills
- be hindered in developing their skills
- be routinely exposed to various forms of graphic violence and hate content
- be removed from work without a clear explanation
The authors call the hidden realities behind AI’s automation drive a human-in-the-loop-model. This model depends on human interaction and has what they call “profound implications for workers and society”.
The human-in-the-loop-model underscores the need for a more nuanced understanding of what is often believed to be “automated” and the impact of AI on the labour market.
The authors call for genuine cooperation between humans and machines, based on visible labour carried out under decent working conditions.
This is what is needed for AI to reach its full potential while, as they write, securing a more just and sustainable future for all.





