AI

Does technology help or hurt employment? | MIT News

4 Mins read


This is part 2 of a two-part MIT News feature examining new job creation in the U.S. since 1940, based on new research from Ford Professor of Economics David Autor. Part 1 is available here.

Ever since the Luddites were destroying machine looms, it has been obvious that new technologies can wipe out jobs. But technical innovations also create new jobs: Consider a computer programmer, or someone installing solar panels on a roof.

Overall, does technology replace more jobs than it creates? What is the net balance between these two things? Until now, that has not been measured. But a new research project led by MIT economist David Autor has developed an answer, at least for U.S. history since 1940.

The study uses new methods to examine how many jobs have been lost to machine automation, and how many have been generated through “augmentation,” in which technology creates new tasks. On net, the study finds, and particularly since 1980, technology has replaced more U.S. jobs than it has generated.

“There does appear to be a faster rate of automation, and a slower rate of augmentation, in the last four decades, from 1980 to the present, than in the four decades prior,” says Autor, co-author of a newly published paper detailing the results.

However, that finding is only one of the study’s advances. The researchers have also developed an entirely new method for studying the issue, based on an analysis of tens of thousands of U.S. census job categories in relation to a comprehensive look at the text of U.S. patents over the last century. That has allowed them, for the first time, to quantify the effects of technology over both job loss and job creation.

Previously, scholars had largely just been able to quantify job losses produced by new technologies, not job gains.

“I feel like a paleontologist who was looking for dinosaur bones that we thought must have existed, but had not been able to find until now,” Autor says. “I think this research breaks ground on things that we suspected were true, but we did not have direct proof of them before this study.”

The paper, “New Frontiers: The Origins and Content of New Work, 1940-2018,” appears in the Quarterly Journal of Economics. The co-authors are Autor, the Ford Professor of Economics; Caroline Chin, a PhD student in economics at MIT; Anna Salomons, a professor in the School of Economics at Utrecht University; and Bryan Seegmiller SM ’20, PhD ’22, an assistant professor at the Kellogg School of Northwestern University.

Automation versus augmentation

The study finds that overall, about 60 percent of jobs in the U.S. represent new types of work, which have been created since 1940. A century ago, that computer programmer may have been working on a farm.

To determine this, Autor and his colleagues combed through about 35,000 job categories listed in the U.S. Census Bureau reports, tracking how they emerge over time. They also used natural language processing tools to analyze the text of every U.S. patent filed since 1920. The research examined how words were “embedded” in the census and patent documents to unearth related passages of text. That allowed them to determine links between new technologies and their effects on employment.

“You can think of automation as a machine that takes a job’s inputs and does it for the worker,” Autor explains. “We think of augmentation as a technology that increases the variety of things that people can do, the quality of things people can do, or their productivity.”

From about 1940 through 1980, for instance, jobs like elevator operator and typesetter tended to get automated. But at the same time, more workers filled roles such as shipping and receiving clerks, buyers and department heads, and civil and aeronautical engineers, where technology created a need for more employees. 

From 1980 through 2018, the ranks of cabinetmakers and machinists, among others, have been thinned by automation, while, for instance, industrial engineers, and operations and systems researchers and analysts, have enjoyed growth.

Ultimately, the research suggests that the negative effects of automation on employment were more than twice as great in the 1980-2018 period as in the 1940-1980 period. There was a more modest, and positive, change in the effect of augmentation on employment in 1980-2018, as compared to 1940-1980.

“There’s no law these things have to be one-for-one balanced, although there’s been no period where we haven’t also created new work,” Autor observes.

What will AI do?

The research also uncovers many nuances in this process, though, since automation and augmentation often occur within the same industries. It is not just that technology decimates the ranks of farmers while creating air traffic controllers. Within the same large manufacturing firm, for example, there may be fewer machinists but more systems analysts.

Relatedly, over the last 40 years, technological trends have exacerbated a gap in wages in the U.S., with highly educated professionals being more likely to work in new fields, which themselves are split between high-paying and lower-income jobs.

“The new work is bifurcated,” Autor says. “As old work has been erased in the middle, new work has grown on either side.”

As the research also shows, technology is not the only thing driving new work. Demographic shifts also lie behind growth in numerous sectors of the service industries. Intriguingly, the new research also suggests that large-scale consumer demand also drives technological innovation. Inventions are not just supplied by bright people thinking outside the box, but in response to clear societal needs.

The 80 years of data also suggest that future pathways for innovation, and the employment implications, are hard to forecast. Consider the possible uses of AI in workplaces.

“AI is really different,” Autor says. “It may substitute some high-skill expertise but may complement decision-making tasks. I think we’re in an era where we have this new tool and we don’t know what’s good for. New technologies have strengths and weaknesses and it takes a while to figure them out. GPS was invented for military purposes, and it took decades for it to be in smartphones.”

He adds: “We’re hoping our research approach gives us the ability to say more about that going forward.”

As Autor recognizes, there is room for the research team’s methods to be further refined. For now, he believes the research open up new ground for study.

“The missing link was documenting and quantifying how much technology augments people’s jobs,” Autor says. “All the prior measures just showed automation and its effects on displacing workers. We were amazed we could identify, classify, and quantify augmentation. So that itself, to me, is pretty foundational.”

Support for the research was provided, in part, by The Carnegie Corporation; Google; Instituut Gak; the MIT Work of the Future Task Force; Schmidt Futures; the Smith Richardson Foundation; and the Washington Center for Equitable Growth.


Source link

Related posts
AI

Google AI Releases Gemini 2.0 Flash: A New AI Model that is 2x Faster than Gemini 1.5 Pro

2 Mins read
Google AI Research introduces Gemini 2.0 Flash, the latest iteration of its Gemini AI model. This release focuses on performance improvements, notably…
AI

Microsoft Research Introduces AI-Powered Carbon Budgeting Method: A Real-Time Approach to Tracking Global Carbon Sinks and Emission

3 Mins read
Since the Industrial Revolution, burning fossil fuels and changes in land use, especially deforestation, have driven the rise in atmospheric carbon dioxide…
AI

Evaluating Gender Bias Transfer between Pre-trained and Prompt-Adapted Language Models

1 Mins read
*Equal Contributors Large language models (LLMs) are increasingly being adapted to achieve task-specificity for deployment in real-world decision systems. Several previous works…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *