Tech

LinkedIn has been secretly examining users and experimenting different things for five years

LinkedIn conducted experiments on more than 20 million users over five years that, while intended to improve the way the platform works for members, could affect some people’s livelihoods, the New York Times reports.

In experiments conducted around the world from 2015 to 2019, Linkedin randomized the percentage of weak and strong contacts suggested by its People You May Know algorithm—the company’s automated system for recommending new connections to its users. Researchers from LinkedIn, MIT, Stanford and Harvard Business School later analyzed aggregated data from the tests in a study published this month in the journal Science.

LinkedIn’s algorithmic experiments may come as a surprise to millions of people because the company didn’t notify users that the tests were underway.

Tech giants like LinkedIn, the world’s largest professional network, routinely conduct large-scale experiments in which they test different versions of app features, web designs and algorithms on different people. The long-standing practice, called A/B testing, aims to improve consumer experiences and keep them engaged, which helps companies make money through premium membership fees or advertising. Users often have no idea that companies are running tests on them. (The New York Times uses such tests to evaluate headline text and make decisions about the products and features the company publishes.)

But the changes made by LinkedIn indicate how such tweaks to widely used algorithms can become potentially life-changing experiments in social engineering for many people. Experts who study the social impacts of computing say that conducting long, large-scale experiments on people that could affect their job prospects in ways invisible to them has raised questions about industry transparency and research oversight.

“The findings suggest that some users had better access to job opportunities or a significant difference in access to job opportunities,” said Michael Zimmer, associate professor of computer science and director of Marquette University’s Center for Data, Ethics and Society. “These are the kinds of long-term consequences that need to be considered when we think about the ethics of engaging in this kind of big data research.”

Related Articles

Back to top button