AI's Latest and Greatest

The Ivory Tower Can’t Keep Ignoring Tech

Photo


Credit
John Taggart/Bloomberg

These days, big data, artificial intelligence and the tech platforms that put them to work have huge influence and power. Algorithms choose the information we see when we go online, the jobs we get, the colleges to which we’re admitted and the credit cards and insurance we are issued. It goes without saying that when computers are making decisions, a lot can go wrong.

Our lawmakers desperately need this explained to them in an unbiased way so they can appropriately regulate, and tech companies need to be held accountable for their influence over all elements of our lives. But academics have been asleep at the wheel, leaving the responsibility for this education to well-paid lobbyists and employees who’ve abandoned the academy.

That means our main source of information on the downside of bad technology — often after something’s gone disastrously awry, such as when we learned that fake news dominated our social media feeds before last year’s presidential election, threatening our democracy — is the media. But this coverage often misses everyday issues and tends to be far too credulous when it does exists. Much of what should concern us is more nuanced and small scale — and much less understood — than what we see in the headlines. Moreover, we shouldn’t have to depend on journalism to do the tedious, serious work of understanding the problems with algorithms any more than we depend on it to pursue the latest questions in sociology or environmental science.

We need academia to step up to fill in the gaps in our collective understanding about the new role of technology in shaping our lives. We need robust research on hiring algorithms that seem to filter out people with mental health disorders, sentencing algorithms that fail twice as often for black defendants as for white defendants, statistically flawed public teacher assessments or oppressive scheduling algorithms. And we need research to ensure that the same mistakes aren’t made again and again. It’s absolutely within the abilities of academic research to study such examples and to push against the most obvious statistical, ethical or constitutional failures and dedicate serious intellectual energy to finding solutions. And whereas professional technologists working at private companies are not in a position to critique their own work, academics theoretically enjoy much more freedom of inquiry.

To be fair, there are real obstacles. Academics largely don’t have access to the mostly private, sensitive personal data that tech companies collect; indeed even when they study data-driven subjects, they work with data and methods that typically predict much more abstract things like disease or economics than human behavior, so they’re naïve about the effects such choices can have. The academics who do get close to the big companies in terms of technique get quickly plucked out of academia to work for them, with much higher salaries to boot. That means professors working in computer science and robotics departments — or law schools — often find themselves in situations in which positing any skeptical message about technology could present a professional conflict of interest.

The many data science institutes around the country, which have created lucrative master’s programs to train data scientists, are more focused on trying to get a piece of the big data pie — in the form of collaborations and jobs for their graduates — than they are on asking how the pie should be made. We won’t find any help there. Indeed, while West Coast schools like Stanford and the University of California, Berkeley, are renowned for creating factories that churn out the future engineers and data scientists of Silicon Valley, there are very few coveted permanent, tenure-track jobs in the country devoted to algorithmic accountability.

Continue reading the main story

By CATHY O’NEIL

https://www.nytimes.com/2017/11/14/opinion/academia-tech-algorithms.html

Source link

Similar Posts

WP2Social Auto Publish Powered By : XYZScripts.com