All big steps of progress in computing — from mainframe to the personal computer to the internet to smartphones — have unlocked new opportunities for more people to invent on the digital platform.
But concerns are emerging that the trend is being reversed at tech’s new important edge, artificial intelligence.
Computer scientists say A.I. research is gradually becoming expensive, compelling complex calculations done by giant data centers, leaving fewer people with easy access to the computing firepower essential to develop the technology behind futuristic products like self-driving cars or digital assistants that can see, talk and reason.
The risk, they say, is that pioneering artificial intelligence research will be a field of haves and have-nots. And the haves will be principally a few big tech companies like Google, Microsoft, Amazon, and Facebook, which each spend billions a year building and developing their data centers.
In the have-not camp, they caution, will be university labs, which have conventionally been a wellspring of innovations that finally power new products and services.
“The colossal computing resources these companies have to pose a threat — the universities cannot compete,” stated Craig Knoblock, executive director of the Information Sciences Institute, a research lab at the University of Southern California.
The research scientists’ warnings come amid rising concern about the influence and power of the big tech companies. Most of the emphasis has been on the current generation of technology — search, online advertising, social media, and e-commerce. But the scientists are apprehensive about a barrier to exploring the technological future when that requires staggering amounts of computing.