Google I / O: The search giant is expanding its overall computing ambitions with the new center
Google has started building a bigger oneresearch center that will be used by hundreds of people, the latest sign that the competition to make radical new machines practical tools is growing strongly. It’s in Santa Barbara, California, where Google’s first total computing lab already employs dozens of researchers and engineers.
Operations on Google’s new Quantum AI campus, which have already begun with some initial researchers, will give Google a bigger role in building its own machines, Google said hereon Tuesday. In-house manufacturing, combined with the increasing number of total computers, should accelerate development.
A top job at Google’s new computing center is making key data processing elements, called qubits, more reliable, said Jeff Dean, senior vice president of Google Research and Health, who helped develop some of Google’s most important technologies such as search, advertising and AI. Qubits are easily distracted by external forces that eliminate calculations, but error correction technology will allow quantum computers to operate the more useful they become.
“Hopefully the timeline is that in the next year or two we will have a showing of a qubit correction error,” Dean told CNET in a briefing before the conference.
which can bring great power to cope with complex problems, such as the development of new drugs or materials, entering the classical machines. Quantum machines, however, rely on strange physical laws governing ultrasmall particles. Many tech giants and startups have pursued the entirety of computer development, and their efforts so far remain expensive research projects that have not proven their potential.
“We hope to someday create a computer that is entirely error -corrected,” Sundar Pichai, chief executive of Google parent company Alphabet, said in the keynote speech of Google I / O.
Error correction combines multiple real -world qubits into a single working virtual qubit, called a logical qubit. In Google’s approach, it would take nearly 1000 physical qubits to produce a single logical qubit that could track its data. Google then expects that 1000 logical qubits are needed to do the actual computing work. A million physical qubits is far from Google’s current total computers, with only dozens.
Google focuses the computing work of its entirety on Google I / O, a conference dedicated primarily for programmers who need to work with the search giant’s Android phone software, Chrome web browser and other projects. The Google conference provides an opportunity to showcase the world’s infrastructure dimension, burn its reputation for innovation and generally geek out. Google also uses the show to provide new AI technology that brings computers relatively close to human intelligence and to provide custom hardware details for AI acceleration.
As one of Google’s top engineers, Dean is a major force in the computing industry, a rare example of a programmer to be profiled in The New Yorker magazine. He has been instrumental in developing key technologies such as MapReduce, which helped drive Google to the top of the search engine business, and TensorFlow, which operates extensively using artificial intelligence technology. He also now faces cultural and political challenges, especially the massive departure of AI researcher Timnit Gebru.
Google’s TPU AI accelerators
At I / O, Dean also revealed new details of Google’s AI acceleration hardware, custom processors it calls tenor processing units. Dean described how the company attached 4,096 of the fourth-generation TPUs to a single pod that was 10 times stronger than previous pods with TPU v3 chips.
“A single pod is an incredibly large amount of computational power,” Dean said. “Many of them are now deployed in many different data centers, and by the end of the year we expect dozens of them to be deployed.” Google uses TPU pods primarily for AI training, the intense calculation process that generates AI models that will later appear on our phones, smart speakers and other devices.
Previous AI pod designs had a dedicated collection of TPUs, but in TPU v4, Google connects them with fast fiber-optic lines so different modules can be bad in one group. That means modules that are down for maintenance can be easily avoided, Dean said.
Google’s TPU v4 pods are for its own use now, but they will be available to the company’s cloud computing customers this year, Pichai said.
Strategy has been crucial to Google’s success. While some computer users are focused on expensive, very reliable computing equipment, Google has been working on cheaper equipment since its early days. However, it designed the infrastructure so that it can continue working even if individual elements fail.
Google is also trying to improve its AI software with a technique called multi -modality. Today, separate AI systems are trained to recognize text, speech, images and video. Google wants a broader AI that covers all inputs. Such a system would, for example, recognize a leopard regardless of whether it sees a picture or heard someone speak the word, Dean said.