How much faster can computers get? UC San Diego is leading a $50M effort to find out
A group of 10 universities led by the University of California, San Diego is undertaking a $50.5 million effort to greatly improve the speed and efficiency of computers, work that could do everything from make drug discovery faster to create better weather forecasts.
The coalition, which includes such schools as Stanford and UCLA, hinges on making advances in software and next-generation computer chips. Among other things, both are needed to more rapidly move data from memory sources to processors.
“Right now, it takes an average of 6.5 years and tremendous computing power to determine which pharmaceutical compounds should be tested in clinical trials – and more than 90% of the trials fail,” said Tajana Šimunić Rosing, the UCSD computer engineering professor who is leading the project.
“We plan to shrink this timeline so that drug discovery will take days rather than years, and results will be more accurate.”
The Semiconductor Research Corp., a North Carolina-based consortium that brings industry, government and universities together on major projects, will provide $35 million of the funding. The rest will come from the schools involved in the project.
UCSD was given a leadership role, in part, because it is one of the largest computer and engineering centers in the country. The campus is home to the San Diego Supercomputer Center, the Halicioglu Data Science Institute, and the Jacobs School of Engineering, which has nearly 10,000 students.
The university recently opened a $180 million research facility that has a heavy focus on chip development, with lots of backing from San Diego’s Qualcomm, one of the world’s largest chipmakers. Such research got a lift from the Biden administration in August with the approval a $52 billion chip development bill that’s intended to make companies better able to compete in the global semiconductor industry. The bill is specifically targeted to help the U.S. compete with China.