If the amount of energy needed for computing keeps climbing, in just eight years up to 20% of the world’s power could be consumed by data centers, wireless networks, consumer electronics and a growing number of other devices.
That prediction from the company Enerdata – which pegs today’s computing energy consumption at up to 9% of the world’s power – is one reason researchers at the University of Virginia’s School of Engineering and Applied Science are working to make computing more energy efficient. That projected growth, coupled with power grids already under strain from weather-related events and the economy transitioning from fossil fuel to renewables, has engineers desperately trying to flatten computing’s energy-demand curve.
Members of Jon Ihlefeld’s multifunctional thin film group are doing their part. They are investigating a material system that will allow the semiconductor industry to put both memory and computation on a single chip.
“Right now we have a computer chip that does its computing activities with a little bit of memory on it,” said Ihlefeld, associate professor of materials science and engineering and electrical and computer engineering.
Every time the computer chip wants to talk to the larger memory bank, it sends a signal down the line, and that requires energy. The longer the distance, the more energy it takes. Today the distance can be up to several centimeters which, in computing distances, is really far.