Researchers at the US Department of Energy’s Oak Ridge National Laboratory (ORNL) are developing software that will automatically calibrate models for simulating building energy use.

Based on recent figures, in the United States, building owners spend well in excess of $400 billion on energy. Their buildings consume 41 per cent of the nation’s total energy and are responsible for 40 per cent of carbon dioxide emissions.

Building Energy Modeling (BEM) uses computer simulations to estimate energy use and guide the design of new buildings as well as energy improvements to existing buildings.

The Department of Energy’s flagship whole-building energy simulation tool estimates energy usage based on weather data and the thousands of input parameters related to heating, ventilation, and air conditioning (HVAC) systems, water heating, lighting, weather interaction, occupancy schedules and more.

With up to 3,000 parameters used when modelling a building’s energy use, it can be a complex and time-consuming process tweaking and optimizing each element.

“Currently, the biggest barrier is the cost of getting an accurate model of the pre-retrofit building because it requires hiring an expert,” said Jibonananda Sanyal , one of the lead researchers.

The “Autotune” calibration software being developed at ORNL, it is hoped, will significantly reduce the amount of time and expertise needed to optimize building parameters for cost and energy savings.

Collaboration with universities and industry has been an integral part of the process to ensure they make their approach accessible to more building engineers and owners.

The Autotune methodology uses multi-parameter optimization techniques along with big data mining-informed artificial intelligence agents to automatically modify software inputs so simulation output matches measured data.

“Instead of having a human change the knobs, so to speak, the Autotune methodology does that for you,” Sanyal said.

Development of the software has required extremely powerful technology to crunch all the possible scenarios and has included use of ORNL’s 27-petaflop Titan supercomputer and the National Institute for Computational Sciences’ Nautilus system – to perform millions of simulations for a range of standard building types with generated data totaling hundreds of terabytes.

Using the supercomputer, the team has been able to run annual energy simulations for more than half a million buildings in less than one hour.

The process has been focused, however, with building technology experts brought on board to identify the 150 most important parameters so as to ensure a high degree of accuracy while reducing unnecessary computational loads.

The software then uses machine learning algorithms to “learn” successful versus unsuccessful paths to optimization. This means when similar building input parameters are introduced down the track, results are optimized more quickly by ignoring what didn’t work before.

Autotune’s fully-automated process has routinely calibrated models to an error below one per cent on all building types tested.

With such precision, an overnight Autotune process is far less costly than the time it would take an expert to manually calibrate a model.

Although the computer power needed to create the metadata and train the software was ‘super’, the public can be rest assured that they will not require such equipment to benefit from the new Autotune solution.

“We commonly run the software on a laptop,” Sanyal said.

The team is currently making Autotune capabilities available to a limited set of beta testers through a web service and anticipates making it publicly available in September 2015.