Deep-learning neural-network architectures and methods: Using component-based models in building-design energy prediction

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Sundaravelpandian Singaravel
  • Johan Suykens
  • Philipp Florian Geyer

External Research Organisations

  • KU Leuven
View graph of relations

Details

Original languageEnglish
Pages (from-to)81-90
Number of pages10
JournalAdvanced engineering informatics
Volume38
Early online date18 Jun 2018
Publication statusPublished - Oct 2018
Externally publishedYes

Abstract

Increasing sustainability requirements make evaluating different design options for identifying energy-efficient design ever more important. These requirements demand simulation models that are not only accurate but also fast. Machine Learning (ML) enables effective mimicry of Building Performance Simulation (BPS) while generating results much faster than BPS. Component-Based Machine Learning (CBML) enhances the capabilities of the monolithic ML model. Extending monolithic ML approach, the paper presents deep-learning architectures, component development methods and evaluates their suitability for space exploration in building design. Results indicate that deep learning increases the performance of models over simple artificial neural network models. Methods such as transfer learning and Multi-Task Learning make the component development process more efficient. Testing the deep-learning model on 201 new design cases indicates that its cooling energy prediction (R2: 0.983) is similar to BPS, while errors for heating energy predictions (R2: 0.848) are higher than BPS. Higher heating energy prediction error can be resolved by collecting heating data using better design space sampling methods that cover the heating demand distribution effectively. Given that the accuracy of the deep-learning model for heating predictions can be increased, the major advantage of deep-learning models over BPS is their high computation speed. BPS required 1145 s to simulate 201 design cases. Using the deep-learning model, similar results can be obtained in 0.9 s. High computation speed makes deep-learning models suitable for design space exploration.

Keywords

    Building performance simulation, LSTM, Multi-task learning, Performance gap, Sustainability, Transfer learning

ASJC Scopus subject areas

Cite this

Deep-learning neural-network architectures and methods: Using component-based models in building-design energy prediction. / Singaravel, Sundaravelpandian; Suykens, Johan; Geyer, Philipp Florian.
In: Advanced engineering informatics, Vol. 38, 10.2018, p. 81-90.

Research output: Contribution to journalArticleResearchpeer review

Singaravel S, Suykens J, Geyer PF. Deep-learning neural-network architectures and methods: Using component-based models in building-design energy prediction. Advanced engineering informatics. 2018 Oct;38:81-90. Epub 2018 Jun 18. doi: 10.1016/j.aei.2018.06.004
Singaravel, Sundaravelpandian ; Suykens, Johan ; Geyer, Philipp Florian. / Deep-learning neural-network architectures and methods : Using component-based models in building-design energy prediction. In: Advanced engineering informatics. 2018 ; Vol. 38. pp. 81-90.
Download
@article{2378e070992a485bae4ba93dcfc6818a,
title = "Deep-learning neural-network architectures and methods: Using component-based models in building-design energy prediction",
abstract = "Increasing sustainability requirements make evaluating different design options for identifying energy-efficient design ever more important. These requirements demand simulation models that are not only accurate but also fast. Machine Learning (ML) enables effective mimicry of Building Performance Simulation (BPS) while generating results much faster than BPS. Component-Based Machine Learning (CBML) enhances the capabilities of the monolithic ML model. Extending monolithic ML approach, the paper presents deep-learning architectures, component development methods and evaluates their suitability for space exploration in building design. Results indicate that deep learning increases the performance of models over simple artificial neural network models. Methods such as transfer learning and Multi-Task Learning make the component development process more efficient. Testing the deep-learning model on 201 new design cases indicates that its cooling energy prediction (R2: 0.983) is similar to BPS, while errors for heating energy predictions (R2: 0.848) are higher than BPS. Higher heating energy prediction error can be resolved by collecting heating data using better design space sampling methods that cover the heating demand distribution effectively. Given that the accuracy of the deep-learning model for heating predictions can be increased, the major advantage of deep-learning models over BPS is their high computation speed. BPS required 1145 s to simulate 201 design cases. Using the deep-learning model, similar results can be obtained in 0.9 s. High computation speed makes deep-learning models suitable for design space exploration.",
keywords = "Building performance simulation, LSTM, Multi-task learning, Performance gap, Sustainability, Transfer learning",
author = "Sundaravelpandian Singaravel and Johan Suykens and Geyer, {Philipp Florian}",
note = "Funding Information: The research is funded by STG-14-00346 at KUL and by Deutsche Forschungsgemeinschaft (DFG) in the Researcher Unit 2363 “Evaluation of building design variants in early phases using adaptive levels of development” in Subproject 4 “System-based Simulation of Energy Flows.” The authors acknowledge the support of ERC AdG A-DATADRIVE-B (290923), KUL: GOA/10/09 MaNet, CoE PFV/10/002 (OPTEC), BIL12/11T; FWO: G.0377.12, G.088114N, G0A4917N; IUAPP7/19 DYSCO. ",
year = "2018",
month = oct,
doi = "10.1016/j.aei.2018.06.004",
language = "English",
volume = "38",
pages = "81--90",
journal = "Advanced engineering informatics",
issn = "1474-0346",
publisher = "Elsevier Ltd.",

}

Download

TY - JOUR

T1 - Deep-learning neural-network architectures and methods

T2 - Using component-based models in building-design energy prediction

AU - Singaravel, Sundaravelpandian

AU - Suykens, Johan

AU - Geyer, Philipp Florian

N1 - Funding Information: The research is funded by STG-14-00346 at KUL and by Deutsche Forschungsgemeinschaft (DFG) in the Researcher Unit 2363 “Evaluation of building design variants in early phases using adaptive levels of development” in Subproject 4 “System-based Simulation of Energy Flows.” The authors acknowledge the support of ERC AdG A-DATADRIVE-B (290923), KUL: GOA/10/09 MaNet, CoE PFV/10/002 (OPTEC), BIL12/11T; FWO: G.0377.12, G.088114N, G0A4917N; IUAPP7/19 DYSCO.

PY - 2018/10

Y1 - 2018/10

N2 - Increasing sustainability requirements make evaluating different design options for identifying energy-efficient design ever more important. These requirements demand simulation models that are not only accurate but also fast. Machine Learning (ML) enables effective mimicry of Building Performance Simulation (BPS) while generating results much faster than BPS. Component-Based Machine Learning (CBML) enhances the capabilities of the monolithic ML model. Extending monolithic ML approach, the paper presents deep-learning architectures, component development methods and evaluates their suitability for space exploration in building design. Results indicate that deep learning increases the performance of models over simple artificial neural network models. Methods such as transfer learning and Multi-Task Learning make the component development process more efficient. Testing the deep-learning model on 201 new design cases indicates that its cooling energy prediction (R2: 0.983) is similar to BPS, while errors for heating energy predictions (R2: 0.848) are higher than BPS. Higher heating energy prediction error can be resolved by collecting heating data using better design space sampling methods that cover the heating demand distribution effectively. Given that the accuracy of the deep-learning model for heating predictions can be increased, the major advantage of deep-learning models over BPS is their high computation speed. BPS required 1145 s to simulate 201 design cases. Using the deep-learning model, similar results can be obtained in 0.9 s. High computation speed makes deep-learning models suitable for design space exploration.

AB - Increasing sustainability requirements make evaluating different design options for identifying energy-efficient design ever more important. These requirements demand simulation models that are not only accurate but also fast. Machine Learning (ML) enables effective mimicry of Building Performance Simulation (BPS) while generating results much faster than BPS. Component-Based Machine Learning (CBML) enhances the capabilities of the monolithic ML model. Extending monolithic ML approach, the paper presents deep-learning architectures, component development methods and evaluates their suitability for space exploration in building design. Results indicate that deep learning increases the performance of models over simple artificial neural network models. Methods such as transfer learning and Multi-Task Learning make the component development process more efficient. Testing the deep-learning model on 201 new design cases indicates that its cooling energy prediction (R2: 0.983) is similar to BPS, while errors for heating energy predictions (R2: 0.848) are higher than BPS. Higher heating energy prediction error can be resolved by collecting heating data using better design space sampling methods that cover the heating demand distribution effectively. Given that the accuracy of the deep-learning model for heating predictions can be increased, the major advantage of deep-learning models over BPS is their high computation speed. BPS required 1145 s to simulate 201 design cases. Using the deep-learning model, similar results can be obtained in 0.9 s. High computation speed makes deep-learning models suitable for design space exploration.

KW - Building performance simulation

KW - LSTM

KW - Multi-task learning

KW - Performance gap

KW - Sustainability

KW - Transfer learning

UR - http://www.scopus.com/inward/record.url?scp=85048729300&partnerID=8YFLogxK

U2 - 10.1016/j.aei.2018.06.004

DO - 10.1016/j.aei.2018.06.004

M3 - Article

AN - SCOPUS:85048729300

VL - 38

SP - 81

EP - 90

JO - Advanced engineering informatics

JF - Advanced engineering informatics

SN - 1474-0346

ER -