Details
Original language | English |
---|---|
Pages (from-to) | 81-90 |
Number of pages | 10 |
Journal | Advanced engineering informatics |
Volume | 38 |
Early online date | 18 Jun 2018 |
Publication status | Published - Oct 2018 |
Externally published | Yes |
Abstract
Keywords
- Building performance simulation, LSTM, Multi-task learning, Performance gap, Sustainability, Transfer learning
ASJC Scopus subject areas
- Computer Science(all)
- Information Systems
- Computer Science(all)
- Artificial Intelligence
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: Advanced engineering informatics, Vol. 38, 10.2018, p. 81-90.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Deep-learning neural-network architectures and methods
T2 - Using component-based models in building-design energy prediction
AU - Singaravel, Sundaravelpandian
AU - Suykens, Johan
AU - Geyer, Philipp Florian
N1 - Funding Information: The research is funded by STG-14-00346 at KUL and by Deutsche Forschungsgemeinschaft (DFG) in the Researcher Unit 2363 “Evaluation of building design variants in early phases using adaptive levels of development” in Subproject 4 “System-based Simulation of Energy Flows.” The authors acknowledge the support of ERC AdG A-DATADRIVE-B (290923), KUL: GOA/10/09 MaNet, CoE PFV/10/002 (OPTEC), BIL12/11T; FWO: G.0377.12, G.088114N, G0A4917N; IUAPP7/19 DYSCO.
PY - 2018/10
Y1 - 2018/10
N2 - Increasing sustainability requirements make evaluating different design options for identifying energy-efficient design ever more important. These requirements demand simulation models that are not only accurate but also fast. Machine Learning (ML) enables effective mimicry of Building Performance Simulation (BPS) while generating results much faster than BPS. Component-Based Machine Learning (CBML) enhances the capabilities of the monolithic ML model. Extending monolithic ML approach, the paper presents deep-learning architectures, component development methods and evaluates their suitability for space exploration in building design. Results indicate that deep learning increases the performance of models over simple artificial neural network models. Methods such as transfer learning and Multi-Task Learning make the component development process more efficient. Testing the deep-learning model on 201 new design cases indicates that its cooling energy prediction (R2: 0.983) is similar to BPS, while errors for heating energy predictions (R2: 0.848) are higher than BPS. Higher heating energy prediction error can be resolved by collecting heating data using better design space sampling methods that cover the heating demand distribution effectively. Given that the accuracy of the deep-learning model for heating predictions can be increased, the major advantage of deep-learning models over BPS is their high computation speed. BPS required 1145 s to simulate 201 design cases. Using the deep-learning model, similar results can be obtained in 0.9 s. High computation speed makes deep-learning models suitable for design space exploration.
AB - Increasing sustainability requirements make evaluating different design options for identifying energy-efficient design ever more important. These requirements demand simulation models that are not only accurate but also fast. Machine Learning (ML) enables effective mimicry of Building Performance Simulation (BPS) while generating results much faster than BPS. Component-Based Machine Learning (CBML) enhances the capabilities of the monolithic ML model. Extending monolithic ML approach, the paper presents deep-learning architectures, component development methods and evaluates their suitability for space exploration in building design. Results indicate that deep learning increases the performance of models over simple artificial neural network models. Methods such as transfer learning and Multi-Task Learning make the component development process more efficient. Testing the deep-learning model on 201 new design cases indicates that its cooling energy prediction (R2: 0.983) is similar to BPS, while errors for heating energy predictions (R2: 0.848) are higher than BPS. Higher heating energy prediction error can be resolved by collecting heating data using better design space sampling methods that cover the heating demand distribution effectively. Given that the accuracy of the deep-learning model for heating predictions can be increased, the major advantage of deep-learning models over BPS is their high computation speed. BPS required 1145 s to simulate 201 design cases. Using the deep-learning model, similar results can be obtained in 0.9 s. High computation speed makes deep-learning models suitable for design space exploration.
KW - Building performance simulation
KW - LSTM
KW - Multi-task learning
KW - Performance gap
KW - Sustainability
KW - Transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85048729300&partnerID=8YFLogxK
U2 - 10.1016/j.aei.2018.06.004
DO - 10.1016/j.aei.2018.06.004
M3 - Article
AN - SCOPUS:85048729300
VL - 38
SP - 81
EP - 90
JO - Advanced engineering informatics
JF - Advanced engineering informatics
SN - 1474-0346
ER -