Skip to content
/ LBGAN Public
forked from Yuqing27/LBGAN

Results for Synthetic Data Augmentation: Deep Leaf-Bootstrapping GAN for Image-based Structural Scene and Damage Identification

Notifications You must be signed in to change notification settings

STAIRlab/LBGAN

 
 

Repository files navigation

Deep Leaf-Bootstrapping Generative Adversarial Network for Structural Image Data Augmentation

Employing Deep Learning (DL) technologies to solve Civil Engineering problems is an emerging topic in recent years. However, due to the lack of labeled data, it is difficult to obtain accurate results with DL. One commonly used method to tackle this issue is to use affine transformation to augment the dataset, but it can only generate new images that are highly correlated with the original ones. Moreover, unlike normal natural objects, distribution of structural images is much more complex and mixed. To address these challenges, Generative Adversarial Network (GAN) can be one feasible choice. We introduce one specific generative model, namely Deep Convolutional Generative Adversarial Network (DCGAN) and propose a Leaf-Bootstrapping algorithm to improve the performance of this DCGAN. To effectively and quantitatively evaluate the quality of the synthetic images generated by DCGAN to complement human evaluation, Self-Inception Score (SIS) and Generalization Ability (GA) are proposed. Finally, we conducted computer experiments with the proposed methods for two scenarios (scene level identification and damage state check) and the results demonstrated the effectiveness and robustness of the proposed methods.

Reference

Gao Y, Kong B, Mosalam KM. Deep leaf-bootstrapping generative adversarial network for structural image data aug- mentation. Comput Aided Civ Inf. 2019;1–19. https://doi.org/10.1111/mice.12458

About

Results for Synthetic Data Augmentation: Deep Leaf-Bootstrapping GAN for Image-based Structural Scene and Damage Identification

Topics

Resources

Stars

Watchers

Forks