TY - GEN
T1 - Semantic Relation Detection between Construction Entities to Support Safe Human-Robot Collaboration in Construction
AU - Kim, Daeho
AU - Goyal, Ankit
AU - Newell, Alejandro
AU - Lee, Sang Hyun
AU - Deng, Jia
AU - Kamat, Vineet R.
N1 - Funding Information:
The work presented in this paper was supported financially by a National Science Foundation Award (No. IIS-1734266, ‘Scene Understanding and Predictive Monitoring for Safe Human-Robot Collaboration in Unstructured and Dynamic Construction Environment’). Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Publisher Copyright:
© 2019 American Society of Civil Engineers.
PY - 2019
Y1 - 2019
N2 - Construction robots have drawn increased attention as a potential means of improving construction safety and productivity. However, it is still challenging to ensure safe human-robot collaboration on dynamic and unstructured construction workspaces. On construction sites, multiple entities dynamically collaborate with each other and the situational context between them evolves continually. Construction robots must therefore be equipped to visually understand the scene's contexts (i.e., semantic relations to surrounding entities), thereby safely collaborating with humans, as a human vision system does. Toward this end, this study builds a unique deep neural network architecture and develops a construction-specialized model by experimenting multiple fine-tuning scenarios. Also, this study evaluates its performance on real construction operations data in order to examine its potential toward real-world applications. The results showed the promising performance of the tuned model: the recall@5 on training and validation dataset reached 92% and 67%, respectively. The proposed method, which supports construction co-robots with the holistic scene understanding, is expected to contribute to promoting safer human-robot collaboration in construction.
AB - Construction robots have drawn increased attention as a potential means of improving construction safety and productivity. However, it is still challenging to ensure safe human-robot collaboration on dynamic and unstructured construction workspaces. On construction sites, multiple entities dynamically collaborate with each other and the situational context between them evolves continually. Construction robots must therefore be equipped to visually understand the scene's contexts (i.e., semantic relations to surrounding entities), thereby safely collaborating with humans, as a human vision system does. Toward this end, this study builds a unique deep neural network architecture and develops a construction-specialized model by experimenting multiple fine-tuning scenarios. Also, this study evaluates its performance on real construction operations data in order to examine its potential toward real-world applications. The results showed the promising performance of the tuned model: the recall@5 on training and validation dataset reached 92% and 67%, respectively. The proposed method, which supports construction co-robots with the holistic scene understanding, is expected to contribute to promoting safer human-robot collaboration in construction.
UR - http://www.scopus.com/inward/record.url?scp=85090860782&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85090860782&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85090860782
T3 - Computing in Civil Engineering 2019: Data, Sensing, and Analytics - Selected Papers from the ASCE International Conference on Computing in Civil Engineering 2019
SP - 265
EP - 272
BT - Computing in Civil Engineering 2019
A2 - Cho, Yong K.
A2 - Leite, Fernanda
A2 - Behzadan, Amir
A2 - Wang, Chao
PB - American Society of Civil Engineers (ASCE)
T2 - ASCE International Conference on Computing in Civil Engineering 2019: Data, Sensing, and Analytics, i3CE 2019
Y2 - 17 June 2019 through 19 June 2019
ER -