Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence

Print
General Information
Study Number: TPF-5(535)
Former Study Number:
Lead Organization: Kansas Department of Transportation
Solicitation Number: 1597
Partners: CA, KS, NC, TX
Status: Cleared by FHWA
Est. Completion Date:
Contract/Other Number:
Last Updated: Jul 16, 2024
Contract End Date:
Financial Summary
Contract Amount:
Suggested Contribution:
Total Commitments Received: $600,000.00
100% SP&R Approval: Approved
Contact Information
Lead Study Contact(s): David Behzadpour
David.Behzadpour@ks.gov
Phone: 785-291-3847
FHWA Technical Liaison(s): Hoda Azari
hoda.azari@dot.gov
Phone: 202-493-3064
Study Champion(s): Mark Hurt
Mark.Hurt@ks.gov
Phone: 7852968905
Organization Year Commitments Technical Contact Name Funding Contact Name
California Department of Transportation 2024 $0.00 Shawn Hart Sang Le
California Department of Transportation 2025 $120,000.00 Shawn Hart Sang Le
Kansas Department of Transportation 2024 $0.00 Mark Hurt David Behzadpour
Kansas Department of Transportation 2025 $80,000.00 Mark Hurt David Behzadpour
Kansas Department of Transportation 2026 $80,000.00 Mark Hurt David Behzadpour
Kansas Department of Transportation 2027 $80,000.00 Mark Hurt David Behzadpour
North Carolina Department of Transportation 2024 $40,000.00 David Snoke Curtis Bradley
North Carolina Department of Transportation 2025 $40,000.00 David Snoke Curtis Bradley
North Carolina Department of Transportation 2026 $40,000.00 David Snoke Curtis Bradley
Texas Department of Transportation 2024 $0.00 Justin Wilson Ned Mattila
Texas Department of Transportation 2025 $40,000.00 Justin Wilson Ned Mattila
Texas Department of Transportation 2026 $40,000.00 Justin Wilson Ned Mattila
Texas Department of Transportation 2027 $40,000.00 Justin Wilson Ned Mattila

Study Description

State DOTs currently are relying on trained inspectors to visually inspect bridge components for detecting structural deterioration and damage, which can be limited in accuracy, speed, repeatability, and reliability. On the other hand, computer vision (CV) can see what human eyes cannot, and artificial intelligence (AI) such as deep learning has shown tremendous ability to conceptualize and generalize. By integrating CV and Augmented Reality (AR), a recent NCHRP Highway IDEA project (Li et al., 2022) completed by this project team successfully demonstrated how human-centered AR environment and automated CV algorithms can empower bridge inspectors to perform more accurate and efficient field inspections of steel bridges for fatigue cracks.

As illustrated in Figure 1, the inspector wearing an AR headset (Microsoft HoloLens 2) examines the steel bridge and records a short video of the target structural surface through the AR headset. The video is then automatically uploaded to the server, where the computer vision algorithm analyzes the video by detecting and analyzing surface motion through feature points (pinks dots in the upper right figure). These feature points are then projected in near real time in front of the inspector’s eyes as holograms through the AR headset, allowing the inspector to interact with the hologram through a virtual menu to examine the results under different threshold values for crack detection, enabling human-in-the-loop decision-making.

The NCHRP Highway IDEA project has successfully demonstrated the concept of human-centered bridge inspection by integrating CV and AR using an AR headset as the hardware platform. However, further developments are needed for successful adoption of this tool in practical bridge inspections. In addition, the idea of human-centered bridge inspection would have a broader impact if realized on a wider range of mobile platforms such as tablet devices. The goal of this proposed pooled fund study is to develop a full-fledged AR-based bridge inspection tool that leverages CV and AI to support field detection, quantification, and documentation of various damages and deteriorations for steel bridges. 

Objectives

The main objective of this proposed research is to provide state DOTs practical tools for supporting human-centered steel bridge inspection with real-time defect (e.g., fatigue cracks and corrosion) detection, documentation, tracking, and decision making. The proposed research will not only bridge the gaps identified in the IDEA project, but also expand the existing capability by developing AI algorithms for crack and corrosion detection. In addition to AR headsets, the project will also develop AR-based inspection capability using tablet devices.  The tablet device can be used to perform AR-based inspection directly in a similar way to the AR headset. It can also leverage Unmanned Aerial Vehicles (UAV) for remote image and video acquisition during inspections, enabling bridge inspections from a distance in a human-centered manner, as illustrated in Figure 2. 

Scope of Work

The scope of work includes three main tasks from the development and creation of CV and AI algorithms for steel fatigue crack and corrosion detection and quantification (Task 1), comprehensive design and development of AR-based software to facilitate human-centered damage detection, visualization, documentation, tracking, and decision-making (Task 2), and extensive laboratory and field implementation, testing, and evaluation (Task 3).

Task 1: CV and AI algorithms for crack and corrosion inspection

Two types of algorithms will be included in the AR inspection tool. The first method is based on video analysis and will be improved upon the NCHRP IDEA product in terms of accuracy and sensitivity. In addition, this research will also include image-based deep learning algorithms to enable classification, detection, and segmentation of cracks and corrosion, as illustrated in Figure 3 for the case of crack identification, using images taken by the AR headset, tablet, or UAV. Focus will be placed on minimizing the complexity of the deep learning model to reduce computation, with the goal of enabling real-time image processing and damage inference for practical inspections. With the two methods available, the inspector can first use the image-based deep learning method to identify and segment the regions where cracks and corrosion may exist, then apply the video-based algorithm to further examine the crack region for a refined result.

Task 2: AR-based software for human-center bridge inspection

This task will develop AR-based software environment and user interface to enable human-in-the-loop decision making during field inspections. A process will be developed to convert the damage detection result into holograms and deploy them to the 3D real-world environment with accurate anchorage onto the structural surface. A cloud database will be created to store inspection results. This ability is the key to enabling documentation, allowing for comparisons and tracking of bridge damage in space and time. Build upon the user interface developed in the NCHRP Highway IDEA project, a more comprehensive virtual menu will be created to facilitate a smooth and user-friendly interface for human-centered bridge inspection. In addition, the software for AR headset will be adapted to enable AR-based inspection by using a tablet device. When a UAV is used to facilitate bridge inspection from a distance, the tablet device will receive the damage detection result for the inspector to facilitate human-centered documentation and decision-making, as illustrated in Figure 2.

Task 3: Laboratory and field testing

The developed AR software and AI algorithms will be tested extensively in both laboratory and field settings. A large-scale girder bridge subassemblage with realistic fatigue and corrosion damage will be established in the structural testing laboratory at the University of Kansas for testing the developed AR inspection tools. In addition, several bridges in the inventory of KDOT and other participating member states will be selected for field testing and validation. The team will work closely with the KDOT inspection crew to ensure the tools are relevant and address practical challenges.

This project will result in user-friendly AR software packages for participating member states empowered by AI algorithms for automated damage detection that can be readily adopted by bridge inspectors to perform AI and AR assisted bridge inspections using both AR headsets and tablet devices. In addition, quarterly reports and a final report will be generated in MS Word format. The team will hold quarterly online report meetings with participating parties during the project. The team also plans to hold on one in-person mid-project participant meeting in Year 3. The team will also disseminate the findings and results from this research through journal and conference publications. 

Comments

     

·       Funding requested: $40,000/year per each participating state for 3 years.

·       5 states (Total budget $600,000)

Please see figures in the enclosed complete proposal in the attachment:

Figure 1: Human-centered fatigue crack inspection tool developed under NCHRP IDEA 223

Figure 2: Human-centered bridge inspection enabled by integrating AI, AR, and UAV

Figure 3: Classification, detection, and segmentation of cracks using deep learning

Documents Attached
Title File/Link Document Category Document Type Privacy Document Date Download
SPR Waiver Memo Approval SPR Waiver Memo #1597.pdf TPF Study Documentation Solicitation Public 2024-07-16
Acceptance Letter acceptance letter for TPF-5(535).pdf TPF Study Documentation Work Plan/Scope/Charter Public 2024-07-08
Proposal Proposal.pdf TPF Study Documentation Work Plan/Scope/Charter Public 2024-07-08
Documents Attached
Title File/Link Document Category Document Type Privacy Document Date Download
Laboratory demonstration using a large-scale steel bridge girder specimen TPF 1597 Lab Demo.mp4 Other Other Public 2024-04-02
Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence Proposal.pdf TPF Study Documentation Work Plan/Scope/Charter Public 2023-04-10

Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence

General Information
Study Number: TPF-5(535)
Lead Organization: Kansas Department of Transportation
Solicitation Number: 1597
Partners: CA, KS, NC, TX
Status: Cleared by FHWA
Est. Completion Date:
Contract/Other Number:
Last Updated: Jul 16, 2024
Contract End Date:
Financial Summary
Contract Amount:
Total Commitments Received: $600,000.00
100% SP&R Approval:
Contact Information
Lead Study Contact(s): David Behzadpour
David.Behzadpour@ks.gov
Phone: 785-291-3847
FHWA Technical Liaison(s): Hoda Azari
hoda.azari@dot.gov
Phone: 202-493-3064
Commitments by Organizations
Organization Year Commitments Technical Contact Name Funding Contact Name Contact Number Email Address
California Department of Transportation 2024 $0.00 Shawn Hart Sang Le (916)701-3998 sang.le@dot.ca.gov
California Department of Transportation 2025 $120,000.00 Shawn Hart Sang Le (916)701-3998 sang.le@dot.ca.gov
Kansas Department of Transportation 2024 $0.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
Kansas Department of Transportation 2025 $80,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
Kansas Department of Transportation 2026 $80,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
Kansas Department of Transportation 2027 $80,000.00 Mark Hurt David Behzadpour 785-291-3847 David.Behzadpour@ks.gov
North Carolina Department of Transportation 2024 $40,000.00 David Snoke Curtis Bradley 919-707-6661 cbradley8@ncdot.gov
North Carolina Department of Transportation 2025 $40,000.00 David Snoke Curtis Bradley 919-707-6661 cbradley8@ncdot.gov
North Carolina Department of Transportation 2026 $40,000.00 David Snoke Curtis Bradley 919-707-6661 cbradley8@ncdot.gov
Texas Department of Transportation 2024 $0.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov
Texas Department of Transportation 2025 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov
Texas Department of Transportation 2026 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov
Texas Department of Transportation 2027 $40,000.00 Justin Wilson Ned Mattila 512-416-4727 ned.mattila@txdot.gov

Study Description

Study Description

State DOTs currently are relying on trained inspectors to visually inspect bridge components for detecting structural deterioration and damage, which can be limited in accuracy, speed, repeatability, and reliability. On the other hand, computer vision (CV) can see what human eyes cannot, and artificial intelligence (AI) such as deep learning has shown tremendous ability to conceptualize and generalize. By integrating CV and Augmented Reality (AR), a recent NCHRP Highway IDEA project (Li et al., 2022) completed by this project team successfully demonstrated how human-centered AR environment and automated CV algorithms can empower bridge inspectors to perform more accurate and efficient field inspections of steel bridges for fatigue cracks.

As illustrated in Figure 1, the inspector wearing an AR headset (Microsoft HoloLens 2) examines the steel bridge and records a short video of the target structural surface through the AR headset. The video is then automatically uploaded to the server, where the computer vision algorithm analyzes the video by detecting and analyzing surface motion through feature points (pinks dots in the upper right figure). These feature points are then projected in near real time in front of the inspector’s eyes as holograms through the AR headset, allowing the inspector to interact with the hologram through a virtual menu to examine the results under different threshold values for crack detection, enabling human-in-the-loop decision-making.

The NCHRP Highway IDEA project has successfully demonstrated the concept of human-centered bridge inspection by integrating CV and AR using an AR headset as the hardware platform. However, further developments are needed for successful adoption of this tool in practical bridge inspections. In addition, the idea of human-centered bridge inspection would have a broader impact if realized on a wider range of mobile platforms such as tablet devices. The goal of this proposed pooled fund study is to develop a full-fledged AR-based bridge inspection tool that leverages CV and AI to support field detection, quantification, and documentation of various damages and deteriorations for steel bridges. 

Objectives

The main objective of this proposed research is to provide state DOTs practical tools for supporting human-centered steel bridge inspection with real-time defect (e.g., fatigue cracks and corrosion) detection, documentation, tracking, and decision making. The proposed research will not only bridge the gaps identified in the IDEA project, but also expand the existing capability by developing AI algorithms for crack and corrosion detection. In addition to AR headsets, the project will also develop AR-based inspection capability using tablet devices.  The tablet device can be used to perform AR-based inspection directly in a similar way to the AR headset. It can also leverage Unmanned Aerial Vehicles (UAV) for remote image and video acquisition during inspections, enabling bridge inspections from a distance in a human-centered manner, as illustrated in Figure 2. 

Scope of Work

The scope of work includes three main tasks from the development and creation of CV and AI algorithms for steel fatigue crack and corrosion detection and quantification (Task 1), comprehensive design and development of AR-based software to facilitate human-centered damage detection, visualization, documentation, tracking, and decision-making (Task 2), and extensive laboratory and field implementation, testing, and evaluation (Task 3).

Task 1: CV and AI algorithms for crack and corrosion inspection

Two types of algorithms will be included in the AR inspection tool. The first method is based on video analysis and will be improved upon the NCHRP IDEA product in terms of accuracy and sensitivity. In addition, this research will also include image-based deep learning algorithms to enable classification, detection, and segmentation of cracks and corrosion, as illustrated in Figure 3 for the case of crack identification, using images taken by the AR headset, tablet, or UAV. Focus will be placed on minimizing the complexity of the deep learning model to reduce computation, with the goal of enabling real-time image processing and damage inference for practical inspections. With the two methods available, the inspector can first use the image-based deep learning method to identify and segment the regions where cracks and corrosion may exist, then apply the video-based algorithm to further examine the crack region for a refined result.

Task 2: AR-based software for human-center bridge inspection

This task will develop AR-based software environment and user interface to enable human-in-the-loop decision making during field inspections. A process will be developed to convert the damage detection result into holograms and deploy them to the 3D real-world environment with accurate anchorage onto the structural surface. A cloud database will be created to store inspection results. This ability is the key to enabling documentation, allowing for comparisons and tracking of bridge damage in space and time. Build upon the user interface developed in the NCHRP Highway IDEA project, a more comprehensive virtual menu will be created to facilitate a smooth and user-friendly interface for human-centered bridge inspection. In addition, the software for AR headset will be adapted to enable AR-based inspection by using a tablet device. When a UAV is used to facilitate bridge inspection from a distance, the tablet device will receive the damage detection result for the inspector to facilitate human-centered documentation and decision-making, as illustrated in Figure 2.

Task 3: Laboratory and field testing

The developed AR software and AI algorithms will be tested extensively in both laboratory and field settings. A large-scale girder bridge subassemblage with realistic fatigue and corrosion damage will be established in the structural testing laboratory at the University of Kansas for testing the developed AR inspection tools. In addition, several bridges in the inventory of KDOT and other participating member states will be selected for field testing and validation. The team will work closely with the KDOT inspection crew to ensure the tools are relevant and address practical challenges.

This project will result in user-friendly AR software packages for participating member states empowered by AI algorithms for automated damage detection that can be readily adopted by bridge inspectors to perform AI and AR assisted bridge inspections using both AR headsets and tablet devices. In addition, quarterly reports and a final report will be generated in MS Word format. The team will hold quarterly online report meetings with participating parties during the project. The team also plans to hold on one in-person mid-project participant meeting in Year 3. The team will also disseminate the findings and results from this research through journal and conference publications. 

Comments

     

·       Funding requested: $40,000/year per each participating state for 3 years.

·       5 states (Total budget $600,000)

Please see figures in the enclosed complete proposal in the attachment:

Figure 1: Human-centered fatigue crack inspection tool developed under NCHRP IDEA 223

Figure 2: Human-centered bridge inspection enabled by integrating AI, AR, and UAV

Figure 3: Classification, detection, and segmentation of cracks using deep learning

Title File/Link Type Private
SPR Waiver Memo Approval SPR Waiver Memo #1597.pdf TPF Study Documentation Public
Proposal Proposal.pdf TPF Study Documentation Public
Acceptance Letter acceptance letter for TPF-5(535).pdf TPF Study Documentation Public
Title File/Link Type Private
Laboratory demonstration using a large-scale steel bridge girder specimen TPF 1597 Lab Demo.mp4 Other Public
Human-centered Steel Bridge Inspection enabled by Augmented Reality and Artificial Intelligence Proposal.pdf TPF Study Documentation Public

Currently, Transportation Pooled Fund is not supported on mobile devices, please access this Web portal using a desktop or laptop computer.