Deep convolutional neural network processing of aerial stereo imagery to monitor vulnerable zones near power lines
The monitoring of vegetation near high-voltage transmission power lines and poles is tedious. Blackouts present a huge challenge to power distribution companies and often occur due to tree growth in hilly and rural areas. There are numerous methods of monitoring hazardous overgrowth that are expensi...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Published: |
SPIE
2018
|
Online Access: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85042224373&doi=10.1117%2f1.JRS.12.014001&partnerID=40&md5=9dc68a460bc04e8a22c161588a2c656c http://eprints.utp.edu.my/21912/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Teknologi Petronas |
Summary: | The monitoring of vegetation near high-voltage transmission power lines and poles is tedious. Blackouts present a huge challenge to power distribution companies and often occur due to tree growth in hilly and rural areas. There are numerous methods of monitoring hazardous overgrowth that are expensive and time-consuming. Accurate estimation of tree and vegetation heights near power poles can prevent the disruption of power transmission in vulnerable zones. This paper presents a cost-effective approach based on a convolutional neural network (CNN) algorithm to compute the height (depth maps) of objects proximal to power poles and transmission lines. The proposed CNN extracts and classifies features by employing convolutional pooling inputs to fully connected data layers that capture prominent features from stereo image patches. Unmanned aerial vehicle or satellite stereo image datasets can thus provide a feasible and cost-effective approach that identifies threat levels based on height and distance estimations of hazardous vegetation and other objects. Results were compared with extant disparity map estimation techniques, such as graph cut, dynamic programming, belief propagation, and area-based methods. The proposed method achieved an accuracy rate of 90. © 2018 Society of Photo-Optical Instrumentation Engineers (SPIE). |
---|