Multi-vision-based bridge three-dimensional deformation monitoring method
A technology of three-dimensional deformation and multi-eye vision, which is applied in the direction of measuring devices, instruments, biological neural network models, etc., can solve the problems of high price, high measurement cost and difficult measurement, and achieve the prevention of the influence of ambient light, high sensitivity, The effect of strong adaptability to the experimental environment
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment
[0061] like figure 1 As shown, this multi-eye vision-based bridge three-dimensional deformation monitoring method includes the following steps:
[0062] (1) Each camera image of the object is obtained by multiple cameras, and each point of the object corresponds to each camera image coordinate in each camera image, and they are all two-dimensional coordinates; obtain several feature points on the calibration board;
[0063] (2) By training each camera image coordinate and 3D world coordinate corresponding to each feature point as a data sample, a mapping model based on BP neural network is established;
[0064] (3) According to the camera images obtained in the step (1), extract the camera image coordinates of the bridge edge feature points;
[0065] (4) Use the RANSAC algorithm to eliminate the mismatch of the extracted feature point pairs to obtain the correct feature point pairs;
[0066] (5) Perform three-dimensional calculation on the feature points through the BP neura...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com