Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fast-RCNN target detection method based on FPGA

A target detection and test result technology, applied in the field of intelligent recognition, can solve the problems of accelerating faster-RCNN target detection and slow recognition speed

Pending Publication Date: 2020-10-27
REDNOVA INNOVATIONS INC
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a faster-RCNN target detection method based on FPGA, based on the existing deep learning network and computer vision technology, and use FPGA to carry out deep customization according to the faster-RCNN model, FPGA can realize accurate Recognize objects, and can also solve the problem of slow recognition speed, realize parallel computing of the faster-RCNN model, and achieve the purpose of accelerating faster-RCNN target detection

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fast-RCNN target detection method based on FPGA
  • Fast-RCNN target detection method based on FPGA

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] Such as figure 1 Shown, the present invention is a kind of faster-RCNN object detection method based on FPGA, and this method specifically comprises the following steps:

[0037] Step 1: Obtain an existing data set for target detection and preprocess the data set;

[0038] Step 2: Build a faster-RCNN model;

[0039] Step 3: Load the dataset from step 1 into the faster-RCNN model, and customize the FPGA according to the faster-RCNN model;

[0040] Step 4: Use the customized FPGA to train the faster-RCNN model in step 3;

[0041] Step 5: Set an average precision rate AP threshold, and test according to the training results of the faster-RCNN model. If the test result is lower than the average precision rate AP threshold, modify the parameters, and perform step 4 again to test the training results until the test The result reaches the threshold;

[0042] Step 6: Input the picture to be detected, and use the trained faster-RCNN model for target recognition.

[0043] In...

Embodiment 2

[0046] This embodiment is a further description of the present invention.

[0047] The steps for building the faster-RCNN model in step 1 are:

[0048] Step 21: Build Conv layers for extracting feature maps of images, including conv, pooling, and relu layers;

[0049] Step 22: Build the area generation network layer, and use the area generation network layer to generate the detection frame, that is, initially extract the target candidate area in the picture;

[0050] Step 23: Build the pooling layer of the region of interest, obtain the feature map of step 21 and the target candidate area of ​​step 22, and extract the candidate feature map after synthesizing the information;

[0051] Step 24: Build a classification layer, use border regression to obtain the final precise position of the detection frame, and determine the target category through the candidate feature map.

[0052] In this embodiment, the faster-RCNN model includes Conv layers, region generation network layer,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an FPGA (Field Programmable Gate Array)-based master-RCNN (Recurrent Convolutional Neural Network) target detection method, which relates to the field of intelligent identification. The method comprises the following steps: preprocessing a data set; constructing a faster-RCNN model; loading the data set into the faster-RCNN model, and customizing an FPGA i according to thefaster-RCNN model; training a faster-RCNN model by using the customized FPGA; performing testing according to a training result of the faster-RCNN model, if the training result is lower than an average precision rate AP threshold, modifying parameters, performing training again, and testing the training result until the training result reaches a threshold; and inputting a picture to be detected, and performing target recognition by using the trained faster-RCNN model. According to the invention, each processing module of the FPGA is customized according to the faster-RCNN model; according to the invention, the method can achieve the precise recognition of an object through the Faster-RCNN model, also can solve a problem that the recognition speed of the Faster-RCNN model is slow, and achieves a higher detection speed, higher detection precision, better performances and lower power consumption.

Description

technical field [0001] The invention relates to the field of intelligent identification, in particular to an FPGA-based faster-RCNN target detection method. Background technique [0002] With the development of intelligent recognition technology, the terminal system needs to detect surrounding objects, especially in the field of automatic driving. For the sake of personal safety, target recognition must be fast and accurate. Therefore, it is of great practical significance to construct a fast and accurate object detection method. [0003] Existing target detection for autonomous driving requires both high precision and high speed. Existing target detection algorithms based on deep learning, such as SSD and YOLO, are fast but not accurate enough. In addition, the faster-RCNN algorithm is accurate enough but not fast enough in target detection. Each type of detection has more or less specific problems. For example, in the actual driving process, once the target detection is...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V2201/07G06N3/047G06N3/045G06F18/241G06F18/25G06F18/2415
Inventor 王堃王铭宇吴晨
Owner REDNOVA INNOVATIONS INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products