Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Machine vision-based famous high-quality tea picking point position information acquisition method

A technology of position information and machine vision, applied in picking machines, instruments, harvesters, etc., can solve the problems of low positioning accuracy, mutual occlusion of tea leaves, low efficiency, etc., to reduce impact, improve accuracy and efficiency, and improve integrity. Effect

Pending Publication Date: 2021-05-28
ZHEJIANG SCI-TECH UNIV
View PDF0 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Famous and high-quality tea is light in quality, and the wind blowing or the movement of the picking machine will cause the tea leaves to sway; the environment of the tea garden is complex, and the tea leaves block each other; the light is too strong or too dark, and the resolution of young shoots and old leaves is low
These factors make it very difficult to identify and locate the bud picking point, which seriously limits the automatic picking of famous and high-quality tea
The currently used positioning method for tea bud picking points has low positioning accuracy and low efficiency. In order to realize the rapid identification and positioning of picking points and ensure the efficiency and high quality requirements of famous tea machine picking, it is necessary to develop a kind of famous tea picking point location information access method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Machine vision-based famous high-quality tea picking point position information acquisition method
  • Machine vision-based famous high-quality tea picking point position information acquisition method
  • Machine vision-based famous high-quality tea picking point position information acquisition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0072]The present invention is further explained below in conjunction with the embodiment shown in the drawings. However, the present invention is not limited to the following examples.

[0073]A machine-visual name-based tea pick point location information acquisition method, including the following steps:

[0074]Step 1, the tea picture obtained from the tea garden (figure 2 ), The collection of tea samples with 3 ╳ ╳ 卷核 核核 (see pictures after treatment)image 3 ).

[0075]At the RGB model, you can first obtain the G-B component map of its picture, and the OTSU algorithm is used to initiate the tear bud, and then the corrosive morphological operation is performed to filter the fine profile of the cause of noise or the like.

[0076]Step 2, in order to reduce the influence of the influence of the inclusive point, the number of image processing is improved, the real time of image processing is improved, and each of the new buds acquired separately, respectively, when the pick point identificatio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of image processing algorithms. According to the technical scheme, the famous high-quality tea picking point position information obtaining method based on machine vision comprises the following steps: 1, obtaining a tea image from a tea garden, and performing Gaussian filtering noise removal on the tea image through a 3 * 3 convolution kernel; 2, setting a respective ROI for each tender shoot obtained through the image; 3) converting the ROI in the RGB color space into an HSV color space, and extracting the features of the tender shoots and the growing point branches of the tender shoots; 4) carrying out secondary binarization segmentation on the extracted tender shoot and branch areas by using an Otsu algorithm; 5) refining the binarized picture in the previous step by adopting an improved Zhang refining algorithm, and extracting a skeleton of the binarized picture; 6) using a Shii-Tomasi algorithm to search bifurcation points of the tender shoots and the branches as feature angular points for detecting the refined skeleton; 7) fitting the lowest point and the angular point of the contour into a linear segment. According to the method, the precision and efficiency of tea tender shoot picking point positioning can be improved.

Description

Technical field[0001]The present invention relates to the field of artificial intelligence, machine vision and image processing algorithm, in particular to provide a machine-based name-based tea pickpoint location information acquisition method.Background technique[0002]In recent years, visually-based automatic picking robots are used in the picking point of the broth, and the automatic identification of its pick points has become a key and demarchers that limit their development. The quality of the famous tea is light, the wind blow or picking machine moves will cause the wobble of tea; the tea garden is complex, the tea is blocked; the light is too strong or too dark, the buds and the old leaves resolution are low. These factors make the identification positioning of the tender budding point is very difficult, seriously limiting the automated pick of famous tea. The positioning method of the tea-tender sprout pick point is low, the efficiency is low, in order to achieve the rapid ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32G06K9/34G06K9/38G06K9/46A01D46/04
CPCA01D46/04G06V20/00G06V10/25G06V10/267G06V10/28G06V10/44G06V10/56
Inventor 邹浪张雷武传宇陈建能
Owner ZHEJIANG SCI-TECH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products