Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object volume calculating method based on Kinect

A volume calculation and object technology, applied in the field of computer vision, can solve problems such as high cost and long measurement time, and achieve the effect of satisfying automation, solving labor intensity and improving measurement accuracy.

Active Publication Date: 2017-08-18
HOHAI UNIV CHANGZHOU
View PDF7 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Purpose: Aiming at the deficiencies of the prior art, the present invention provides a Kinect-based object volume calculation method. Through a non-contact measurement method, the measurement target is not damaged, and the natural state of the measured object is not disturbed. It can be used under certain conditions and is easy to disassemble and install, which solves the problems of long measurement time and high cost of traditional measurement methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object volume calculating method based on Kinect
  • Object volume calculating method based on Kinect
  • Object volume calculating method based on Kinect

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] (1a) Utilize Kinect to collect the foreground depth image and the foreground color image that comprise the measured object and the background depth image and the background color image of the measuring platform that does not contain the measured object;

[0063] (1b) Carry out checkerboard calibration to the color camera of Kinect, calculate the internal parameters and external parameters of the camera through the relationship between the image coordinate system and the world coordinate system, and the general camera model can be expressed as shown in formula (I):

[0064]

[0065] In formula (I), K is only related to the internal structure of the camera, called the internal parameters of the camera; R, t is only related to the orientation of the camera relative to the world coordinate system, called the external parameters of the camera, R is the rotation matrix, and t is the translation Matrix, [x y] is the coordinate of image pixel point, and [X Y Z] is the coordin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an object volume calculating method based on Kinect. The method comprises the steps of (1), acquiring a depth image and a color image by means of Kinect; (2), calibrating a color camera of the Kinect; (3), setting an ROI area of the depth image, performing image segmentation by means of a foreground color image which comprises a measured object and a background color image of a measurement platform that does not comprise the measured object, and obtaining a binary image of the measured object; (4), converting a background depth image ROI area to a background distance matrix, and performing preprocessing on the background distance matrix, filling in elements which are zero on the background distance matrix, and converting the foreground depth image ROI area to a foreground distance matrix; (5), obtaining a height matrix according to difference between the foreground distance matrix and the background distance matrix; and (6), calculating length, width, height and volume of the object. The object volume calculating method effectively settles problems of high labor intensity and long measurement time in traditional manual measurement, and is a noncontact measurement means. Damage of the measurement object is prevented. A requirement for automatic is satisfied and furthermore measurement precision is improved.

Description

technical field [0001] The invention relates to a Kinect-based object volume calculation method, which belongs to the field of computer vision. Background technique [0002] With the development of digital signal processing and computer technology, the new technology formed by converting the external environment image acquired by the camera into a digital signal, and using the computer to realize the whole process of visual information processing is called computer vision. Apply computer vision to precise measurement and positioning of spatial geometric dimensions, resulting in visual measurement technology. As one of today's high-tech, visual measurement has developed rapidly on the basis of continuous maturity and improvement of image processing and computer technology, and has been widely used in product inspection, reverse engineering, robot navigation and other fields. Computer vision measurement technology uses image sensors as a means to detect the three-dimensional ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/62G06T7/80
CPCG06T2207/10024G06T2207/20036
Inventor 刘波李凌陈荔庄兴昌李奎周军廖华丽王婷婷谢小敏杨跟
Owner HOHAI UNIV CHANGZHOU
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products