Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Electronic pet based on Kinect technology

An electronic pet and technology technology, applied in biological models, user/computer interaction input/output, instruments, etc., can solve problems such as single feedback, electronic pets have no learning function, and monotony.

Inactive Publication Date: 2018-01-26
DALIAN INST OF SCI & TECH
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present, artificial intelligence is developing rapidly. The current electronic pet has a single action and cannot interact with people. It cannot accurately identify the surrounding environment. It can only provide a single feedback to the user's action, which is very monotonous and lacks interaction with people. Emotional communication, users generally need electronic pets to be more intelligent, able to distinguish between the environment and people, and able to recognize human actions and expressions. In addition, the current electronic pets do not have a learning function, and cannot be learned and continuously updated

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Electronic pet based on Kinect technology
  • Electronic pet based on Kinect technology

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0023] The present invention provides such as Figure 1-2 A kind of electronic pet based on Kinect technology shown, comprises logic layer, image recognition layer and control layer, and image recognition layer takes image information by camera and transmits image information to logic layer, and the corresponding The action command is sent to the control layer, and the control layer sends movement signals to the limbs of the electronic pet to control the movement of the limbs of the electronic pet.

[0024] Further, in the above technical solution, the logic layer includes the following steps:

[0025] S1: Define the recognition target, the logic layer decomposes the image information into several pixels, and connects the adjacent pixels with light intensity changes and color changes in the image to form the recognition target;

[0026] S2: Judging the interactive object, judging the movement status of the recognition target, defining the motionless recognition target as the ...

Embodiment 2

[0035] The difference between this embodiment and Embodiment 1 is that, based on the Kinect recognition technology, the program flow is changed, and the program is simplified and optimized to make it faster.

[0036] The software design is divided into a logic layer and an identification layer. The logic layer is mainly realized by the .Net Core framework, and .NET is also the underlying framework used to control Kinect, both of which are developed by Microsoft. Net Core is a cross-platform operating framework redeveloped by Microsoft based on the concept of the previous .NET framework. Its main feature is cross-platform, and it can support exclusive UWP development and XAMARIN mobile cross-platform development. It is also very good, and Microsoft has also open sourced it to the community, and it is currently developing rapidly.

[0037] The image recognition layer is implemented using the Python language. Using the Python language, we can also use Google's deep learning frame...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an electronic pet based on a Kinect technology. The electronic pet comprises a logic layer, an image identification layer and a control layer, wherein the image identificationlayer shoots image information through a camera and transmits the image information to the logic layer; after the logic layer carries out an operation, a corresponding movement command is sent to thecontrol layer; and the control layer emits a movement signal to the limbs of the electronic pet so as to control the limb movement of the electronic pet. By use of the electronic pet, the personalityof the pet can be increased according to contact time with people, for example, the electronic pet has hobbies, can act like a spoiled child and the like, has excellent characteristics of "decoupling", "redundancy", "expansibility", "flexibility", "peak value processing capacity", "restorability" and the like, has basic face recognition, gesture recognition and movement recognition functions, andexhibits learnability and basic AI (Artificial Intelligence).

Description

technical field [0001] The invention relates to the technical field of electronic pets, in particular to an electronic pet based on Kinect technology. Background technique [0002] At present, artificial intelligence is developing rapidly. The current electronic pet has a single action and cannot interact with people. It cannot accurately identify the surrounding environment. It can only provide a single feedback to the user's action, which is very monotonous and lacks interaction with people. Emotional communication, users generally need electronic pets to be more intelligent, able to distinguish between the environment and people, and able to recognize human actions and expressions. In addition, the current electronic pets do not have a learning function, and cannot be learned and continuously updated. [0003] Therefore, it is necessary to invent a kind of electronic pet based on Kinect technology to solve the above problems. Contents of the invention [0004] The purp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06N3/00
Inventor 王荔徐国旭龚嘉成单慧孙仲璞孙朕子
Owner DALIAN INST OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products