Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method of deep neural network based on discriminable region for dish image classification

A deep neural network and image technology, applied in the field of convolutional neural networks, can solve the problems of high similarity between classes, great difficulty in classifying dishes, and large differences within classes, so as to improve training efficiency, verification effectiveness and Real-time performance, the effect of improving network convergence speed

Active Publication Date: 2018-01-12
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF7 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Chinese dishes are rich in categories, with ever-changing ingredients, high similarity between categories, and large differences within categories. At the same time, unlike target detection, it is very difficult to classify dishes due to the lack of specific spatial features and structural information that can be used as an aid.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method of deep neural network based on discriminable region for dish image classification
  • Method of deep neural network based on discriminable region for dish image classification
  • Method of deep neural network based on discriminable region for dish image classification

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] The present invention can be mainly divided into two parts: the deep neural network based on the discriminable area is used for the learning and testing of dish image classification, and the whole work can be divided into the following 5 steps:

[0026]Step 1. Build the database: First, we built a database of 90 common and popular dishes including braised pork, pork ribs and radish soup, twice-cooked pork, fried potato shreds, tomato scrambled eggs, minced meat eggplant, fried broccoli, etc. Image database, 1500 images per category, samples of these images are from recipe websites. We randomly select 1200 of them as training samples, and the remaining 300 as testing samples. All image sizes are normalized. Due to the large number of network parameters and the small number of samples, in order to avoid overfitting, the images randomly cropped from the images during training are used for network training to increase the number of samples.

[0027] Step 2. Classify the d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method of deep neural network based on a discriminable region for dish image classification. The method relates to the field of image processing, integrates a significant spectrum pooling operation, and fuses low-level features and high-level features in a network. The method adopts a convolution kernel filling operation, effectively preserves important information on characteristic spectra, and is matched with data dimensions of a full connection layer, so that the full connection layer can utilize a VGG-16 pre-training model at a training state, thereby improving the training efficiency and network convergence speed. Each image to be classified is subjected to normalization processing based on the model which is learned in a constructed database, the image is tested by using a trained convolutional neural network, the classification precision is measured by using Softmax loss, a classification result of the image is obtained, real categories and predicted categories of targets in all test images are compared, and a classification accuracy rate is obtained through calculation. The method is used for testing on a self-established data set CFOOD90, and theeffectiveness and the real-time performance of the method are verified.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a novel application method of a convolutional neural network for dish image classification. Background technique [0002] In recent years, healthy eating has become a social focus topic along with the fitness boom. Accurately classifying food images is an important part of intelligent healthy eating management. Although nutrition experts can give some professional dietary analysis and consultation suggestions for bodybuilders and patients to guide them in managing calories and nutrient intake, the high cost and time cost limit its popularization and development to the public. Therefore, there is an urgent need for a method that can directly automatically identify food photos taken by mobile devices such as smart phones and tablet computers, and then classify food and manage personal diets. [0003] Algorithms based on deep convolutional neural networks have made breakthrough achi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
Inventor 李宏亮陈雅丽方清姚晓宇杨燕平
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products