Evaluation of pork color by using computer vision

Citation
J. Lu et al., Evaluation of pork color by using computer vision, MEAT SCI, 56(1), 2000, pp. 57-60
Citations number
10
Categorie Soggetti
Food Science/Nutrition
Journal title
MEAT SCIENCE
ISSN journal
03091740 → ACNP
Volume
56
Issue
1
Year of publication
2000
Pages
57 - 60
Database
ISI
SICI code
0309-1740(200009)56:1<57:EOPCBU>2.0.ZU;2-A
Abstract
The objective of this study was to determine the potential of computer visi on technology for evaluating fresh pork loin color. Software was developed to segment pork loin images into background, muscle and fat. Color image fe atures were then extracted from segmented images. Features used in this stu dy included mean and standard deviation of red, green, and blue bands of th e segmented muscle area. Sensory scores were obtained for the color charact eristics of the lean meat from a trained panel using a 5-point color scale. The scores were based on visual perception and ranged from 1 to 5. Both st atistical and neural network models were employed to predict the color scor es by using the image features as inputs. The statistical model used partia l least squares technique to derive latent variables. The latent variables were subsequently used in a multiple linear regression. The neural network used a back-propagation learning algorithm. Correlation coefficients betwee n predicted and original sensory scores were 0.75 and 0.52 for neural netwo rk and statistical models, respectively. Prediction error was the differenc e between average sensory score and the predicted color score. An error of 0.6 or lower was considered negligible from a practical viewpoint. For 93.2 % of the 44 pork loin samples, prediction error was lower than 0.6 in neura l network modeling. In addition, 84.1% of the samples gave an error lower t han 0.6 in the statistical predictions. Results of this study showed that a n image processing system in conjunction with a neural network is an effect ive tool for evaluating fresh pork color. (C) 2000 Elsevier Science Ltd. Al l rights reserved.