Evaluation of Fiber Degree for Cooked Fish Muscle Based on the Convolutional Neural Network
CSTR:
Author:
Affiliation:

(1.State Key Laboratory of Food Science and Resources, Jiangnan University, Wuxi 214122, Jiangsu;2.Key Laboratory of Refrigeration and Conditioning Aquatic Products Processing, Ministry of Agriculture and Rural Affairs, Xiamen 361022, Fujian;3.School of Food Science and Technology, Jiangnan University, Wuxi 214122, Jiangsu;4.Fujian Provincial Key Laboratory of Refrigeration and Conditioning Aquatic Products Processing, Xiamen 361022, Fujian;5.Anjoy Foods Group Co., Ltd., Xiamen 361022, Fujian

  • Article
  • | |
  • Metrics
  • |
  • Reference [40]
  • |
  • Related [20]
  • | | |
  • Comments
    Abstract:

    An evaluation method for cooked fish fiber degree was constructed based on the convolutional neural network in order to break the situation that there was no systematic evaluation method for cooked fish fiber taste. A dataset was established by collecting microscopic images of cooked fish samples with different fiber degrees. The dataset was randomly divided into training dataset and testing dataset in the ratio of 8∶2. The constructed models were trained on the training dataset and their recognition performance were evaluated by the testing dataset. The result showed that the 34-layer ResNet model win in convergence speed and accuracy when choosing network depth. Four ResNet models with different depths were better than AlexNet, VGG-16 and GoogLeNet models in the best recognition accuracy. The average accuracy, precision, sensitivity and specificity and AUC of ResNet-34 were 96.94%, 91.26%, 91.00%, 98.13% and 99.19%, respectively, which proved that the evaluation method based on ResNet-34 model could objectively and accurately identify the degree of cooked fish fiber.

    Reference
    [1] DUCEPT F, BROUCKER T D, SOULIE J M, et al.Influence of the mixing process on surimi seafood paste properties and structure[J].Journal of Food Engineering, 2012, 108(4): 557-562.
    [2] 曹洪伟.微波对鱼糜加工过程中肌球蛋白和关键酶结构的影响[D].无锡: 江南大学, 2019.CAO H W.Effect of microwave on the structure of myosin and key enzymes in surimi processing[D].Wuxi: Jiangnan University, 2019.
    [3] WANG L, CHENG J Q, SU R H, et al.Changing the gel-forming properties of myofibrillar protein by using a gentle breaking method[J].Journal of Food Science, 2019, 84(1/2/3): 261-267.
    [4] YIN T, HE Y T, LIU L L, et al.Structural and biochemical properties of silver carp surimi as affected by comminution method[J].Food Chemistry, 2019, 287: 85-92.
    [5] LAN C, OPARA U L.Texture measurement approaches in fresh and processed foods - A review[J].Food Research International, 2013, 51(2): 823-835.
    [6] RAITA S M, GHIMPETEANU M, PREDESCU C, et al.PP-APR11-050: Histological assessment of beef muscle tenderized by stretch method[J].Journal of Biotechnology, 2019, 305: S47.
    [7] RAHBARI M, HAMDAMI N, MIRZAEI H, et al.Investigation of the histological and textural properties of chicken breast thawed by high voltage electric field[J].Journal of Food Process Engineering, 2020, 43(11): e13543.
    [8] POGODAEV V, ADUCHIEV B, SERGEEVA N.Microstructure of muscle tissue and its connection with slaughter and meat qualities of young rams of different genotype[C]//IOP Conference Series: Earth and Environmental Science.Seoul: IOP Publishing, 2019, 403(1): 012111.
    [9] BABAR, KHAN, FANG, et al.Automatic quality inspection of bakery products based on shape and color information[J].Journal of Harbin Institute of Technology, 2017, 24(5): 92-100.
    [10] BARBRI N E, HALIMI A, RHOFIR K.A nondestructive method based on an artificial vision for beef meat quality assesement[J].Ijireeice, 2014, 2(10): 2060-2063.
    [11] TAHERI-GARAVAND A, FATAHI S, BANAN A, et al.Real-time nondestructive monitoring of common carp fish freshness using robust vision-based intelligent modeling approaches[J].Computers and Electronics in Agriculture, 2019, 159: 16-27.
    [12] RAWAT W, WANG Z H.Deep convolutional neural networks for image classification: A comprehensive review[J].Neural Computation, 2017, 29(9): 2352-2449.
    [13] ZHAN Y H, CHEN Y H, LI X K, et al.Fruit recognition based on convolution neural network[J].Journal of Physics: Conference Series, 2020, 1651(1): 012176.
    [14] 王博, 杨洪遥, 陆逢贵, 等.重组牛肉图像识别模型的比较研究[J].肉类研究, 2020, 34(7): 13-17.WANG B, YANG H Y, LU F G, et al.Comparative study on image recognition models for restructured beef[J].Meat Research, 2020, 34(7): 13-17.
    [15] TAHERI-GARAVAND A, NASIRI A, BANAN A, et al.Smart deep learning-based approach for non-destructive freshness diagnosis of common carp fish[J].Journal of Food Engineering, 2020, 278: 109930.
    [16] COLEMAN R.In search of perfect frozen sections[J].ActaHistochemica, 2013, 115(3): 195-197.
    [17] PETRACCI M, SIRRI F, MAZZONI M, et al.Comparison of breast muscle traits and meat quality characteristics in 2 commercial chicken hybrids[J].Poultry Science, 2013, 92(9): 2438-2447.
    [18] MENG L L, JIAO X D, YAN B W, et al.Effect of fish mince size on physicochemical and gelling properties of silver carp(Hypophthalmichthys molitrix) surimi gel[J].LWT-Food Science and Technology, 2021, 149: 111912.
    [19] BABILI F E, REY-RIGAUD G, ROZON H, et al.State of knowledge: Histolocalisation in phytochemical study of medicinal plants[J].Fitoterapia, 2021, 150: 104862.
    [20] SRDJAN S, MARKO A, ANDRAS A, et al.Deep neural networks based recognition of plant diseases by leaf image classification[J].Computational Intelligence and Neuroence, 2016, 2016(1): 1-11.
    [21] GU J X, WANG Z H, KUEN J, et al.Recent advances in convolutional neural networks[J].Pattern Recognition the Journal of the Pattern Recognition Society, 2018, 77(1): 354-377.
    [22] TECHNICOLOR T, RELATED S, TECHNICOLOR T, et al.ImageNet classification with deep convolutional neural networks[J].Communications of the ACM, 2017, 60(6): 84-90.
    [23] SIMONYAN K, ZISSERMAN A.Very deep convolutional networks for large-scale image recognition[J].Computer Science, 2014, 1409(1556): 1-14.
    [24] SZEGEDY C, LIU W, JIA Y, et al.Going deeper with convolutions[J].IEEE Computer Society, 2015, 7: 1-9.
    [25] HE K M, ZHANG X, REN S, et al.Deep residual learning for image recognition[C]//Proceedings of the IEEE conference on computer vision and pattern recognition.Las Vegas: Institute of Electrical and Electronics Engineers, 2016: 770-778.
    [26] OQUAB M, BOTTOU L, LAPTEV I, et al.Learning and transferring mid-level image representations using convolutional neural networks[C]//Proceedings of the IEEE conference on computer vision and pattern recognition.Columbus: Institute of Electrical and Electronics Engineers, 2014: 1717-1724.
    [27] LUO W J, ZHANG H, NI P, et al.Balanced principal component for 3D shape recognition using convolutional neural networks[J].IET Image Processing, 2020, 14(17): 4468-4476.
    [28] OHURA N, MITSUNO R, SAKISAKA M, et al.Convolutional neural networks for wound detection: the role of artificial intelligence in wound care[J].Journal of Wound Care, 2019, 28(Suppl 10): S13-S24.
    [29] BOHRA N, BHATNAGAR V.Group level social media popularity prediction by MRGB and Adam optimization[J].Journal of Combinatorial Optimization, 2021, 41(2): 328-347.
    [30] YEUNG M, SALA E, SCH?NLIEB C B, et al.Unified focal loss: Generalising dice and cross entropy-based losses to handle class imbalanced medical image segmentation[J].Computerized Medical Imaging and Graphics, 2022, 95: 102026.
    [31] NASIRI A, OMID M, TAHERI-GARAVAND A.An automatic sorting system for unwashed eggs using deep learning[J].Journal of Food Engineering, 2020, 283: 110036.
    [32] RAJBHANDARI E, ALSADOON A, PRASAD P W C, et al.A novel solution of enhanced loss function using deep learning in sleep stage classification: predict and diagnose patients with sleep disorders[J].Multimedia Tools and Applications, 2021, 80(8): 11607-11630.
    [33] ALAEDDINE H, JIHENE M.Deep network in network[J].Neural Computing and Applications, 2021, 33(5): 1453-1465.
    [34] LEE S, LEE C.Revisiting spatial dropout for regularizing convolutional neural networks[J].Multimedia Tools and Applications, 2020, 79(45): 34195-34207.
    [35] SHI X Y, LV F S, SENG D W, et al.Visualizing and understanding graph convolutional network[J].Multimedia Tools and Applications, 2021, 80(6): 8355-8375.
    [36] RAFEGAS I, VANRELL M.Color encoding in biologically-inspired convolutional neural networks[J].Vision Research, 2018, 151: 7-17.
    [37] JIA W K, ZHAO D, DING L.An optimized RBF neural network algorithm based on partial least squares and genetic algorithm for classification of small sample[J].Applied Soft Computing, 2016, 48(1): 373-384.
    [38] BROYART B, TRYSTRAM G.Modelling of heat and mass transfer phenomena and quality changes during continuous biscuit baking using both deductive and inductive (neural network) modelling principles[J].Food and Bioproducts Processing, 2003, 81(4): 316-326.
    [39] PUTRANTO A, CHEN X D, ZHOU W B.Bread baking and its color kinetics modeled by the spatial reaction engineering approach (S-REA)[J].Food Research International, 2015, 71: 58-67.
    [40] LING C X, HUANG J, ZHANG H.AUC: a better measure than accuracy in comparing learning algorithms[C].Conference of the canadian society for computational studies of intelligence.Springer, Berlin, Heidelberg, 2003, 2671(1): 329-341.
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation
Share
Article Metrics
  • Abstract:218
  • PDF: 200
  • HTML: 246
  • Cited by: 0
History
  • Received:September 02,2022
  • Online: November 22,2023
Article QR Code
Copyright :Journal of Chinese Institute of Food Science and Technology     京ICP备09084417号-4
Address :9/F, No. 8 North 3rd Street, Fucheng Road, Haidian District, Beijing, China      Postal code :100048
Telephone :010-65223596 65265376      E-mail :chinaspxb@vip.163.com
Supported by : Beijing E-Tiller Technology Development Co., Ltd.
Firefox, Chrome, IE10, IE11 are recommended. Other browsers are not recommended.