Research Article Open Access
Automatic Controls of Food Quality Using the NI Vision System and Multispectral Images Technologies
Yuriy Vashpanov1,2*, Kae-Dal Kwack1 and Jung-Young Son1
1Department of Electrical and Computer Engineering, Hanyang Institute of Technology, Hanyang University, 17 Haengdang-dong, Sung-dong-gu, 133-791 Seoul, South Korea
2Department of automation and robotics system, Odessa National Academy of Food Technologies, Kanatna 112, 65000, Odessa, Ukraine
*Corresponding author: Yuriy Vashpanov, Department of Electrical and Computer Engineering, Hanyang Institute of Technology, Hanyang University, 17 Haengdang-dong, Sung-dong-gu, 133-791 Seoul, South Korea, E-mail: @
Received: April 11, 2018; Accepted: May 16, 2018; Published: May 23, 2018
Citation: Vashpanov Y, Kwack KD, Jung YS (2018) Automatic Controls of Food Quality Using the NI Vision System and Multispectral Images Technologies. Int J Adv Robot Automn 3(2): 1-5. DOI: 10.15226/2473-3032/3/2/00134
Abstract
A combined method of diagnosing the surface of grain using multispectral imaging technology in polarized light for automatic evaluation of grain quality has been analyzed. Polarized light in our experiment increases the spectral effect. We propose parameters for quantitative characterization of damages of the samples under the streaming data automation process on the base of the NI Vision platform with special narrow band-pass filters of Asahi Spectra Co. Ltd, Japan. A receiving of streaming data about the damages of grains can be fully automated on the base of the platform of NI Vision.

Keywords: Robot vision; Automation; Multispectral imaging technology in polarized light; Narrow band-pass Asahi filters, NI Vision;
Introduction
Computer vision is a rapid and objective method of control, which has applications in many different industries, particularly in the food industry [1]. Images technologies in the visible and near infrared regions have been effectively used for diagnostics and monitoring of surfaces and are currently the advanced technology for morphological studies (see for example, [2-5]). Many scientists use computer vision systems to assess of food quality and safety (see for example, [6-9]). As a rule, polarized light increases optical effects in the study of various biological objects [10-12].

At present, an important problem is food safety (see, for example, the encyclopedia of food safety [13]). This encyclopedia includes numerous research results in this field. There are a large number of methods for determining the properties of food (see, for example, [14]). In most cases, to obtain qualitative results, special equipment and considerable analysis time are necessary. However, automation and robotization of the technological process are impossible. Nowadays the role of this problem increases significantly with the increase in the volume of the food industry. New methods are needed to analyze the quality of various products that can allow the automation of the technological process.

Filters of the Japanese company Asahi Spectra Co. Ltd with a very narrow bandwidth (10 nm) can register very specific spectral properties of numerous objects [15]. The human eye and photo camera cannot detect such spectral features of the object. Previously, we investigated multispectral technologies with such Asahi Spectra filters for medical applications [16,17]. Our goal is to expand the range of possible application of Asahi Spectra filters for early detection of changes in biological food products. The use of these filters can be a new direction in the field of automatic processing of digital images by industrial robots in the food industry.

The machine vision technology is developing intensively due to the National Instruments (NI) Company (USA) [18]. The NI Company is a leader in this field of robotics and automation technologies. NI hardware and software products allow to solve today many problems of automation: identification and recognition, the non-destructive testing, measuring of geometric parameters, counting the number of products and so on.

NI Vision systems have a specialized hardware and software for the computer vision, which consists of a collection of technologies (capture) images NI IMAQ (IMageAcQuision) and software to process and analyze based on LabVIEWTM. The NI IMAQ hardware includes modules for images captures from almost all video sources (digital photo cameras of different standards and configurations). These modules are designed so that most of the functions can be controlled by software, which greatly simplifies input of images from video converters. By using the NI IMAQ hardware we can work with images of wide spectral ranges, different speeds, with a different depth of digitizing. The NI IMAQ interfaces also offer the software of synchronizing of a video data with control systems.
Experimental Methods and Materials
We chose a mixture of healthy grains and grains with nascent damaged properties as samples without the optical difference visible to the human eye. This means that conventional optical methods (for example, determining the color of an object [8]) are not suitable for identifying damaged grains.

Our experimental setup for multispectral studies in polarized light is shown in figure 1. We used a digital photo camera Redlake Mega Plus II model EC4020 with a resolution 2048x2048 pixels, a source of polarized light and the different special filters from the Asahi Spectra Company Ltd [19]. The digital camera was connected to a personal computer via an IEEE-1394 interface as shown in Figure 1. The digital camera Redlake Mega Plus II model EC4020 can provide sensitivity to infrared, visible and ultraviolet radiation of light spectrum. The IMAQ module has four inputs for cameras, Ethernet, high-speed IEEE 1394 interface, DVI output, and contacts for the remote control.
Figure 1: An experimental setup for multispectral studies of surfaces in polarized light with the special narrow band-pass filters in visible light area from 400 to 900 nm (Asahi Spectra Co. Ltd, Japan)
Figure 2: Transmission coefficients of Asahi Spectra filters [15]
The light source was realized with a special halogen lamp. A polarizer installed after the light source (Figure 1). An analyzer installed in front of the camera. The spectral optical Asahi Spectra filter is being placed between the analyzer and the Redlake Mega Plus II camera. The depth of focus of our optical system was sufficient. We investigated spectral range from 450 to 900 nm at intervals of 10 nm by using the Asahi Spectra filters (a diameter is 25 mm). A transmission coefficient of each filter is in the range of 69 to 82% and is decreasing when approaching to 400 nm (Figure 2)

Sensors of the Redlake EC4020 MegaPlus II camera have a photosensitivity in a wide range of wavelengths from infrared to ultraviolet. An absolute quantum efficiency of this sensor is shown in figure 3 [19].

The halogen lamp has a spectral power distribution (see figure 4) [20]. Measuring of the light intensity of the reflected light from the surface of the samples was performed by an optical power meter New Port 1830C [21]. The optical sensor New Port 1830C was located at the level of the light source (Figure 1).
Figure 3: Absolute quantum efficiency of the Redlake EC4020 MegaPlus II camera [18]
Figure 4: Spectral power distribution of the halogen lamp [20]
Calibrations of the optical system contain relevant data of discrete wavelengths for the light source intensity. We can adjust and equalize the light intensity for any wavelength. An accuracy of the calibration of the optical system ranges from 2% to 7% depending on the wavelength. The light intensity in the visible and infrared regions of the spectrum was normalized by using the LabVIEW program. The computer system also provides an alignment of camera sensitivity across the spectrum through a system of automatic amplification. We used the software of NI Vision with a program LabVIEW. The IMAQ NI LabVIEW module communicates with the computer through high-speed 1394 interfaces.

For multispectral studies in polarized light we chose a variety of grains, on the surface of which were not contained and were contained biological changes due to illness. The samples were located on a special table at the bottom of the system (Figure 1).
Results and Discussion
It is impossible to determine in advance which wavelengths of reflected light can observe the maximum effect of changing the reflection from the damaged parts of the grains prepared for the study. This requires a whole set of filters and, therefore, the acquisition of a set of multispectral images. If we find a specific optical response at a certain wavelength of light (a certain Asahi Spectra filter), so such a filter can be used to automatically detect spoiled products.

Figure 5 shows some images in polarized light of grains by using of narrow band-pass filters with 450. 500, 550, 600 650 and 700 nm respectively by NI Vision in the software LabViewTM. The magnitudes of the reflected light beam characterize degree of surface damages. The maximum degree of damages has a maximum intensity of the reflected light at 550 nm. Thus, the NI Vision with the narrow band-pass Asahi Spectra filters can be used for detection the level of changes of surface properties of biological objects (in our case, the damages in a storage process). So, the defined Asahi Spectra filter with 550 nm can use for automatically diagnostic of grains quality in their flow on the conveyor.

So, the black-and-white image (obtained by the Redlake Mega Plus II camera) provides information on the intensity of the reflected polarized light in a very narrow spectral band (through filters of Asahi Spectra Co. Ltd). The maximum intensities of the reflected beam correspond to the damaged surface of the grains. We can identify the damaged grains more clearly by using of this optical effect. A damage level for various grains is different.

The 3D surface plots of images in polarized light of grains by using of narrow band pass filters with 450. 500, 550, 600, 650 and 700, nm in the LabVIEW program are shown on figure 6. It is very important for practice to quantitative description of such food transformations. In this case, we can quantitatively characterize the object of research in the process of degradation of the food properties.
Figure 5: The grayscale images in polarized light of grains by using of narrow band pass filters with 450, 500, 550, 600, 650, and 700 nm by NI Vision with a program LabViewTM
The intensity of light reflection from the damaged and undamaged area can be used as a method of assessing the degree of damage to the grain surface. We know the attenuation coefficients of each filter. This makes it possible to equalize the intensity of transmitted light. The intensity of light from an undamaged surface can be selected for calibrations.

A more accurate method of such estimation can be the calculation of the average light intensity distribution of the image:
Figure 6: The 3D surface plots of images in polarized light of grains by using of narrow band-pass filters with 450, 500, 550, 600, 650, and 700 nm by NI Vision
a= S I(x,y)dxdy s× I 0             (1) MathType@MTEF@5@5@+= feaagGart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaamyyaiabg2 da9maalaaabaWaa8quaeaacaWGjbGaaiikaiaadIhacaGGSaGaamyE aiaacMcacaWGKbGaamiEaiaadsgacaWG5baaleaacaWGtbaabeqdcq GHRiI8aaGcbaGaam4CaiabgEna0kaadMeadaWgaaWcbaGaaGimaaqa baaaaOGaaeiiaiaabccacaqGGaGaaeiiaiaabccacaqGGaGaaeiiai aabccacaqGGaGaaeiiaiaabccacaqGGaGaaeikaiaabgdacaqGPaaa aa@5238@
Where I(x,y) MathType@MTEF@5@5@+= feaagGart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaamysaiaacI cacaWG4bGaaiilaiaadMhacaGGPaaaaa@3AC7@ is the distribution of light intensity, s is an image area, I 0 MathType@MTEF@5@5@+= feaagGart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaamysamaaBa aaleaacaaIWaaabeaaaaa@37A9@ is a reference magnitude for the samples without damage. A computation of parameter a demands a significant machine time.

As a first approximation the damage level can be calculating as the ratio of the maximum intensity of light from the damaged region to the intensity of light reflected from the intact region:
b= I max I 0       (2) MathType@MTEF@5@5@+= feaagGart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaamOyaiabg2 da9maalaaabaGaamysamaaBaaaleaaciGGTbGaaiyyaiaacIhaaeqa aaGcbaGaamysamaaBaaaleaacaaIWaaabeaaaaGccaqGGaGaaeiiai aabccacaqGGaGaaeiiaiaabccacaqGOaGaaeOmaiaabMcaaaa@4366@
Where I max MathType@MTEF@5@5@+= feaagGart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaamysamaaBa aaleaaciGGTbGaaiyyaiaacIhaaeqaaaaa@39C3@ is a maximal intensity of reflected light from a damages’ region.

Figure 7 shows the dependence of the parameters a and b on the wavelength λ of the spectrum transmitted by the corresponding filters. Both parameters estimate almost coincide. We can use the magnitude I 0 MathType@MTEF@5@5@+= feaagGart1ev2aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn hiov2DGi1BTfMBaeXatLxBI9gBaerbd9wDYLwzYbItLDharqqtubsr 4rNCHbGeaGqiVu0Je9sqqrpepC0xbbL8F4rqqrFfpeea0xe9Lq=Jc9 vqaqpepm0xbba9pwe9Q8fs0=yqaqpepae9pg0FirpepeKkFr0xfr=x fr=xb9adbaqaaeGaciGaaiaabeqaamaabaabaaGcbaGaamysamaaBa aaleaacaaIWaaabeaaaaa@37A9@ as a reference magnitude for the samples without damage. In this case parameter b is equal to 1. In this case the damage degree corresponds to zero percent.
Figure 7: The dependence of the parameters a and b from the wavelength λ
For our experiment the maximal magnitude by 550 nm of parameter b equals about 9. We will consider that the damage degree corresponds to 100% percent (maximal damages for grains).

A NI LabVIEW module was designed to automatically determine the coefficient b from picture in jpeg format. Figure 8 shows a block-diagram NI LabVIEW of this module. The calculation of the parameter b does not take much time. Therefore, the module can be used for video signal in real time measurements.
Figure 8: A block-diagram NI LabVIEW for automatic determination of the parameter b
Conclusion
Experimentally, we observed the maximum spectral effect of the grain damage area using a special 550 nm filter from Asahi Spectra Co. Ltd, which can be used for automatic quality control of grain based on the modern hardware-software multispectral system NI Vision. The proposed a combined diagnostics method for the surface of biological objects using multispectral images and polarized light technology allows more clearly detecting the spectral changes of researched food materials.

Automatic measuring of damages of grains level under formulas 1 or 2 can be organized without contact with researched materials and in their stream. The proposed parameter b can be useful for quantitative characterizations of such food samples. A receiving of streaming data about the damage level of grains can be fully automated by using of NI Vision.
ReferencesTop
  1. Robotics and automation in the food industry. Current and future technologies. Caldwell DG, editor. Woodhead Publishing Limited; 2013. p. 75-108.
  2. Mahendran R, Jayashree GC, Alagusundaram K. Application of Computer Vision Technique on Sorting and Grading of Fruits and Vegetables. J Food Process Technol. 2012;S1-001:1-7. doi:10.4172/2157-7110.S1-001
  3. Levenson R, Fornari A, Loda M. Multispectral imaging and pathology: seeing and doing more. J Expert Opin Mol Diagn. 2008;2(9):1067-1081. doi: 10.1517/17530059.2.9.1067
  4. Khodabakhshian R, Emadi B, Khojastehpour M, Golzarian MR, Sazgarnia A. Development of a multispectral imaging system for online quality assessment of pomegranate fruit. International Journal of Food Properties. 2017;20(1):107-118.
  5. Bejinariu SI, Rotaru F, Niţă CD, Costin M. Morphological Wavelets for Panchromatic and Multispectral Image Fusion. Soft Computing Applications. 2013;195:573-583.
  6. Su WH, Sun DW. Multispectral Imaging for Plant Food Quality Analysis and Visualization. Comprehensive Reviews in Food Science and Food Safety. 2018;17(1):220–239.
  7. Qin J, Chao K, Kim MS, Lu R, Burks TF. Hyperspectral and multispectral imaging for evaluating food safety and quality. J Food Eng. 2013;118(2):157–171.
  8. Huang H, Liu L, Ngadi MO. Ngadi Recent Developments in Hyperspectral Imaging for Assessment of Food Quality and Safety. Sensors (Basel). 2014;14(4):7248-7276. doi: 10.3390/s140407248.
  9. Prince JL, Links JM. Medical Imaging, Signals, and Systems. 2nd ed. New York: Prentice Hall; 2006.
  10. Levenson RM, Mansfield JR. Multispectral imaging in biology and medicine: Slices of life. CytometryA. 2006;69(8):748-758.
  11. Arimoto H, Furukawa H. Retinal blood oxygen saturation mapping by multispectral imaging and morphological angiography. Conf Proc IEEE Eng Med Biol Soc. 2007;2007:1627-1630.
  12. Thévenaz P, Blu T, Unser M. Image interpolation and resampling. Handbook of medical imaging, processing and analysis. 2000: 393-420.
  13. Encyclopedia of food safety. 2nd ed. Yasmine Motarjemi, Gerald Moy, Ewen Todd, editors. Elsevier, 2014. vol.1; p. 580.
  14. Handbook of Food Analysis. 3rd ed. Leo M.L. Nollet, editor. CRC Press; 2015. p. 1568.
  15. Narrow band-pass Asahi filters. Asahi Spectra Co Ltd. Available from:  http://www.asahi-spectra.com/
  16. V Yuriy, Son JY, Kwack KD. Multispectral images in polarized light for medical applications. Proc of IEEE International Conference on Multisensory Fusion and Integration for Intelligent Systems. 2008;86-89.
  17. Son JY, Vashpanov YA, Jung DH, Lee DS, Kwack Kae-Dal. Polarized Light for Measuring a Human Skin Feature Indicating Aging. Japanese Journal of Applied Physics. 2009;48.
  18. National Instrument Corporation (USA). Available from: http://www.ni.com/vision/
  19. MegaPlus: ES4020, Princeton Instruments. Available from:  www.paicton.com, MegaPlus II Controller, Readlake. Available from: www.readlake.com.
  20. MacDonald L, Giacometti A, Campagnolo A, Robson S, Weyrich T, Terras M, et al. Multispectral Imaging of Degraded Parchment. Computational Color Imaging. 2013:143-157.
  21. Picowatt Digital Optical Power Meter Model 1830-C Newport. Available from: www.newport.com
 
Listing : ICMJE   

Creative Commons License Open Access by Symbiosis is licensed under a Creative Commons Attribution 3.0 Unported License