Visual inspection of biscuit packages on packaging lines

Case studies > Logistics and Packing > Visual inspection of biscuit packages on packaging lines

Visual Inspection of Biscuit Packages on Packaging Lines

By – Raghava Kashyapa

Published on – 27-10-2022

The customer is a multinational food processing company with a diverse product line comprising biscuits, confectioneries, powdered beverages, and so on. Qualitas designed a cutting-edge integrated machine vision solution to automate and streamline their packaging inspection processes.

Problem Definition

The customer’s production process was a combination of manual and automated processes for final packaging before the goods were dispatched. There were two production lines running in sequence – an infeed line where biscuit slugs were manually placed in packaging buckets, and an outfeed line where the final packaging took place. Their current manual inspections made the process inefficient as they were error-prone and labor-intensive.

Errors resulting from manual inspection were the following:

  • Placing biscuit slugs into buckets on the infeed line was a manual process. As a result, there were occasional instances of the wrong number of slugs going into a bucket as well as a mix-up of biscuit flavors.
  • The loaded buckets were inserted into final packaging cartons on the outfeed line. Manual inspection of these cartons for errors such as excessive flap gaps, misalignment and incorrect/illegible code printing was inadequate.

Solution and Implementation

Qualitas was tasked with developing a machine vision system that would automate the process of identifying and segregating defective inner and outer packaging as they passed through the corresponding infeed and outfeed conveyor lines.

AI and machine vision-based technologies were used to automate the inspection process on both conveyor lines. Qualitas’s 4I methodology – Inspect, Install, Instruct, and Improve was used to achieve the desired outcome.

Install

A Qualitas EagleEye® camera with appropriate lighting and a proximity sensor was installed on the infeed line to inspect incoming slugs in the buckets for miscounts and mismatches. Figure 1 shows the EagleEye camera setup in the infeed line.

Fig – 1 (One EagleEye camera setup in the infeed line )

Here’s a video of the outfeed line which shows three cameras positioned with adequate lighting to capture multiple images of biscuit cartons.

Figure 2 shows the three-cameras setup in the outfeed line.

Three-cameras setup in the outfeed line

Fig – 2 ( Three-cameras setup in the outfeed line )

Images from both the lines were captured and sent to a Cloud repository, where human operators examined them using the EagleEye software application. This process was allowed to run in iterations until a sufficient number and variety of validated images were collected.

Instruct

This collection of validated images was then moved into the EagleEye software pipeline’s annotation and training phases. The first phase is a one-of-a-kind pre-processing step. This is a fully automated image-processing operation intended to boost the success of downstream neural network training. Figure 3 shows how pre-processing was applied to mask out a distracting detail on certain images.

Pre-processing of images by removing distracting and irrelevant image details

Fig – 3 ( Pre-processing of images by removing distracting and irrelevant image details )

Human operators annotated the pre-processed images in preparation for training a few Deep Learning neural networks. The operators used the EagleEye software application’s powerful and semi-automated annotation tools. Figure 4 shows the annotation tool in use, with an operator annotating a biscuit slug.

Fig – 4 ( Annotation of images for Deep Learning )

Figure 5 shows examples of images representing defective and good packaging.

Pass and Fail image examples

Fig – 5 ( Pass and Fail image examples )

Figure 6 shows yet another example of applying Qualitas’s annotation software to these images.

Labelling of the defects in order to teach the downstream software
Labelling of the defects in order to teach the downstream software

Fig – 6 ( Labelling of the defects in order to teach the downstream software )

This was followed by the actual Deep Learning model training phase. Neural network models appropriate for the required defect detections were selected, set up, and fed with these annotated images. Qualitas’ QEP software implements an automatic image augmentation capability at this stage. Specifically, each human annotated image can generate one or more variations of it through the application of certain image processing operations such as brightness/contrast changes, horizontal flipping, image shearing and others. This image augmentation beneficially amplifies the image training set.

The training and testing stages were repeated until adequate model accuracies were achieved.

Figure 7 shows the EagleEye Deep Learning training dashboard.

EagleEye’ s Deep Learning dashboard

Fig – 7 ( EagleEye’ s Deep Learning dashboard )

Inspect

The neural networks from the previous stage were downloaded, optimized, and installed on the production floor Vision Controller computers for live deployment on both the lines. The production line cameras captured images of both the biscuit slug buckets and the cartons as they moved along their respective production lines. The Vision Controller’s inference neural networks processed these images to detect the defects for which they were trained. In addition, custom-developed graphical reporting dashboards on the Vision Controller’s display provided real-time updates and feedback on progress to production line operators.

Figure 8 shows the inspection dashboard displaying the infeed line inspection for miscount and mismatch of slugs.

Dashboard displaying the infeed line inspection for miscount and mismatch of slugs

Fig – 8 ( Dashboard displaying the infeed line inspection for miscount and mismatch of slugs )

Here’s a video that shows the outfeed line inspection for flap gaps, flap misalignment, OCR, and barcode verification.

Figure 9 shows the EagleEye® Cloud dashboard of the outfeed line inspection for flap gaps, flap misalignment, OCR, and barcode verification.

Fig – 9 ( Dashboard displaying the outfeed line inspection for flap gaps, flap misalignment, barcode, and OC)

Qualitas integrated this inspection system to the customer’s factory automation processes so that pass/fail judgements made by the inferencing software could trigger the appropriate factory floor actions. A ‘fail’ signal on the outfeed inspection triggered the mechanical rejection of the offending biscuit carton.

Here’s a video that shows a visual signal indicating whether the biscuit carton has passed the inspection.

Another video shows the production line rigged with the factory automation rejection systems which pushes the rejected packages out of the conveyor line before final dispatch.

Improve

Improve, the final step of the 4I process, was a continual improvement, closed feedback loop mechanism known as grading. As the Inference step determined the pass/fail status of collected images, a confidence score was assigned to each. The purpose behind Improve was to collect an intelligence subset of those with low confidence scores and compare them with the judgement of a human expert. Images that differed from expert human judgement were culled and used to retrain the pertinent neural networks – thereby resulting in continuous inspection improvement. Figure 10 shows the grading mechanism that made inferences such as ‘judged correctly’ or ‘misjudged’ by human operators, based on the processed and graded values.

Fig – 10 ( Grading process )

The complete ‘acquire-process-judge-act’ pipeline increased production efficiency by 50% and reduced false acceptance by 95%. Because of automated rejection, the overall equipment efficiency increased by 55%. The detection of variant mismatches and the absence of slugs in the infeed bucket were 100% accurate.

Conclusion

The Food and Beverage sector has grown highly competitive as consumers continue to seek superior quality and presentation. Qualitas’s machine vision technology helped our customer meet the standards of efficiency and consistent quality in economic production. The application of machine vision technologies in packaging optimized production and quality control processes increased quality assurance, expedited shipments, and increased customer satisfaction.

Leave a Reply

Schedule A Demo
close slider


hbspt.forms.create({ region: "na1", portalId: "726123", formId: "bd9f048a-7e04-4512-9179-c50855c961ea" });