Building Sustainable Livestock Farming Computer Vision Models on Edge Device
Faromatics is a Precision Livestock Farming company and the creator of ChickenBoy, the world’s first ceiling-suspended robot that monitors broiler chickens and helps farmers increase both animal welfare as well as farm productivity. In this two-month Omdena Challenge, our collaborators built a machine learning based solution that is able to detect chicken via camera videos.
Today’s livestock production is very challenging for animals. Large flocks – due to low margins for farmers, but also a strive to use less and fewer inputs for more sustainable production – means less attention, more diseases, and more welfare challenges. The solution is to assist farmers with automated surveillance of the animals. Through continuous monitoring of animals, we are able to detect their needs quickly and reliably. The satisfaction of those needs helps farmers to improve the health and welfare of their animals and become more competitive.
Given that we globally raise 60 billion chickens every year for meat consumption, of which 3 billion don’t make it through the process and 1.6 billion are rejected at the slaughterhouse because of illnesses, scratches, bruises, and other signs of welfare failures, this makes a huge difference for the animals. Add to this the fact that we could save 3 million tons of feed every year if we had better digestion management, and it is obvious that there is a chance to make chicken products that are not only healthy but can also be purchased in good conscience by those who wish to continue consuming meat.
With the ChickenBoy robot, we are already going a long way towards achieving this goal, but we need another push to be able to increase our capabilities to detect anomalies in chicken behavior.
The project solutions
The goal was to implement an individual chicken detector based on machine learning (such as YOLO, R-CNN …) and then do an individual tracking of the movement of each chicken frame by frame.
The model runs on the hardware of the Faromatics robot which operates with an 800×600 RGB camera and a Raspberry Pi 4 with a Google Coral Edge-TPU. The developed algorithm should work in real-time (with a minimum of 5 FPS).
Faromatics provided several videos recorded with the camera. The project team delivered the following:
The code in Python (Chickens detector + Chickens tracking)
The parameters learned by the algorithm
The image database with the labels with which the model has been trained
To ensure the quality of the algorithms, the following metrics were used for the chicken’s detector algorithm:
Intersection over Union (IoU): quantifies the similarity between the ground truth bounding box and the predicted one.
Recall: that measures the probability of ground truth objects being correctly detected.
Precision: the probability of the predicted bounding boxes matching actual ground truth boxes.
The minimum performance expected for the released model is 0.75 for IoU and Recall and 0.9 for Precision. This is evaluated on a test set of images independent of the training set.
To evaluate the tracking part, we used the following metric:
Multiple Object Tracking Precision: the average dissimilarity (in pixels) between all true positives and their corresponding ground truth targets. This value must be less than 5% of the pixels on the diagonal of the image. The score is evaluated on a test set of images independent of the training one.
Below is an example image captured with one of the cameras.