The poultry industry faces growing challenges in ensuring animal welfare, preventing disease outbreaks, and addressing labor shortages. Traditional vision intelligence-based behavior monitoring needs intensive manual labeling, which is impractical for large-scale operations, necessitating advanced, efficient, and scalable solutions. Utilizing artificial intelligence technologies to monitor poultry behavior accurately in real-time can revolutionize animal welfare practices and enable proactive management. Monitoring normal and abnormal behaviors is paramount among the many aspects of poultry management in commercial poultry houses such as broiler houses and cage-free layer houses (Figure 1).  

Figure 1. Active laying hens in cage-free houses (photo credit: United Egg Producers)

Researchers have tested different machine vision methods to monitor applied behaviors at the University of Georgia (UGA). Recently, UGA and University of Arkansas researchers developed a new method (Figure 2) for monitoring chickens’ applied behaviors (Figure 3; e.g., feeding, foraging, and piling, etc.) faster by using the semi-supervised auto-labeling and the zero-short recognition.

Figure 2. Semi-supervised learning framework with zero-short text embedding for auto labeling.

Figure 3. Applied poultry behaviors detected in cage-free facilities on UGA research farm.

The Semi-YOLOWorld model for text-prompt detection and auto-labeling of images showed significant improvements in behavior detection and labeling accuracy. It demonstrates the model’s potential to automate the image annotation process effectively based on text input (Figure 4). This research successfully integrated semi-supervised learning and zero-shot models to improve poultry behavior detection in large, dynamic datasets, and compared their performance with supervised, semi-supervised, and zero-shot learning. The YOLOv9 model achieved strong baseline results (81.1 % precision, 77.9 % recall, 79.5 % F1-score), while the semi-supervised I-640 model with 2,500 pseudo-labeled images further improved performance to 86.9 % precision, 84.3 % recall, and 84.9 % F1-score. Using semi-supervised models with a confidence score of at least 50 % and the YOLOv9 model accelerated data labeling while maintaining accuracy.

Figure 4. Text-prompt-based detection and auto-labeling of behaviors in cage-free hens.

Further reading:

Bist, R., L. Chai, S. Subedi, Y. Tian, D. Wang. 2025. Enhancing poultry multi-behavior detection with semi-supervised auto-labeling and prompt-driven zero-shot recognition. Computers and Electronics in Agriculture, 111178.

https://doi.org/10.1016/j.compag.2025.111178