While we are most commonly known as the Bee Lab, a more accurate name is the Social Insect Behavior Lab. This is because our overarching goal is to study the behavior of social insects, which include bees as well as ants. Studying the behavior of other species is not easy. For obvious reasons, we cannot ask directly about their activities. Instead, we must actively observe the animals, and, from those observations, extract information that is conducive to answering our research question. One way we make this process easier in the Social Insect Behavior Lab is to record videos of our ant experiments so that we can observe them at a later, more convenient time. However, we still face a larger problem: actually watching the ant videos. Currently we have 360 hours of recorded ant experiments, which is a LOT of videos to watch. If a poor student watched ant videos for 8 hours a day, it would take them 45 days to complete it; that’s about 7 weeks, or half a semester at Mudd … 🤔 In addition, since ants are not very active creatures, large portions of many of the videos don't really have much going on. Therefore although the data we hope to collect could lead to very interesting conclusions, we must collect them by combing through hours of boring still-frames.
![]() |
| Rajan watches ants. |
For the past few weeks I have been working to solve this problem computationally. We aim to use computer vision to track ants and, from that information, figure out where each ant is going. MATLAB is commonly used for image processing, so it already has built-in functions to perform object tracking. Since our videos are taken from a stationary camera without much noise, the tracking algorithm we use does not need to distinguish much detail. In addition, since we need to track multiple ants at once, we must apply an algorithm that can track multiple objects. After some research, I found that the best and most efficient method to tackle this problem is automatic detection and motion-based tracking. The functions that I applied to our ant videos decompose the problem into several sub-problems:
1. Motion detection
2. Blob analysis
3. Matching moving objects between consecutive frames
Consider sub problem one: motion detection. The MATLAB function for motion detection uses an original background subtraction algorithm. The algorithm first identifies moving objects by determining the background frame, which gets updated every K frames. It then calculates the pixel differences between the current frame and the current background frame. If the difference for a particular pixel is nonzero, then the pixel is marked as the foreground.
Left: a frame of a cropped region of an ant video. Right: black and white mask of the same frame, black denoting the background and white denoting the object.
Since the only objects that are moving in our videos are ants, motion detection is sufficient to find ants even though the algorithm isn’t looking for ants specifically.
The second sub-problem that we tackle is blob analysis. A “blob” is defined as a region of connected pixels. A simple Breadth First Search Algorithm is implemented to find all the blobs in a frame, and each blob is assigned an ID, a bounding box, which is a rectangle that contains all the pixels in each blob, and and a centroid, which is the center of the blob. Applied to our ant videos, a blob corresponds to an individual ant, so in this step, we have located the ants in a single frame of video.
Left: a frame of a cropped region of an ant video. Right: black and white mask of the same frame, with yellow box bounding the detected ant.
The last sub-problem, and arguably the most important, is to match moving objects between consecutive frames. This is also the most difficult sub-problem because multiple ants in the frame could bunch together or intersect paths, which would introduce errors to the tracking output. Associating blobs across consecutive frames creates “tracks”, which are represented by a matrix of x- and y- coordinates, and a unique identifier per blob. A Kalman filter is utilized to predict the motion of each track, and then to assign each blob to its respective track. For each frame:
- If a blob is assigned to a track, we update the track’s information.
- If a blob has not been assigned a track (new blob appears in frame), start a new track
- If a track is unassigned (blob disappears from frame), mark the track “invisible”
- If the track has been invisible for enough frames, the track is deleted (the blob for sure has left the frame and is assumed not to come back)
This is an example of the tracking algorithm applied to our Ant videos:
The resulting output is a data frame of the coordinates of each detected ant, tracked from when it enters the frame until it leaves the frame. From this information, we can then determine how each ant is moving within our experimental setup and, from that, infer the ant colony’s behavioral patterns.
There is still much more work to be done before we can successfully and dependably use this program to analyze all of our ant videos. The next steps in my research include:
There is still much more work to be done before we can successfully and dependably use this program to analyze all of our ant videos. The next steps in my research include:
- Process the output from the tracking function into a useful format for data analysis
- Adjust parameters in each step of the motion-based tracking algorithm in order to optimize it for our purposes
- Streamline the pipeline for convenient user input and output
- Perform accuracy analysis by comparing the program’s results with the results from students’ counts.
While the ant tracking program is still in its early stages, I am optimistic that, in the near future, we can begin using it to collect meaningful data quickly and accurately, while saving future students from watching endless ant videos.
Further reading:
Research paper on the method I used for tracking ants:
X. Li, K. Wang, W. Wang and Y. Li, "A multiple object tracking method using Kalman filter," The 2010 IEEE International Conference on Information and Automation, Harbin, 2010, pp. 1862-1866. doi: 10.1109/ICINFA.2010.5512258
X. Li, K. Wang, W. Wang and Y. Li, "A multiple object tracking method using Kalman filter," The 2010 IEEE International Conference on Information and Automation, Harbin, 2010, pp. 1862-1866. doi: 10.1109/ICINFA.2010.5512258

Any luck on using OpenCV integration in MATLAB for blob analysis? OpenCV does a lot of the heavy lifting for you these days.
ReplyDeletehttps://www.mathworks.com/discovery/matlab-opencv.html