This summer we are working to build a system that creates flower density maps of large areas. These bee foraging maps can be used to track bee nutritional resources over time, to understand bee behavior, and to greatly speed up ecological research that currently requires time-consuming ground surveys. In order to accomplish this task we are using a drone and camera to take aerial images of the Bernard Field Station. We then stitch these images together and use machine learning to calculate flower densities.
Research is defined as “investigating systematically” by Google, but based on this summer perhaps it should be defined as systematically being wrong. Since our project started something has gone wrong in nearly every aspect. Our original drone failed to remain airborne. Our code contained countless bugs. Our images couldn’t be stitched together easily into useful aerial maps. And though we have only just started to experiment with flower recognition I have no doubt that we will encounter many obstacles as we progress towards our goal. Of course we never expected things to work perfectly the first time when we started our project, and each setback has forced us to learn more about our project and gain a better understanding. Ultimately, I think I’ve learned more from the things that didn’t go right than those that did.
Research is defined as “investigating systematically” by Google, but based on this summer perhaps it should be defined as systematically being wrong. Since our project started something has gone wrong in nearly every aspect. Our original drone failed to remain airborne. Our code contained countless bugs. Our images couldn’t be stitched together easily into useful aerial maps. And though we have only just started to experiment with flower recognition I have no doubt that we will encounter many obstacles as we progress towards our goal. Of course we never expected things to work perfectly the first time when we started our project, and each setback has forced us to learn more about our project and gain a better understanding. Ultimately, I think I’ve learned more from the things that didn’t go right than those that did.
We originally purchased a 3DR Iris+ and GoPro to take aerial pictures of the BFS. However, we quickly discovered that this drone could not handle the weight of the stabilizing gimbal and GoPro while maintaining a constant altitude. We tried many things to fix it. It flew very well without the camera attached, which was fun because flying a drone is entertaining in and of itself, but not particularly useful. We tried using a camera and no gimbal, but it wouldn’t remain at a constant altitude. We tried to replace the propellers with a longer version, but didn’t have the necessary parts. Finally, we decided it wasn’t worth the time.
Abandoning our first drone certainly paid off, because our next drone, a DJI Phantom 2 Vision +, flew perfectly. The only issue we have run into in several weeks of flying is inaccuracy in the ground altitude. However, it is easy to adjust for this, and after finding an app that allows for more control than the DJI Vision app, we finally have our mapping system down. Flying at about 40 feet high and with 20 feet between passes, we are able to map large sections of the field station. Having a drone that works well without any modifications allows us to focus on other parts of the project, instead of constantly trying to fix hardware problems.
| DJI Phantom 2 Vision + flying over a patch of buckwheat in the BFS |
Once our drone was working we were very excited to start mapping! The first few flights were a little bit rough, with some images having too much overlap, some having too little, and a few extra exciting flights that had near misses with the taller trees in the field station. Eventually, we found the flying parameters that worked well, and we have stuck with them ever since.
The first map stitching attempts failed, largely because most of the field station looks the same. A typical shot of the field station has brown dirt with some small plants, and possibly a few shrubs. This doesn’t make it easy to distinguish between images, or to determine where they should go in a map. We are using Microsoft Image Composite Editor (ICE) to stitch our images together. While there are several dedicated drone mapping programs available, including Open Drone Map, Pix4D, and Menci Software, ICE was free and easy to use. After experimenting with different settings and parameters, we are finally able to create maps, though with a little extra effort on our part. Each row of the flight has to be stitched individually. These rows are then cropped and stitched together to create a final image of the entire area. By flying over extra trees as landmarks we have been able to determine how much the images overlap, and to check if the outputs of ICE are actually representative of the field station. In order to get the best maps, ICE requires the user to have some idea of the overlap and alignment of the images. Having the extra information from trees helped us to do this.
Stitched map from the BFS
|
| Areas of the Bernard Field Station that have been mapped are highlighted in green. Areas that we are planning to map in the near future are highlighted in blue. |
Next up, determining flower densities in these images.
Such a cool project!
ReplyDelete