Pages

Friday, July 23, 2021

A bird’s eye view: drone surveying and flower mapping

Sitting there in the sun, drone on and ready to go, controller connected, flight map loaded and cleared, I press the start mission button and get a message that haunts me to this day: “SD Card not found”. I groan and check the drone’s small micro-sd card compartment; sure enough, I had forgotten it at the lab. I pack up my setup and begin making the long trek back from the Bernard Field Station to Harvey Mudd College across the street, where my SD card lay waiting for me.


[1] Phantom 4 RTK drone and controller


Drone flying, more specifically drone surveying, was a large part of my work with the HMC Bee Lab over the summer. Surveying an area means to gather information about the things inside that area and their relative locations. My project used a Phantom 4 Advanced drone to survey large fields with flowering plants in them and employed machine learning to identify these plants, applying this information to better understand bee foraging behavior. The Bernard Field Station, a 5C wildlife reserve right across the street from Harvey Mudd, was the perfect place to find fields of flowering plants, with California buckwheat and white sage plants being in the middle of their blooming season. My drone flying trips to the Field Station were really enjoyable so I will share what a day out in the field looked like.


[2] Map of the Bernard Field Station

Social Insect Behavior Room, 8:00 am

I go through my checklist of items as I prepare to go out to the field. SD cards, batteries, cables, missing any one of these meant losing precious morning time and doubling the amount of walking in the day, both of which I’m not particularly keen on. After packing the accessories, I check the batteries I left charging the night before. I grab one for each flight I have planned plus one extra; each battery only lasts for about 25 minutes, meaning our 20 minute flights would each need a new battery. I then check the preliminary flight plans on the drone’s dedicated tablet. We used an app called DroneDeploy to generate automatic flight plans from a marked region of interest. Apart from the region of flight, we can specify height, speed, and the overlap between each of the images taken by the drone. This sort of automated flight allows for consistency and stability that a human pilot would have difficulty producing themselves. With these flight plans prepped, I grab the Phantom 4 drone and head out.



[3] DroneDeploy webpage for flight plannin

Bernard Field Station, 8:30 am

I check in at the gate and walk along it until I get to the Central-Eastern CSS border, a tree-filled lower section of the Field Station. I find a flat section for the drone to take off in and connect the tablet to the drone’s controller, pulling up the preliminary flight plan I had worked out with my professors. I start up the drone and fly it around the planned flight path, ensuring that there aren’t any large trees or power lines in the way. Satisfied, I activate the mission on DroneDeploy and watch it run through a full system diagnosis involving satellite connection, camera focus, and flight plan validity. It decides that everything is in check and the drone flies off to start its mission while I keep an eye on it to make sure it doesn’t miss a section of the path or get hit by a strong gust of wind that knocks it off course. I also check the live-feed from the camera to make sure certain key spots in the field are actually getting picked up by the flight path. The flight finishes and the drone returns to its take-off spot, where I pick it up and head over to the next area for mapping.


[4] Drone picture of California buckwheat plants while in flight.

Sontag Dorm, 9:45 am

With my teammates using the lab space for 3D printing, I head over to my room to process my data. The goal is to stitch the pictures together into one large image called an orthomosaic that we can later run our computer vision algorithms on. I input the data into a software called Agisoft Metashape, which looks for common points between each of the drone pictures and stitches them together. After a couple of hours, it outputs a full orthomosaic of the field that our machine learning pipeline can analyze to determine the type and location of flowering plants within it.


[5] Full orthomosaic stitched together from 400 drone pictures.


Though I still forgot parts every once in a while, by the end of summer my drone flying trips were running smoothly. My hikes to the Bernard Field Station and the outdoor data collection ended up being a really good complement to the coding I would do in the afternoons and made me realize how much I value a hands-on component in my work. This was also my first time working on a project with a drone, giving me a lot of experience with drone photography, flight, and data processing. This has only increased my interest in the application of computer vision to drone technology and I’m looking forward to learning more about how the two can intersect in ethical and interesting ways.

Further Reading
Overview of drone surveying by industry leader DJI
https://enterprise-insights.dji.com/blog/all-about-drone-surveying

Introduction to QGIS for map analyzation
https://www.qgistutorials.com/en/


Media Credits

[1] Photo by Berlin Paez

[2] Figure by Berlin Paez, made with QGIS

[3] Figure by Berlin Paez, made with DroneDeploy

[4] Photo by Berlin Paez, made with DJI Phantom 4 RTK drone at the Bernard Field Station

[5] Figure by Berlin Paez, made using Agisoft Metashape

No comments:

Post a Comment