Selected projects

Work

A few of the projects I have led or contributed to, drawn from Antarctic field campaigns, time-critical autonomy research, biosecurity surveys, and crop monitoring. Each one had to deliver in the field, not only on a benchmark.

Drone launch on Antarctic terrain at ASPA135

Antarctic biodiversity

Mapping Antarctic moss and lichen with hyperspectral drones

I lead UAV-based vegetation mapping at the Antarctic Specially Protected Area 135 (Robinson Ridge) and at Bunger Hills, combining hyperspectral and multispectral imaging with deep learning to classify moss and lichen at sub-centimetre resolution.

The work feeds directly into long-term conservation baselines for the continent, and runs as part of Securing Antarctica's Environmental Future (SAEF). Field campaigns to date have validated the workflow against ground spectra and quadrat surveys across multiple seasons.

Collaborators: QUT Centre for Robotics, SAEF, University of Wollongong, Australian Antarctic Division

UAV flying low over a mannequin used for human-detection field tests

Search and rescue

Autonomous drones for time-critical search

My PhD built a decision-making framework that lets small UAVs reason about object-detection uncertainty in cluttered indoor and outdoor environments, including collapsed buildings and bushland.

The framework fuses RGB and thermal evidence in real time, plans the next viewpoint to confirm or reject a detection, and reduces the human bias that creeps in when pilots scout under pressure. Field tests used full-body mannequins as stand-in victims to validate detection across viewpoint, scale, illumination, and occlusion changes.

Collaborators: QUT Centre for Robotics, CSIRO Data61

Hyperspectral myrtle-rust signature, Vanegas et al. 2018

Biosecurity

Hyperspectral UAVs for myrtle rust detection in native flora

Myrtle rust is a fungal pathogen that has spread aggressively through Australian native flora, and early-stage infection rarely shows up on the canopy in time for ground crews to act. I worked on UAS-based hyperspectral monitoring to surface the spectral signatures of infected leaves before symptoms become visible to the eye, giving land managers a head start on containment.

The same project arc carried into later campaigns on declared invasive species, including mouse-ear hawkweed in the Australian alps with Charles Sturt University, and bitou bush along Queensland and New South Wales coastal dunes. Each one uses the same playbook: low-altitude drone surveys, multispectral or hyperspectral imagery, and supervised classifiers tuned to the target species.

Collaborators: QUT Centre for Robotics, Charles Sturt University, Biosecurity Queensland

Figure 16: Disease detection results, Amarasingam et al. 2022

Precision agriculture

UAV detection of sugarcane white leaf disease

White leaf disease devastates sugarcane yields and spreads before symptoms are visible from the ground. With Sugar Research Australia and partners in Sri Lanka, I helped benchmark off-the-shelf deep learning detectors against UAV RGB imagery so growers can flag affected stools early in the season.

The study compared real-time and two-stage architectures across cane varieties and flight altitudes, and documents a workflow growers can run on commercial drones without specialised sensors.

Collaborators: QUT Centre for Robotics, Sugar Research Australia, South Eastern University of Sri Lanka, Wayamba University of Sri Lanka

Collaborate

Want to work on something at the edge of what drones can do?

I'm open to projects across academia, industry, and government, especially where autonomy, remote sensing, and field reliability all need to land together.