ORTUS PROJECT

THE RAWFIE PLATFORM

bkg-ortus
RAWFIE is the automated, remote operation of a large number of robotic devices for assessing the performance of different technologies in networking, sensing and mobile/autonomic application domains.
Devices are hosted on a different testbeds for exposing a vast test infrastructure to experimenters.
The project delivers the required software for experiments management, data collection and post-analysis.
The vision of Experimentation-as-a-Service will be realized by virtualizing the provided software, making the framework available to any experimenter around the globe.

For further details www.rawfie.eu

THE ORTUS PROJECT

The ORTUS project is the experiment held by W.P. FORMAT through the RAWFIE Platform.
The aim of the project is experimenting a deep learning algorithm, to recognize automatically objects in aerial video images taken by UAVs.
The RAWFIE system allowed to build up a specific dataset with aerial images to train a newly implemented Neural Network (NN) and to test the NN itself for recognizing objects in video stream.

THE EXPERIMENT

ortus

The first step of the experiment is simply to get video footage of ground vehicles (UGVs) from an aerial perspective. UAV acquired video has be used to train our neural network to recognize the different vehicles.

The second step is to create a useful image dataset to feed the Neural Network (NN). These images has be extracted from the video footage taken by UAVs: a variety of UGV types and a wide range of image scales, resolutions, and compositions have been taken.
Some specific open-source tools will be used to label all Ground Vehicles instances in this dataset, creating ground truth bounding boxes.
The third step of the experiment is the Neural Network (NN) training

The NN still needs to be trained and the correct parameters need to be determined.
The batch size, momentum, learning rate, decay, iteration number, and detection thresholds are all task-specific parameters (defined by the experimenter) that need to be input into the platform.

The fourth step consists in repeating the first step. The UAV will fly again while UGVs are moving on the ground; the UAV performs real time object recognition from UAV video feed through the trained Neural Network.
The fifth step is to evaluate and report the accuracy of object detection and classification in aerial images. This is the validation step in which we will assess the reliability of our neural network app.

THE DELIVERABLES

Experiment report

Description of the methodology, of the experiment execution and experiments results

Aerial images dataset release

The implemented dataset with aerial images for training the NN

Video showcase

The video showcasing experiment results with object detection and recognition boxes over-impressed

bandieraeuropaThis project has received funding from “HORIZON 2020” the European Union’s Framework Programme
for research, technological development and demonstration under grant agreement no 645220
Contattaciper saperne di più!