This page contains a list of selected projects.

Mapping and Perception for Autonomous Yard Trucks

In part of 2023, I worked at Outrider, as a Principal Engineer, focusing on State Estimation, Perception, and LIDAR mapping.

SLAM for Advanced Robot Mowers

In 2022 and part of 2023, I worked at Electric Sheep Robotics, as a Principal Engineer, Head of R&D SLAM (with LIDAR, camera, IMU, and GPS) for advanced and efficient robot mowers. I also led R&D activities on navigation algorithms and simulation frameworks.

  Camera features, lidar pointcloud projection

VR/AR Tracking 

In the period 2020-April 2022, I worked at Arcturus Industries to build with the Team the next generation of computer vision perception technologies for VR/AR (Virtual and Augmented Reality). In particular, the focus was on 6 DOF inside-out positional tracking for AR/VR applications. The target was to provide an accurate and robust tracking experience by fusing visual and inertial information. More info on Arcturus Industries website. 

Circus Maximus Mixed-Reality Experience

In the period 2018-2019, during my experience in Inglobe Technologies, I worked on the development of the Circus Maximus Experience. This mixed-reality experience consists of an immersive tour that brings you through all the historical phases of the Circus Maximus. In order to prepare the AR and VR experiences, the Circus Maximus archeological site was accurately reconstructed in 3D, and the obtained 3D model was registered to the 3D maps used by the AR devices. These operations were possible thanks to my software framework PLVS, which I privately developed. PLVS is an RBG-D and Stereo SLAM System with Keypoints, Keylines, Volumetric Mapping, and 3D Incremental Segmentation. On this occasion, it was a pleasure for me to let Inglobe Technologies use PLVS. I personally collected the video data and then produced the 3D reconstruction and localization of reference spots by using PLVS with a ZED camera. 


In 2019, I collaborated with the pediatric Hospital Bambino Gesù for the development of DORIS (Dynamic Oriented Rehabilitative Integrated System). DORIS is an advanced robotic system that allows postural assessment, and training of gait and equilibrium. This system integrates altogether 1) a Stewart platform, 2) a Vicon system, 3) an EMG system, and 4) a Virtual Reality environment that we developed under Unreal Engine 4. ROS (Robotic Operating System) is used as the main integration middleware.  In particular, during each working session, all the sensory data and exchanged messages can be recorded and then played back (in a time-synchronized fashion) for further advanced post-processing and analyses.

uARe Augmented Reality Platform 

In the period 2018-2019, during my experience in Inglobe Technologies, I worked on the design and development of uARe. This tool consists of:
(i) an editing platform that allows fast and easy creation of AR and VR content on the cloud,
(ii) a customizable mobile application that, at any time, can playback the created interactive and immersive experiences.
uARe is officially used by the Mondadori Group for creating mixed-reality experiences in many of its magazines. 

In the period 2016-2018 (University of Rome “La Sapienza”), I worked on the TRADR EU project, with a focus on perception, motion planning, and control for UGVs. In particular, I was the leader of the TRADR work package WP4: Persistent models for multi-robot collaboration.

TRADR EU project (EU FP7 ICT 609763) developed technology for human-robot teams to assist in disaster response efforts, over multiple missions: The novel challenge was how to make the experience persistent. In the TRADR scenario, heterogenous robots collaborate with human team members to explore the environment and gather physical samples. Throughout this collaborative effort, the team gradually develops its understanding of the disaster area over multiple possibly asynchronous missions (persistent environment models), to improve team members’ understanding of how to work in the area (persistent multi-robot action models), and to improve teamwork (persistent human-robot teaming). The TRADR use cases involved response to a medium to large-scale industrial accident by teams consisting of human rescuers and several robots (both ground and airborne).
This is a link to some TRADR activities. Further details are available on the official TRADR website.

robot-talos2  DortmundPhoenix

Forza NEC UAVs
During my experience in Selex ES MUAS (now Leonardo), I designed and developed real-time computer vision and visual servoing applications for the Unmanned Aerial Systems CREX-BASIOB, SPYBALL-B. My Team obtained the Military Type Certificate for those systems according to Italian AER.P-2 regulation. In 2011 I won with part of my Team the SELEX Galileo Innovation Awards for the UAS CREX-B. See more on my linkedin profile

spyball_s.JPG asio_s_1 crex_s

LYRA Program
The LYRA program represented an important evolution of the European VEGA launcher. During my experience in UTRI, I developed with my Team a trajectory control for the LYRA launch vehicle by using innovative synthesized auto-scheduled algorithms. 


The main goal of the PHRIENDS programme (EU FP6-IST 045359) was the development of key components of the next generation of robots, including industrial robots and assist devices, designed to share the environment and to physically interact with people. During my research period at the DIS Robotics Lab (University of Rome “La Sapienza”), I developed a sensor-based exploration method for general robotic systems equipped with multiple sensors. See this page for more details about my contributions.