Circus Maximus Mixed-Reality Experience
In the period 2018-2019, during my experience in Inglobe Technologies, I worked on the development of the Circus Maximus Experience. This mixed-reality experience consists of an immersive tour that brings you through all the historical phases on the Circus Maximus. In order to prepare the AR and VR experiences, the Circus Maximus archeological site was accurately reconstructed in 3D, and the obtained 3D model was registered to the 3D maps used by the AR devices. These operations were possible thanks to my software framework PLVS, which I privately developed. PLVS is an RBG-D and Stereo SLAM System with Keypoints, Keylines, Volumetric Mapping, and 3D Incremental Segmentation. On this occasion, it was a pleasure for me to let Inglobe Technologies use PLVS. I personally collected the video data and then produced the 3D reconstruction and localization of reference spots by using PLVS with a ZED camera.
In 2019, I collaborated with the pediatric Hospital Bambino Gesù for the development of DORIS (Dynamic Oriented Rehabilitative Integrated System). DORIS is an advanced robotic system that allows postural assessment, and training of gait and equilibrium. This system integrates altogether 1) a Stewart platform, 2) a Vicon system, 3) an EMG system, and 4) a Virtual Reality environment (developed under Unreal Engine 4). ROS (Robotic Operating System) is used as the main integration middleware. In particular, during each working session, all the sensory data and exchanged messages can be recorded and then played back (in a time-synchronized fashion) for further advanced post-processing and analyses.
uARe Augmented Reality Platform
In the period 2018-2019, during my experience in Inglobe Technologies, I worked on the design and development of uARe. This tool consists of (i) an editing platform that allows fast and easy creation of AR and VR content on the cloud, and of (ii) a customizable mobile application which, at any time, can playback the created interactive and immersive experiences. uARe is officially used by the Mondadori Group for creating mixed-reality experiences in many of its magazines.
In the period 2016-2018, I worked on the TRADR EU project (EU FP7 ICT 609763), with a focus on perception, motion planning and control for UGVs. In particular, I was the leader of the TRADR work package WP4: Persistent models for multi-robot collaboration.
TRADR develops technology for human-robot teams to assist in disaster response efforts, over multiple missions: The novel challenge is how to make experience persistent. In the TRADR scenario, various kinds of robots collaborate with human team members to explore the environment and gather physical samples. Throughout this collaborative effort, the team gradually develops its understanding of the disaster area over multiple possibly asynchronous missions (persistent environment models), to improve team members’ understanding of how to work in the area (persistent multi-robot action models), and to improve team-work (persistent human-robot teaming). The TRADR use cases involve response to a medium to large scale industrial accident by teams consisting of human rescuers and several robots (both ground and airborne).
Forza NEC UAVs
During my experience in Selex ES MUAS (now Leonardo), I designed and developed real-time computer vision and visual servoing applications for the Unmanned Aerial Systems CREX-B, ASIOB, SPYBALL-B. My Team obtained the Military Type Certificate for those systems according to Italian AER.P-2 regulation. In 2011 I won with part of my Team the SELEX Galileo Innovation Awards for the UAS CREX-B. See more on my linkedin profile.
The LYRA program represents an important evolution of the European VEGA launcher. During my experience in UTRI, I developed with my Team a trajectory control for the LYRA launch vehicle by using innovative synthesized auto-scheduled algorithms.
The main goal of the PHRIENDS programme (EU FP6-IST 045359) was the development of key components of the next generation of robots, including industrial robots and assist devices, designed to share the environment and to physically interact with people. During my research period at the DIS Robotics Lab, I developed a sensor-based exploration method for general robotic systems equipped with multiple sensor. See this page for more details about my contributions.