Ioannis Marios Stavropoulos, BSc

PhD candidate

First Class Honours

Teaching Associate

CU Computational Robotics team


Cardiff University

School of Computer Science and Informatics

Cardiff, Wales (UK)


Robot Shared Autonomy via Immersive Interfaces
Under Communication Delays and Bandwidth Constraints

PhD (September 2023 - today).

I am developing a supervisory control-based robot teleoperation system for remote manipulation in space. The standard approach of direct teleoperation of the robot with shared control approaches (i.e. adjustment and correction of continuous user input to some extent) is unsuitable in scenarios where the communication medium between the local (operator's) and remote (robot's) environments is challenged by long delays and limited communication windows or bandwidth. My system's architecture involves immersing the operator in Augmented Reality where the operator is able to see a visualization of the robot's environment, monitor/supervise task execution by setting high-level goals to the robot, and correcting any errors reported back from the robot during execution.

There is an on-going collaboration with the SpaceR Space Robotics Research Group at the University of Luxembourg to test my system in a space robotics use case, where the robot has to assemble a modular solar panel structure in space. The proof of concept was presented at the iSpaRo (International Conference on Space Robotics) in Luxembourg in June 2024. The system has the potential to be generalized such that it can be used in nuclear and underwater robotics scenarios.

Visit my profile on the official Cardiff University staff website.

Proof of concept (iSpaRo'24 conference workshop paper)

The proof of concept demonstrated the feasibility of my proposed and it served as a validation step in my research. Nasa, AirBus, and DLR (German Aerospace Center) clarified that autonomy will be a core feature of any in-space robotics system in the future.

Setting high level goals (green placeholders) to the Franka Emika manipulator robot virtual surrogate for a structure assembly task.

iSpaRo'24 conference proof of concept.


1st Iteration of proposed supervisory control scheme

The development of the proof of concept and the feedback received led to the first iteration of my proposed solution featuring high-level goal scene specification, and error visualization and correction in Augmented Reality. Futher, the original system has been completely re-written to support ROS2 and now uses Behavior Trees as the underlying control architecture. Behavior Trees are dynamically generated at runtime, based on the goal scene specification, and can also be modified (automatically) at runtime depending on user feedback in case of any errors during execution.

1st Iteration of proposed supervisory control scheme


Side products from the development of the system

3D model of one of the connector parts in the simulation (Drag to inspect)


Robot Navigation Commands via Augmented Reality Interfaces  

BSc graduation project (7 February 2023 - 22 March 2023)

A1

Unitree A1 quadruped robot


The goal of this project is to investigate how Augmented Reality (AR) can be used to facilitate the interaction and control of a robot; the Unitree A1 quadruped robot in particular. This project explores the different modalities that could be used to give navigation commands to a robot via an AR Head-Mounted Display (HMD), and investigates how intuitive AR is for understanding robot-generated data by displaying holographic content that is superimposed on the real environment of the person who controls the robot.

The system comprises two main components:

AR interface

Robot Operating System (ROS)

Microsoft HoloLens 2 headset



There are four modalities that enable the user to control the robot:

The user uses a near-interaction menu to control which modalities are active at any time, or to publish, cancel, or unset a goal. Once the user has used any of the modalities to set a goal, in the real environment, they can then publish it to the robot which will move to these coordinates in virtual space. Due to the implementation of the system, the virtual surrogate robot can be swapped with the real one for real life applications (i.e. search and rescue missions in charred ruins, or exploration of radioactive areas etc.).

Technologies involved:

  • Robot Operating System (ROS)
  • Unity Engine & Editor
  • Microsoft’s Mixed Reality Toolkit (MRTK)
  • C#
  • Gazebo Simulation
  • Python
  • Octomap
  • Unity Robotics’ ROS TCP Connector
  • Unity Robotics’ Visualizations
  • Cardiff University Computational Robotics software

Project: Boost - devStav

[Game Under Development]

Side project to practice game development in Unity with C#.


Play online or Download from the Play Store

Self Hosting

This website was developed using the MERN stack in typescript. It is hosted on a Raspberry Pi 4 and uploaded under my stavro.dev domain. Cloudflare Zero Trust tunnels are used such that no open ports in my home network are exposed to the internet. My ioannismarios.com domain currently redirects to stavro.dev.

Kernel Core

Kernel Core is a sample E-Commerce website written in Python using Flask. It was first developed as a project in university Year 1 and has since been refactored to use the NoSQL MongoDB database rather than a relational one. It can be accessed via kc.ioannismarios.com.

Neighbourfy

Neighbourfy is a group project I was a co-developer of in university Year 2. It is a sample gardening tool sharing website. My responsibilities included helping designing and implementing the database, integrating the ArcGIS maps API, implementing the personal profile page, and retouching the aesthetics of the website. It can be accessed via neighbourfy.com.

Computer Vision powered IoT security camera

Concept of a security alarm system that consists of a door-mounted device that detects movement and is disarmed by using hand gestures. An admin control panel where processed data and statistics are displayed is also provided. I was a co-developer of this university Year 2 group project.