A production-grade mobile robotics project demonstrating sim-to-real architecture with
NVIDIA Isaac Sim 5.0 and ROS 2 Jazzy. The robot autonomously navigates a sequence of waypoints using
closed-loop control with real-time odometry feedback. The control stack is fully decoupled from the
simulator โ the same code would drive a real Jetbot without modification.
Both sides follow the "functional core, imperative shell" pattern โ
pure math and state-machine logic are isolated from ROS 2 and Isaac Sim integration,
making the core algorithms unit-testable without infrastructure.
๐ Phase 4: Nav2 stack, SLAM Toolbox, behavior trees
๐ Phase 5: Unit tests, Docker, GitHub Actions CI/CD
๐ค Other Work
sEMG-Based Gesture Control for Shadow Robot Hand
University of Hertfordshire ยท Jun 2024 โ Sep 2024
Developed a real-time ROS-based control pipeline for a 25-DOF Shadow Robot Hand,
integrating sEMG-driven ML inference for gesture-based manipulation.
Low-latency sEMG signal acquisition using the Myo armband
Designed and trained ML models (SVM, CNN, LSTM) for gesture recognition
Integrated ML inference directly into the control loop
Validated in Gazebo simulation and on physical hardware
ROSShadow Robot Hand (25-DOF)sEMGPyTorchGazebo
๐ฑ Vision
My long-term goal is to develop affordable, real-world robotics systems
that can learn from their environment through observation and interaction.
I'm particularly interested in learning-driven robots that improve their
behaviour over time using perception and feedback โ with a focus on practical, everyday applications.
The aim is to reduce complexity and cost while enabling intelligent robotic systems
that can operate effectively in real-world environments.