Hi everyone!
It has been a long while since I last posted to LMR, hope you're all doing well and spreading the cult of hobby robotics!!
Lately I've been thinking that it's been too long since I last shared any work of mine (seems like my last activity was 4 years ago, MENTAL!!).
Saying that I remember the community being very welcoming, ready help and driven!!
So here's my latest idea - LabAtar. LabAtar is a fully imersive teleoperation robotics project. This robotic control application enables the operator to be present from the robot's point-of-view (i.e. the user remotely perceives what robot cameras observe and all body movements are directly mapped to the humanoid, all in real-time).
This undertaking is my individual engineering project as part of the MEng Mechatronics and Robotics course at the University of Leeds. Due to module timing arrangements, actual hands-on work began in January and the submission deadline was in early-May. This gave me about 4 months to complete this independent development with addition to attending other University modules. Therefore for a project of such scale and having tight time limitations, it was crucial to choose a set of tools that would not only be simple to use but also reliable and powerful enough to accommodate all goals that were set. I have used two programming languages simultaneously to get it all working, NI LabVIEW and Python 2.7.
If you have any questions please ask below and I will happily explain everything that I can!
For more info see: https://decibel.ni.com/content/docs/DOC-42699
Thanks as always!
Controlled immersively, FPV video feedback projected to Oculus Rift in real-time
This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/labatar-humanoid-robot-teleoperation-system