Hello I'm

Muhammad Umar

I am an Erasmus Mundus Scholar and Master's student in Intelligent Field Robotic Systems (IFRoS). The joint master program is taught at Universitat de Girona, Spain and Eötvös Loránd University, Budapest. I am currently working as a Master's Thesis Student at Eurecat Technology Center, Barcelona on ground robotic systems. My research topic is vision based reactive navigation for agricultural robotics applications. I am passionate about space robotics and my ultimate goal is to work with a space program and revolutionize the space industry by bringing innovation through a mix of robotics and AI
View Resume

Projects


Package Delivery Robot using AgileX Scout Robot

Created a Gazebo enviornment for the newly arrived AgileX's Scout robots at ELTE, Budapest. By integrating Xsens 600 INU module, we are able to get IMU and GPS data. This GPS and IMU data is fused with Laser Odometry and robot's odometry using SLAM for navigation purposes. The path-planning and motion control is done using the move_base package in ROS. We have also integrated road-segmentation in ROS and OpenCV for future applications where the robot should only navigate on roads.

Frontier-Based Exploration using Hybrid-A* Planner

Implemented Frontier-based exploration and Hybrid-A* from scratch using LiDAR and Python in ROS-Noetic. The code was tested both in Gazebo Simulation and real world on Turtlebot3 robots. Extended the traditional A* algorithm for non-holonomic vehicles by incorporating vehicle dynamics inspired by Reeds-Shepp curves to implement Hybrid-A*.

EKF-SLAM using Corner Features

Implemented EKF-SLAM from scratch by detecting corners of a room using a 360◦ LiDAR in an unknown environment. The corners are detected by taking intersection of line segments generated by the points of the Laser scan.

Object Recognition using PyTorch in ROS

Implemented object detection using transfer learning on COCO dataset and analyzed its performance using Voxel51. The inference detector was later integrated in ROS for pick and place robot application

TRACK-E: Smartphone's IMU-Based Human Following Robot (Bachelor's Final Year Project)

Designed a robot capable of tracking and following a person through the IMU sensors of a smartphone. Used raw data of accelerometer & magnetometer, transferred over UDP, for distance and heading measurement. Implemented tilt-compensation and dual PID controller in a Raspberry Pi to follow the person using the sensor data.

Stereo Vision: Matching and Display

Implemented stereo matching techniques and 3D displays using Point Cloud Library (PCL). The following stereo matching algorithms were implemented:
  • Naive Stereo Matching
  • Dynamic Programming
  • OpenCV's StereoSGBM

Image Filters using Point Cloud Library with QT5/VTK GUI

Implemented different image and upsampling filters. Also made a 3D display GUI using VTK and QT. The following filters were implemented: Bilateral Filter
  • Bilateral Filter
  • Bilateral Median Filter
  • Joint Bilateral and Joint Bilateral Median Filter
  • Joint Bilateral Upsampling and Joint Bilateral Median Upsampling
  • Iterative Upsampling

PointCloud Reconstruction using ICP and TrICP

Implemented Iterative Closest Point (ICP) and Trimmed ICP (TrICP) algorithm for point cloud reconstruction. The following algorithms were implemented:
  • Nearest Neighbour search (kd-trees using nanoflann)
  • Rotation Estimation (using SVD)
  • Iterative Closest Point (ICP)
  • Trimmed Iterative Closest Point (TrICP)

Deep Learning Projects

  • Using PyTorch and Transfer Learning trained an image classifier on fruits-360 dataset with VGG16
  • Fine-tuned Mask R-CNN on custom dataset (sunflowers) for segmentation.
  • Implemented encoder architecture and fine-tuned encoder-decoder on COCO image captioning dataset.

Research Work


Smartphone's IMU-Based Human Following Robot
M. Umar, M. Ijaz, M. Naqvi, A. Ashraf
Department of Electrical Engineering, University of Engineering and Technology, Lahore