Skip to content
View Murdism's full-sized avatar
🎯
Focusing
🎯
Focusing

Organizations

@AV-Lab

Block or report Murdism

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Murdism/README.md

Murad Mebrahtu

Autonomous Systems & Perception Engineer
Perception • Field Testing • ROS2 • CARLA • Autonomous Vehicles


🚗 About

I work on perception systems and vehicle-level validation for autonomous vehicles, with hands-on experience spanning development, simulation, and real-world field testing. My focus is on building robust LiDAR–camera perception pipelines and validating them across CARLA simulation and on-vehicle deployments.

I regularly work with ROS2, CARLA, sensor calibration, ROS bag analysis, and simulation tools to ensure perception outputs are reliable in dynamic, safety-critical environments.

Currently at Autonomous Vehicle’s Lab (AVLAB), Khalifa University.


🔧 What I Work On

  • LiDAR–camera perception pipelines for autonomous vehicles
  • 3D detection, tracking, and trajectory prediction in ROS2
  • CARLA-based simulation testing for perception and autonomy validation
  • Field testing & system validation: sensor calibration, data collection, debugging
  • Simulation-to-real workflows using Gazebo, RViz, and CARLA
  • Real-time and edge deployment of perception models

🧰 Tech Stack

Languages & Frameworks

  • Python, C++
  • ROS2 (ROS1), PyTorch

Simulation & Autonomy

  • CARLA Simulator
  • Gazebo, RViz

Perception & Deployment

  • LiDAR & camera fusion
  • YOLO-based detection, tracking
  • Trajectory prediction
  • TensorRT, Docker

Systems

  • Linux

🚀 Featured Work

End-to-End Autonomous Vehicle Perception Pipeline
LiDAR–camera perception system validated in CARLA simulation, Gazebo, and real-world field testing, supporting detection, tracking, and prediction.

▶️ Demo: https://www.youtube.com/watch?v=ZYhhkAWVly0


🔗 Links

Pinned Loading

  1. Probabilistic_Pedestrian_Trajectory_Prediction-PPTP Probabilistic_Pedestrian_Trajectory_Prediction-PPTP Public

    Thesis

    Python 3

  2. AV-Lab/emt-dataset AV-Lab/emt-dataset Public

    EMT is a comprehensive dataset for autonomous driving research, containing 57 minutes of diverse urban traffic footage from the Gulf Region.

    Python 8

  3. turtlebot4-navigation-testing turtlebot4-navigation-testing Public

    🤖 Automated TurtleBot4 navigation testing suite with ROS 2 Jazzy. Statistical performance analysis, path tracking, Docker containerization, and comprehensive YAML reporting for Nav2 validation.

    Python 4

  4. AV-Lab/Sensor_Setup AV-Lab/Sensor_Setup Public

    Code to configure, test, start and calibrate sensors : Camera and Lidar

    Python 1

  5. planner planner Public

    D* lite for global planning and A* for local planning

    Python 1