Autonomous Systems & Perception Engineer
Perception • Field Testing • ROS2 • CARLA • Autonomous Vehicles
I work on perception systems and vehicle-level validation for autonomous vehicles, with hands-on experience spanning development, simulation, and real-world field testing. My focus is on building robust LiDAR–camera perception pipelines and validating them across CARLA simulation and on-vehicle deployments.
I regularly work with ROS2, CARLA, sensor calibration, ROS bag analysis, and simulation tools to ensure perception outputs are reliable in dynamic, safety-critical environments.
Currently at Autonomous Vehicle’s Lab (AVLAB), Khalifa University.
- LiDAR–camera perception pipelines for autonomous vehicles
- 3D detection, tracking, and trajectory prediction in ROS2
- CARLA-based simulation testing for perception and autonomy validation
- Field testing & system validation: sensor calibration, data collection, debugging
- Simulation-to-real workflows using Gazebo, RViz, and CARLA
- Real-time and edge deployment of perception models
Languages & Frameworks
- Python, C++
- ROS2 (ROS1), PyTorch
Simulation & Autonomy
- CARLA Simulator
- Gazebo, RViz
Perception & Deployment
- LiDAR & camera fusion
- YOLO-based detection, tracking
- Trajectory prediction
- TensorRT, Docker
Systems
- Linux
End-to-End Autonomous Vehicle Perception Pipeline
LiDAR–camera perception system validated in CARLA simulation, Gazebo, and real-world field testing, supporting detection, tracking, and prediction.
- 📄 CV — https://murdism.github.io/resume/
- 💼 LinkedIn — https://linkedin.com/in/murad-s-mebrahtu-0311a0181
- 🌐 Portfolio — https://murdism.github.io
- 📧 Email — muradsmebrahtu@gmail.com


