Back to Projects

ROS2 Neural Network Robotics

Sensor fusion and spatial reasoning with neural networks

2025Fall 2025California State University, StanislausIndependent Study
PythonROS2

Problem Statement

Robots need to understand not just what objects are present, but where they are in 3D space. This requires fusing data from multiple sensors with neural network predictions and mapping everything to the robot's coordinate frame.

Solution

Built ROS2 environment for sensor data fusion. Developed occupancy grid tooling to determine object positions based on robot position and orientation. Implemented midpoint circle algorithm for efficient spatial cell filtering. Created socket-based CNN server for generating object predictions. Implemented distance calculations using both weighted Euclidean distance and radial basis functions.

Impact

Demonstrated advanced robotics concepts combining perception (CNN), localization, and spatial reasoning. System can accurately place detected objects in robot's coordinate frame for navigation and planning.

Technical Highlights

  • Implemented ROS2 sensor fusion pipeline
  • Built occupancy grid system for spatial reasoning
  • Optimized spatial queries with midpoint circle algorithm
  • Created socket-based CNN inference server
  • Implemented multiple distance metrics for object localization
  • Integrated neural network predictions with spatial data