Back to Projects

Probability-Based Map Fusion — Research Paper

Published method for fusing occupancy maps with neural network object labels

2025ResearchCalifornia State University, StanislausAuthor, Researcher
PythonROS2Keras

Problem Statement

SLAM-based robotic mapping systems produce geometric representations of the environment but lack semantic understanding of what objects are present and where. Bridging this gap by incorporating neural network object labels into the map calls for a lightweight probabilistic approach that remains tractable on resource-constrained robotic hardware.

Solution

Developed a probability-based fusion algorithm that aggregates neural network object detections over time, weighting observations by distance from stored map points using radial basis functions and the midpoint circle algorithm for efficient spatial cell filtering. The method produces a probabilistic object map without requiring expensive re-computation.

Impact

Demonstrated that lightweight, probability-based methods can produce richer, context-aware maps that support object search, counting, and location estimation — improving robot decision-making without sacrificing performance.

Technical Highlights

  • Probability-based aggregation of object detections over time
  • Weighted distance using radial basis functions for spatial inference
  • Midpoint circle algorithm for efficient occupancy cell traversal
  • Fuses SLAM occupancy grid data with CNN object detection labels (YOLO, EfficientNet, MobileNet)
  • Supports searchable object databases and occurrence estimation in mapped regions
  • Designed for onboard efficiency on resource-constrained robotic hardware

Key Metrics

CSU Stanislaus Research
Venue
Robotics / Computer Vision
Domain
Probabilistic Sensor Fusion
Method