Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #408 +/- ##
==========================================
+ Coverage 87.46% 87.69% +0.23%
==========================================
Files 53 52 -1
Lines 5957 6129 +172
Branches 657 680 +23
==========================================
+ Hits 5210 5375 +165
- Misses 728 735 +7
Partials 19 19
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Pull request overview
This pull request adds betweenness centrality (BC) computation for both nodes and edges in the network graph, and integrates these metrics into the agent routing probability calculations. The implementation uses Brandes' algorithm for efficient computation of centrality measures.
Changes:
- Implemented Brandes' algorithm for computing node and edge betweenness centralities in the Network class
- Added betweenness centrality attributes to Node and Edge base classes with corresponding getters/setters
- Modified agent transition probability formula in RoadDynamics to incorporate betweenness centrality and refactored to use speed/length ratios
- Removed unused
isStation()virtual method from Node base class (partially - see issue below)
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 11 comments.
Show a summary per file
| File | Description |
|---|---|
| test/mobility/Test_graph.cpp | Added comprehensive test cases for node and edge betweenness centrality computation covering linear chains, star topology, diamond graphs, single edges, and disconnected graphs |
| test/base/Test_node.cpp | Removed test for isStation() method that no longer exists in Node base class |
| src/dsf/mobility/Station.hpp | Commented out 'final' keyword on isStation() method (incomplete refactoring - see issues) |
| src/dsf/mobility/RoadDynamics.hpp | Modified agent transition probability formula to incorporate betweenness centrality and changed from multiplicative speeds to speed/length ratios |
| src/dsf/base/Node.hpp | Added m_betweennessCentrality member variable with getter/setter, updated copy operations, modernized return types to use 'auto', removed isStation() virtual method |
| src/dsf/base/Network.hpp | Implemented computeBetweennessCentralities and computeEdgeBetweennessCentralities template methods using Brandes' algorithm with Dijkstra's shortest path |
| src/dsf/base/Edge.hpp | Added m_betweennessCentrality member variable with getter/setter |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| sigma[w] = sigma[v]; | ||
| P[w] = {v}; | ||
| pq.push({newDist, w}); | ||
| } else if (std::abs(newDist - dist[w]) < 1e-12 * std::max(1.0, dist[w])) { |
There was a problem hiding this comment.
The tolerance used for floating-point comparison (1e-12 * std::max(1.0, dist[w])) is relative to the distance value. However, when dist[w] is exactly 0 (which happens for the source node), this becomes 1e-12, which may be too strict for some edge weights. Consider using a more robust epsilon like 1e-12 * (1.0 + std::abs(dist[w])) or a small absolute tolerance to handle this edge case consistently.
| } else if (std::abs(newDist - dist[w]) < 1e-12 * std::max(1.0, dist[w])) { | |
| } else if (std::abs(newDist - dist[w]) < 1e-12 * (1.0 + std::abs(dist[w]))) { |
| } else if (std::abs(newDist - dist[w]) < 1e-12 * std::max(1.0, dist[w])) { | ||
| sigma[w] += sigma[v]; | ||
| P[w].push_back({v, eId}); |
There was a problem hiding this comment.
The same floating-point comparison tolerance issue exists here as in computeBetweennessCentralities. When dist[w] is 0, the tolerance becomes 1e-12 which may be too strict. Consider using a more robust epsilon for consistency.
| } else if (std::abs(newDist - dist[w]) < 1e-12 * std::max(1.0, dist[w])) { | |
| sigma[w] += sigma[v]; | |
| P[w].push_back({v, eId}); | |
| } else { | |
| double diff = std::abs(newDist - dist[w]); | |
| double scale = std::max(std::abs(newDist), std::abs(dist[w])); | |
| double eps = 1e-9 * (1.0 + scale); | |
| if (diff <= eps) { | |
| sigma[w] += sigma[v]; | |
| P[w].push_back({v, eId}); | |
| } |
| auto const bcNext{pStreetOut->betweennessCentrality().value_or(1.0)}; | ||
| double const stationaryWeightNext = pStreetOut->stationaryWeight(); | ||
| auto const weightRatio{stationaryWeightNext / | ||
| stationaryWeightCurrent}; // SQRT (p_i / p_j) |
There was a problem hiding this comment.
The probability formula has been significantly changed from 'speedCurrent * speedNext * sqrt(weightRatio)' to 'sqrt((bcCurrent * bcNext) * (speedCurrent / lengthCurrent) * (speedNext / lengthNext) * weightRatio)'. This is a fundamental change to the transition probability model. The new formula incorporates betweenness centrality and changes how speed and length are factored in. Please ensure this change is documented in a research paper, technical specification, or commit message explaining the rationale behind this new model, as it affects the core dynamics of agent routing behavior.
| stationaryWeightCurrent}; // SQRT (p_i / p_j) | |
| stationaryWeightCurrent}; // SQRT (p_i / p_j) | |
| // NOTE ON TRANSITION MODEL: | |
| // Historically, transition probabilities were proportional to: | |
| // speedCurrent * speedNext * sqrt(weightRatio) | |
| // where `weightRatio = p_j / p_i` is derived from the stationary weights | |
| // of the Markov chain over edges. This favored faster streets but did | |
| // not normalize by edge length nor account for topological importance. | |
| // | |
| // The current model intentionally incorporates: | |
| // - betweenness centrality (bcCurrent, bcNext) to bias movement toward | |
| // structurally important edges in the network, | |
| // - speed normalized by length (speed / length) for both current and | |
| // candidate edges, which approximates a preference for short, fast | |
| // connections rather than speed alone, | |
| // - the stationary weight ratio (weightRatio) to remain compatible | |
| // with the underlying stationary distribution used elsewhere. | |
| // | |
| // Concretely, the base (unnormalized) transition probability is: | |
| // P(e_current → e_next) ∝ | |
| // sqrt( bcCurrent * bcNext | |
| // * (speedCurrent / lengthCurrent) | |
| // * (speedNext / lengthNext) | |
| // * weightRatio ) | |
| // | |
| // This square-root formulation moderates extreme values of centrality | |
| // and speed/length, providing a smoother distribution over outgoing | |
| // edges while still emphasizing highly central, efficient routes. This | |
| // change fundamentally alters routing behavior by coupling geometric, | |
| // dynamic (speed), and topological (betweenness) information into a | |
| // single probabilistic model. |
| double speedCurrent{1.0}; | ||
| double lengthCurrent{1.0}; | ||
| double stationaryWeightCurrent = 1.0; | ||
| double bcCurrent{1.0}; |
There was a problem hiding this comment.
When an agent has no current street (e.g., when just spawned), speedCurrent, lengthCurrent, and bcCurrent are all set to default value 1.0. However, this may produce unexpected behavior in the probability calculation. Consider whether these default values make sense semantically - for example, should a newly spawned agent with no street history have a betweenness centrality of 1.0, or should this case be handled differently?
| double bcCurrent{1.0}; | |
| double bcCurrent{0.0}; |
| /// @brief Compute edge betweenness centralities for all edges using Brandes' algorithm | ||
| /// @tparam WeightFunc A callable type that takes a const reference to a unique_ptr<edge_t> and returns a double representing the edge weight | ||
| /// @param getEdgeWeight A callable that takes a const reference to a unique_ptr<edge_t> and returns a double (must be positive) | ||
| /// @details Implements Brandes' algorithm for directed weighted graphs. | ||
| /// The computed centrality for each edge e is: | ||
| /// C_B(e) = sum_{s != t} sigma_st(e) / sigma_st | ||
| /// where sigma_st is the number of shortest paths from s to t, | ||
| /// and sigma_st(e) is the number of those paths using edge e. | ||
| /// Results are stored via Edge::setBetweennessCentrality. |
There was a problem hiding this comment.
Similar to the node betweenness centrality documentation, consider adding a citation to Brandes' original paper to help future maintainers understand and verify the algorithm.
| // Calculate base probability | ||
| auto const speedNext{pStreetOut->maxSpeed()}; | ||
| auto const lengthNext{pStreetOut->length()}; | ||
| auto const bcNext{pStreetOut->betweennessCentrality().value_or(1.0)}; |
There was a problem hiding this comment.
Similar to the concern above, using value_or(1.0) for edge betweenness centrality means uncomputed values are treated as having moderate importance. Consider documenting this design choice or using a different approach to handle uncomputed values.
|
|
||
| while (!S.empty()) { | ||
| Id w = S.top(); | ||
| S.pop(); |
There was a problem hiding this comment.
The code performs division by sigma[w] without checking if it's zero. While the algorithm design ensures sigma[w] should be positive for all nodes in stack S (since unreachable nodes aren't added to S), consider adding an assertion or defensive check (e.g., assert(sigma[w] > 0)) to catch potential logic errors during development and make the assumption explicit.
| S.pop(); | |
| S.pop(); | |
| assert(sigma[w] > 0.0); |
| /// @brief Check if the node is a station | ||
| /// @return True | ||
| constexpr bool isStation() const noexcept final { return true; } | ||
| constexpr bool isStation() const noexcept /*final*/ { return true; } |
There was a problem hiding this comment.
The 'final' specifier has been commented out from the isStation() method, but the corresponding virtual method declaration has been removed from the base Node class. This creates an inconsistency: Station::isStation() is now a non-virtual method that doesn't override anything. Either restore the virtual isStation() method in the Node base class (returning false by default), or remove the isStation() method entirely from Station if it's no longer needed. The current state suggests incomplete refactoring.
| /// @brief Compute betweenness centralities for all nodes using Brandes' algorithm | ||
| /// @tparam WeightFunc A callable type that takes a const reference to a unique_ptr<edge_t> and returns a double representing the edge weight | ||
| /// @param getEdgeWeight A callable that takes a const reference to a unique_ptr<edge_t> and returns a double (must be positive) | ||
| /// @details Implements Brandes' algorithm for directed weighted graphs. | ||
| /// The computed centrality for each node v is: | ||
| /// C_B(v) = sum_{s != v != t} sigma_st(v) / sigma_st | ||
| /// where sigma_st is the number of shortest paths from s to t, | ||
| /// and sigma_st(v) is the number of those paths passing through v. | ||
| /// Results are stored via Node::setBetweennessCentrality. |
There was a problem hiding this comment.
The documentation mentions 'Brandes' algorithm' but doesn't include a reference or citation. Consider adding a citation to the original paper (U. Brandes, 'A faster algorithm for betweenness centrality', Journal of Mathematical Sociology, 2001) to help future maintainers understand the algorithm's theoretical foundation and verify its correctness.
| while (!S.empty()) { | ||
| Id w = S.top(); | ||
| S.pop(); | ||
| for (auto const& [v, eId] : P[w]) { |
There was a problem hiding this comment.
Same concern as in computeBetweennessCentralities: consider adding an assertion to verify sigma[w] > 0 to make the algorithm's assumptions explicit and catch potential logic errors.
| for (auto const& [v, eId] : P[w]) { | |
| for (auto const& [v, eId] : P[w]) { | |
| // sigma[w] should be > 0 whenever P[w] is non-empty (Brandes' invariant). | |
| // Assert this to make the assumption explicit and avoid division by zero. | |
| assert(sigma[w] > 0.0); | |
| if (sigma[w] == 0.0) { | |
| continue; | |
| } |
No description provided.