Motivation
Direct ILP formulation for MinimumWeightDecoding. Companion issue for #923.
Source
MinimumWeightDecoding
Target
ILP
Reference
Standard binary ILP formulation for minimum weight codeword.
Reduction Algorithm
Input: n×m binary matrix H, binary syndrome vector s of length n.
- Binary variables x_j ∈ {0, 1} for j = 1, ..., m.
- For each row i of H, the GF(2) constraint Σ_j H_{ij} x_j ≡ s_i (mod 2) is linearized: introduce integer slack variable k_i ≥ 0 and add constraint Σ_j H_{ij} x_j = s_i + 2 k_i.
- Objective: minimize Σ_j x_j (Hamming weight).
Solution extraction: Read x_j values as the minimum weight vector.
Size Overhead
| Code metric |
Formula |
num_variables |
m + n |
num_constraints |
n |
Where n = rows, m = columns.
Validation Method
Closed-loop test.
Example
Source: H = [[1,0,1],[0,1,1]], s = [1,1].
Optimal: x = (0,0,1), weight 1. Min(1).
Motivation
Direct ILP formulation for MinimumWeightDecoding. Companion issue for #923.
Source
MinimumWeightDecodingTarget
ILPReference
Standard binary ILP formulation for minimum weight codeword.
Reduction Algorithm
Input: n×m binary matrix H, binary syndrome vector s of length n.
Solution extraction: Read x_j values as the minimum weight vector.
Size Overhead
num_variablesnum_constraintsWhere n = rows, m = columns.
Validation Method
Closed-loop test.
Example
Source: H = [[1,0,1],[0,1,1]], s = [1,1].
Optimal: x = (0,0,1), weight 1. Min(1).