Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SnareNet: Flexible Repair Layers for Neural Networks with Hard Constraints

About

Neural networks are increasingly used as surrogate solvers and control policies, but unconstrained predictions can violate physical, operational, or safety requirements. We propose SnareNet, a feasibility-controlled architecture for learning mappings whose outputs must satisfy input-dependent nonlinear constraints. SnareNet appends a differentiable repair layer that navigates in the constraint map's range space, steering iterates toward feasibility and producing a repaired output that satisfies constraints to a user-specified tolerance. To stabilize end-to-end training, we introduce adaptive relaxation, which designs a relaxed feasible set that snares the neural network at initialization and shrinks it into the feasible set, enabling early exploration and strict feasibility later in training. On optimization-learning and trajectory planning benchmarks, SnareNet consistently attains improved objective quality while satisfying constraints more reliably than prior work.

Ya-Chi Chu, Alkiviades Boukas, Madeleine Udell• 2026

Related benchmarks

TaskDatasetResultRank
Non-Convex Programming OptimizationNCP (test)
Count Equality Violations0.00e+0
8
QCQP OptimizationQCQP (test)
# Equality Violations0.00e+0
7
Constrained OptimizationQCQP 100 inequality constraints (test)
Count of Equality Violations0.00e+0
4
Constrained OptimizationQCQP with 10 inequality constraints (test)
# Eq Violations0.00e+0
4
Quadratically Constrained Quadratic ProgramQCQP 50 inequality constraints (test)
Number of Equality Violations0.00e+0
4
Showing 5 of 5 rows

Other info

Follow for update