TY - GEN
T1 - Just-in-Time Logic Enforcement
T2 - 24th ACM Workshop on Hot Topics in Networks, HotNets 2025
AU - Hè, Hongyu
AU - Apostolaki, Maria
N1 - Publisher Copyright:
© 2025 Copyright held by the owner/author(s).
PY - 2025/11/17
Y1 - 2025/11/17
N2 - While ML can greatly aid network management, it often makes glaring mistakes that contradict common sense or domain-specific constraints, undermining its trustworthiness and hindering adoption. To address this mismatch, this paper advocates for enforcing logic during ML inference (or Just-In-Time), rather than during training or post-inference in prior work. We find that this approach offers correctness guarantees without sacrificing statistical fidelity, thereby maximizing the benefits of both ML and formal reasoning.To achieve Just-In-Time Logic Enforcement, we interleave an SMT solver into the language model's inference process, which guides generation step by step to enforce domain-specific rules. Our proof-of-concept implementation, LeJIT, turns a generic GPT-2 model at inference time into either a synthetic data generator or a telemetry imputer by applying different sets of logic rules and performs on par with task-specific SOTA systems. LeJit paves the way for a networking foundation model networking that can be repurposed through logic rules, instead of costly retraining or fine-tuning.
AB - While ML can greatly aid network management, it often makes glaring mistakes that contradict common sense or domain-specific constraints, undermining its trustworthiness and hindering adoption. To address this mismatch, this paper advocates for enforcing logic during ML inference (or Just-In-Time), rather than during training or post-inference in prior work. We find that this approach offers correctness guarantees without sacrificing statistical fidelity, thereby maximizing the benefits of both ML and formal reasoning.To achieve Just-In-Time Logic Enforcement, we interleave an SMT solver into the language model's inference process, which guides generation step by step to enforce domain-specific rules. Our proof-of-concept implementation, LeJIT, turns a generic GPT-2 model at inference time into either a synthetic data generator or a telemetry imputer by applying different sets of logic rules and performs on par with task-specific SOTA systems. LeJit paves the way for a networking foundation model networking that can be repurposed through logic rules, instead of costly retraining or fine-tuning.
KW - formal methods
KW - inference-time reasoning
KW - LLM
KW - network management
KW - neuro-symbolic AI
KW - SMT solver
UR - https://www.scopus.com/pages/publications/105023662156
UR - https://www.scopus.com/pages/publications/105023662156#tab=citedBy
U2 - 10.1145/3772356.3772406
DO - 10.1145/3772356.3772406
M3 - Conference contribution
AN - SCOPUS:105023662156
T3 - HotNets 2025 - Proceedings of the 2025 24th ACM Workshop on Hot Topics in Networks
SP - 184
EP - 192
BT - HotNets 2025 - Proceedings of the 2025 24th ACM Workshop on Hot Topics in Networks
PB - Association for Computing Machinery, Inc
Y2 - 17 November 2025 through 18 November 2025
ER -