Skip to content

๐Ÿ“œ [NeurIPS 2022] "Symbolic Distillation for Learned TCP Congestion Control", S P Sharan, Wenqing Zheng, Kuo-Feng Hsu, Jiarong Xing, Ang Chen, Zhangyang Wang

License

Notifications You must be signed in to change notification settings

VITA-Group/SymbolicPCC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

5 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Symbolic Distillation for Learned TCP Congestion Control

Accepted at NeurIPS 2022

[ Paper ] [ Poster ]

Introduction

Recent advances in TCP congestion control (CC) have achieved tremendous success with deep reinforcement learning (RL) approaches, which use feedforward neural networks (NN) to learn complex environment conditions and make better decisions. However, such "black-box" policies lack interpretability and reliability, and often, they need to operate outside the traditional TCP datapath due to the use of complex NNs. This paper proposes a novel two-stage solution to achieve the best of both worlds: first to train a deep RL agent, then distill its over-parameterized NN policy into white-box, light-weight rules in the form of symbolic expressions that are much easier to understand and to implement in constrained environments. At the core of our proposal is a novel symbolic branching algorithm that enables the rule to be aware of the context in terms of various network conditions, eventually converting the NN policy into a symbolic tree. The distilled symbolic rules preserve and often improve performance over state-of-the-art NN policies while being faster and simpler than a standard neural network. We validate the performance of our distilled symbolic rules on both simulation and emulation environments.

Overview of a congestion control agent's role in the network

Results

Emulation Performance on Lossy Network Conditions
Emulation Performance under Network Dynamics
Link Utilization and Network Sensitivities

Usage

Training RL Agents

TODO

Symbolic Distillation

TODO

Citation

If you find our code implementation helpful for your own research or work, please cite our paper.

@inproceedings{
    sharan2022symbolic,
    title={Symbolic Distillation for Learned {TCP} Congestion Control},
    author={S P Sharan and Wenqing Zheng and Kuo-Feng Hsu and Jiarong Xing and Ang Chen and Zhangyang Wang},
    booktitle={Thirty-Sixth Conference on Neural Information Processing Systems},
    year={2022},
    url={https://openreview.net/forum?id=rDT-n9xysO}
}

Contact

For any queries, please raise an issue or contact S P Sharan.

License

This project is open sourced under MIT License.

About

๐Ÿ“œ [NeurIPS 2022] "Symbolic Distillation for Learned TCP Congestion Control", S P Sharan, Wenqing Zheng, Kuo-Feng Hsu, Jiarong Xing, Ang Chen, Zhangyang Wang

Topics

Resources

License

Stars

Watchers

Forks

Languages