🚀 GNN-DT: Graph Neural Network Enhanced Decision Transformer for Efficient Optimization in Dynamic Environments
GNN-DT (Graph Neural Network Enhanced Decision Transformer) is a next-generation AI framework that seamlessly blends the power of Graph Neural Networks (GNNs) with Decision Transformers (DTs) to redefine decision-making in dynamic environments.
🔹 Why GNN-DT?
- 🌟 Tackles scalability issues and sparse rewards
- 🔄 Adapts to ever-changing state-action spaces
- ⚡ Achieves unparalleled efficiency and robustness
💡 By leveraging the permutation-equivariant nature of GNNs and introducing an innovative residual connection mechanism, GNN-DT sets a new benchmark in optimization and AI-driven decision-making.
📌 This repository provides everything you need to explore and implement GNN-DT, including:
- ✅ Dataset generation
- ✅ Model training
- ✅ Evaluation scripts
Whether applied to electric vehicle (EV) charging optimization or other complex decision-making tasks, GNN-DT is your gateway to the AI-driven optimization. 🚀
- 📈 Learns from previously collected trajectories, reducing the need for extensive online interactions.
- 🎯 Effectively addresses the sparse rewards limitation of traditional RL algorithms.
- 🌍 GNN-based embeddings allow for effective adaptation to unseen environments.
- 🔄 Handles dynamic state-action spaces with varying numbers of entities over time.
- 🏆 Outperforms standard DT and RL baselines on real-world optimization tasks.
- 🚀 Requires significantly fewer training trajectories while achieving higher rewards.
- 🔢 Maintains performance across different problem sizes without retraining.
- ⚙️ Efficiently scales from small-scale to large-scale environments, as demonstrated in EV charging applications.
If you find this repository useful in your research, please cite our paper:
@misc{orfanoudakis2025gnndt,
title={GNN-DT: Graph Neural Network Enhanced Decision Transformer for Efficient Optimization in Dynamic Environments},
author={Stavros Orfanoudakis and Nanda Kishor Panda and Peter Palensky and Pedro P. Vergara},
year={2025},
eprint={2502.01778},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2502.01778},
}
For any inquiries or contributions, feel free to open an issue or submit a pull request! 💡