The Triton Inference Server provides an optimized cloud and edge inferencing solution.
-
Updated
Dec 23, 2024 - Python
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
FireSim: Fast and Effortless FPGA-accelerated Hardware Simulation with On-Prem and Cloud Flexibility
Using network observability to operate and design healthier networks
Disseminated, Distributed OS for Hardware Resource Disaggregation. USENIX OSDI 2018 Best Paper.
Automated, multi-region container deployment
An Advanced Linux RAM Drive and Caching kernel modules. Dynamically allocate RAM as block devices. Use them as stand alone drives or even map them as caching nodes to slower local disk drives. Access those volumes locally or export them across an NVMe Target network. Manage it all from a web API.
AMD OpenNIC Project Overview
CloudSimPy: Datacenter job scheduling simulation framework
Toolkit to accelerate Azure adoption for enterprise customers
AMD OpenNIC Shell includes the HDL source files
Kubernetes Scheduler Simulator
API to automate IP Networking management, resource allocation and provisioning.
Collaborative Datacenter Simulation and Exploration for Everybody
Lists of locations & IP addresses of Valve servers
A platform to test reinforcement learning policies in the datacenter setting.
DPU-Powered File System Virtualization over virtio-fs
Freely distributed official SR Linux container image
AMD OpenNIC driver includes the Linux kernel driver
Run speed tests for all DigitalOcean datacenters faster than ever.
Network probing tool crafted for datacenters (but not only)
Add a description, image, and links to the datacenter topic page so that developers can more easily learn about it.
To associate your repository with the datacenter topic, visit your repo's landing page and select "manage topics."