GPU Performance Benchmarks

Evaluate and compare the compute capabilities of the leading cloud GPUs. Make data-driven decisions on infrastructure for your LLM training, fine-tuning, and inference workloads.

Compute Power (FP16/BF16 TFLOPS)

Memory Bandwidth (GB/s)

Complete Hardware Statistics

GPU ModelArchitectureVRAMBandwidthFP16 TFLOPSINT8 TOPS
NVIDIA H100 (SXM)Hopper80 GB HBM33,350 GB/s1,9793,958
NVIDIA A100Ampere80 GB HBM2e2,039 GB/s6241,248
NVIDIA L40SAda Lovelace48 GB GDDR6864 GB/s362733
NVIDIA RTX 4090Ada Lovelace24 GB GDDR6X1,008 GB/s82.6660

Don't let hardware limit your potential.

Discover which providers have these models in stock right now.

Browse Providers in Stock