Customers and partners who trust Verda
A100 SXM 80GB and 40GB instances
At the forefront of digital intelligence
Our servers exclusively use the SXM4 'for NVLINK' module, which offers a memory bandwidth of over 2TB/s and Up to 600GB/s P2P bandwidth.
A100 80GB reaches up to 1.3 TB of unified memory per node and delivers up to a 3X throughput increase over A100 40GB
80GB
$0.58/h
Spot price
$1.29/h
Pay As You Go price
40GB
$0.22/h
Spot price
$0.72/h
Pay As You Go price
80GB vs 40GB
Push the limits of compute| GPU Model | A100 SXM4 80GB | A100 SXM4 40GB |
|---|---|---|
| Memory type | HBM2e | HBM2 |
| Memory Clock speed | 3.2 GB/s | 2.4 GB/s |
| Memory Bandwidth | 2.039 GB/s | 1,555 GB/s |
| GPU Model | Memory type | Memory Clock speed | Memory Bandwidth |
|---|---|---|---|
| A100 SXM4 80GB | HBM2e | 3.2 GB/s | 2.039 GB/s |
| A100 SXM4 40GB | HBM2 | 2.4 GB/s | 1,555 GB/s |
NVIDIA A100 virtual machines
Powered by 2nd Gen AMD EPYC Rome processors and NVLink v3, suitable for demanding AI and HPC applications.
| GPU model | Instance name | CPU | RAM | VRAM | Pay As You Go Price |
|---|---|---|---|---|---|
| 8x A100 SXM4 | 8A100.176V | 176 | 960 | 640 | $10.32/h |
| 4x A100 SXM4 | 4A100.88V | 88 | 480 | 320 | $5.16/h |
| 2x A100 SXM4 | 2A100.44V | 44 | 240 | 160 | $2.58/h |
| 1x A100 SXM4 | 1A100.22V | 22 | 120 | 80 | $1.29/h |
| 8x A100 SXM4 | 8A100.40S.176V | 176 | 960 | 320 | $5.77/h |
| 1x A100 SXM4 | 1A100.40S.22V | 22 | 120 | 40 | $0.72/h |
- Pricing per GPU
- $1.29/h Pay As You Go
- $0.58/h Spot
- $1.26/h -2%1 month
- $1.19/h -8%1 year
- $0.97/h -25%2 years
Verda Cloud
Where speed meets simplicity in GPU solutions Fast Dedicated hardware for max speed and security
Productive Start, stop, hibernate instantly via the dashboard or API
Reliable A historical uptime of over 99.9%
Protected Verda is ISO27001 certified and GDPR compliant
Reserve your perfect setup today
GPU clusters tailored to your needs
Looking for something different?
Contact us