AI GPU

Nvidia plan to enhance AI induced success

Nvidia have announced a new generation of artificial intelligence chips and software for running AI models. It’s called: The Blackwell B200 GPU

Blackwell B200 GPU

The Blackwell B200 is the successor to Nvidia’s Hopper H100 and H200 GPUs.

It represents a massive generational leap in computational power.

AI Performance: The B200 GPU delivers 4 times the AI training performance and 30 times the inference performance compared to its predecessor.

Transistor Count: It packs an impressive 208 billion transistors, more than doubling the transistor count of the existing H100.

Memory: The B200 features 192GB of HBM3e memory with an impressive bandwidth of 8 TB/s.

Architecture: The Blackwell architecture takes over from H100/H200.

*Dual-Die Configuration: The B200 is not a single GPU in the traditional sense. Instead, it consists of two tightly coupled die, functioning as one unified CUDA GPU. These chips are linked via a 10 TB/s NV-HBI connection to ensure coherent operation.

*Dual-die packaging technology is used to pack two integrated circuit chips in one single package module. It doubles functionality levels.

Process Node: The B200 utilizes TSMC’s 4NP process node, a refined version of the 4N process used by Hopper H100 and Ada Lovelace architecture GPUs.

The Blackwell B200 is designed for data centres and AI workloads but will likely be available to expect consumer in the future, although these may differ significantly from the data centre model.

Grace Blackwell GB200 Superchip:

Nvidia’s GB200 Grace Blackwell Superchip, with two B200 graphics processors and one Arm-based central processor

This superchip pairs the Grace CPU architecture with the updated Blackwell GPU.

It’s another addition to Nvidia’s lineup, combining CPU and GPU power for advanced computing tasks.

Nvidia continues to push the boundaries of accelerated computing, and these new GPUs promise remarkable performance improvements for AI and other workloads.

Onwards and upwards for Nvidia and the advancement of AI.

Leave a Reply

Your email address will not be published. Required fields are marked *