Tinygrad 0.12 Brings Mesa NIR/NAK Support for Open-Source NVIDIA Deep Learning
#Hardware

Tinygrad 0.12 Brings Mesa NIR/NAK Support for Open-Source NVIDIA Deep Learning

Hardware Reporter
3 min read

The 0.12 release of Tinygrad introduces a Mesa NIR backend targeting NVK/NAK, enabling a fully open-source software stack for NVIDIA GPUs alongside new AMD Instinct support.

The Tinygrad project has released version 0.12, bringing a significant development for open-source GPU compute: native support for Mesa's NIR intermediate representation. This update, led by George Hotz, positions Tinygrad to work seamlessly with the emerging open-source NVIDIA Vulkan driver stack.

AI

NIR Backend Targets NVK and NAK

The headline feature is the Mesa NIR backend, which initially focuses on the NVK Vulkan driver and its NAK compiler. This combination represents the core of Mesa's open-source NVIDIA Vulkan implementation. By targeting NIR, Tinygrad can now execute deep learning workloads through a completely free software stack on NVIDIA hardware, bypassing proprietary drivers entirely.

NIR (NVIDIA Intermediate Representation) is Mesa's common compiler IR that allows graphics and compute APIs to share optimization passes and code generation. The NAK compiler is the new NVIDIA-specific backend within Mesa that handles the translation from NIR to NVIDIA's instruction set. With Tinygrad 0.12, the framework can generate NIR code that NAK compiles for execution on NVIDIA GPUs.

The backend also includes LLVMpipe support, providing a CPU-based fallback using Mesa's software renderer. This ensures compatibility even when discrete GPUs aren't available or for testing purposes.

AMD Instinct MI300 and MI350 Support

Beyond NVIDIA open-source support, Tinygrad 0.12 expands AMD GPU compatibility. The release adds support for AMD's latest Instinct accelerators: the MI300 series and the upcoming MI350 series. These additions enhance Tinygrad's utility for high-performance computing environments that rely on AMD's CDNA architecture.

The AMD backend (AM) in Tinygrad handles the translation of deep learning operations to AMD's GPU compute stack. With MI300 and MI350 support, users can leverage AMD's most advanced AI accelerators through Tinygrad's Python-based framework.

Implications for Homelab Builders

For homelab enthusiasts and open-source advocates, Tinygrad 0.12 represents a meaningful step toward vendor-neutral deep learning infrastructure. The ability to run modern neural networks on NVIDIA GPUs using only open-source drivers removes a major barrier for privacy-conscious users and those who prioritize software freedom.

The NIR/NAK path is particularly interesting because it leverages Mesa's mature compiler infrastructure. NIR has been refined over years of graphics driver development, bringing proven optimization techniques to compute workloads. This shared foundation means Tinygrad benefits from Mesa's ongoing compiler improvements.

Getting Started

Tinygrad is available on GitHub at https://github.com/tinygrad/tinygrad. The 0.12 release includes updated documentation for configuring the NIR backend with Mesa's NVK driver. Users will need a recent Mesa build with NAK support and compatible NVIDIA hardware (generally RTX 20-series and newer for full feature support).

The project maintains its focus on code simplicity and readability while delivering competitive performance. With 0.12, that philosophy now extends to supporting the open-source GPU stack that many homelab builders prefer for their privacy and freedom benefits.

What Comes Next

The NIR backend opens intriguing possibilities for future Mesa driver integration. While the initial focus is NVK/NAK, the architecture suggests potential support for other Mesa Vulkan drivers like RADV (AMD) or ANV (Intel). As Mesa's compute capabilities mature, Tinygrad could become a unified interface for open-source GPU acceleration across all major hardware vendors.

For now, 0.12 delivers on the immediate goal: enabling fully open-source deep learning on NVIDIA hardware through the proven Mesa driver stack.

Comments

Loading comments...