South Korean AI chip startup Rebellions is expanding globally with its new rack-scale compute platform, targeting enterprises with air-cooled systems that don't require liquid cooling infrastructure.
South Korean AI chip startup Rebellions is setting its sights on global expansion after establishing a strong foothold in its domestic market, with plans to challenge established players like Nvidia and AMD in the enterprise AI infrastructure space.

Founded in late 2020, Rebellions has spent the past few years building its reputation in South Korea, deploying AI accelerators across various applications from call centers and customer service to national highway surveillance systems. Now, with offices opened in Japan, Saudi Arabia, Taiwan, and the United States, the company is ready to bring its technology to a wider audience.
The Rebel100: A Challenger to Nvidia's Dominance
The company's latest AI accelerator, the Rebel100 (formerly known as Rebel Quad), represents a significant technological achievement. The chip delivers one petaFLOP of dense 16-bit floating point math or double that at FP8 precision, positioning it as a direct competitor to Nvidia's H200 accelerators from late 2023.
What sets the Rebel100 apart is its chiplet architecture. Rather than using a monolithic compute die like Nvidia's H200, Rebellions employs four compute dies manufactured and packaged by Samsung. This approach offers several advantages:
- Improved yields: Smaller dies typically have better manufacturing yields
- Reduced competition for fab capacity: By not competing for TSMC's limited resources
- Strategic partnerships: Close ties with Samsung and SK Hynix for HBM memory supply
The processor is fed by four HBM3e stacks totaling 144 GB of capacity and 4.8 TB/s of aggregate bandwidth. While memory supply remains a challenge industry-wide, Rebellions' South Korean roots provide it with preferential access to the world's largest HBM suppliers.
Enterprise-Friendly Design Philosophy
One of Rebellions' key differentiators is its focus on enterprise compatibility. Unlike many modern AI accelerators that require liquid cooling or ultra-power-dense racks, the Rebel100 comes as a PCIe card with a 600-watt TDP that can be deployed in standard air-cooled infrastructure.
The company's reference design packs eight of these cards into a single air-cooled node, with the RebelRack featuring four such nodes for a total of 32 accelerators. This configuration delivers 64 petaFLOPS of FP8 compute, 4.6 TB of HBM3e, and 153.6 TB/s of aggregate memory bandwidth.
For larger deployments, Rebellions is developing the RebelPod, which can scale from eight to 128 nodes, each with eight Rebel100 accelerators interconnected using 800 Gbps Ethernet. This scalability allows enterprises to start small and expand as their AI workloads grow.
Software Stack: Betting on Open Source
Rebellions is taking a pragmatic approach to software, building its platform on established open source frameworks. The system runs on vLLM, PyTorch, and Triton, with llm-d for disaggregated inference workloads. This means enterprises can leverage their existing expertise with these tools rather than learning proprietary software stacks.
"Everything's open source, from vLLM compiler all the way up to the very highest level of stack, Red Hat, OpenShift, and everything in between," says Marshall Choy, Rebellions' chief business officer. "If you've used any of these technologies in any other context, you already know how to use Rebellions."
This strategy mirrors successful approaches taken by other companies in the space, though Choy acknowledges that similar claims from other chipmakers haven't always lived up to expectations. The company's membership in the PyTorch Foundation provides additional credibility to its commitment to open source ecosystems.
Financial Backing and IPO Plans
Rebellions is well-funded for its global ambitions, having raised $400 million in a pre-IPO funding round led by Mirae Asset Financial Group and the Korea National Growth Fund. This capital will support both the company's westward expansion and the development of more capable and efficient AI accelerators and systems.
According to recent reports, Rebellions could file for an IPO as soon as this year or early next year, positioning itself as one of the first AI chip startups to go public. This move would provide additional capital for research and development while giving investors exposure to the growing AI infrastructure market.
The Competitive Landscape
The AI chip market is becoming increasingly crowded, with established players like Nvidia and AMD facing competition from numerous startups. However, Rebellions believes its combination of technological innovation, enterprise-friendly design, and strategic partnerships gives it a competitive edge.
The company's focus on air cooling and standard form factors addresses a significant pain point for enterprises looking to deploy AI infrastructure without massive datacenter overhauls. This approach could prove particularly attractive in regions where liquid cooling infrastructure is less common or where enterprises prefer to avoid the complexity and cost of specialized cooling systems.
As the AI infrastructure market continues to evolve, Rebellions' success will depend on its ability to execute on its global expansion plans while continuing to innovate in both hardware and software. With substantial funding, a differentiated product strategy, and a pragmatic approach to software, the company appears well-positioned to challenge the established players in the years ahead.

Comments
Please log in or register to join the discussion