A new paper introduces EGGROLL, a low‑rank perturbation technique that turns the traditionally expensive Evolution Strategies into a viable tool for training billion‑parameter neural networks. By replacing full‑rank noise with a pair of small matrices, the method cuts memory and compute costs while matching the performance of classic ES on reinforcement learning and large‑language‑model benchmarks.