Our Article Published in Neural Processing Letters (Nature Portfolio)
- ABCD Lab.

- Mar 16
- 1 min read
On March 16, 2026, our article, “Deep Learning with Zero Initialization: Revisiting Symmetry Breaking and Gradient Flow” has been published in Neural Processing Letters (Nature Portfolio) and is now available online.
In this work, we revisit a long-standing assumption in deep learning that zero initialization prevents effective learning due to symmetry issues.
We show that this is not necessarily the case. Under certain conditions, neural networks can still learn effectively even when all weights are initialized to zero. Through experiments across multiple architectures (MLPs, CNNs, ResNets, ViTs, and MLP-Mixers), we find that zero initialization can achieve performance comparable to conventional random initialization.

We also propose a conceptual framework that reinterprets random initialization as part of a broader initialization perspective, suggesting that zero initialization is a viable and underexplored alternative.
Big congratulations to Jongwoo Seo for his great work on this project !
Seo, Jongwoo, and Wuhyun Koh. "Deep Learning with Zero Initialization: Revisiting Symmetry Breaking and Gradient Flow." Neural Processing Letters (2026).





Comments