Co-inventor of the Transformer architecture, the foundation of modern LLMs.
VP Engineering, Gemini Co-lead / Google DeepMind
Noam Shazeer is a legendary AI scientist and engineer, widely recognized as a co-author of the seminal "Attention Is All You Need" paper and the original inventor of key components of the Transformer architecture. He is currently the VP Engineering and Gemini Co-lead at Google DeepMind, having recently returned to Google after the company acquired his startup, Character.AI, which he co-founded and served as CEO. His inventions also include Sparsely-gated Mixture of Experts and Mesh-Tensorflow, making him a pivotal figure in the large language model revolution.


VP Engineering, Gemini Co-lead
Chief Executive Officer


Principal Software Engineer


Software Engineer


Co-inventor of the Transformer architecture, the foundation of modern LLMs.
Co-founder and former CEO of Character.AI, valued at $2.7B upon Google reacquisition.
VP Engineering and Co-lead of the Gemini AI project at Google DeepMind.
Inventor of Sparsely-gated Mixture of Experts (MoE) and Mesh-Tensorflow.
Noam Shazeer is a pivotal figure in the AI revolution, known for inventing core technologies that power large language models, including the Transformer architecture and Mixture of Experts. He recently returned to Google to co-lead the Gemini project after his startup, Character.AI, was reacquired in a deal valued at $2.7 billion.
27.3K
AI Pioneer
Transformer Inventor
Character.AI Founder