Noam Shazeer

Co-inventor of the transformer and co-founder of Character.AI

  • Co-authored "Attention Is All You Need" (2017), introducing the transformer architecture.
  • Spent nearly two decades at Google contributing to foundational AI research, including early work on large-scale language models.
  • Co-founded Character.AI in 2021, a conversational AI platform, and later returned to Google in a notable acquisition deal.

Noam Shazeer (born c. 1976) is an American computer scientist and entrepreneur who was one of Google’s most prolific AI researchers. He co-authored the transformer paper and contributed to numerous advances in large-scale language modeling during his long tenure at Google. He co-founded Character.AI in 2021 before returning to Google DeepMind.

Milestones

  • The Architecture That Ate AI
    The Deep Learning Revolution Research
    The Architecture That Ate AI

    A team at Google introduced the Transformer, a deceptively simple attention-based model that would become the foundation of virtually every major AI breakthrough that followed.

    2017