← Back
BERT
software
1 mention from 1 sources
Bidirectional Encoder Representations from Transformers - Google's language model that revolutionized NLP through bidirectional training.
1
sources
Mentioned by
All mentions
"It's kind of similar to the BERT models by Google. Like, when you go back to the original transformer, they were the encoder and the decoder."
From:
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
•
▶ 2:30:23
•
Jan 2026
Attribution: Sebastian uses BERT as an analogy to explain how text diffusion models work with parallel token generation