← All creators
S

Sebastian Raschka

77 recommendations

All recommendations

NVIDIA GPUs uses product
"Even back when I was a grad student, I was in a lab doing biophysical simulations, molecular dynamics, and we had a Tesla GPU back then just for the computations. It was about 15 years ago now."
Claude Code uses software
"I try Claude Code on the web every three to six months, which is just prompting a model to make an update to some GitHub repository that I have"
Tesla GPU uses product
"We had a Tesla GPU back then just for the computations. It was about 15 years ago now"
Zelda uses other
"Sometimes for pastime I play video games, like I like- Video games with puzzles, like Zelda and Metroid."
Metroid uses other
"Sometimes for pastime I play video games, like I like- Video games with puzzles, like Zelda and Metroid."
VS Code uses software
"So, I use the Codeium plugin for VS Code."
Codeium uses software
"So, I use the Codeium plugin for VS Code. You know, it's very convenient. It's just like a plugin, and then it's a chat interface that has access to your repository."
OLMo uses software
"What I would recommend doing, or what I also do, is if I want to understand, for example, how OLMo is implemented, I would look at the weights in the model hub, the config file, and then you can see, 'Oh, they used so many layers.'"
Qwen 3 uses software
"I can give you also a hands-on example. I was training the Qwen 3 base model with RLVR on MATH-500. The base model had an accuracy of about 15%. Just 50 steps, like in a few minutes with RLVR, the model went from 15% to 50% accuracy."
OLMo 3 uses software
"Sometimes it takes me a day. With OLMo 3, the challenge was RoPE for the position embeddings. They had a YaRN extension and there was some custom scaling there, and I couldn't quite match these things."
MATH-500 uses other
"I was training the Qwen 3 base model with RLVR on MATH-500. The base model had an accuracy of about 15%."
Pulse uses software
"I used that feature before, and I always feel bad because it does that every day, and I rarely check it out"
Exa uses software
"Exa is my preferred search provider"
Cursor Composer uses software
"I should say I use Composer a lot because one of the benefits it has is that it's fast"
Recursive Language Model recommends technique
"The Recursive Language Model paper, that is one of the papers that tries to kind of address the long context thing"
ChatGPT recommends software
"So I suggested, 'Hey, let's try ChatGPT.' We copied the text into ChatGPT, and it fixed them. Instead of two hours going from link to link fixing that, it made that type of work much more seamless."
SGLang mentions software
"even Transformers, the library, is not used in production. People use SGLang or vLLM, and it adds another layer of complexity."
vLLM mentions software
"even Transformers, the library, is not used in production. People use SGLang or vLLM, and it adds another layer of complexity."
GPT-2 mentions software
"And then you start, let's say, with your GPT-2 model and add these things."
RoPE mentions technique
"With OLMo 3, the challenge was RoPE for the position embeddings. They had a YaRN extension and there was some custom scaling there, and I couldn't quite match these things."
YaRN mentions technique
"They had a YaRN extension and there was some custom scaling there, and I couldn't quite match these things."
LoRA mentions technique
"For the character training thing, I think this research is built on fine-tuning about 7 billion parameter models with LoRA, which is essentially only fine-tuning a small subset of the weights of the model."
Stable Diffusion mentions software
"And listeners may know diffusion models from image generation, like Stable Diffusion popularized it."
GANs mentions technique
"There was a paper on generating images. Back then, people used GANs, Generative Adversarial Networks."
BERT mentions software
"It's kind of similar to the BERT models by Google. Like, when you go back to the original transformer, they were the encoder and the decoder."
Gemini Diffusion mentions software
"But there was an announcement by Google, a site where they said they are launching Gemini Diffusion, and they put it into context of their Gemini Nano 2 model, and they said basically: for the same quality on most benchmarks, we can generate things much faster."
Gemini Nano 2 mentions software
"they put it into context of their Gemini Nano 2 model"
Apple Foundation Models mentions software
"Like what Apple tried to do with the Apple Foundation models, putting them on the phone, where they learn from experience."
LoRA adapters mentions technique
"One thing people still use is LoRA adapters. These are basically, instead of updating the whole weight matrix, there are two smaller weight matrices"
Nemotron 3 mentions software
"With Nemotron 3, they found a good ratio of how many attention layers do you need for the global information compared to having these compressed states"
DeepSeek-V3.2 mentions software
"DeepSeek-V3.2, where they had a sparse attention mechanism where they have essentially a very efficient, small, lightweight indexer"
World Models mentions technique
"There was a paper by Meta, a paper called World Models. So where they basically apply the concept of world models to LLMs again"
CASP mentions other
"There is a competition called CASP, I think, where they do protein structure prediction"
AlphaFold mentions software
"AlphaFold, when it came out, it crushed this benchmark"
RTX mentions technique
"There's some work in this area like RTX, I think it was a few years ago, where people are starting to do that"
Hugging Face mentions software
"I think when I was at Hugging Face, I was trying to get this to happen, but it was too early. It's like these open robotic models on Hugging Face"
AI2027 report mentions media
"I don't know if you like the originally titled AI27 report. They focus more on code and research taste, so the target there is the superhuman coder"
Harmonic mentions software
"I think there are startups—maybe Harmonic is one—where they're going all in on language models plus Lean for math"
Lean mentions software
"language models plus Lean for math"
Memory mentions software
"We talked about Memory, which saves across chats. Its first implementation is kind of odd, where it'll mention my dog's name or something in a chat"
Slack mentions software
"You want to add a new tab in Slack that you want to use, and I think AI will be able to do that pretty well"
Microsoft Word mentions software
"take something like Slack or Microsoft Word. I think if organizations allow it, AI could very easily implement features end-to-end"
Transformer mentions technique
"The word 'transformer' could still be known. I would guess that deep learning is definitely still known, but the transformer might be evolved away from in 100 years with AGI researchers everywhere."
ChatGPT memory feature mentions software
"ChatGPT has a memory feature, right? And so you may have a subscription and you use it for personal stuff, but I don't know if you want to use that same thing at work."
Mistral AI mentions software
"Let's throw in Mistral AI, Gemma..."
Gamma mentions software
"Let's throw in Mistral AI, Gemma..."
gpt-oss-120b mentions software
"gpt-oss, the open weight model by OpenAI... gpt-oss-120b is actually a very strong model and does some things that other models don't do very well."
NVIDIA Nemotron 3 mentions software
"Actually, NVIDIA had a really cool one, Nemotron 3."
Substack mentions software
"For example, if you read a Substack article, I could maybe ask an LLM to give me opinions on that, but I wouldn't even know what to ask."
Bing Sydney mentions software
"I would love to have tried Bing Sydney. Did that have more voice? Because it would so often go off the rails, which is historically obviously a scary way—like telling a reporter to leave his wife—is a crazy model to potentially put in general adoption."
GPT-4o mentions software
"There was a lot of backlash last year with GPT-4o getting removed, and I've personally never used the model, but I've talked to people at OpenAI where they get emails from users that might be detecting subtle differences in the deployments in the middle of the night."
TikTok mentions software
"We see this with TikTok. You open it... I don't use TikTok, but supposedly in five minutes the algorithm gets you. It's locked in."
Anthropic mentions software
"A lot of researchers at these companies are so well-motivated, and definitely Anthropic and OpenAI culturally want to do good for the world."
OpenAI mentions software
"A lot of researchers at these companies are so well-motivated, and definitely Anthropic and OpenAI culturally want to do good for the world."
Spotify mentions software
"my wife the other day—she has a podcast for book discussions, a book club, and she was transferring the show notes from Spotify to YouTube, and then the links somehow broke."
YouTube mentions software
"my wife the other day—she has a podcast for book discussions, a book club, and she was transferring the show notes from Spotify to YouTube, and then the links somehow broke."
MMLU mentions other
"even something simpler like MMLU, which is a multiple-choice benchmark. If you just change the format slightly, like, I don't know, if you use a dot instead of a parenthesis or something like that, the model accuracy will vastly differ."
Hugging Face Transformers mentions software
"When you code these from scratch, you can take an existing model from the Hugging Face Transformers library. The library is great, but if you want to learn about LLMs, it's not the best place to start because the code is so complex to fit so many use cases."
ChatGPT mentions software
"But I do still think that eventually, something like ChatGPT would have happened and a build-out like this would have happened, but it probably would not have been as fast."
AlexNet mentions technique
"I think it only happened because you could purchase those GPUs."
Reflection AI mentions other
"We hear about Reflection AI, where they say their two billion dollar fundraise is dedicated to building US open models"
Black Forest Labs mentions software
"They're signing licensing deals with Black Forest Labs, which is an image generation company"
Midjourney mentions software
"signing licensing deals with Black Forest Labs, which is an image generation company, or Midjourney"
Groq mentions product
"We are starting to see some types of consolidation with Groq for $20 billion"
Scale AI mentions software
"Scale AI for almost $30 billion and countless other deals like this"
Perplexity mentions software
"I think there will be some other multi-billion dollar acquisitions, like Perplexity"
Vera Rubin mentions product
"That's why part of what Vera Rubin is- where they have a new chip with no high-bandwidth memory, which is one of the most expensive pieces"
TPUs mentions product
"Like, Google obviously can make TPUs"
CUDA mentions software
"The moat of NVIDIA is probably not just the GPU. It's more like the CUDA ecosystem, and that has evolved over two decades"
AI2027 report mentions media
"We should say that the AI27 report kinda predicts one of the things it does from a narrative perspective is that there will be a lot of centralization."
Groq mentions product
"That's supposed to be the point of the Groq acquisition."
Google TPUs mentions product
"Like, Google obviously can make TPUs."
ADAM Project created other
"The ADAM Project started as me calling it the American DeepSeek Project"
"The goal with the book is basically to understand how the LLM works. It's not going to be a production-level LLM, but once you have that, you can understand the production-level LLM."
MLxtend created software
"I have a repository called MLxtend that I developed as a student around 10 years ago, and it is a reasonably popular library still for certain algorithms, I think especially like frequent data mining stuff."