Peter Coates
1 min readFeb 5, 2025

--

DeepSeek is an awesome achievement, but I think people are missing something critical about it. It doesn't replace the heavy weight LLM computations. It rides on them. It isn't trained on the sources, but on the pre-digested weights of the big models. So I think people are missing the point of what it. GPT and similar vast models (or their equivalent) won't go away in any plausible near future. If they did, DeepSeek and the 100 versions of it that are sure to appear immediately, would have nothing to feed on. What they do, however, is change the economics. Already, you can run DeepSeek on a freakin' laptop! Making AI that easy to deliver is huge, but the very fact that the Chinese team created it so cheaply indicates (and I mean no disrespect to them) that it is provably not hard to do. The big boys and girls will do the same, but better, as they control the large LLM's that support DeepSeek. I suspect the Nvidia and it's peers will be huge winners in this, because they make the increasingly dirt cheap hardware that can run a capable LLM in something as cheap as a toy.

--

--

Peter Coates
Peter Coates

Written by Peter Coates

I was an artist until my thirties when I discovered computers and jumped ship for a few decades. Now I'm back to it. You can probably find some on instagram.

No responses yet