T
19

I finally got my small language model to run locally on a 5-year-old laptop.

It only took 4 minutes to generate a full paragraph, which shocked me because I thought it needed a new GPU. Has anyone else gotten decent results on older hardware?
2 comments

Log in to join the discussion

Log In
2 Comments
lucas551
lucas55125d ago
Remember running a basic chatbot on my old desktop, it felt like watching paint dry. Makes you wonder what else we've been told needs new hardware when it really doesn't. That old tech can still surprise you sometimes.
3
ross.river
ross.river25d ago
My 2012 laptop still runs a local LLM just fine.
6