13
Can we talk about hitting 10,000 tokens in a single training run?
I was just trying to fine-tune a small model on my own data and it processed way more than I thought it could handle. Has anyone else pushed their local setup past what you thought was its limit?
3 comments
Log in to join the discussion
Log In3 Comments
jessicaw116d ago
My old 3060 handled a 15k token batch last week, which totally blew my mind. I guess these cards have more headroom than we give them credit for.
3
mary2395d ago
Wow, @jessicaw11, I got my 3060 to do that once by turning the fans way up.
4
elizabetht567h ago
Honestly I always thought the 3060 was pretty much maxed out. But seeing it handle that many tokens, maybe I was wrong.
5