T
13

Can we talk about hitting 10,000 tokens in a single training run?

I was just trying to fine-tune a small model on my own data and it processed way more than I thought it could handle. Has anyone else pushed their local setup past what you thought was its limit?
3 comments

Log in to join the discussion

Log In
3 Comments
jessicaw11
My old 3060 handled a 15k token batch last week, which totally blew my mind. I guess these cards have more headroom than we give them credit for.
3
mary239
mary2395d ago
Wow, @jessicaw11, I got my 3060 to do that once by turning the fans way up.
4
elizabetht56
Honestly I always thought the 3060 was pretty much maxed out. But seeing it handle that many tokens, maybe I was wrong.
5