

If I was Indian I would be so good at this
@fluffykittycat@furry.engineer
If I was Indian I would be so good at this
This shit shouldn’t have existed in the first place
It’s the generation speed. Internally LLMs use tokens which represent either words or parts of words and map them to integer values. The model then does it’s prediction on which integer is most likely to come after the input. How the words are split up is an implementation detail that can vary from model to model
They program it to do that on purpose so that Elon Musk can go to the median claim that it wasn’t using a full self driving mode before the Collision, it’s wrong but arguably technically correct and you would have to dig into it to learn the truth which is what he wants. We shouldn’t give him the benefit of the doubt because we know what he’s up to
One study suggested all that burn up is bad for the ozone letter though
I think there used to be a difference but nowadays they just slap an expensive label on white label goods