Culture Magazine

What Happens in Next-token Generation in an LLM During Inference?

By Bbenzon @bbenzon

“That which gives rise to the next token” is a space far exceeding the scope of next token itself:
- What’s in output so far
- Embedding space of 1000s dimensions
- Dynamics of “semantic motion”(@stephen_wolfram) in subconceptual space of 100s billions parameters

— Charles Wang (@charleswangb) February 21, 2023

We need a theory for such “dynamics of subsemantics-subconceptual motion” such that it can afford us ideas similar to or better than transformers and higher-order programs running on them (eg RLRF, CAI, ICL, CoT).

— Charles Wang (@charleswangb) February 21, 2023

See my recent post: The idea that ChatGPT is simply “predicting” the next word is, at best, misleading.


Back to Featured Articles on Logo Paperblog