I Just Spent 2 Hours Creating Seinfeld Jokes. Why Is It Forgetting Stuff?
Remember tokens from earlier? ChatGPT has a limited amount of memory and it's memory is measured in *drum roll please*:
Tokens!
The ChatGPT website can remember a conversation of 8000 tokens. It used to be 4000 tokens and then they upgraded it to 8000 tokens.
Think of tokens as memory (RAM) from your old computer. The more tokens a model has, the more it can remember!
It gets more interesting if you pay for ChatGPT Plus. The
gpt-3.5-turbo
model
is trained on 1.3 billion
parameters
bla bla bla
datasets.
I know, I know I lost you there. But there's more to ChatGPT than meets the eye.
Here's what you need to know:
What's a Model? It's like a brain. ChatGPT's brain is called "gpt-3.5-turbo", and you can get an even better brain called "gpt-4" if you pay 30$ a month!
Models have Tokens!
"gpt-3.5-turbo" has 4000 to 8000 tokens.
"gpt-4" has 8000 tokens.
"gpt-4-32k" has 32 000 tokens.
"claude-100k" has 100 000 tokens.
These Models are trained on "parameters"
Really, another buzz word? Yup!
Models like "gpt-3.5-turbo" are given vast amounts of text data (books or internet data). The parameters are information it has gathered from that data. These parameters enable the model to understand different topics, grammar, and context, similar to how reading books and browsing the internet expand our understanding.
Tell me more about the evolution of chatGPT!
"gpt-1" - June 2018 - 117 million parameters
"gpt-2" - February 2019 - 1.5 billion parameters
"gpt-3" - May 2020 - 175 billion parameters
"gpt-3.5-turbo" - January 2022 - 1.5 billion parameters
"gpt-4" - March 2023 - 170 trillion parameters
Last updated