What does it mean for Google Gemini 1.5 Pro to come with a 1 million token context window?

337 viewsOtherTechnology

Today, Google announced the release of Gemini 1.5 Pro, its next gen LLM.

Sundar Pichai posted: “Gemini 1.5 Pro, our mid-sized model, will soon come standard with a 128K-token context window, but starting today, developers + customers can sign up for the limited Private Preview to try out 1.5 Pro with a groundbreaking and experimental 1 million token context window!”

What does it mean to have 1 million token context window and how does it compare with the previous Gemini 1.0 Pro and OpenAI’s GPT 4.0?

In: Technology

3 Answers

Anonymous 0 Comments

Large language models convert your input into a string of tokens which is what it actually processes. Each token can represent a combination of entire words, word fragments, or single letters depending on the implementation and context. Think of the tokens as the vocabulary of the LLM, if you type the word ‘red’ then it might convert it into a token that represents the word red, rather than ingesting the letters r-e-d individually.

You are viewing 1 out of 3 answers, click here to view all answers.