I’ve been chatting with AI a lot lately. And I mean a lot.
Sometimes I find myself scrolling back through endless messages, trying to find where we left off on a particular topic.
And sometimes I get responses that seem totally off-base, as if the AI has forgotten what we were talking about.
Sound familiar?
If you’re diving into the world of large language models (LLMs) like ChatGPT or Claude, you might’ve run into similar issues. Today, I want to talk about why this happens and how you can get the most out of your AI chats. It all comes down to something called the context window.
What’s a context window?
The context window is like the AI’s short-term memory.
It’s the amount of conversation the AI can “see” and use to form its responses. When you chat with an AI, it’s not remembering everything you’ve ever said to it. It’s working with a limited snapshot of your conversation.
Tthis window isn’t infinite. It’s measured in tokens (roughly, pieces of words), and once you hit that limit, older parts of the conversation start falling out of view.
Why does this matter?
- Relevance: The AI can only respond based on what it “sees” in the context window. If important information falls out, its answers might become less relevant.
- Hallucinations: When the AI loses context, it might start making things up to fill in the gaps. We call this hallucinating, and it can lead to some pretty bizarre responses.
- Efficiency: A cluttered context window full of old, irrelevant chat can make it harder for the AI to focus on your current question.
Different models, different windows
Not all AIs are created equal. Some have larger context windows than others:
- GPT-3.5: About 4,000 tokens (roughly 3,000 words)
- GPT-4: Up to 32,000 tokens (about 24,000 words)
- Gemini 1.5: 1 million tokens
But bigger isn’t always better. A larger window means more potential for confusion if you’re not managing your conversations well.
When should you start a new chat?
- Changing topics: If you’re switching gears to a new subject, start fresh.
- Long conversations: If you’ve been chatting for a while, consider a reset.
- Getting odd responses: If the AI seems confused, it might be losing context.
- Complex tasks: For big projects, break them into smaller, focused chats.
Practical tips for better AI chats
- Be specific: Clear, focused questions help the AI understand what you need.
- Summarise: If you’re in a long chat, briefly recap the important points.
- Edit: Remove irrelevant parts of the conversation if you’re reusing a chat.
- Start fresh: Don’t be afraid to begin a new chat. It’s often more efficient.
What’s in it for you?
By understanding how context windows work, you can:
- Get more accurate and relevant responses
- Avoid frustrating misunderstandings
- Save time by structuring your chats more effectively
- Make the most of the AI’s capabilities
The bottom line
AI is a powerful tool, but it’s not magic. Understanding its limitations helps you use it more effectively. Next time you’re deep in a chat and things start to go off the rails, remember the context window. A fresh start might be just what you need to get back on track.
And remember, practice makes perfect. The more you chat with AI, the better you’ll get at managing these conversations. So go on, start a new chat, and see the difference for yourself.