ChatGPT recently broke the internet with an absurdly detailed 100 step plan to open a jar of peanut butter, and it's hilariously revealing about how AI models approach simple tasks when you don't give them constraints.
What Makes the 100 Step Plan So Funny?
The viral Reddit post showcasing this interaction perfectly captures what happens when you ask an AI to be "thorough" without setting boundaries. Instead of saying "twist the lid counterclockwise," you get steps like:
- Step 1: Acknowledge the need to access peanut butter
- Step 2: Locate the jar in your pantry
- Step 3: Assess the jar's position relative to other items
- Step 4: Plan your approach vector
- Step 5: Extend your dominant hand
You get the idea. What should take 3 seconds becomes a military operation.
Why AI Models Overthink Simple Tasks
This 100 step plan to open a jar of peanut butter isn't a bug—it's a feature of how large language models work. When you ask for a "plan" without parameters, AI defaults to maximum detail because:
It has no common sense filter. Humans naturally know that "open a jar" doesn't require documenting every muscle movement. AI doesn't have that contextual understanding of what level of detail is socially appropriate.
It optimizes for completeness. Language models are trained to be helpful and thorough. Without constraints, they assume more detail equals better output.
It lacks real-world experience. You've opened thousands of jars. The AI has never opened one. It's theorizing from text descriptions, not lived experience.
What This Teaches Us About Workflow Design
Here's where it gets interesting: the peanut butter plan is funny, but it's also a perfect metaphor for bad process documentation in businesses.
How many times have you encountered:
- Employee onboarding documents that take 40 pages to explain how to request time off
- Standard operating procedures that break down every click in excruciating detail
- Project plans that have 15 steps where 3 would do
The Right Level of Detail
Good workflow documentation should match the complexity of the task and the expertise of your audience:
For simple tasks: High-level steps only. "Open jar, spread peanut butter, close jar."
For complex tasks: Break it down, but focus on decision points and potential failure modes, not obvious micro-actions.
For training beginners: More detail is appropriate, but organize it so experts can skip ahead.
How to Get Better Output from AI Tools
The 100 step plan to open a jar of peanut butter phenomenon shows why prompt engineering matters. Here's how to avoid getting encyclopedic responses when you need practical ones:
Set Clear Constraints
Instead of: "Give me a plan to open a jar of peanut butter"
Try: "Give me a 3-step process to open a jar of peanut butter"
The word count or step limit forces the AI to prioritize.
Specify Your Audience
"Explain how to open a jar of peanut butter for someone who has opened jars before" produces drastically different output than asking for a comprehensive tutorial.
Define the Purpose
Are you looking for:
- Quick reference steps?
- Troubleshooting guidance for stuck lids?
- Accessibility adaptations for people with limited grip strength?
Your purpose should shape the response length and focus.
Use Examples
Show the AI what "good" looks like: "Write a brief guide to opening a jar, similar to instructions you'd find on a product label—just the essentials."
The Productivity Paradox
The viral nature of this 100 step plan to open a jar of peanut butter reveals something about our relationship with productivity culture. We're simultaneously:
Obsessed with optimization. We want systems, frameworks, and step-by-step plans for everything.
Exhausted by over-complication. We recognize that sometimes 100 steps is 97 steps too many.
The sweet spot? Intentional complexity. Add steps where they prevent errors or improve outcomes. Remove them everywhere else.
Real Situations That Need Detailed Plans
Not every task deserves the 100-step treatment, but some genuinely do:
- High-stakes procedures: Surgery checklists, aircraft pre-flight checks, nuclear plant protocols
- Infrequent complex tasks: Annual tax filing, disaster recovery procedures, product launches
- Knowledge transfer: When an expert is documenting their process for someone with zero context
- Compliance requirements: When regulations demand documented evidence of each action
The difference? These scenarios have real consequences for missing steps. Peanut butter does not.
Creating Your Own "Right-Sized" Plans
Whether you're building workflows in Flowi or just trying to document your team's processes, use this framework:
The Three-Tier Test
Tier 1: Can an expert do this from memory? If yes, you need a checklist, not a manual. 3-7 high-level steps maximum.
Tier 2: Does this require decisions or judgment? Document the decision criteria and branch points. Maybe 10-20 steps with conditional logic.
Tier 3: Is this genuinely complex with many dependencies? Now you can justify detailed documentation, but organize it hierarchically so people can zoom to their level.
Test It
Hand your process document to someone and watch them try to follow it. Where do they get confused? Where do they roll their eyes at obvious over-explanation? That's your feedback.
The Takeaway
The 100 step plan to open a jar of peanut butter is comedy gold precisely because it violates our expectations about proportional response. It's a $1 problem getting a $1000 solution.
Before you create your next workflow, process document, or AI prompt, ask yourself: Am I solving for the right level of complexity? More steps don't make a plan better—they make it harder to follow.
When you're building workflows in Flowi, start minimal. Add complexity only where it prevents actual problems. Your team will thank you, and you'll avoid becoming the real-world equivalent of a viral AI meme.
Your next step: Review one process document you created in the last month. Could you cut it in half without losing anything important? Try it.