Back to Obstacles

Limited Focus

Obstacle

Description

LLMs have limited attention. Everything you load into context competes for that attention.

When too much is loaded at once, the model either:

  • Dilutes attention across everything (stays shallow)
  • Fixates on the wrong parts (misses what matters)

Impact

  • Worse performance on all tasks when context is too broad
  • Even explicit ground rules get ignored
  • A longer, focused context outperforms a shorter, scattered one
Documented by