Attention Is All You Need, Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, Illia Polosukhin, 2017Advances in Neural Information Processing Systems, Vol. 30 (Curran Associates, Inc.) - Introduces the Transformer architecture, which forms the basis for Large Language Models and their use of attention mechanisms over input sequences, defining the conceptual context window.
OpenAI Platform - Models, OpenAI, 2024 - Official documentation detailing the available Large Language Models, their respective context window sizes (in tokens), and other API limits, which are crucial for managing application reliability and cost.