| Random Link ¯\_(ツ)_/¯ | ||
| Apr 6, 2025 | » | Building with LLMs
3 min; updated May 8, 2026
Emerging LLM App Stack. Credits: a16z.com Design Pattern: In-Context Learning Betting on the LLM’s context window increasing doesn’t pay off. As the input approaches the limits of the context window, inference time and accuracy degrade. Instead, the typical workflow of in-context learning is: Data Pre-processing/Embedding. Compute and store embeddings of the private data in a vector database. Prompt Construction/Retrieval. On user input, compile a prompt from a hard-coded template with few-shot examples, information retrieved from external APIs, and a set of relevant documents retrieved from the vector database.... |