All terms · Workflows & Patterns

Chain-of-Thought Prompting

A prompting technique that asks an AI to explain its reasoning step-by-step before giving a final answer.

Chain-of-thought (CoT) is a simple but powerful prompting technique. Instead of asking for just the answer, you ask the model to explain its reasoning:

Weak: "Is 17 * 24 = 408?" Strong: "Is 17 * 24 = 408? Let's work through this step by step."

With CoT, the model generates intermediate reasoning, which: - Catches errors (the model can self-correct during reasoning). - Improves complex reasoning (math, logic, code). - Makes outputs interpretable (you see the model's logic).

CoT is most effective for tasks requiring multi-step reasoning. For factual questions, it's less impactful. The technique is sometimes called "thinking out loud."

Variants include zero-shot CoT ("Let's think step by step..."), few-shot CoT (showing examples), and more advanced methods like self-consistency (sampling multiple reasoning paths and voting on answers).

Example

CoT for math: "Q: If you have 3 apples and 2 oranges, how many pieces of fruit do you have? Let's think step by step. We have 3 + 2 = 5 pieces of fruit."