Back to Glossary

Temperature

A setting that controls how creative or deterministic an AI model's output is. Lower values give predictable answers; higher values increase randomness.

Temperature is a parameter you pass to an LLM that controls the randomness of its output. At temperature 0, the model always picks the most likely next token, producing deterministic, predictable text. At higher temperatures (0.7-1.0+), the model samples from a wider distribution, producing more varied and creative output.

For code generation, lower temperatures (0-0.3) are usually better because you want correct, consistent code. For creative tasks like brainstorming, writing marketing copy, or generating ideas, higher temperatures (0.7-1.0) produce more diverse results.

Most vibe coding tools handle temperature automatically. Cursor and Claude Code use low temperatures for code edits and higher ones for open-ended chat. When building AI features in your own apps, temperature is one of the first knobs to tune.

Related Courses

Links open the course details directly on the Courses page.