Temperature
LLM & Language ModelsA setting that controls how random or creative an AI's responses are. Low temperature = predictable and focused. High temperature = diverse and creative.
Temperature is the most important generation parameter you can adjust. It ranges from 0 to 2 (typically), controlling the randomness of the model's token selection. At temperature 0, the model always picks the highest-probability next token (deterministic). At higher temperatures, it considers lower-probability tokens too (creative but riskier).
Practical guidelines: use low temperature (0-0.3) for factual tasks, code generation, data extraction, and anything where consistency matters. Use medium temperature (0.5-0.7) for general writing and conversation. Use high temperature (0.8-1.2) for creative writing, brainstorming, and when you want surprising outputs.
Temperature interacts with other parameters like top_p (nucleus sampling) and frequency_penalty. Most AI applications set these behind the scenes, but understanding temperature helps you use AI tools with adjustable settings more effectively.
Real-World Example
When ChatGPT gives you a bland, predictable answer — the temperature is probably set low. When it gives wildly creative but occasionally nonsensical output — temperature is high.
Related Terms
Try AI Summarizer
Condense long articles, papers, and reports into clear, concise summaries in seconds.
Try FreePut this concept to work
Once the definition is clear, the next useful move is to try a focused tool flow instead of bouncing through more glossary pages.
Open the summarizer routeFAQ
What is Temperature?
A setting that controls how random or creative an AI's responses are. Low temperature = predictable and focused. High temperature = diverse and creative.
How is Temperature used in practice?
When ChatGPT gives you a bland, predictable answer — the temperature is probably set low. When it gives wildly creative but occasionally nonsensical output — temperature is high.
What concepts are related to Temperature?
Key related concepts include Top-p (Nucleus Sampling), Token, Temperature, LLM (Large Language Model). Understanding these together gives a more complete picture of how Temperature fits into the AI landscape.