Deep Tech Point
first stop in your tech adventure

Temperature and Top_p in ChatGPT Text Generation: The Interplay

January 13, 2024 | AI

In natural language processing (NLP) and text generation, the two parameters known as temperature and top_p play a crucial role in determining the output of language models aka Generative Pre-trained Transformers (GPTs) – these two settings allow us to control the level of randomness, creativity, and coherence in the generated text. In this article, we will explore the relationship between temperature and top-p and experiment by assigning them random low and high values and observing the output GPT will generate.

Understanding Temperature and Top-p

Before we delve into relationship between GPT’s temperature and Top_p, let’s briefly review what these two settings represent:

Temperature

Temperature is a hyperparameter that affects the probability distribution of the tokens generated by the GPT model. When we adjust the temperature, we control the diversity and creativity of the generated text. A numerical value of temperature according to OpenAI documentation varies between 0 and 2 with a default value set to 1:

Temperature determines the creativity and diversity of the text generated by the GPT model:
A higher temperature value (e.g., 1.5) leads to more diverse and creative text, while a lower value (e.g., 0.4) results in more fact-focused, and deterministic text.

Mathematically, the temperature is incorporated into the softmax function, which converts the logits (raw output scores) produced by the GPT model into probabilities:
P(token_i) = exp(logits(token_i) / T) / Σ_j exp(logits(token_j) / T)

Where:

  1. P(token_i) is the probability of generating token i
  2. logits(token_i) is the logit (raw output score) for token i
  3. T is the temperature
  4. Σ_j is the sum over all tokens in the vocabulary

The temperature controls how much the model should take into account low-probability words when generating the next token in the sequence – when we control the temperature, we control a balance between the diversity and the quality of the generated text.

Top-p (Nucleus Sampling)

Top-p is a numerical parameter designed to control the diversity of generated text by restricting the selection of the next word. It can be set between 0 and 1.0:

Top-p Sampling parameter maintains a balance between diversity and high-probability words by selecting tokens from the top-p most probable tokens whose collective probability mass is greater than or equal to a threshold p. This kind of selection helps ensure that the output is both diverse and relevant to the given context.

Experimenting With Different Settings For Temperature and Top_p

Mixing temperature and top-p has a significant influence on text generation in terms of controlling the balance between creativity and coherence in the generated text. Let’s explore the various combinations of temperature and top-p and their effects for different prompts, such as: “Can you describe the process of photosynthesis?”, How is the Weather in Barcelona During May?”, “Who is Michael Jackson?”, and “Define exercise.”

Low Temperature and Low Top_p

With a low temperature there is no randomness in text generation and output becomes entirely deterministic, highly focused and predictable, as there is extremely low variability in word selection. By setting top_p to a low value we additionally restrict the model to choose only the single most probable word for each token.

When both temperature and top-p are set to very low, the outcome is text that lacks creativity, diversity, and spontaneity. It will closely resemble simple template-based responses or repetitive patterns. While this setting may be useful for certain applications where consistency and predictability are paramount, it can also produce robotic and unnatural-sounding text.

High Temperature and High Top_p

With a temperature of 1.0, we invite a sense of exploration and creativity into the text generation process – the model introduces randomness into word selection, allowing for unexpected and imaginative phrasings. This setting can lead to more diverse and playful text, making it ideal for generating creative content like poetry, storytelling, or brainstorming. And with a high top_p we allow the model to choose all possible words for the next token, leading to a broad range of options.

So, yes, when both temperature and top-p are set to high values, the generated text tends to be highly creative and unpredictable. This combination encourages the model to explore a wide range of word choices and sentence structures, leading to imaginative and diverse text. It may result in serendipitous and novel word combinations, which can be useful for brainstorming, creative writing, or generating playful content.

High Temperature and Low Top-p

When we combine high temperature (for example, temperature = 1.0) with a low top-p (top-p = 0.5) we strike a balance between creativity and coherence.
High temperature still encourages the creativity, but the low top-p restricts the model to consider only a subset of highly likely words, ensuring that the generated text maintains a degree of logical flow and coherence.
This combination is suitable for generating content that is both creative and structured, such as storytelling or content generation for marketing purposes.

Low Temperature and High Top-p

A low temperature (e.g., temperature = 0.2) with a high top-p (e.g., top-p = 0.8) emphasizes coherence and predictability in the generated text – it reduces randomness, resulting in deterministic text generation.
A high top-p value allows the model to choose from a broader range of words while still adhering to a structured context, making it suitable for generating informative or educational content.

Prompt Output Examples with Different Temperature and High Top_p Settings:

Prompt: Define exercise

In the following example, we asked ChatGPT to provide four examples of responses to the prompt “Define exercise” using different combinations of temperature and top_p settings:

Low Temperature: 0.2, Low Top_p: 0.2
Output: “Exercise is physical activity that promotes health.”

High Temperature: 1.5, Low Top_p: 0.2
Output: “Exercise, in its essence, can be described as a remarkable and invigorating undertaking that involves a diverse range of physical activities, all aimed at enhancing one’s overall well-being.”

High Temperature: 1.5, High Top_p: 0.9
Output: “Exercise, a multifaceted endeavor, encompasses a broad spectrum of physical activities, including but not limited to running, swimming, weightlifting, yoga, and dance. It plays a pivotal role in fostering both physical and mental health, promoting vitality and a sense of accomplishment.”

Low Temperature: 0.2, High Top_p: 0.9
Output: “Exercise is a holistic practice that involves various forms of physical activity, such as jogging, cycling, and strength training. It is instrumental in improving not only physical fitness but also mental well-being, offering a comprehensive approach to health enhancement.”

Prompt: Who is Michael Jackson?

In the following example we provide four outputs to the prompt “Who is Michael Jackson?” using different combinations of temperature and top_p settings:

Low Temperature: 0.2, Low Top_p: 0.2
Output: “Michael Jackson was a famous American musician and entertainer. He was known for his contributions to the music industry, particularly in the pop genre.”

High Temperature: 1.5, Low Top_p: 0.2
Output: “Michael Jackson, the legendary King of Pop, was an iconic American singer, songwriter, dancer, and performer. His unparalleled talent and groundbreaking contributions to music and dance revolutionized the entertainment industry.”

High Temperature: 1.5, High Top_p: 0.9
Output: “Michael Jackson, often referred to as the ‘King of Pop,’ was a transcendent American artist. His legacy encompasses a remarkable career spanning music, dance, and humanitarian efforts. Jackson’s immense influence extends globally, making him one of the most celebrated and influential figures in modern history.”

Low Temperature: 0.2, High Top_p: 0.9
Output: “Michael Jackson, an extraordinary American entertainer, was not just a musician but also an iconic performer and humanitarian. His unmatched contributions to the world of music and dance have left an indelible mark, making him a legendary figure whose influence is felt worldwide.”

Prompt: Who is Michael Jackson?

Bellow you’ll find four outputs to the prompt “How is the Weather in Barcelona During May?” using different combinations of temperature and top_p settings:

Low Temperature: 0.2, Low Top_p: 0.2
Output: “The weather in Barcelona during May is generally mild with pleasant temperatures and occasional rain.”

High Temperature: 1.5, Low Top_p: 0.2
Output: “May in Barcelona brings a delightful change as the city enjoys warmer and sunnier days, with temperatures averaging around 24°C to 27°C. It’s a perfect time to explore the beautiful outdoors.”

High Temperature: 1.5, Low Top_p: 0.9
Output: “Barcelona experiences fantastic weather in May, with temperatures soaring to around 25°C to 28°C on average. Expect lots of sunshine and perfect conditions for outdoor activities, beach visits, and soaking up the Mediterranean charm.”

Low Temperature: 0.2, High Top_p: 0.9
Output: “During May, Barcelona typically enjoys pleasant weather with temperatures ranging from 21°C to 24°C. It’s an excellent time to visit this vibrant city and take advantage of the sunny days for sightseeing and enjoying the beaches.”

Conclusion

The interplay between temperature and top-p in language generation models is a dynamic and versatile tool that allows us to fine-tune the output to suit different needs and preferences, and by understanding how these settings work together, we can craft text that ranges from precise and controlled to creative and exploratory. So, what have we learned?

Temperature:

  1. If the generated text is too random and lacks coherence, lower the temperature value.
  2. If the generated text is too focused and repetitive, increase the temperature value.

Top-p Sampling:

  1. If the generated text is too narrow in scope and lacks diversity, increase the probability threshold (p).
  2. If the generated text is too diverse and includes irrelevant words, decrease the probability threshold (p).

Low temperature and low top-p settings are useful when we need factual, concise, and contextually relevant information. This combination ensures a conservative and deterministic output, ideal for tasks like answering questions or providing straightforward explanations. As we can see from our experiment, outputs with low temperature and low top_p setting were the most factual and concise, without any irrelevant information and without any unnecessary words. No mumbo-jumbo.

On the other hand, high temperature and low top-p settings encourage models to generate more creative and varied responses. This can be valuable in scenarios where we want to stimulate creativity, generate diverse text, or inspire new ideas. And even more, when we push both temperature and top-p to higher values, we unleash the full creative potential of language models. This combination can produce imaginative and even speculative content, making it suitable for tasks like storytelling, brainstorming, and generating rich, diverse narratives. However, it may lack correct factual data.

Lastly, low temperature with high top-p strikes a balance between control and diversity. It helps ensure that the generated text adheres closely to provided prompts while still allowing for some variation and creativity within those constraints.

In essence, the judicious manipulation of temperature and top-p settings empowers us to harness the full potential of language models, adapting their outputs to the specific requirements of our tasks. By mastering this interplay, we can leverage these models effectively in a wide range of applications, from generating informative content to inspiring innovative ideas and narratives.

So, what have we learned? We can start with default values and then adjust them based on the quality of the generated text and the specific requirements of your application.