Generated Knowledge
Description
Generated Knowledge Prompting enhances the ability of language models to provide accurate predictions by generating relevant knowledge beforehand. This technique improves tasks that require commonsense reasoning by integrating additional context.
Methodology:
- Knowledge Generation: First, the model generates specific knowledge based on prompts. For example, providing detailed information about subjects such as geography, biology, or common misconceptions.
- Integration with Queries: Next, the generated knowledge is formatted into a question-and-answer style to guide the model's responses.