Generative Question & Answer

Build retrieval enhanced generative QA systems with Konko.

Large Language Models (LLMs) have a data freshness problem. The most powerful LLMs in the world, like Llama 2, have no idea about recent world events.

The world of LLMs is frozen in time. Their world exists as a static snapshot of the world as it was within their training data.

A solution to this problem is retrieval augmentation. The idea behind this is that we retrieve relevant information from an external knowledge base and give that information to our LLM. In this notebook we will learn how to do that with Konko endpoint.

πŸ“Œ Code Notebook