The recent explosion on the scene of Generative AI and Large Language Models (LLMs) is creating a vast array of opportunities. There is more than one way to leverage this recent breakthrough. You can create a chatbot by directly “plugging” to a LLM and start asking questions. However, LLMs are limited to the information available in the data they have seen during their training. So how to build a chatbot that will be able to answer questions related to documents that the model has never seen. One solution is to use a technique called “Retrieval Augmented Generation” (RAG). After an introduction of Generative AI and transformers, we will go through the RAG logic and explore its benefits and potential drawbacks. The talk is intended for anyone interested in generative AI and its potential applications.
- Maxime Huvet Lead data scientist
Practical applications of Generative AI: building a chatbot answering questions about your documents.
Online webinar
Tuesday 13th August 2024,
1:00 pm - 2:00 pm BST
1:00 pm - 2:00 pm BST