LLM Application - Q/A Assistant

less than 1 minute read

Published:

In this blogpost, I created an interactive command line Q&A assistant application that uses an Ollama LLM, vector database to answer user questions through a retriever augmented generation (RAG) approach

Medium Blog Link