Skip to content

Nagarjuna325/RAG-LLM-Chat-Bot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Used groq which is a open source platform which provides an inference for all the avilable LLM Models availble in the market. It uses LPU-Language processing Unit , which is a combination of hardware and software for delivering fast and effecitent results. It is faster than GPU because it has high bandwidth and memory. So we can use Groq Cloud,and create a API key to use all the LLM models available as open source.

we used session state variables which is used to store memory information.

Steps to run

  1. Clone the project
  2. pip install -r requirements.txt
  3. streamlit run app.py(filename)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages