A small local research agent that runs on your machine using LangChain + Ollama. It can search Wikipedia, DuckDuckGo, Arxiv, Crossref, Europe PMC, Open Library, check Unpaywall OA links, do quick math, keep notes in a local ChromaDB, pull in fresh news via AskNews, search Google via SerpAPI, search via SearxNG, and append segments to ongoing Markdown reports.
- Local LLM chat (Ollama)
- Tools: wiki, duckduckgo, arxiv, crossref, europe_pmc, openlibrary, unpaywall (OA lookup), math, date
- Extras: serp_search (Google via SerpAPI), searx_search (SearxNG), save_md_plus (append to report)
- Memory: stores/retrieves snippets in a local ChromaDB folder
- Optional: AskNews for recent events (needs free API creds)
- Utilities: save markdown locally (
save_md_locally) and summarize long text (summarize_text)
-
Prereqs
- macOS or Linux (Windows WSL is fine too)
- Python 3.10+
- Ollama installed and running
-
Install Ollama
-
macOS (Homebrew):
brew install ollama ollama serve
-
Or grab the app: https://ollama.com
-
-
Pull the models this project expects
-
Chat model used in
agent.py:ollama pull gpt-oss:120b-cloud
-
Embedding model used for memory in
retriever.py:ollama pull nomic-embed-text
Note: If
gpt-oss:120b-cloudisn’t available on your system, you can swap it for a common model likellama3:8borqwen2:7bby editingagent.py:llm = ChatOllama(model="llama3:8b", temperature=0.7)
-
-
Set up the project
python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt -
(Optional, but recommended) Add a .env for AskNews, polite headers, and optional web search tools
The agent imports the AskNews tool on startup. To avoid errors, add your free AskNews keys:
# .env ASKNEWS_CLIENT_ID=your_client_id ASKNEWS_CLIENT_SECRET=your_client_secret
Don’t want news? Temporarily remove
asknews_searchfrom thetools=[...]list inagent.pyand the importfrom tools.getNews import asknews_search.(Optional) To enable SerpAPI (Google) search, also add:
SERPAPI_API_KEY=your_serpapi_key
Don’t want SerpAPI? Remove
serp_searchfrom thetools=[...]list and the importfrom tools.serpSearch import serp_search.(Optional) To enable SearxNG (self-hosted/private metasearch), add:
SEARX_INSTANCE_URL=https://your-searx-instance SEARX_TOP_K_RESULTS=5
Don’t want SearxNG? Remove
searx_searchfrom thetools=[...]list and the importfrom tools.SearxNG import searx_search.(Optional but polite) To include an email contact in headers for Crossref/Unpaywall and satisfy Unpaywall's required email parameter:
CONTACT_EMAIL=[email protected] # used in User-Agent UNPAYWALL_EMAIL=[email protected] # used by unpaywall_lookup if inline email not provided
-
Run it
python agent.py
You’ll see a prompt:
Ask Kurama 🦊Kurama will Analyze -> Prompt -> Research -> Reason -> Gather Info -> Use Tools -> Save Report -> Preview Report
- Vector store lives in
./chromadb_store(created automatically) - You can wipe it by deleting that folder if you want a clean slate
- Markdown reports saved via the tool are in
./LocalStore(created automatically)
There are a few quick tests under tests/.
pip install -U pytest
pytest -k "not asknews" # skip news tests if you didn’t set .env- “model not found” → make sure you ran
ollama pull gpt-oss:120b-cloudandollama pull nomic-embed-text. Ifgpt-oss:120b-cloudisn’t available, switch the model inagent.pyto something you have (e.g.,llama3:8b). - “AskNews Error / Missing env” → add
ASKNEWS_CLIENT_IDandASKNEWS_CLIENT_SECRETto your.envor remove the AskNews tool from the agent. - “Ollama not running” → start it with
ollama serve. On macOS, the app can also run a background service. - “Chroma DB issues” → delete the
chromadb_storefolder and try again.
agent.py– creates the agent and wires up toolsretriever.py– embeddings + ChromaDB add/querytools/– wiki, duckduckgo, serp_search (SerpAPI), searx_search (SearxNG), arxiv, crossref, europe_pmc, openlibrary, unpaywall, math, date, asknews, save_md, save_md_plus, summarize_textprompts/research_prompt.py– the system prompt
That’s it. Keep it simple, keep it local, have fun.





