Eleven study modes across three categories. Runs entirely on your machine via Ollama. Optional BYOK for cloud models like Grok. Your facts, drafts, and uploaded outlines never leave your computer unless you explicitly opt in.
Pick an area of law (Contracts, Torts, ConLaw, Crim, BizAssoc, ...). Then choose what kind of work you want to do.
Paste a fact pattern or court opinion → get a structured response.
Drills, multiple choice, essay grading, Socratic dialogue. Most modes include a ⚡ Generate hypo button so you don't need to bring your own fact pattern — pick area, pick topic (Agency, Negligence, 4th Amendment, etc.), and the LLM writes a fresh MEE-style hypo.
Manage outlines, browse history, switch LLM provider.
Standard IRAC compresses the whole rule into one step. IRREAC splits it — and the split is worth points.
Download and run Ollama — it serves the local LLM that powers all analysis. Single drag-and-drop on macOS, single binary on Linux/Windows.
git clone https://codeberg.org/russkysong/IRAC-Maker.git cd IRAC-Maker bash setup.sh streamlit run app.py
First run downloads the ~5.6 GB qwen3.5:9b model and bundles it into a custom irac-maker Ollama model with the IRREAC system prompt. Subsequent launches are instant.
Out of the box, every fact pattern, IRAC, case brief, uploaded
outline, and saved history entry lives only on your machine.
Nothing is uploaded, nothing is sent to a server, nothing is
committed to git. The Ollama model runs locally; the app is
single-user; the storage is plain JSON files under
~/.iracmaker/.
BYOK is opt-in. If you switch to a cloud model (Grok, GPT-4, etc.) in Settings, the app shows a red privacy notice and auto-flips the outline source away from your uploads to keep purchased commercial outlines off the network.
Outlines you upload are your responsibility. The app does not bundle any commercial outlines. If you upload copyrighted materials (Glannon, Emanuel, etc.), they're stored locally for your personal study use only.