AIBC · Week 9
Week 9 – Local LLMs
Running models on your own hardware
Focus for this week
Explore running models locally so you can prototype offline, reduce cost, or keep sensitive data in-house.
Skills & concepts
- • Installing Ollama or LM Studio
- • Downloading and running at least one local model
- • Comparing outputs vs hosted GPT-style models
- • Understanding trade-offs: speed, quality, hardware
Weekly deliverable
A local model demo that answers a specific task (e.g., drafting a short summary) from your own machine.