Local AI: Your Data Stays With You
Install and use AI on your workstation, without sending data to the cloud. Ollama, AnythingLLM, RAG on your documents. Digital sovereignty starts here.
What you'll be able to do
Understand how a local LLM works and its privacy advantages
Use Ollama to interact with different AI models locally
Choose the right model for each type of task
Use an AI agent to interact with local documents via RAG
Apply prompting best practices for relevant results
Program
Why local AI + setup
- › Cloud vs local: what goes online vs what stays with you
- › Installing Ollama and downloading models
- › Installed models: Mistral (French), Llama 3.1 8B (fast)
- › First test: asking a question to a local model
AnythingLLM: AI with a real interface
- › Installing and configuring AnythingLLM
- › Creating your first workspace
- › Asking business questions in the interface
- › Comparing models: Mistral vs Llama
Writing effective prompts
- › The winning structure: context → role → task → format → constraints
- › Before/after examples on your use cases
- › Configuring a system prompt in AnythingLLM
- › Writing 5 optimized prompts for your recurring tasks
Working on your documents with RAG
- › RAG: AI reads your documents and answers with sources
- › Drag and drop a full folder into a workspace
- › Querying documents: extraction, summary, cross-referencing
- › Producing a summary note from multiple documents
Organizing daily use
- › Creating workspaces per client folder
- › Decision tree: local vs cloud by sensitivity
- › Daily launch procedure
- › Demo of advanced tools: Claude Code + Ollama, Claude Cowork
Practical info
1 day (7 hours)
Professionals handling sensitive data: regulated professions, finance, legal, HR, healthcare
Regular use of a computer (Mac or PC). No programming skills required.
1 to 4 people
100% hands-on on the trainee's workstation. The trainer installs and configures, the trainee learns to use. Exercises on real or fictional documents adapted to the role.
Entry-level assessment, practical exercises, end-of-training knowledge evaluation.
Colombani.ai, AI consultant and trainer.
2 weeks minimum between enrollment and training start.
Pricing
€700 / person / day
On request
Pricing on request. Each program is adapted to your situation, contact us for a personalized quote.
Accessibility
Program accessible to people with disabilities. Contact the disability officer in advance to discuss accommodations.
Ulysse Trin — [email protected] — 06 58 58 37 11
Post-training support
Frequently asked questions
Do I need a powerful computer to run local AI? +
A recent computer with 8 GB of RAM is enough for lightweight models (Mistral 7B, Llama 3.1 8B). For larger models, 16 GB is recommended. We check compatibility beforehand.
Does my data really stay on my machine? +
Yes, 100%. Ollama and AnythingLLM run entirely locally. No data is sent over the internet. That's the whole point of local AI.
Is local AI as powerful as ChatGPT? +
For most business tasks (summarization, extraction, writing), local models give excellent results. The training teaches you to choose the right model for each task.
Related programs
Working with AI Agents
Configure an AI agent for your role, automate recurring tasks, connect AI to your tools. All without writing a single line of code.
AI in Business: Executive Briefing
What AI concretely changes in business. Live demos, use cases by function, regulatory framework. The entry point for decision-makers.
Request the full program
Bespoke program, tailored to your industry. First call is free.