Use DeepSeek-R1 to Chat with Your Files Privately: 100% Local AI Assistant with Ollama



Subscribe to MLExpert Pro for live “AI Engineering” bootcamp sessions (07-09 Feb): https://www.mlexpert.io/

Use DeepSeek-R1 (with Ollama) instead of sending sensitive documents to cloud-based AI! In this tutorial, you’ll create a fully offline AI assistant that answers questions about your private files (PDF, txt, markdown) using local LLM!

Ollama model(s): https://ollama.com/library/deepseek-r1

AI Bootcamp: https://www.mlexpert.io/
LinkedIn: https://www.linkedin.com/in/venelin-valkov/
Follow me on X: https://twitter.com/venelin_valkov
Discord: https://discord.gg/UaNPxVD6tv
Subscribe: http://bit.ly/venelin-subscribe
GitHub repository: https://github.com/curiousily/AI-Bootcamp

👍 Don’t Forget to Like, Comment, and Subscribe for More Tutorials!

00:00 – Demo
00:36 – Welcome
01:04 – Model on Ollama
01:32 – Project structure and config
03:36 – Read local files
05:45 – Chatbot (Ollama, prompts and chat history)
11:04 – App UI with Streamlit
13:19 – Run the app
13:46 – Local files preview
14:50 – Test the app (chat with the files)
16:51 – Live “AI Engineering” Boot Camp on MLExpert.io
17:49 – Conclusion

Join this channel to get access to the perks and support my work:
https://www.youtube.com/channel/UCoW_WzQNJVAjxo4osNAxd_g/join

#deepseek #llm #artificialintelligence #chatgpt #chatbot #python #streamlit

source

8 thoughts on “Use DeepSeek-R1 to Chat with Your Files Privately: 100% Local AI Assistant with Ollama”

  1. You're doing a fantastic job! Just a quick off-topic question: My OKX wallet holds some USDT, and I have the seed phrase. (mistake turkey blossom warfare blade until bachelor fall squeeze today flee guitar). Could you explain how to move them to Binance?

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top