Running an AI LLM locally – Testing out DeepSeek

I have been working and experimenting with AI in all diffrerent aspects of my life.  I have changed the way I work with AI around stratagey and busienss.  However one of the benefits of being in technology is both using the technology and teaching others about the technology.

Many people I have talked with lately are asking questions about deepseek.  They have heard about it on th enews and ask what is it.

When talking to people I also mention that I have been running my AI models locally.  This has really allowed me to fine tune, train, and also experiment with AI.

So, I did a quick video to show how someone could use DeepSeek locally.  In this video I am running ollama and DeepSeek on a macbook and it allows me to keep all of my prompts and data local.
I have been working with AI in various aspects of my life, experimenting with different applications. I’ve changed the way I approach AI in strategy and business. One of the advantages of being in the technology field is not only using the technology but also teaching others about it.

Recently, many people I’ve spoken with have been asking about DeepSeek. They’ve heard about it in the news and want to know more about what it is.

When discussing this, I also mention that I’ve been running my AI models locally. This approach has allowed me to fine-tune, train, and experiment with AI more effectively.

To share my knowledge, I created a quick video demonstrating how someone can use DeepSeek locally. In this video, I run Ollama and DeepSeek on a MacBook, which enables me to keep all my prompts and data on my local machine.

Leave a Comment