# How to Execute DeepSeek R1 on Mac and Windows for Free
DeepSeek R1 has swiftly gained attention as one of the leading AI tools lately, climbing to the top of app store rankings and even surpassing ChatGPT in popularity. Its strength lies in its sophisticated reasoning capabilities, akin to ChatGPT, yet it requires a markedly lower training cost. Although the application is accessible for iPhone, Android, and web users, running it locally on your Mac or Windows PC provides several benefits, such as improved privacy and evading built-in censorship. Here’s a thorough guide on how to install and operate DeepSeek R1 on your computer for free.
—
## Reasons to Run DeepSeek R1 Locally
Before we jump into the installation process, it’s crucial to grasp why running DeepSeek R1 locally might be a preferable choice:
1. **Data Privacy**: DeepSeek has come under fire for transmitting user information to servers located in China. By running the AI locally, your data remains on your device, giving you complete control over your privacy.
2. **Bypassing Censorship**: The cloud-based version of the app may be subjected to real-time censorship. Operating it locally removes these limitations, empowering you to fully explore its capabilities.
3. **No Cost Involved**: DeepSeek R1 is open-source, allowing you to download and run it on your computer free of charge. This is especially attractive to developers, researchers, and enthusiasts aiming to experiment with the AI.
—
## Prerequisites
To run DeepSeek R1 on your local machine, you will require the following:
– A computer running **Windows**, **Mac**, or **Linux**.
– Software to manage and operate the AI models, such as **LM Studio** or **Ollama**.
– Sufficient hardware resources, based on the size of the AI model you opt to run.
—
## Step-by-Step Instructions for Running DeepSeek R1 Locally
### 1. **Download Necessary Software**
– **LM Studio**: A user-friendly application developed to execute AI models locally. It is available for free and supports various distillations of DeepSeek R1.
– Download it from [LM Studio’s official site](https://lmstudio.ai/).
– **Ollama**: An alternative for running AI models locally, but it requires the use of Command Prompt (Windows) or Terminal (Mac). It supports smaller AI models, making it suitable for systems with limited hardware capabilities.
– Get it from [Ollama’s site](https://ollama.com/download/).
### 2. **Select a DeepSeek R1 Model**
DeepSeek R1 presents multiple distillations of its AI model, from lightweight versions to comprehensive implementations. Here’s a brief overview:
– **DeepSeek R1 Distill (Qwen 7B)**: Needs 5GB of storage and 8GB of RAM. A great starting point for most contemporary computers.
– **Lesser Models (1.5B parameters)**: These demand as little as 1.1GB of RAM and are perfect for older or less robust systems.
– **Larger Models (up to 70B parameters)**: These provide superior performance but require significantly more RAM and storage.
You can find these models on platforms like [Hugging Face](https://huggingface.co/), where DeepSeek R1 distillations are available.
### 3. **Install the AI Model**
– If you’re utilizing **LM Studio**, simply look for the desired DeepSeek R1 model within the application and download it. The interface is user-friendly, making it straightforward to begin.
– For **Ollama**, you’ll need to manually download the model and run it through Command Prompt or Terminal. While this necessitates somewhat more technical knowledge, it supports smaller models that are simpler to execute on limited hardware.
### 4. **Commence Interaction with DeepSeek R1**
– After the model is installed, you can start engaging with the AI directly through LM Studio’s interface or via the command line in Ollama.
– Experiment with various models to identify the one that best fits your needs and hardware specifications.
—
## Hardware Specifications
The hardware requirements for operating DeepSeek R1 locally vary based on the model you decide to use:
– **1.5B Parameters**: Needs 1.1GB of RAM and minimal storage. Ideal for older systems.
– **7B Parameters**: Requires 5GB of RAM and moderate storage. Suitable for most modern computers.
– **70B Parameters**: Requires substantial hardware resources, including high RAM and storage capacity. Recommended for advanced users with robust systems.
—
## Performance Overview
DeepSeek R1’s different distillations provide varying performance levels. Based on benchmarks provided by DeepSeek, the smaller models are