ele

ele

ele

Come and get the benefits, Tencent Cloud Cloud Studio big benefits!!!二

Correction, after testing by the author, the Cloud Studio high-performance basic version does not support external services for some reason, meaning it can only operate in dialogue mode under its command line by default.
As shown in the figure:
EC48EC26-1A07-4B30-86E6-E3DBAA0D61D2

It cannot provide API services, but other machines and hai machines have been tested by the author and are supported.

Following up on the previous article, let's get some benefits, Tencent Cloud Cloud Studio big benefits!!!
Click here for the previous article
Today we will use Tencent Cloud Cloud Studio to set up our DeepSeek R1 large model for free, let's get started.

  1. Open the Ollama model page
    image
    The model page lists the one-click installation models supported by Ollama. For example, if we need to install DeepSeek-R1, we can directly click this link to open the specific page of the model. If there isn't a suitable one, you can enter the model keyword in the search bar to search.

  2. Model detail page
    image
    From here we can see that the currently supported models include 1.5b, 7b, 8b, 14b, 32b, 20b, 671b, etc. We can directly switch to the model we need, as shown in the figure.

image
We choose to install the 32b model. If not all models are listed, you can click "view all" as shown in the image.

  1. Install the model
    We copy the command, then open the terminal command box in Cloud Studio and enter the command to install and run the large model. Just a note, if we only want to pull the model and not run it, we can use
    ollama pull xxx to run. Ollama also supports HF models, but conversion is required, which will not be detailed here.
    ollama run deepseek-r1:32b

image

528FFB03-E1E5-457A-A1DB-699855DB64A7
Press Enter and wait for the download to complete.
At this point, the waiting time is a bit longer since the model is being downloaded directly from the internet. The larger the model, the longer the download time, but theoretically, larger models are smarter.
When you see the prompt "send a message," it indicates that our large model is running normally and waiting for our input.

image

Let's ask a question.

image
Some friends may ask, how can I call this on the web or in a third-party client?
Stay tuned for our next article.

Big benefits from Tencent Cloud Cloud Studio!!! Part One

Big benefits from Tencent Cloud Cloud Studio!!! Part Two

Big benefits from Tencent Cloud Cloud Studio!!! Part Three

Big benefits from Tencent Cloud Cloud Studio!!! Part Four

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.