ele

ele

ele

Come for the wool, Tencent Cloud Cloud Studio big wool!!! One

Correction, after testing by the author, the Cloud Studio high-performance basic version does not support external services for some reason, which means it can only perform interactive operations under its command line by default.
As shown in the figure below:
EC48EC26-1A07-4B30-86E6-E3DBAA0D61D2

It cannot provide API services, but other machines and hai machines have been tested by the author and are supported.

Tencent Cloud is indeed a conscientious cloud service, and this wave of benefits must be seized.
Today, let's introduce our protagonist
Tencent Cloud Cloud Studio

Introduction:

Cloud Studio is a browser-based integrated development environment (IDE) that provides developers with an uninterrupted cloud workstation. Users can use Cloud Studio without installation and can program online anytime and anywhere by simply opening a browser.

5B3BB8EB-A7B2-4F0E-849B-051DC3EFBA55

As an online IDE, Cloud Studio includes basic IDE features such as code highlighting, auto-completion, Git integration, and terminal, while also supporting real-time debugging and plugin extensions, helping developers quickly complete the development, compilation, and deployment of various applications.
Online IDE interface
image

For a more detailed introduction, you can check here official documentation

Why mention this? Because Tencent Cloud provides developers with a lot of free experience time.
Initially, it provided 3000 minutes (general),
High performance provided 10000 minutes, and now the general duration has directly increased to 50000 minutes. Alright, without further ado, let's get started.
Today's tutorial is to teach everyone how to deploy large models using Cloud Studio through Ollama.

Step 1: Register for a Tencent Cloud account. There’s nothing much to say here, just follow the prompts to register and verify your identity; it’s very quick.
Step 2: Open our protagonist Tencent Cloud Cloud Studio
The interface is as follows:

CA4FED14-715E-420D-A931-454F4706400F

Step 3: Create a new space
As shown in the figure, first click on the menu on the left, high-performance space, then click on create.

8DFC11C6-2F58-4916-8220-4644625E531B

Some may wonder, the banner above has already stated that you can directly create Ollama and that it is the hottest DeepSeek, so why go through the trouble of explaining how to use Ollama to create it? This is to teach you how to fish rather than just giving you fish; if you know how to install Ollama, then you will also be able to install and use other large models in the future. Alright, let's continue.

89C5498A-D9D3-47E3-9536-3F3D13EFDD7E

Select the appropriate specifications, click next, and start creating the machine. It may take a few minutes to log in, after which it will automatically redirect to the IDE page.
Since there was another wave of resources released today, I directly selected the high-performance space in the lower left corner, after all, it is a GPU server, yes, a free experience GPU server.

Step 4: Install Ollama
Then, in the editor's menu bar, select Terminal ==> New Terminal

18F5AC3E-5DA1-4DBB-8489-B67060BD9825

In the lower right part of the editor is the opened terminal, where we can enter normal Linux commands. Let's install the Ollama large model deployment framework. You can install Ollama with just one command.

2A7E44FE-E7A4-440B-A89E-090FBA70B1F6

curl -fsSL https://ollama.com/install.sh | sh

Copy the command above to the terminal interface, press Enter, and the automatic installation will begin.

After a moment, the installation will be successful.
The interface may vary, but the commands are the same.
After successful installation, we enter ollama --help in the terminal.
Due to network issues, I will directly use the one that has already been downloaded.

A30426E7-BF04-44D4-9372-47AC56C35635

The commonly used commands are ollama run xxxx to download and run the xxx model,
ollama stop xxxx to stop running the xxx large model,
ollama list to list all downloaded large models,
ollama ps to list running large models,
ollama rm xxxx to delete a large model (stop it first, then delete).
For others, you can refer to the documentation.
Ollama supports mainstream open-source large models; you can check the official website for specifics.

D0226684-8F3F-4FC8-9E63-9765E69D46A9

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! One

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! Two

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! Three

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! Four

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.