Connect with us

Hi, what are you looking for?

Tech

How to operate large language models of artificial intelligence (LLM) locally: artificial intelligence that is not related to beginners for beginners

How to operate large language models of artificial intelligence (LLM) locally: artificial intelligence that is not related to beginners for beginners

How to operate large language models of artificial intelligence (LLM) locally: artificial intelligence that is not related to beginners for beginners

If you are interested in the world of artificial intelligence (AI), you have noticed that more and more open source communities adopt large language models (LLMS). These models are already competing smoothly with paid giants such as ChatGPT, Gemini or Clauds Sonnet.

And the best part? You can run it in a mode of not calling it completely until your data remains with you while you can use AI Unlimited.

In this small guide we look at:

  • Why is it worth running LLMS locally?
  • What are the best tools for this purpose
  • How to choose a model for your device’s power

Why is it good to run artificial intelligence locally?

Instead of relying on the cloudy artificial intelligence service, you can control here. Here is the reason that I tend to use LLMS locally:

🛫 You don’t need the Internet Use Amnesty International without a network, either on a plane or in a isolated place, or if you cannot fit Chatgpt.

advertisement

advertisement

🔒 Privacy and control – The local AI does not send your data anywhere, you do not loot your conversations, and do not use anything later. You have complete control of it.

💬 Unlimited use for free – Forget the API boundaries, features of advantages and subscriptions. You can use as much as you want – completely free.

💡 Amnesty International developments -Ai use is not only about Chatgpt. Learn about organisms, generate images, and talk to the text/text to words, and I can be included. Localai from Server.hu can help start similar local target developments. Find them for an unofficial advice.

When you use a cloud of clouds, all reactions can be saved and your own symbols or companies can be used on technical, for others. However, you are a president, do not lose data and will not be shared.


How can LLMS work with you? – The best tools for beginners and applicants

Local artificial intelligence is much easier than you think devices that do not even have to COD! Here are some tips based on the amount in which you are at home on this topic:

1. LM studio (simpler, really zero code!)

LM Studio is the easiest way to start local LLMS. You can download the models and chat with it and even download the documents to respond to them.

💡 Professional advice: Fill PDF, CSV or DocX files (up to 30MB) to the context of the model. This is similar to the SUPER MinI-RAG summary for reports, and information recovery.

2. OLLAMA (for developers, energy users)

This is a Linux orders to download and operate models very appropriately. If you want to work at a station, this has been invented for you. But it can also be connected to a graphic surface, such as Open WebuiWhich, at first glance, gives the same surface just as Chatgpt. You can download documents, full knowledge prices, even verbally communicate and choose a different LLM model for different action/questions.

3. VLLM (for lightning speed)

It is made by Uc Berkeley Sky Computing – Fast, has many requests at the same time, so if the speed is important to you, you will be your friend.

4. Manual installation (for artificial intelligence lovers, Bethon developers)

If you want to keep everything at hand, download the Huging Face GGUF models, use the Transformers and Gine models if you want. This is the advanced level, but it can be pressed smoothly with small skills of programmers. Remember: Artificial intelligence will help you write symbols!


What is the model you choose?

You can choose from a lot of Prime Open Source Llm. The decision depends on the device you have and what you want to use. Favorite couple:

  • Deepsek R1
  • Gemma 3
  • Dibsic v3
  • QWQ 32B
  • Lama 3.3 (My favorite!)

If you don’t know where to start, check out Chatbot Arena LLM Categories There are real users who appreciate models.


Fate the form with your device

The next step is to choose a model with your device. Here comes the quantitative measurement – a way that makes models smaller and easier to run.

💡 Important factor: RAM!
Llms eat a lot of memory, here is a quick reference point:

  • 🖥 8-16 GB RAM → The smallest models (3B-7B) with an aggressive amount
  • 🖥 32 GB of RAM → Small and Medium Models (7B-13B) with medium amount
  • 🖥 64 GB+ RAM → Mazer models (up to 30B-70B) more accurately

If you have Cuda is compatible with GPU-DLook for models that have been improved in GPU-playing it is very important. For professional use anyway Cuda-GPUS devices Required by installing two or more graphics processing units. The amount of VRAM is the key! You can already collect 20-24 GB of VRAM on artificial intelligence, but you need at least 40-48 GB+ VRAM for more dangerous functions of artificial intelligence!

🧠 hint: Start a little, then if you can with your device, move!


What are the limits of local running?

Although there are many benefits, there are some settlements:

Time and energy Unlike ChatGPT, you need a lot of work and perseverance (test) for a local (professional) local work station.
I need a stronger machine Large models require a lot of RAM and more graphics processing units, so the laptop may be sufficient to try it, but really hard, it will definitely be minimal.

However, these are smoothly compensated by the fact that You have full control Without subscription, the new improved models come constantly!


Final: Control!

If you run artificial intelligence in non -communication mode, it is really no paid wall, does not collect data, and there is no need for a network.

📌 For beginners: LM Studio – Simple but knows a lot

📌 This has been done for me: OLLAMA + Open Webui – completely professional and professional work
📌 Favorite model: Llama3.3 – You can do amazingly amazing in the Hungarian language
📌 What I loveI can easily use artificial intelligence for major developments such as identifying the license board

The open source LLMS is strong enough, so don’t have to rely on subscriptions anymore. Use Amnesty International the way you feel!

Relationships: This article was written by László fésűs from server.hu. I think anyone who wants to deal with ovaries of search engines at the professional level is necessary for deep knowledge of linguistic models, and local AI is an important component. For this reason I kept it important to present this topic on it.hu and I could not write a clear summary on this topic. But there are opportunities that go beyond SEO in local artificial intelligence, a specific example is to say Learn about the license board with artificial intelligence.

I also tried the LM studio, but my device did not really stand on it, but in any case, it was easy to use and work, very slowly (Intel-I3 4150). Better open resource models come out, including Openai in the summer. For this reason I just entered into a better desktop with a CUDU compatible NVIDIA graphics card.

Once I get suspended experience and tips.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Top News

In a harrowing incident that has shaken the community of Lewiston, Maine, a series of shootings on Wednesday evening resulted in a tragic loss...

Top News

President Joe Biden’s abrupt departure from a speech on the U.S. economy at the White House on Monday sent a ripple of speculation and...

Tech

a DSO . Games You mentioned that a hacker recently managed to hack the PC version of Resident Evil Village. This is not interesting...

Top News

Given the differences in styles with next-generation consoles, the so-called “console war” between Sony and Microsoft is arguably moot. Most console players, however, will...

Copyright © 2024 Campus Lately.