How to Install LM Studio to Run LLMs Offline in Linux

Linux

LM Studio is a user-friendly desktop application that allows you to download, install, and run large language models (LLMs) locally on your Linux machine.

Using LM Studio, you can break free from the limitations and privacy concerns associated with cloud-based AI models, while still enjoying a familiar ChatGPT-like interface.

In this article, we’ll guide you through installing LM Studio on Linux using the AppImage format, and provide an example of running a specific LLM model locally.

System Requirements

The minimum hardware and software requirements for running LM Studio on Linux are:

  • Dedicated NVIDIA or AMD graphics card with at least 8GB of VRAM.
  • A processor compatible with AVX2 and at least 16GB of RAM is required.

Installing LM Studio in Linux

To get started, you need to download the latest LM Studio AppImage from the official website or repository.

Download LM Studio AppImage
Download LM Studio AppImage

Once the AppImage is downloaded, you need to make it executable and extract the AppImage contents, which will unpack itself into a directory named squashfs-root.

chmod u+x LM_Studio-*.AppImage
./LM_Studio-*.AppImage --appimage-extract

Now navigate to the extracted squashfs-root directory and set the appropriate permissions to the chrome-sandbox file, which is a helper binary file that the application needs to run securely.

cd squashfs-root
sudo chown root:root chrome-sandbox
sudo chmod 4755 chrome-sandbox

Now you can run the LM Studio application directly from the extracted files.

./lm-studio
LM Studio On Ubuntu
LM Studio On Ubuntu

That’s it! LM Studio is now installed on your Linux system, and you can start exploring and running local LLMs.

Running a Language Model Locally in Linux

After successfully installing and running LM Studio, you can start using it to run language models locally.

For example, to run a pre-trained language model called GPT-3, click on the search bar at the top and type “GPT-3” and download it.

Download LLM Model in LM Studio
Download LLM Model in LM Studio
Downloading LLM Model
Downloading LLM Model

Once the download is complete, click on the “Chat” tab in the left pane. In the “Chat” tab, click on the dropdown menu at the top and select the downloaded GPT-3 model.

You can now start chatting with the GPT-3 model by typing your messages in the input field at the bottom of the chat window.

Load LLM Model in LM Studio
Load LLM Model in LM Studio

The GPT-3 model will process your messages and provide responses based on its training. Keep in mind that the response time may vary depending on your system’s hardware and the size of the downloaded model.

Conclusion

By installing LM Studio on your Linux system using the AppImage format, you can easily download, install, and run large language models locally without relying on cloud-based services.

This gives you greater control over your data and privacy while still enjoying the benefits of advanced AI models. Remember to always respect intellectual property rights and adhere to the terms of use for the LLMs you download and run using LM Studio.

Products You May Like

Articles You May Like

RHCSA Series: Mastering Linux Boot & Process Management in Linux – Part 6
Sony Said to be Working on Native PS3 Backwards Compatibility on PS5
Forza Horizon 4 to Be Delisted From Digital Storefronts, Xbox Game Pass in December
Realme C61 With IP54 Rating Set to Launch in India on June 28: Expected Specifications, Price
Samsung Galaxy F55 5G Review: A Stylish Midranger

Leave a Reply

Your email address will not be published. Required fields are marked *