Run your own PRIVATE AI locally on Docker and integrate it with VSCode : ollama, docker, vmware

#ai #docker #server #debian11
Welcome to our Digital Mirror, where we embark on the exciting journey of building a robust remote AI server. Using Debian as our foundational operating system, we leverage the power of Docker to containerize and manage our applications, ensuring a seamless development and deployment process. Our tutorials guide you through integrating Ollama AI, an innovative artificial intelligence framework, to enhance the capabilities of your server with cutting-edge AI functionalities.

We also delve into the world of virtualization with ESXi, demonstrating how to efficiently manage and run multiple virtual machines on a single physical server, optimizing resources and improving scalability. Throughout our series, we employ Visual Studio Code (VSCode) as our editor of choice, showcasing its versatility and powerful features for coding in various programming languages and managing Docker containers.

Whether you're a seasoned developer looking to expand your skills in server management and AI, or a hobbyist eager to explore the world of virtualization and containerization, our channel provides step-by-step tutorials, best practices, and tips to help you build and maintain a high-performance remote AI server. Join us on this technical adventure to unlock new possibilities and take your projects to the next level!

Timestamps:
0:00 Intro
2:13 Table of Contents
3:09 ESXI VM Setup
4:53 Configuring Debian 11
6:51 Linux basic tools config
7:49 – Docker installation
8:16 – NVidia Drivers installation
10:54 Installing Portainer
11:50 Ollama AI Docker Server Installation
13:40 VScode Integration with Ollama Server
15:32 Bonus FREE AI tools
16:43 Outro

Server Specs
Gigabyte Z370 HD3P
6 CPUs x Intel(R) Core(TM) i7-8700K CPU @ 3.70GHz
64 GB RAM DDR 4
NVIDIA ZOTAC GTX1080
ESXI 8.0

List of Commands used in the video
– Prep the Debian image

apt install sudo &&
usermod -aG sudo digitalmirror &&
logout

sudo apt update &&
sudo apt upgrade &&
sudo apt install cifs-utils -y &&
sudo apt install build-essential -y &&
sudo apt-get install software-properties-common -y &&
sudo apt-get install linux-headers-$(uname -r) -y &&
sudo apt install curl -y

curl | sh
&& sudo systemctl –now enable docker

sudo usermod -aG docker digitalmirror

– Installing the Drivers

distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
&& curl -fsSL | sudo gpg –dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg
&& curl -s -L |
sed 's#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] ' |
sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list

sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit
sudo nvidia-ctk runtime configure –runtime=docker
sudo systemctl restart docker
sudo docker run –rm –runtime=nvidia –gpus all nvidia/cuda:11.6.2-base-ubuntu20.04 nvidia-smi

sudo apt install nvtop
sudo apt install htop

check drivers are working
nvdia-smi

Config for Cody
"cody.autocomplete.experimental.ollamaOptions": {
"url": " ",
"model": "codellama:7b-code"
}

Leave a Reply

Your email address will not be published. Required fields are marked *

Amazon Affiliate Disclaimer

Amazon Affiliate Disclaimer

“As an Amazon Associate I earn from qualifying purchases.”

Learn more about the Amazon Affiliate Program