AI in The Company: Container-based Language Models as a Secure Alternative to The Cloud

11. February 2025

The next generation of business laptops is already being delivered with special AI accelerators. But you can already use your existing hardware for powerful AI applications today – without monthly cloud costs.

 

The revolution in local AI usage

The landscape of AI language models is evolving rapidly. While cloud services like ChatGPT and Claude offer impressive capabilities, increasingly powerful open-source alternatives are emerging. With Docker containers and the user-friendly Portainer management interface, companies can now deploy these models easily and securely locally.

 

Advantages of the container-based AI solution

  1. Centralized administration and user-friendliness

Portainer provides you with a clear web interface for managing your AI containers. Complex command line commands become superfluous – all functions are available at the click of a mouse.

  1. Flexibility when changing hardware

The container technology makes migrating to new hardware child’s play. When switching to a new laptop, simply export your container configurations and import them to the new system.

  1. Automated maintenance

Updates and maintenance work can be planned and carried out centrally. The system proactively informs you about available updates.

  1. Resource monitoring

Keep an eye on CPU, RAM, and network utilization. Portainer shows you the performance of your AI applications in real-time.

  1. Cost efficiency through local AI usage

Let’s look at a typical company team with 5 employees: common cloud providers such as OpenAI and Anthropic charge $25 per user per month with annual billing in their team subscriptions. For the entire team, this results in monthly costs of $125 or $1,500 per year.

In comparison, a local installation only incurs electricity costs of around $5-10 per month for the entire team in addition to the one-off set-up costs. The resulting savings of around $100 per month or $3,600 over a typical hardware cycle of 36 months enables targeted investments: Around an additional €700 per workstation is available for more powerful hardware.

 

Key benefits of the local setup:

  • Unlimited use at no additional cost
  • Parallel use of different AI models
  • Independence from an internet connection
  • No cloud availability problems
  • Full control over data and models
  • Flexible scalability without additional costs

 

Hardware requirements and future-proofing

Good news: your current business laptop is probably already AI-ready!

DeepSeek-R1:

  • Available in different sizes: 1.5b, 7b, 8b, 14b, 32b, 70b, 67.1b
  • The 70b version requires 43GB memory
  • OpenAI-o1 comparable performance

Llama 3.3:

  • 70B version
  • 43GB memory requirement
  • New version with performance like Llama 3.1 405B

Mistral:

  • Large version: 123b with 73GB memory requirement
  • Small version: 24b with 14GB memory requirement
  • Especially good for code, math, and multilingual applications

The new generation of AI laptops with special NPUs (Neural Processing Units) will further increase performance but is not essential for getting started.

 

Installation in four simple steps

For detailed step-by-step instructions or if you have any questions about installation, please contact us. We will support you in setting up and optimizing your local AI environment.

  1. Docker Desktop: The basic software for container management. Available free of charge for Windows, Mac, and Linux.
  2. Portainer: The user-friendly management interface for your containers. Enables easy monitoring and control of all AI applications.
  3. Ollama: The AI engine that powers your local language models. Optimized for efficient use of your existing hardware.
  4. Open WebUI: The ChatGPT-like user interface for interacting with your local AI models. Intuitive and user-friendly.

After installation, you can start using your local AI models immediately. The container technology ensures the secure and isolated execution of all components.

 

Data security as a key advantage

The local execution of the AI models offers decisive security advantages:

  • No data transfer to external servers
  • Full control over data access and processing
  • GDPR compliance thanks to local data storage
  • Protection of business secrets

 

Conclusion

Entering the world of AI doesn’t have to be expensive or complicated. With Docker and Portainer, you can use local AI models securely and cost-effectively – your existing hardware is usually completely sufficient.

Start small and grow as required. The beauty of container technology:

You simply add new digital specialists to your team whenever you need them. Like well-trained employees, each container performs its specific task and works seamlessly with the others.

The AI hardware of the future is developing rapidly and the models are becoming increasingly powerful. Companies that take the first step today not only gain valuable know-how – but they actively shape their digital future.

Suchbegriff

Start Searching

Kategorien

Categories

Follow us on LinkedIn

Folgen Sie uns auf LinkedIn

More Highlights

Weitere Highlights