Coding with LLMs on felix

tuto_felix tuto_felix

Coding with LLMs on Felix

Using and configuring Felix (felix.loria.fr) as coding assistant:

Discover how to use and configure local generative AIs (felix.loria.fr) as coding assistants.

A few weeks ago, we made available two machines (tigre and jaguar) on the local network, accessible via felix.loria.fr for using generative AIs: https://sed-nge.inria.fr/platforms/ollama/index.html

In this tuto techno, you will see the following points:

  • Using AIs via Open WebUI: configuration, system prompt personalization, function usage, documents, etc.
  • Local installation of Open WebUI on your machines and configuration to leverage knowledge bases (RAG).
  • Using servers via the Ollama client for Python or another programming language.
  • Using G5K to run heavy processes.
  • Configuring coding assistants like aider and Continue.
  • Developing a complete small web application with the assistant: code, tests, documentation.

Prerequisites:

  • Basic programming skill
  • Understanding the limitations of LLMs (text generation)

Technologies used:

  • Your favorite IDE: Visual Studio (Windows) or Visual Studio Code (Linux)
  • Programming language: Python or another.
Date April, 25 2025
Place INRIA Nancy
Authors frederic.beck@inria.fr, theo.biasutto-lervat@inria.fr & laurent.pierron@inria.fr
Source https://gitlab.inria.fr/tutos-technos/openwebui-ollama