pds-it
['Product detail page','no']
Machine Learning & Data Analytics / Generative AI
The illustrations were created in cooperation between humans and artificial intelligence. They show a future in which technology is omnipresent, but people remain at the center.
AI-generated illustration

Developing generative AI assistants with LLM, RAG and cloud services

Online
3 days
German
Download PDF
€ 1.890,-
plus VAT.
€ 2.249,10
incl. VAT.
Booking number
41720
Venue
Online
4 dates
€ 1.890,-
plus VAT.
€ 2.249,10
incl. VAT.
Booking number
41720
Venue
Online
4 dates
Become a certified
Machine Learning Engineer
This course is part of the certified Master Class "Machine Learning Engineer". If you book the entire Master Class, you save over 15 percent compared to booking this individual module.
To the Master Class
In-house training
In-house training just for your employees - exclusive and effective.
Inquiries
In cooperation with
Large language models (LLM) and retrieval augmented generation (RAG) take chatbots to the next level: as intelligent AI assistants, they answer customer questions, generate content and automate workflows. In this three-day live training training course, you will learn how to design such assistants, implement them as prototypes and operate them reliably in cloud services. You will understand embeddings, transformer architecture and prompt design, work hands-on with the OpenAI and Anthropic APIs, host open-source LLMs with Ollama and combine them using RAG workflows, agentic AI patterns and multimodality. A preconfigured GPU lab environment with Jupyter notebooks allows you to try out each step immediately in small exercises. You will build an actionable blueprint for scalable, cost-efficient and governance-compliant AI assistants.
Contents

1. technical basics of generative language models

  • The evolution of chatbots into LLM agents
  • Text embeddings: From text to vectors
  • Efficient language modeling with the Transformer architecture
  • How modern chat models work
  • Evaluation of language models

2. LLMs via API and cloud integration

  • Use proprietary APIs (OpenAI, Anthropic, Google AI)
  • LLMs in the cloud (Azure OpenAI Service and others)
  • Security, data protection, cost control for API calls

3. use open source language models

  • Host open source language models yourself
  • Easy access to open source LLMs using the example of Ollama
  • Building your own AI chat application with OpenWebUI and Ollama

4. best practices in LLM development

  • Use LLM frameworks: LangChain, LlamaIndex, Haystack and others.
  • Generate structured data with LLMs
  • Structured JSON outputs and data extraction

5. customize LLM: RAG and fine tuning

  • Basics: Retrieval Augmented Generation (RAG) in projects
  • Construction of a simple RAG system with LlamaIndex
  • When and how fine-tuning is used in relation to RAG

6. advanced concepts and practice

  • Introduction to multimodal models (text & vision)
  • Quantized models for limited memory requirements
  • Agentic AI: using language models for complex tasks
  • Practical project: Development of a chatbot (end-to-end)
Your benefit

Overview of models and technologies: You will gain an overview and detailed knowledge of the different approaches in the development of AI solutions - with open source or proprietary models, via API or locally hosted, with fine-tuning or RAG, with or without cloud services.

 

Developing practice-ready assistants: After learning the basics and technology, you will develop an AI assistant yourself in practical exercises.

 

Make well-founded technology decisions: You compare public API, Azure OpenAI and own GPU clusters in terms of cost, latency and control.

 

Scale without surprises: Quantization, security measures and local hosting allow you to keep operating costs and risks under control.

 

Support for learning transfer: cloud lab, source code, Jupyter notebooks and deploy blueprint ensure transfer to your everyday work.

trainer
Christian Staudt
Dr.
Methods

This training training is conducted in a group of a maximum of 12 participants using the Zoom video conferencing software.

 

In the training , you work in a cloud-based lab environment provided by the trainers . All you need is a web browser and an internet connection; no additional software needs to be installed.  

 

Interactive Jupyter notebooks serve as learning material and working environment. You get access to source code, documentation, references and links. A powerful server for working with current AI models is provided.

 

The course is held in German, the course materials are mostly available in English due to the focus on programming.

 

There is room for your questions - individual support from the trainers is guaranteed.

 

You can access further materials in your personal learning environment.

Final examination
Recommended for

developers developers, ML engineers, data scientists, solution architects and consultants consultants who want to design, implement and operate AI assistants.

 

Basic knowledge of Python is required, as examples are analyzed and programmed on a code basis. Existing data science knowledge is helpful, but not essential.

Start dates and details

Form of learning

Learning form

20.10.2025
Online
Places free
Implementation secured
Online
Places free
Implementation secured
20.1.2026
Online
Places free
Implementation secured
Online
Places free
Implementation secured
22.4.2026
Online
Places free
Implementation secured
Online
Places free
Implementation secured
27.7.2026
Online
Places free
Implementation secured
Online
Places free
Implementation secured
Do you have questions about training?
Call us on +49 761 595 33900 or write to us at service@haufe-akademie.de or use the contact form.