Building Generative AI Plugins with Spring AI: A Step-by-Step Guide — Part II

Halil Ural
8 min readOct 24, 2024
Featured Image

Introduction

In the last article, we talked about how to prepare our data by getting the dataset and converting it to the database, since we’re going to use it in our Spring AI application. In this article, we’re going to focus on how to set up Ollama and use it in the Spring AI application.

Before moving to the application I’d like to discuss Ollama, and what it offers.

What is Ollama?

Ollama is an open-source tool that helps run LLM on your local machine. I’m not going to mention how to install it, Sridevi Panneerselvam wrote a good article about it, please check this article to install it.

I’ll explain how it works anyway.

Ollama is a command-line application, so you can interact with it in the terminal directly. Open the terminal and type this command:

ollama

If you run the above command, you’ll get a set of commands.

Usage:
ollama [flags]
ollama [command]

Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
stop Stop a running model
pull Pull a model from a…

--

--

Halil Ural
Halil Ural

Written by Halil Ural

Tech writer and software engineer exploring system design, AI, blockchain, and autonomous systems. Sharing insights and knowledge to inspire and educate. 🚀