Skip to content

Commit 936b388

Browse files
committed
Remove Ollama from the nav menu
1 parent 1781381 commit 936b388

3 files changed

Lines changed: 13 additions & 12 deletions

File tree

documentation/modules/ROOT/nav.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,5 +28,5 @@
2828
** xref:19_agents_tools.adoc[Agents/Tools]
2929
** xref:20_embed_documents.adoc[Embedding Documents]
3030
** xref:21_podman_ai.adoc[Working with Podman Desktop AI]
31-
** xref:22_local_models.adoc[Working with local models]
31+
//** xref:22_local_models.adoc[Working with local models]
3232
//** xref:23_kubernetes_kafka_ai.adoc[Bringing Kubernetes and Kafka to the party]

documentation/modules/ROOT/pages/21_podman_ai.adoc

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ Go ahead and click on it, and in the AI Lab, select the "Catalog"
2222

2323
image::podman-desktop-ai-catalog.png[]
2424

25-
You should now see a list of available AI Models choose from. You can also import different ones (eg. from Huggingface), but we will use one of the InstructLab models that are already available.
25+
You should now see a list of available AI Models to choose from. You can also import different ones (eg. from Huggingface), but we will use one of the InstructLab models that are already available.
2626

2727
NOTE: If you haven't heard of https://developers.redhat.com/articles/2024/05/07/instructlab-open-source-generative-ai[Instructlab], it's an open source project for enhancing large language models (LLMs) used in generative artificial intelligence (gen AI) applications. You can even contribute to it yourself!
2828

@@ -95,7 +95,7 @@ quarkus.langchain4j.openai.log-responses=true
9595

9696
Let's create an interface for our AI service.
9797

98-
Create a new `AssistantForInstructLab.java` Java interface in `src/main/java` in the `com.redhat.developers` package with the following contents:
98+
Create a new `Assistant.java` Java interface in `src/main/java` in the `com.redhat.developers` package with the following contents:
9999

100100
[.console-input]
101101
[source,java]
@@ -109,7 +109,7 @@ import jakarta.enterprise.context.SessionScoped;
109109
110110
@RegisterAiService()
111111
@SessionScoped
112-
public interface AssistantForInstructLab {
112+
public interface Assistant {
113113
114114
@SystemMessage({
115115
"You are a Java developer who likes to over engineer things" //<1>
@@ -123,7 +123,7 @@ public interface AssistantForInstructLab {
123123

124124
Now we're going to implement another REST resource that accepts prompts
125125

126-
Create a new `InstructLabResource.java` Java class in `src/main/java` in the `com.redhat.developers` package with the following contents:
126+
Create a new `AIResource.java` Java class in `src/main/java` in the `com.redhat.developers` package with the following contents:
127127

128128
[.console-input]
129129
[source,java]
@@ -137,11 +137,11 @@ import jakarta.ws.rs.Produces;
137137
import jakarta.ws.rs.QueryParam;
138138
import jakarta.ws.rs.core.MediaType;
139139
140-
@Path("/instructlab")
141-
public class InstructLabResource {
140+
@Path("/ai")
141+
public class AIResource {
142142
143143
@Inject
144-
AssistantForInstructLab assistant;
144+
Assistant assistant;
145145
146146
@GET
147147
@Produces(MediaType.TEXT_PLAIN)
@@ -157,14 +157,14 @@ public class InstructLabResource {
157157

158158
Let's ask our model to create a class that returns the square root of a given number:
159159

160-
You can check your prompt implementation by pointing your browser to http://localhost:8080/instructlab[window=_blank]
160+
You can check your prompt implementation by pointing your browser to http://localhost:8080/ai[window=_blank]
161161

162162
You can also run the following command:
163163

164164
[.console-input]
165165
[source,bash]
166166
----
167-
curl http://localhost:8080/instructlab
167+
curl http://localhost:8080/ai
168168
----
169169

170170
An example of output (remember, your result will likely be different):

documentation/modules/ROOT/pages/22_local_models.adoc

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,10 @@
22

33
:project-ollama-name: quarkus-ollama-app
44

5-
Throughout this tutorial, we've been working with remote models. Let's switch now to a local model.
5+
Throughout this tutorial, we've been working with remote or containerized models. Let's switch now to a model running natively on our local machine.
66

7-
There are various options out there, and we'll work with Ollama.
7+
There are various options out there. In our case we'll work with Ollama, open-source project that serves as a powerful
8+
and user-friendly platform for running LLMs on your local machine.
89

910

1011
== Installing Ollama

0 commit comments

Comments
 (0)