Skip to content

Commit fc11e8a

Browse files
committed
remove custom prompt information from overview docs page
1 parent 7e070d8 commit fc11e8a

2 files changed

Lines changed: 1 addition & 90 deletions

File tree

src/pages/docs/administration/octopus-ai-assistant/custom-prompts.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ To add custom prompts to your Octopus AI Assistant:
3636
1. Open the Octopus Deploy web portal
3737
2. On the main page for the space, click **Variable Sets**
3838
3. Click **Add Variable Set**
39-
4. Enter `Octopus AI Assistant Prompts` for the variable set name
39+
4. Enter `OctoAI Prompts` for the variable set name
4040
5. Add variables in the new variable set using the naming convention below
4141

4242
### Variable naming convention

src/pages/docs/administration/octopus-ai-assistant/index.md

Lines changed: 0 additions & 89 deletions
Original file line numberDiff line numberDiff line change
@@ -59,95 +59,6 @@ On-premises Octopus instances must allow HTTP requests from the IP address `51.8
5959
It is not possible to integrate Octopus AI Assistant with an on-premises Octopus instance that can not accept HTTP requests from this public IP address.
6060
:::
6161

62-
## Adding custom prompts
63-
64-
Octopus AI Assistant will present custom prompts defined in a Library Variable Set called `Octopus AI Assistant Prompts`. The Library Variable set contains variables named:
65-
66-
- `PageName[#].Prompt` - The prompt displayed in the UI and passed to the LLM
67-
- `PageName[#].SystemPrompt` - Additional prompt instructions passed to the LLM but not shown in the UI
68-
69-
Where:
70-
71-
- `PageName` is one of the pages listed in the table below
72-
- `#` is a number from 0 to 4 inclusive for up to 5 prompts per page
73-
74-
For example:
75-
76-
- `Project.Deployment[0].Prompt` - A prompt displayed when a project deployment is viewed
77-
- `Project.Deployment[0].SystemPrompt` - The system prompt passed to the LLM when the project deployment is viewed
78-
79-
| Page Name | Description |
80-
|-----------------------------------------|-----------------------------------------------------|
81-
| `Dashboard` | The main dashboard |
82-
| `Tasks` | The tasks overview |
83-
| `Project` | The project dashboard |
84-
| `Project.Settings` | The project settings |
85-
| `Project.VersionControl` | The project version control settings |
86-
| `Project.ITSMProviders` | The project ITSM settings |
87-
| `Project.Channels` | The project channels |
88-
| `Project.Triggers` | The project triggers |
89-
| `Project.Process` | The project deployment process editor |
90-
| `Project.Step` | An individual step in the deployment process editor |
91-
| `Project.Variables` | The project variables editor |
92-
| `Project.AllVariables` | The overview of all the project variables |
93-
| `Project.PreviewVariables` | The preview of all the project variables |
94-
| `Project.VariableSets` | The project library variable sets |
95-
| `Project.TenantVariables` | The project tenant variables |
96-
| `Project.Operations` | The project runbooks dashboard |
97-
| `Project.Operations.Triggers` | An runbook triggers |
98-
| `Project.Deployment` | The project deployments |
99-
| `Project.Release` | The project releases |
100-
| `Project.Runbooks` | The project runbooks |
101-
| `Project.Runbook.Runbook` | An individual runbook |
102-
| `Project.Runbook.Run` | A runbook run |
103-
| `LibraryVariableSets` | The library variable sets |
104-
| `LibraryVariableSet.LibraryVariableSet` | An individual library variable set |
105-
| `Machines` | The targets dashboard |
106-
| `Machine.Machine` | An individual target |
107-
| `Accounts` | The accounts dashboard |
108-
| `Account.Account` | An individual account |
109-
| `Workers` | The workers dashboard |
110-
| `WorkerPools` | The workerpool dashboard |
111-
| `MachinePolicies` | An machine policies dashboard |
112-
| `MachineProxies` | An machine proxies dashboard |
113-
| `Feeds` | The feeds dashboard |
114-
| `GitCredentials` | The git credentials dashboard |
115-
| `GitConnections` | The GitHub App dashboard |
116-
| `Lifecycles` | The lifecycles dashboard |
117-
| `Packages` | The built-in feed dashboard |
118-
| `ScriptModules` | The script modules dashboard |
119-
| `StepTemplates` | The step templates dashboard |
120-
| `TagSets` | The tag sets dashboard |
121-
| `TagSets.TagSet` | An individual tag set |
122-
| `Tenants` | The tenants dashboard |
123-
| `Tenant.Tenant` | An individual tenant |
124-
| `Certificates` | The certificates dashboard |
125-
| `Environments` | The environments dashboard |
126-
| `Environment.Environment` | An individual environment |
127-
| `Infrastructure` | The infrastructure dashboard |
128-
| `BuildInformation` | The build information dashboard |
129-
130-
## Writing custom prompts
131-
132-
To write a custom prompt, you need to define the prompt variable, which is in the format `PageName[#].Prompt`. The prompt variable represents what an Octopus user might write themselves when interacting with Octopus AI Assistant.
133-
134-
You can optionally define the system prompt variable, which is in the format `PageName[#].SystemPrompt`. The system prompt variable is used to provide additional context to the LLM, usually to capture unique business knowledge. The system prompt is not shown to the user.
135-
136-
For example, the prompt variable `Project.Deployment[0].Prompt`, which is displayed when a project deployment is viewed, might be:
137-
138-
> Why did the deployment fail? If the deployment didn't fail, say so. Provide suggestions for resolving the issue.
139-
140-
On its own, this prompt variable relies on the knowledge built into the LLM to provide an answer based on the context. The context for a project deployment is:
141-
142-
- The deployment logs
143-
- The deployment process
144-
145-
To improve the response, you can add a system prompt variable `Project.Deployment[0].SystemPrompt`:
146-
147-
> If the logs indicate that a Docker image is missing, You must only provide the suggestion that the user must visit <https://help/missingdocker> to get additional instructions to resolve missing docker containers. You will be penalized for offing generic suggestions to resolve a missing docker image. You will be penalized for offering script suggestions to resolve a missing docker image. You will be penalized for suggesting step retries to resolve a missing docker image.
148-
149-
The system prompt allows you to embed business knowledge to guide the LLM to provide a more accurate response. In this example we have instructed the LLM to determine if the deployment logs indicate that a Docker image is missing, and if so, to provide a custom link to internal documentation. We have also instructed the LLM to not provide generic suggestions, script suggestions, or step retries to resolve a missing docker image.
150-
15162
## FAQ
15263

15364
Q: What data is collected?

0 commit comments

Comments
 (0)