Connecting an AI tool to Ongoing WMS
Table of contents
Introduction
It is possible to connect AI tools such as ChatGPT and Gemini to Ongoing WMS. The connection gives the AI tool the ability to ask Ongoing WMS for information.
On a technical level, the protocol used by the AI tool to communicate with Ongoing WMS is the Model Context Protocol (MCP). MCP is an open-source standard for connecting AI applications to external systems. It functions as a standardized "gateway" between the AI tool and the external system.
Currently, Ongoing WMS allows AI tools to ask for article data on a per goods owner basis. For instance, the AI tool can query for the stock balances of an article at a particular goods owner.
Preparing the MCP connection in Ongoing WMS
When adding an MCP connection to an AI tool, typically three things have to be specified:
- Type of transport. This should be "Streamable HTTP".
- The API URL.
- The Authorization header.
To find out the API URL and Authorization header, follow these steps:
- As administrator, go to Administration ⇒ API for goods owners.
- Click on the goods owner that you want the AI client to be able to access.
- Click "Add new user for AI tool integrations".
- Follow the instructions, and at the end you will be given the API URL and Authorization header.
Adding the connection in the AI tool
Exactly how the connection is added varies between the different AI tools. Here is the documentation for adding an MCP connection for some popular AI tools:
General pointers
Ongoing WMS' MCP connection does not use OAuth. Instead, we use an Authorization header, as discussed above. Make sure to save the Authorization header in a secure way! Treat the Authorization header the same way you would treat a password. Most AI tools have some secure storage where you can put API keys and passwords. The Authorization header should be saved in the same secure storage.