Spring AI ResOs
by pacphi
This project provides a Spring AI enhanced restaurant booking system using an API-first approach. It includes a code-generated client from the ResOs API, a Spring AI implementation, an MCP server and client, and a ReactJS chatbot UI.
Last updated: N/A
What is Spring AI ResOs?
Spring AI ResOs is a multi-module project that demonstrates how to integrate Spring AI with a restaurant booking system (ResOs). It provides a chatbot interface for users to search for restaurants and make reservations using natural language.
How to use Spring AI ResOs?
To use Spring AI ResOs, you need to clone the repository, build the project using Maven, and configure the necessary API keys for ResOs and an LLM provider (e.g., Groq Cloud, OpenRouter, or OpenAI). You can then run the backend server, MCP server, and chatbot UI separately, or integrate with Claude Desktop using the MCP client configuration.
Key features of Spring AI ResOs
Spring AI integration
API-first design
Code-generated client
MCP server and client
ReactJS chatbot UI
Support for multiple LLM providers (Groq Cloud, OpenRouter, OpenAI)
Use cases of Spring AI ResOs
Restaurant booking via chatbot
Integrating AI into existing restaurant management systems
Demonstrating Spring AI capabilities
Building conversational interfaces for APIs
Prototyping AI-powered booking systems
FAQ from Spring AI ResOs
What is ResOs?
What is ResOs?
ResOs is a restaurant operating system API for managing reservations and restaurant information.
What is MCP?
What is MCP?
MCP stands for Multi-Cloud Platform. In this context, it refers to a server and client setup for interacting with different LLMs.
Do I need a ResOs API key?
Do I need a ResOs API key?
You only need a ResOs API key if you intend to register as a restaurateur and interact directly with the ResOs v1.2 API. Otherwise, you can use the provided backend server.
Which LLM providers are supported?
Which LLM providers are supported?
The project supports Groq Cloud, OpenRouter, and OpenAI. You will need to obtain API keys for these providers.
How do I configure the chatbot to use a specific LLM provider?
How do I configure the chatbot to use a specific LLM provider?
You can configure the chatbot by activating the corresponding Spring profile (e.g., openai
, groq-cloud
, openrouter
) and providing the necessary API keys in the creds.yml
file.