A Spring Boot application demonstrating a multi-agent system architecture using Spring AI and Ollama for intelligent travel, shopping, and dining assistance.
- Travel Agent: Flight bookings, hotel recommendations, itinerary planning
- Shopping Agent: Product recommendations and shopping advice
- Restaurant Agent: Dining recommendations and reservation assistance
- Smart Routing: AI-powered query analysis to route requests to appropriate agents
- Multi-Agent Coordination: Combine responses from multiple agents for complex queries
- Sequential Processing: Process queries through agents in optimal order
- Parallel Processing: Get responses from all agents simultaneously
The application follows a microservices-inspired architecture with:
- Agent Handler: REST API endpoints for agent interactions
- Orchestrator Service: Intelligent query routing and multi-agent coordination
- Individual Agent Services: Specialized services for travel, shopping, and restaurant domains
- Spring AI Integration: Leverages Ollama for LLM capabilities
- Java 17+
- Maven 3.6+
- Ollama installed and running
- Llama3 model pulled in Ollama
# Install Ollama (macOS)
brew install ollama
# Start Ollama service
ollama serve
# Pull the Llama3 model
ollama pull llama3# Clone the repository
git clone <repository-url>
cd demo-travel-agents
# Run the application
./mvnw spring-boot:runThe application will start on http://localhost:8080
POST /api/agents/travel
Content-Type: application/json
"Plan a 3-day trip to Paris"POST /api/agents/shopping
Content-Type: application/json
"Find the best laptop under $1000"POST /api/agents/restaurant
Content-Type: application/json
"Recommend Italian restaurants in downtown"POST /api/agents/smart-orchestrate
Content-Type: application/json
"I need to plan a trip to Tokyo and buy travel gear"POST /api/agents/sequential
Content-Type: application/json
"Plan a business trip with shopping and dining"POST /api/agents/all
Content-Type: application/json
"Help me with vacation planning"GET /api/agents/healthGET /api/agents# Server configuration
server.port=8080
# Ollama configuration
spring.ai.ollama.base-url=http://localhost:11434
spring.ai.ollama.chat.model=llama3
spring.ai.ollama.chat.options.temperature=0.7
# Actuator endpoints
management.endpoints.web.exposure.include=health,info,metricsOnce the application is running, access the interactive API documentation at:
- Swagger UI:
http://localhost:8080/swagger-ui.html - OpenAPI Spec:
http://localhost:8080/v3/api-docs
curl -X POST http://localhost:8080/api/agents/travel \
-H "Content-Type: application/json" \
-d "\"Book a flight from New York to London\""curl -X POST http://localhost:8080/api/agents/smart-orchestrate \
-H "Content-Type: application/json" \
-d "\"I'm planning a business trip to San Francisco. I need flight recommendations, hotel suggestions, good restaurants for client dinners, and shopping for professional attire.\""{
"agentType": "Travel Agent",
"response": "I'd be happy to help you book a flight from New York to London...",
"modelUsed": "Travel LLM",
"metadata": null
}- Framework: Spring Boot 3.5.5
- AI Integration: Spring AI with Ollama
- LLM: Llama3 (via Ollama)
- Build Tool: Maven
- Java Version: 17
- Documentation: SpringDoc OpenAPI
- Utilities: Lombok for boilerplate reduction
src/main/java/com/samshodan/agents/
βββ TravelAgentApp.java # Main application class
βββ config/
β βββ AppConfig.java # Application configuration
βββ handler/
β βββ AgentHandler.java # REST API endpoints
βββ model/
β βββ AgentResponse.java # Response model
βββ service/
βββ AgentService.java # Base service class
βββ impl/
βββ OrchestratorService.java # Multi-agent orchestration
βββ TravelAgentService.java # Travel domain logic
βββ ShoppingAgentService.java # Shopping domain logic
βββ RestaurantAgentService.java # Restaurant domain logic
# Compile and package
./mvnw clean package
# Run tests
./mvnw test
# Run with specific profile
./mvnw spring-boot:run -Dspring-boot.run.profiles=devThe application includes Spring Boot Actuator for monitoring:
- Health:
http://localhost:8080/actuator/health - Metrics:
http://localhost:8080/actuator/metrics - Info:
http://localhost:8080/actuator/info
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Spring AI for LLM integration
- Ollama for local LLM hosting
- Spring Boot for the application framework
Note: This is a demonstration project showcasing multi-agent system architecture patterns. For production use, consider implementing proper error handling, authentication, rate limiting, and monitoring.