This example demonstrates how to use function calling capabilities with the Ollama language model using the langchaingo library. It showcases a simple weather information retrieval system.
What it does
Sets up an Ollama language model client with JSON output format.
Defines a set of tools (functions) that the model can use:
getCurrentWeather: Retrieves weather information for a given location.
finalResponse: Provides the final response to the user query.
Sends a user query about the weather in Beijing.
Processes the model's responses, which may include function calls.
Handles function calls by dispatching them to the appropriate logic.
Continues the conversation until a final response is generated or the maximum number of retries is reached.
Key Features
Function Calling: Demonstrates how to define and use custom functions with Ollama.
Conversation Flow: Manages a multi-turn conversation between the user and the model.
Error Handling: Includes retry logic and validation of function calls.
Customization: Allows specifying a custom Ollama model via the OLLAMA_TEST_MODEL environment variable.
How to Run
Ensure you have Ollama set up and running on your system.
Run the example with: go run ollama_functions_example.go
Use the -v flag for verbose output: go run ollama_functions_example.go -v
Note
This example is a great starting point for understanding how to implement function calling with Ollama and manage more complex conversations with AI models. It can be extended to include more tools and handle various types of queries beyond weather information.