OpenAI Function Call Example
Welcome to this cheerful example of using OpenAI's function calling capabilities with the LangChain Go library! ๐
What does this example do?
This example demonstrates how to use OpenAI's GPT-3.5-turbo model to generate responses and make function calls based on user input. It's like having a smart assistant that can not only answer questions but also fetch real-time information for you! ๐ค๐ฌ
Here's a breakdown of what happens in this exciting journey:
- We set up an OpenAI language model using the LangChain Go library.
- We ask the model about the weather in Boston and Chicago.
- The model recognizes that it needs to fetch weather information and makes a function call to
getCurrentWeather
.
- We simulate getting the weather data (it's always sunny in this example! โ๏ธ).
- We provide the weather information back to the model.
- Finally, we ask the model to compare the weather in both cities.
Key Features
- ๐ Uses OpenAI's GPT-3.5-turbo model
- ๐ ๏ธ Demonstrates function calling capabilities
- ๐ค๏ธ Simulates weather data retrieval
- ๐ Shows how to manage conversation context and message history
How it Works
- Initial Query: We ask about the weather in Boston and Chicago.
- Function Recognition: The model recognizes it needs to call the
getCurrentWeather
function.
- Data Retrieval: We simulate fetching weather data for both cities.
- Context Update: We update the conversation context with the weather information.
- Comparison: We ask the model to compare the weather, and it provides a human-like response.
Fun Fact
In this example, Boston is always 72 and sunny, while Chicago is 65 and windy. Looks like Boston is winning the weather game today! ๐
So, grab your virtual sunglasses and enjoy exploring this example of AI-powered weather inquiries! โ๏ธ๐ถ๏ธ