groq

package
v0.27.2-beta Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Sep 12, 2024 License: MIT Imports: 8 Imported by: 0

README

---
title: "Groq"
lang: "en-US"
draft: false
description: "Learn about how to set up a VDP Groq component https://github.com/instill-ai/instill-core"
---

The Groq component is an AI component that allows users to connect the AI models served on GroqCloud.
It can carry out the following tasks:

- [Text Generation Chat](#text-generation-chat)



## Release Stage

`Alpha`



## Configuration

The component configuration is defined and maintained [here](https://github.com/instill-ai/component/blob/main/ai/groq/v0/config/definition.json).




## Setup




In order to communicate with Groq, the following connection details need to be
provided. You may specify them directly in a pipeline recipe as key-value pairs
withing the component's `setup` block, or you can create a **Connection** from
the [**Integration Settings**](https://www.instill.tech/docs/vdp/integration)
page and reference the whole `setup` as `setup:
${connection.<my-connection-id>}`.

| Field | Field ID | Type | Note |
| :--- | :--- | :--- | :--- |
| API Key | `api-key` | string | Fill in your GroqCloud API key. To find your keys, visit the GroqCloud API Keys page. |




## Supported Tasks

### Text Generation Chat

Provide text outputs in response to text inputs.


| Input | ID | Type | Description |
| :--- | :--- | :--- | :--- |
| Task ID (required) | `task` | string | `TASK_TEXT_GENERATION_CHAT` |
| Model (required) | `model` | string | The OSS model to be used |
| Prompt (required) | `prompt` | string | The prompt text |
| System message | `system-message` | string | The system message helps set the behavior of the assistant. For example, you can modify the personality of the assistant or provide specific instructions about how it should behave throughout the conversation. By default, the model’s behavior is set using a generic message as "You are a helpful assistant." |
| Prompt Images | `prompt-images` | array[string] | The prompt images (Note: Only a subset of OSS models support image inputs) |
| Chat history | `chat-history` | array[object] | Incorporate external chat history, specifically previous messages within the conversation. Please note that System Message will be ignored and will not have any effect when this field is populated. Each message should adhere to the format: : \{"role": "The message role, i.e. 'system', 'user' or 'assistant'", "content": "message content"\} |
| Seed | `seed` | integer | The seed |
| Temperature | `temperature` | number | The temperature for sampling |
| Top K | `top-k` | integer | Integer to define the top tokens considered within the sample operation to create new text |
| Max new tokens | `max-new-tokens` | integer | The maximum number of tokens for model to generate |
| Top P | `top-p` | number | Float to define the tokens that are within the sample operation of text generation. Add tokens in the sample for more probable to least probable until the sum of the probabilities is greater than top-p (default=0.5) |
| User | `user` | string | The user name passed to GroqPlatform |



| Output | ID | Type | Description |
| :--- | :--- | :--- | :--- |
| Text | `text` | string | Model Output |
| Usage (optional) | `usage` | object | Token usage on the GroqCloud platform text generation models |






## Example Recipes

Recipe for the [Groq Interview Helper](https://instill.tech/instill-ai/pipelines/groq-interview-helper/playground) pipeline.

```yaml
version: v1beta
component:
    groq-0:
        type: groq
        task: TASK_TEXT_GENERATION_CHAT
        input:
            max-new-tokens: 300
            model: llama3-groq-70b-8192-tool-use-preview
            prompt: |-
                Rewrite this experience using the STAR (Situation, Task, Action, Result) method for a resume or CV:

                ${variable.experience}
            system-message: You are a helpful resume assistant.
            temperature: 0.05
            top-k: 10
            top-p: 0.5
            user: instill-ai
        setup:
            api-key: ${secret.INSTILL_SECRET}
variable:
    experience:
        title: experience
        description: describe your work experience
        instill-format: string
        instill-ui-multiline: true
output:
    resume_format:
        title: resume_format
        value: ${groq-0.output.text}
```

Documentation

Index

Constants

View Source
const (
	Endpoint = "https://api.groq.com"
)
View Source
const (
	TaskTextGenerationChat = "TASK_TEXT_GENERATION_CHAT"
)

Variables

This section is empty.

Functions

func Init

func Init(bc base.Component) *component

Types

type ChatMessage

type ChatMessage struct {
	Role    string              `json:"role"`
	Content []MultiModalContent `json:"content"`
}

type ChatRequest

type ChatRequest struct {
	FrequencyPenalty  float32                    `json:"frequency_penalty,omitempty"`
	MaxTokens         int                        `json:"max_tokens"`
	Model             string                     `json:"model"`
	Messages          []GroqChatMessageInterface `json:"messages"`
	N                 int                        `json:"n,omitempty"`
	PresencePenalty   float32                    `json:"presence_penalty,omitempty"`
	ParallelToolCalls bool                       `json:"parallel_tool_calls,omitempty"`
	Seed              int                        `json:"seed,omitempty"`
	Stop              []string                   `json:"stop"`
	Stream            bool                       `json:"stream,omitempty"`
	Temperature       float32                    `json:"temperature,omitempty"`
	TopP              float32                    `json:"top_p,omitempty"`
	User              string                     `json:"user,omitempty"`
}

type ChatResponse

type ChatResponse struct {
	ID      string       `json:"id"`
	Object  string       `json:"object"`
	Created int          `json:"created"`
	Model   string       `json:"model"`
	Choices []GroqChoice `json:"choices"`
	Usage   GroqUsage    `json:"usage"`
}

type GroqChatContent

type GroqChatContent struct {
	ImageURL *GroqURL            `json:"image_url,omitempty"`
	Text     string              `json:"text"`
	Type     GroqChatContentType `json:"type,omitempty"`
}

type GroqChatContentType

type GroqChatContentType string
const (
	GroqChatContentTypeText  GroqChatContentType = "text"
	GroqChatContentTypeImage GroqChatContentType = "image"
)

type GroqChatMessage

type GroqChatMessage struct {
	Role    string            `json:"role"`
	Content []GroqChatContent `json:"content"`
}

type GroqChatMessageInterface

type GroqChatMessageInterface interface {
}

type GroqChoice

type GroqChoice struct {
	Index        int                 `json:"index"`
	Message      GroqResponseMessage `json:"message"`
	FinishReason string              `json:"finish_reason"`
}

type GroqClient

type GroqClient struct {
	// contains filtered or unexported fields
}

func NewClient

func NewClient(token string, logger *zap.Logger) *GroqClient

func (*GroqClient) Chat

func (c *GroqClient) Chat(request ChatRequest) (ChatResponse, error)

type GroqClientInterface

type GroqClientInterface interface {
	Chat(ChatRequest) (ChatResponse, error)
}

type GroqResponseMessage

type GroqResponseMessage struct {
	Role    string `json:"role"`
	Content string `json:"content"`
}

type GroqSystemMessage

type GroqSystemMessage struct {
	Role    string `json:"role"`
	Content string `json:"content"`
}

type GroqURL

type GroqURL struct {
	URL    string `json:"url"`
	Detail string `json:"detail,omitempty"`
}

type GroqUsage

type GroqUsage struct {
	PromptTokens     int     `json:"prompt_tokens"`
	CompletionTokens int     `json:"completion_tokens"`
	TotalTokens      int     `json:"total_tokens"`
	PromptTime       float32 `json:"prompt_time"`
	CompletionTime   float32 `json:"completion_time"`
	TotalTime        float32 `json:"total_time"`
}

type MultiModalContent

type MultiModalContent struct {
	ImageURL URL    `json:"image-url"`
	Text     string `json:"text"`
	Type     string `json:"type"`
}

type TaskTextGenerationChatInput

type TaskTextGenerationChatInput struct {
	ChatHistory  []ChatMessage `json:"chat-history"`
	MaxNewTokens int           `json:"max-new-tokens"`
	Model        string        `json:"model"`
	Prompt       string        `json:"prompt"`
	PromptImages []string      `json:"prompt-images"`
	Seed         int           `json:"seed"`
	SystemMsg    string        `json:"system-message"`
	Temperature  float32       `json:"temperature"`
	TopK         int           `json:"top-k"`

	// additional parameters
	TopP float32 `json:"top-p"`
	User string  `json:"user"`
}

type TaskTextGenerationChatOuput

type TaskTextGenerationChatOuput struct {
	Text  string                      `json:"text"`
	Usage TaskTextGenerationChatUsage `json:"usage"`
}

type TaskTextGenerationChatUsage

type TaskTextGenerationChatUsage struct {
	InputTokens  int `json:"input-tokens"`
	OutputTokens int `json:"output-tokens"`
}

type URL

type URL struct {
	URL string `json:"url"`
}

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL