go-gpt-3-encoder
Go BPE tokenizer (Encoder+Decoder) for GPT2 and GPT3.
About
GPT2 and GPT3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a Go implementation of OpenAI's original Python encoder/decoder which can be found here.
This code was inspired by Javascript implementation and partially generated by OpenAI himself!
Install
go get github.com/samber/go-gpt-3-encoder
Usage
Compatible with Node >= 12
import tokenizer "github.com/samber/go-gpt-3-encoder"
encoder, err := tokenizer.NewEncoder()
if err != nil {
log.Fatal(err)
}
str := "This is an example sentence to try encoding out on!"
encoded, err := encoder.Encode(str)
if err != nil {
log.Fatal(err)
}
fmt.Println("We can look at each token and what it represents:")
for _, token := encoded {
fmt.Printf("%s -- %s\n", token, encoder.Decode([]string{token}))
}
decoded := encoder.Decode(encoded)
fmt.Printf("We can decode it back into: %s\n", decoded)
Contribute
Some corner cases are not covered by this library. See @TODO
in tests.