tlm

command module
v0.0.0-...-a823697 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Oct 22, 2024 License: Apache-2.0 Imports: 5 Imported by: 0

README ΒΆ

tlm - Local CLI Copilot, powered by CodeLLaMa (DolphinCoder). πŸ’»πŸ¬

[!TIP] Starcoder2 3B model option coming soon to support workstations with limited resources.

Latest Build Sonar Quality Gate Latest Release Downloads

tlm is your CLI companion which requires nothing except your workstation. It uses most efficient and powerful CodeLLaMa -> DolphinCoder in your local environment to provide you the best possible command line suggestions.

Suggest

Explain

Features

  • πŸ’Έ No API Key (Subscription) is required. (ChatGPT, Github Copilot, Azure OpenAI, etc.)

  • πŸ“‘ No internet connection is required.

  • πŸ’» Works on macOS, Linux and Windows.

  • πŸ‘©πŸ»β€πŸ’» Automatic shell detection. (Powershell, Bash, Zsh)

  • πŸš€ One liner generation and command explanation.

Installation

Installation can be done in two ways;

Prerequisites

Ollama is needed to download to necessary models. It can be downloaded with the following methods on different platforms.

  • On macOs and Windows;

Download instructions can be followed at the following link: https://ollama.com/download

  • On Linux;
curl -fsSL https://ollama.com/install.sh | sh
  • Or using official Docker images 🐳;
# CPU Only
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

# With GPU (Nvidia & AMD)
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
Installation Script

Installation script is the recommended way to install tlm. It will recognize the which platform and architecture to download and will execute install command for you.

Linux and macOS;

Download and execute the installation script by using the following command;

curl -fsSL https://raw.githubusercontent.com/hellfiveosborn/tlm/main/install.sh | sudo bash -E
Windows (Powershell 5.1 or higher)

Download and execute the installation script by using the following command;

Invoke-RestMethod -Uri https://raw.githubusercontent.com/hellfiveosborn/tlm/main/install.ps1 | Invoke-Expression
Go Install

If you have Go 1.21 or higher installed on your system, you can easily use the following command to install tlm;

go install github.com/HellFiveOsborn/tlm@latest

Then, deploy tlm modelfiles.

πŸ“ Note: If you have Ollama deployed on somewhere else. Please first run tlm config and configure Ollama host.

tlm deploy

Check installation by using the following command;

tlm help

Uninstall

On Linux and macOS;

rm /usr/local/bin/tlm

On Windows;

Remove-Item -Recurse -Force "C:\Users\$env:USERNAME\AppData\Local\Programs\tlm"

Contributors

Documentation ΒΆ

The Go Gopher

There is no documentation for this package.

Directories ΒΆ

Path Synopsis

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL