Navigation
Updates
[!NOTE]
September 15, 2024 — Lots of new stuff!
- Fabric now supports calling the new
o1-preview
model using the -r
switch (which stands for raw. Normal queries won't work with o1-preview
because they disabled System access and don't allow us to set Temperature
.
- We have early support for Raycast! Under the
/patterns
directory there's a raycast
directory with scripts that can be called from Raycast. If you add a scripts directory within Raycast and point it to your ~/.config/fabric/patterns/raycast
directory, you'll then be able to 1) invoke Raycast, type the name of the script, and then 2) paste in the content to be passed, and the results will return in Raycast. There's currently only one script in there but I am (Daniel) adding more.
- Go Migration: The following command line options were changed during the migration to Go:
- You now need to use the -c option instead of -C to copy the result to the clipboard.
- You now need to use the -s option instead of -S to stream results in realtime.
- The following command line options have been removed
--agents
(-a), --gui
, --clearsession
, --remoteOllamaServer
, and --sessionlog
- You can now use (-S) to configure an Ollama server.
- We're working on a GUI rewrite in Go as well
Intro videos
Keep in mind that many of these were recorded when Fabric was Python-based, so remember to use the current install instructions below.
What and why
Since the start of 2023 and GenAI we've seen a massive number of AI applications for accomplishing tasks. It's powerful, but it's not easy to integrate this functionality into our lives.
In other words, AI doesn't have a capabilities problem—it has an integration problem.
Fabric was created to address this by enabling everyone to granularly apply AI to everyday challenges.
Philosophy
AI isn't a thing; it's a magnifier of a thing. And that thing is human creativity.
We believe the purpose of technology is to help humans flourish, so when we talk about AI we start with the human problems we want to solve.
Breaking problems into components
Our approach is to break problems into individual pieces (see below) and then apply AI to them one at a time. See below for some examples.
Too many prompts
Prompts are good for this, but the biggest challenge I faced in 2023——which still exists today—is the sheer number of AI prompts out there. We all have prompts that are useful, but it's hard to discover new ones, know if they are good or not, and manage different versions of the ones we like.
One of fabric
's primary features is helping people collect and integrate prompts, which we call Patterns, into various parts of their lives.
Fabric has Patterns for all sorts of life and work activities, including:
- Extracting the most interesting parts of YouTube videos and podcasts
- Writing an essay in your own voice with just an idea as an input
- Summarizing opaque academic papers
- Creating perfectly matched AI art prompts for a piece of writing
- Rating the quality of content to see if you want to read/watch the whole thing
- Getting summaries of long, boring content
- Explaining code to you
- Turning bad documentation into usable documentation
- Creating social media posts from any content input
- And a million more…
Installation
To install Fabric, make sure Go is installed, and then run the following command.
# Install Fabric directly from the repo
go install github.com/danielmiessler/fabric@latest
Environment Variables
You may need to set some environment variables in your ~/.bashrc
on linux or ~/.zshrc
file on mac to be able to run the fabric
command. Here is an example of what you can add:
For Intel based macs or linux
# Golang environment variables
export GOROOT=/usr/local/go
export GOPATH=$HOME/go
# Update PATH to include GOPATH and GOROOT binaries
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH
for Apple Silicon based macs
# Golang environment variables
export GOROOT=/opt/homebrew/bin/go
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH:
Setup
Now run the following command
# Run the setup to set up your directories and keys
fabric --setup
If everything works you are good to go.
Migration
If you have the Legacy (Python) version installed and want to migrate to the Go version, here's how you do it. It's basically two steps: 1) uninstall the Python version, and 2) install the Go version.
# Uninstall Legacy Fabric
pipx uninstall fabric
# Clear any old Fabric aliases
(check your .bashrc, .zshrc, etc.)
# Install the Go version
go install github.com/danielmiessler/fabric@latest
# Run setup for the new version. Important because things have changed
fabric --setup
Then set your environmental variables as shown above.
Upgrading
The great thing about Go is that it's super easy to upgrade. Just run the same command you used to install it in the first place and you'll always get the latest version.
go install github.com/danielmiessler/fabric@latest
Usage
Once you have it all set up, here's how to use it.
fabric -h
usage: fabric -h
Usage:
fabric [OPTIONS]
Application Options:
-p, --pattern= Choose a pattern
-v, --variable= Values for pattern variables, e.g. -v=$name:John -v=$age:30
-C, --context= Choose a context
--session= Choose a session
-S, --setup Run setup
--setup-skip-update-patterns Skip update patterns at setup
-t, --temperature= Set temperature (default: 0.7)
-T, --topp= Set top P (default: 0.9)
-s, --stream Stream
-P, --presencepenalty= Set presence penalty (default: 0.0)
-r, --raw Use the defaults of the model without sending chat options (like temperature etc.) and use the user role instead of the system role for patterns
-F, --frequencypenalty= Set frequency penalty (default: 0.0)
-l, --listpatterns List all patterns
-L, --listmodels List all available models
-x, --listcontexts List all contexts
-X, --listsessions List all sessions
-U, --updatepatterns Update patterns
-c, --copy Copy to clipboard
-m, --model= Choose model
-o, --output= Output to file
-n, --latest= Number of latest patterns to list (default: 0)
-d, --changeDefaultModel Change default pattern
-y, --youtube= YouTube video url to grab transcript, comments from it and send to chat
--transcript Grab transcript from YouTube video and send to chat
--comments Grab comments from YouTube video and send to chat
--dry-run Show what would be sent to the model without actually sending it
-u, --scrape_url= Scrape website URL to markdown using Jina AI
-q, --scrape_question= Search question using Jina AI
Help Options:
-h, --help Show this help message
Our approach to prompting
Fabric Patterns are different than most prompts you'll see.
- First, we use
Markdown
to help ensure maximum readability and editability. This not only helps the creator make a good one, but also anyone who wants to deeply understand what it does. Importantly, this also includes the AI you're sending it to!
Here's an example of a Fabric Pattern.
https://github.com/danielmiessler/fabric/blob/main/patterns/extract_wisdom/system.md
-
Next, we are extremely clear in our instructions, and we use the Markdown structure to emphasize what we want the AI to do, and in what order.
-
And finally, we tend to use the System section of the prompt almost exclusively. In over a year of being heads-down with this stuff, we've just seen more efficacy from doing that. If that changes, or we're shown data that says otherwise, we will adjust.
Examples
Now let's look at some things you can do with Fabric.
- Run the
summarize
Pattern based on input from stdin
. In this case, the body of an article.
pbpaste | fabric --pattern summarize
- Run the
analyze_claims
Pattern with the --stream
option to get immediate and streaming results.
pbpaste | fabric --stream --pattern analyze_claims
- Run the
extract_wisdom
Pattern with the --stream
option to get immediate and streaming results from any Youtube video (much like in the original introduction video).
yt --transcript https://youtube.com/watch?v=uXs-zPc63kM | fabric --stream --pattern extract_wisdom
- Create patterns- you must create a .md file with the pattern and save it to ~/.config/fabric/patterns/[yourpatternname].
Just use the Patterns
If you're not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the /patterns
directory and start exploring!
We hope that if you used nothing else from Fabric, the Patterns by themselves will make the project useful.
You can use any of the Patterns you see there in any AI application that you have, whether that's ChatGPT or some other app or website. Our plan and prediction is that people will soon be sharing many more than those we've published, and they will be way better than ours.
The wisdom of crowds for the win.
Custom Patterns
You may want to use Fabric to create your own custom Patterns—but not share them with others. No problem!
Just make a directory in ~/.config/custompatterns/
(or wherever) and put your .md
files in there.
When you're ready to use them, copy them into:
~/.config/fabric/patterns/
You can then use them like any other Patterns, but they won't be public unless you explicitly submit them as Pull Requests to the Fabric project. So don't worry—they're private to you.
This feature works with all openai and ollama models but does NOT work with claude. You can specify your model with the -m flag
Helper Apps
Fabric also makes use of some core helper apps (tools) to make it easier to integrate with your various workflows. Here are some examples:
yt
is a helper command that extracts the transcript from a YouTube video. You can use it like this:
yt https://www.youtube.com/watch?v=lQVcbY52_gY
This will return the transcript from the video, which you can then pipe into Fabric like this:
yt https://www.youtube.com/watch?v=lQVcbY52_gY | fabric --pattern extract_wisdom
yt
Installation
To install yt
, install it the same way as you install Fabric, just with a different repo name.
go install github.com/danielmiessler/yt@latest
Be sure to add your YOUTUBE_API_KEY
to ~/.config/fabric/.env
.
to_pdf
to_pdf
is a helper command that converts LaTeX files to PDF format. You can use it like this:
to_pdf input.tex
This will create a PDF file from the input LaTeX file in the same directory.
You can also use it with stdin which works perfectly with the write_latex
pattern:
echo "ai security primer" | fabric --pattern write_latex | to_pdf
This will create a PDF file named output.pdf
in the current directory.
to_pdf
Installation
To install to_pdf
, install it the same way as you install Fabric, just with a different repo name.
go install github.com/danielmiessler/fabric/to_pdf/to_pdf@latest
Make sure you have a LaTeX distribution (like TeX Live or MiKTeX) installed on your system, as to_pdf
requires pdflatex
to be available in your system's PATH.
[!NOTE]
Special thanks to the following people for their inspiration and contributions!
- Jonathan Dunn for being the absolute MVP dev on the project, including spearheading the new Go version, as well as the GUI! All this while also being a full-time medical doctor!
- Caleb Sima for pushing me over the edge of whether to make this a public project or not.
- Eugen Eisler and Frederick Ros for their invaluable contributions to the Go version
- Joel Parish for super useful input on the project's Github directory structure..
- Joseph Thacker for the idea of a
-c
context flag that adds pre-created context in the ./config/fabric/
directory to all Pattern queries.
- Jason Haddix for the idea of a stitch (chained Pattern) to filter content using a local model before sending on to a cloud model, i.e., cleaning customer data using
llama2
before sending on to gpt-4
for analysis.
- Andre Guerra for assisting with numerous components to make things simpler and more maintainable.
Primary contributors
fabric
was created by Daniel Miessler in January of 2024.