README ¶ parallel-inference-generator Parallel Inference Generator feeds prompts to your hosted llm endpoints and outputs an excel sheet with the required metrics from the endpoint. It is assumed that the hosted endpoint uses tgi. Expand ▾ Collapse ▴ Documentation ¶ There is no documentation for this package. Source Files ¶ View all Source files main.go Click to show internal directories. Click to hide internal directories.