Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

fxnai/fxn-llm-js

Repository files navigation

Function LLM for JavaScript

fxn-llm.mp4

Dynamic JSON Badge X (formerly Twitter) Follow

Use local LLMs in your browser and Node.js apps. This package is designed to patch OpenAI and Anthropic clients for running inference locally, using predictors hosted on Function.

Tip

We offer a similar package for use in Python. Check out fxn-llm.

Important

This package is still a work-in-progress, so the API could change drastically between all releases.

Caution

Never embed access keys client-side (i.e. in the browser). Instead, create a proxy URL in your backend.

Installing Function LLM

Function LLM is distributed on NPM. Open a terminal and run the following command:

# Run this in Terminal
$ npm install fxn-llm

Important

Make sure to create an access key by signing onto Function. You'll need it to fetch the predictor at runtime.

Using the OpenAI Client Locally

To run text generation and embedding models locally using the OpenAI client, patch your OpenAI instance with the locally function:

import { locally } from "fxn-llm"
import { OpenAI } from "openai"

// 💥 Create your OpenAI client
let openai = new OpenAI({ apiKey: "fxn", dangerouslyAllowBrowser: true });

// 🔥 Make it local
openai = locally(openai, {
  accessKey: process.env.NEXT_PUBLIC_FXN_ACCESS_KEY
});

// 🚀 Generate embeddings
const embeddings = openai.embeddings.create({
  model: "@nomic/nomic-embed-text-v1.5-quant",
  input: "search_query: Hello world!"
});

Warning

Currently, only openai.embeddings.create is supported. Text generation is coming soon!


Useful Links

Function is a product of NatML Inc.