Skip to content

FrontLLM is your public gateway to LLMs. Request LLM directly from your front-end code. No backend needed.

License

Notifications You must be signed in to change notification settings

frontllm/frontllm-sdk

Repository files navigation

FrontLLM

FrontLLM

Build Status License: MIT View this project on NPM

FrontLLM is your public gateway to LLMs. Request LLM directly from your front-end code. No backend needed.

📝 Check the documentation for more details.

👀 Demos

🚀 Installation

NPM

To use FrontLLM in your project, you can install it via npm:

npm install frontllm

Now you can import the library and create an instance of the gateway with your specific gateway ID:

import { frontLLM } from 'frontllm';
const gateway = frontLLM('<gateway_id>');

CDN

To use FrontLLM via CDN, you can include the following script tag in your HTML file:

<script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/index.umd.js"></script>

This will expose the frontLLM function globally, which you can use to create an instance of the gateway:

<script>
  const gateway = frontLLM('<gateway_id>');
  // ...
</script>

🎬 Usage

Chat Completion:

// Short syntax - requires the default model configured in the gateway

const response = await gateway.complete('Hello world!'); 

// Full syntax

const response = await gateway.complete({
  model: 'gpt-4',
  messages: [
    { role: 'user', content: 'Hello world!' }
  ],
  temperature: 0.7
});

// Output the generated response text to the console.

console.log(response.choices[0].message.content);

Chat Completion with Streaming:

// Short syntax - requires the default model configured in the gateway

const response = await gateway.completeStreaming('Where is Europe?');

// Full syntax

const response = await gateway.completeStreaming({
  model: 'gpt-4',
  messages: [
    { role: 'user', content: 'Where is Europe?' }
  ],
  temperature: 0.7
});

// Output the generated response text to the console.

for (;;) {
  const { finished, chunks } = await response.read();
  for (const chunk of chunks) {
    console.log(chunk.choices[0].delta.content);
  }
  if (finished) {
    break;
  }
}

💡 License

This project is released under the MIT license.

About

FrontLLM is your public gateway to LLMs. Request LLM directly from your front-end code. No backend needed.

Topics

Resources

License

Stars

Watchers

Forks