Getting Started

To start using byorg.ai, you don't need an extensive setup. All you need is an LLM provider of your choice.

Pre-requisites

  • API keys to LLM Provider supported by Vercel's AI SDK (e.g. OpenAI, Anthropic, ...) complete list here.
TIP

You can even use a self-hosted LLM, as long as it is compatible with the Vercel AI SDK.

Installation

The easiest way to begin is by installing byorg.ai packages and ai package using your preferred package manager.

npm
yarn
pnpm
bun
npm install @callstack/byorg-core ai

In the next section, we'll demonstrate how to use our core library to handle your requests.