You can use any AI provider supported by Vercel’s AI SDK. This includes both LLM-as-a-service providers like OpenAI, Anthropic, and others, as well as locally hosted LLMs. We are also open to extending support to other types of chat models, such as LangChain’s runnables.
After instantiating the provider client, wrap it with our VercelAdapter class:
Now that the chatModel is ready, let’s discuss the systemPrompt function.