feat: add config for embedding model (#12120)

This PR adds the ability to modify the embedding model used by Captain
AI.Previously, the embedding model was hardcoded which led to errors when
you used a different API provider which did not support that specific
embedding model.

Co-authored-by: Shivam Mishra <scm.mymail@gmail.com>
This commit is contained in:
YashRaj
2025-08-25 11:33:00 +05:30
committed by GitHub
parent 7d6a43fc72
commit be721c2b50
3 changed files with 9 additions and 2 deletions

View File

@@ -175,6 +175,10 @@
display_title: 'OpenAI API Endpoint (optional)'
description: 'The OpenAI endpoint configured for use in Captain AI. Default: https://api.openai.com/'
locked: false
- name: CAPTAIN_EMBEDDING_MODEL
display_title: 'Embedding Model (optional)'
description: 'The embedding model configured for use in Captain AI. Default: text-embedding-3-small'
locked: false
- name: CAPTAIN_FIRECRAWL_API_KEY
display_title: 'FireCrawl API Key (optional)'
description: 'The FireCrawl API key for the Captain AI service'

View File

@@ -3,9 +3,11 @@ require 'openai'
class Captain::Llm::EmbeddingService < Llm::BaseOpenAiService
class EmbeddingsError < StandardError; end
DEFAULT_MODEL = 'text-embedding-3-small'.freeze
def self.embedding_model
@embedding_model = InstallationConfig.find_by(name: 'CAPTAIN_EMBEDDING_MODEL')&.value.presence || OpenAiConstants::DEFAULT_EMBEDDING_MODEL
end
def get_embedding(content, model: DEFAULT_MODEL)
def get_embedding(content, model: self.class.embedding_model)
response = @client.embeddings(
parameters: {
model: model,

View File

@@ -3,4 +3,5 @@
module OpenAiConstants
DEFAULT_MODEL = 'gpt-4.1-mini'
DEFAULT_ENDPOINT = 'https://api.openai.com'
DEFAULT_EMBEDDING_MODEL = 'text-embedding-3-small'
end