Embeddings API
Types and functions for the Embeddings API.
Embeddings Object
UniLM.Embeddings — Type
Embeddings(input::String; service=OPENAIServiceEndpoint, model="text-embedding-3-small")
Embeddings(input::Vector{String}; service=OPENAIServiceEndpoint, model="text-embedding-3-small")Create an embedding request for one or more texts. Defaults to OpenAI's text-embedding-3-small (1536 dimensions), but works with any provider via the service parameter — Ollama, Gemini, Mistral, or any OpenAI-compatible server.
The embeddings field is pre-allocated and filled in-place by embeddingrequest!.
Fields
service::ServiceEndpointSpec: LLM provider (default:OPENAIServiceEndpoint).model::String: The embedding model name.input::Union{String,Vector{String}}: Text(s) to embed.embeddings::Union{Vector{Float64},Vector{Vector{Float64}}}: Pre-allocated embedding vector(s).user::Union{String,Nothing}: Optional end-user identifier.
Example
emb = Embeddings("Julia is a great language")
embeddingrequest!(emb)
emb.embeddings # => Float64[...] (1536 dims)
# With Ollama
emb = Embeddings("test"; service=OllamaEndpoint(), model="nomic-embed-text")Construction
using UniLM
# Single input
emb = Embeddings("Julia is a great language")
println("Model: ", emb.model)
println("Embedding dims: ", length(emb.embeddings))
# Batch input
batch = Embeddings(["Hello", "World", "Julia"])
println("Batch size: ", length(batch.input))
println("Each embedding dims: ", length(batch.embeddings[1]))Model: text-embedding-3-small
Embedding dims: 1536
Batch size: 3
Each embedding dims: 1536Request Function
UniLM.embeddingrequest! — Function
embeddingrequest!(emb::Embedding)
Send a request to the OpenAI API to generate an embedding for the `input` in `emb`.
Resulting embedding is stored in the preallocated `embedding` field.
@kwdef struct Embedding
model::String = "text-embedding-3-small"
input::Union{String,Vector{String}}
embedding::Vector{Float64} = zeros(Float64, 1536)
user::Union{String,Nothing} = nothing
endModel Constants
println("Default embedding model: ", UniLM.GPTTextEmbedding3Small)Default embedding model: text-embedding-3-smallUsage Example
julia> emb = Embeddings("Julia is a high-performance programming language for technical computing.")
julia> embeddingrequest!(emb)
julia> emb.embeddings[1:5] # first 5 dimensions
5-element Vector{Float64}:
-0.039474
-0.009283
0.001706
-0.028087
0.063363
julia> sqrt(sum(x^2 for x in emb.embeddings)) # L2 norm ≈ 1.0
1.0