Package
Language Models
A tiny package, but a clear public signal that Foundry model-proxy access is being normalized around the PlatformClient.
- Wraps `PlatformClient` auth and fetch behavior for OpenAI and Anthropic proxy endpoints.
- Does not ship an agent framework; it standardizes connection plumbing only.
- Matters because it turns Foundry LLM proxy access into normal app code rather than an ad hoc helper.
Relevant code
Most revealing source snippet
published: build/esm/utils.js
packages/language-models/src/utils.ts
open source file ->export function createFetch(client: PlatformClient): typeof globalThis.fetch {
return client.fetch;
}
export function getApiBaseUrl(client: PlatformClient): string {
return client.baseUrl;
}
export async function getAccessToken(client: PlatformClient): Promise<string> {
return client.tokenProvider();
} Notes
Why it matters
The package is technically small, but strategically meaningful.
What is new is not complex orchestration logic. It is the decision to expose a clean public path for using existing model SDKs through Foundry’s proxy by reusing the same PlatformClient token provider and fetch implementation.
That makes @osdk/language-models relevant as a direction-of-travel signal: Palantir is telling developers to keep using normal OpenAI/Anthropic clients, but route them through Foundry’s auth and proxy surface.