From 0ad3200ac656e941305a0ec58bb305d4177659fb Mon Sep 17 00:00:00 2001 From: David Meadows Date: Fri, 27 Jun 2025 12:35:15 -0400 Subject: [PATCH 01/18] add draft contributing.md --- CONTRIBUTING.md | 209 ++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 209 insertions(+) create mode 100644 CONTRIBUTING.md diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md new file mode 100644 index 00000000..5b5abbfa --- /dev/null +++ b/CONTRIBUTING.md @@ -0,0 +1,209 @@ +# Contributing to OpenAI Java SDK + +## Setting up the environment + +This repository uses [Gradle](https://gradle.org/) with Kotlin DSL for building and dependency management. The project requires Java 8+ to run, but development requires JDK 21 for the Kotlin toolchain. + +## Project Structure + +This is a multi-module Gradle project with the following modules: + +- **`openai-java-core/`** - Core SDK functionality with client implementations and models +- **`openai-java-client-okhttp/`** - OkHttp-based HTTP client implementation +- **`openai-java/`** - Main SDK module that aggregates other modules +- **`openai-java-example/`** - Example applications and usage demonstrations + +## Modifying/Adding code + +Most of the SDK is generated code. Modifications to code will be persisted between generations, but may +result in merge conflicts between manual patches and changes from the generator. The generator will never +modify the contents of the `openai-java-example/` directories. + +## Adding and running examples + +All files in the `openai-java-example/` directory are not modified by the generator and can be freely edited or added to. + +```java +// add an example to openai-java-example/src/main/java/com/openai/example/.java + +package com.openai.example; + +public class YourExample { + public static void main(String[] args) { + // ... + } +} +``` + +```sh +$ ./gradlew :openai-java-example:run -PmainClass=com.openai.example.YourExample +``` + +## Using the repository from source + +If you'd like to use the repository from source, you can either install from git or link to a cloned repository: + +To use a local version of this library from source in another project, you can publish it to your local Maven repository: + +```sh +$ ./gradlew publishToMavenLocal +``` + +Then in your project's `build.gradle.kts` or `pom.xml`, reference the locally published version: + +```kotlin +implementation("com.openai:openai-java:2.9.1") +``` + +```xml + + com.openai + openai-java + 2.9.1 + +``` + +Alternatively, you can build and install the JAR files directly: + +```sh +$ ./gradlew build +# JAR files will be available in each module's build/libs/ directory +``` + +## Running tests + +Most tests require [our mock server](https://github.com/stoplightio/prism) to be running against the OpenAPI spec to work. + +The test script will automatically start the mock server for you (if it's not already running) and run the tests against it: + +```sh +$ ./scripts/test +``` + +You can also manually start the mock server if you want to run tests repeatedly: + +```sh +$ ./scripts/mock +``` + +Then run the tests: + +```sh +$ ./scripts/test + +# or directly with Gradle + +$ ./gradlew test + +``` + +### Test Configuration + +- Tests run in parallel for better performance +- Mock server runs on `localhost:4010` +- You can disable mock server tests with `SKIP_MOCK_TESTS=true` +- You can target a custom API URL with `TEST_API_BASE_URL=` + +## Linting and formatting + +This repository uses [Spotless](https://github.com/diffplug/spotless) with Palantir Java Format for code formatting and various linting tools. + +To check formatting and run lints: + +```sh +$ ./scripts/lint +``` + +This will compile all modules and run static analysis checks. You can also use the Gradle wrapper directly: + +```sh +$ ./gradlew build +``` + +To format and fix all formatting issues automatically: + +```sh +$ ./scripts/format +``` + +You can also run these directly with Gradle: + +```sh +$ ./gradlew spotlessCheck # Check formatting +$ ./gradlew spotlessApply # Apply formatting +``` + +## Building + +To build all modules: + +```sh +$ ./gradlew build +``` + +To build a specific module: + +```sh +$ ./gradlew :openai-java-core:build +``` + +## Publishing and releases + +Changes made to this repository via the automated release PR pipeline should publish to Maven Central automatically. If +the changes aren't made through the automated pipeline, you may want to make releases manually. + +### Publish with a GitHub workflow + +You can release to package managers by using [the `Publish Sonatype` GitHub action](https://www.github.com/openai/openai-java/actions/workflows/publish-sonatype.yml). This requires setup organization or repository secrets to be configured. + +### Publish manually + +If you need to manually release a package, you can run: + +```sh +$ ./gradlew publishToSonatype closeAndReleaseSonatypeStagingRepository +``` + +This requires the following environment variables to be set: + +- `SONATYPE_USER` - Your Sonatype Central Portal username +- `SONATYPE_PASSWORD` - Your Sonatype Central Portal password +- `GPG_SIGNING_KEY` - Your GPG private key for signing artifacts +- `GPG_SIGNING_PASSWORD` - Your GPG key passphrase + +Note: for now you'll need to comment out the line for `signAllPublications()` here: buildSrc/src/main/kotlin/openai.publish.gradle.kts + +## Development Tools + +### IDE Setup + +This project works well with IntelliJ IDEA and other IDEs that support Gradle and Kotlin. The repository includes: + +- Gradle build scripts with Kotlin DSL +- Spotless formatting configuration +- JUnit 5 test configuration + +### Available Gradle Tasks + +Some useful Gradle tasks: + +```sh +$ ./gradlew tasks # List all available tasks +$ ./gradlew build # Build all modules +$ ./gradlew test # Run all tests +$ ./gradlew spotlessApply # Format code +$ ./gradlew publishToMavenLocal # Publish to local Maven repository +$ ./gradlew dependencies # Show dependency tree +``` + +### Testing Framework + +The project uses: + +- **JUnit 5** for test framework +- **Mockito** for mocking +- **AssertJ** for fluent assertions +- **WireMock** for HTTP service mocking +- **Custom TestServerExtension** for mock server management + +Tests are organized by service in the `src/test/kotlin/com/openai/services/` directory with both blocking and async variants. From d063ba7c0d3f2b17b211b98eedf475dc982ee4c3 Mon Sep 17 00:00:00 2001 From: David Meadows Date: Fri, 27 Jun 2025 12:37:32 -0400 Subject: [PATCH 02/18] remove a few things --- CONTRIBUTING.md | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 5b5abbfa..ce57b70a 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -175,14 +175,6 @@ Note: for now you'll need to comment out the line for `signAllPublications()` he ## Development Tools -### IDE Setup - -This project works well with IntelliJ IDEA and other IDEs that support Gradle and Kotlin. The repository includes: - -- Gradle build scripts with Kotlin DSL -- Spotless formatting configuration -- JUnit 5 test configuration - ### Available Gradle Tasks Some useful Gradle tasks: @@ -205,5 +197,3 @@ The project uses: - **AssertJ** for fluent assertions - **WireMock** for HTTP service mocking - **Custom TestServerExtension** for mock server management - -Tests are organized by service in the `src/test/kotlin/com/openai/services/` directory with both blocking and async variants. From 88f4f5ac4fbe2e564d520b0601f12c0a2aecef72 Mon Sep 17 00:00:00 2001 From: dtmeadows Date: Wed, 2 Jul 2025 09:10:12 -0400 Subject: [PATCH 03/18] Apply suggestions from code review Co-authored-by: Tomer Aberbach --- CONTRIBUTING.md | 11 +++++------ 1 file changed, 5 insertions(+), 6 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index ce57b70a..6df03267 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -2,9 +2,9 @@ ## Setting up the environment -This repository uses [Gradle](https://gradle.org/) with Kotlin DSL for building and dependency management. The project requires Java 8+ to run, but development requires JDK 21 for the Kotlin toolchain. +This repository uses [Gradle](https://gradle.org/) with Kotlin DSL for building and dependency management. The SDK requires Java 8, but development requires JDK 21 for the Kotlin toolchain. -## Project Structure +## Project structure This is a multi-module Gradle project with the following modules: @@ -13,19 +13,18 @@ This is a multi-module Gradle project with the following modules: - **`openai-java/`** - Main SDK module that aggregates other modules - **`openai-java-example/`** - Example applications and usage demonstrations -## Modifying/Adding code +## Modifying or adding code Most of the SDK is generated code. Modifications to code will be persisted between generations, but may result in merge conflicts between manual patches and changes from the generator. The generator will never -modify the contents of the `openai-java-example/` directories. +modify the contents of the `openai-java-example/` directory. ## Adding and running examples All files in the `openai-java-example/` directory are not modified by the generator and can be freely edited or added to. ```java -// add an example to openai-java-example/src/main/java/com/openai/example/.java - +// openai-java-example/src/main/java/com/openai/example/YourExample.java package com.openai.example; public class YourExample { From a9752cf7822e90fe8aaabf6da4d7cefe06928b68 Mon Sep 17 00:00:00 2001 From: David Meadows Date: Tue, 15 Jul 2025 12:58:36 -0400 Subject: [PATCH 04/18] address more comments --- CONTRIBUTING.md | 119 +++++++++++++++++++++++------------------------- 1 file changed, 58 insertions(+), 61 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index ce57b70a..f25d4025 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -4,44 +4,29 @@ This repository uses [Gradle](https://gradle.org/) with Kotlin DSL for building and dependency management. The project requires Java 8+ to run, but development requires JDK 21 for the Kotlin toolchain. -## Project Structure +## Project structure -This is a multi-module Gradle project with the following modules: +The SDK consists of three artifacts: -- **`openai-java-core/`** - Core SDK functionality with client implementations and models -- **`openai-java-client-okhttp/`** - OkHttp-based HTTP client implementation -- **`openai-java/`** - Main SDK module that aggregates other modules -- **`openai-java-example/`** - Example applications and usage demonstrations +- `openai-java-core` + - Contains core SDK logic + - Does not depend on [OkHttp](https://square.github.io/okhttp) + - Exposes [`OpenAIClient`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClient.kt), [`OpenAIClientAsync`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsync.kt), [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt), and [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), all of which can work with any HTTP client +- `openai-java-client-okhttp` + - Depends on [OkHttp](https://square.github.io/okhttp) + - Exposes [`OpenAIOkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClient.kt) and [`OpenAIOkHttpClientAsync`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClientAsync.kt), which provide a way to construct [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt) and [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), respectively, using OkHttp +- `openai-java` + - Depends on and exposes the APIs of -## Modifying/Adding code +## Modifying/adding code Most of the SDK is generated code. Modifications to code will be persisted between generations, but may result in merge conflicts between manual patches and changes from the generator. The generator will never modify the contents of the `openai-java-example/` directories. -## Adding and running examples - -All files in the `openai-java-example/` directory are not modified by the generator and can be freely edited or added to. - -```java -// add an example to openai-java-example/src/main/java/com/openai/example/.java - -package com.openai.example; - -public class YourExample { - public static void main(String[] args) { - // ... - } -} -``` - -```sh -$ ./gradlew :openai-java-example:run -PmainClass=com.openai.example.YourExample -``` - ## Using the repository from source -If you'd like to use the repository from source, you can either install from git or link to a cloned repository: +If you'd like to use the repository from source, you can either [install from git](https://jitpack.io/) or link to a cloned repository. To use a local version of this library from source in another project, you can publish it to your local Maven repository: @@ -49,8 +34,12 @@ To use a local version of this library from source in another project, you can p $ ./gradlew publishToMavenLocal ``` +Note: for now you'll need to comment out the line for `signAllPublications()` here: buildSrc/src/main/kotlin/openai.publish.gradle.kts + Then in your project's `build.gradle.kts` or `pom.xml`, reference the locally published version: + + ```kotlin implementation("com.openai:openai-java:2.9.1") ``` @@ -63,13 +52,16 @@ implementation("com.openai:openai-java:2.9.1") ``` + + Alternatively, you can build and install the JAR files directly: ```sh $ ./gradlew build -# JAR files will be available in each module's build/libs/ directory ``` +JAR files will be available in each module's build/libs/ directory + ## Running tests Most tests require [our mock server](https://github.com/stoplightio/prism) to be running against the OpenAPI spec to work. @@ -91,19 +83,25 @@ Then run the tests: ```sh $ ./scripts/test -# or directly with Gradle - -$ ./gradlew test - ``` -### Test Configuration +### Test configuration - Tests run in parallel for better performance - Mock server runs on `localhost:4010` - You can disable mock server tests with `SKIP_MOCK_TESTS=true` - You can target a custom API URL with `TEST_API_BASE_URL=` +### Testing framework + +The project uses: + +- **JUnit 5** for test framework +- **Mockito** for mocking +- **AssertJ** for fluent assertions +- **WireMock** for HTTP service mocking +- **Custom TestServerExtension** for mock server management + ## Linting and formatting This repository uses [Spotless](https://github.com/diffplug/spotless) with Palantir Java Format for code formatting and various linting tools. @@ -114,11 +112,7 @@ To check formatting and run lints: $ ./scripts/lint ``` -This will compile all modules and run static analysis checks. You can also use the Gradle wrapper directly: - -```sh -$ ./gradlew build -``` +This will compile all modules and run static analysis checks. To format and fix all formatting issues automatically: @@ -126,11 +120,10 @@ To format and fix all formatting issues automatically: $ ./scripts/format ``` -You can also run these directly with Gradle: +You can also check formatting directly with Gradle: ```sh $ ./gradlew spotlessCheck # Check formatting -$ ./gradlew spotlessApply # Apply formatting ``` ## Building @@ -147,6 +140,22 @@ To build a specific module: $ ./gradlew :openai-java-core:build ``` +## Adding and running examples + +All files in the `openai-java-example/` directory are not modified by the generator and can be freely edited or added to. + +```java +// add an example to openai-java-example/src/main/java/com/openai/example/.java + +package com.openai.example; + +public class YourExample { + public static void main(String[] args) { + // ... + } +} +``` + ## Publishing and releases Changes made to this repository via the automated release PR pipeline should publish to Maven Central automatically. If @@ -171,29 +180,17 @@ This requires the following environment variables to be set: - `GPG_SIGNING_KEY` - Your GPG private key for signing artifacts - `GPG_SIGNING_PASSWORD` - Your GPG key passphrase -Note: for now you'll need to comment out the line for `signAllPublications()` here: buildSrc/src/main/kotlin/openai.publish.gradle.kts - -## Development Tools +## Development tools -### Available Gradle Tasks +### Available gradle tasks Some useful Gradle tasks: ```sh -$ ./gradlew tasks # List all available tasks -$ ./gradlew build # Build all modules -$ ./gradlew test # Run all tests -$ ./gradlew spotlessApply # Format code -$ ./gradlew publishToMavenLocal # Publish to local Maven repository -$ ./gradlew dependencies # Show dependency tree +$ ./gradlew tasks # List all available tasks +$ ./gradlew build # Build all modules +$ ./gradlew test # Run all tests +$ ./gradlew spotlessApply # Format code +$ ./gradlew publishToMavenLocal # Publish to local Maven repository +$ ./gradlew dependencies # Show dependency tree ``` - -### Testing Framework - -The project uses: - -- **JUnit 5** for test framework -- **Mockito** for mocking -- **AssertJ** for fluent assertions -- **WireMock** for HTTP service mocking -- **Custom TestServerExtension** for mock server management From 39d98f6dd769b203b725a18db8c23d1e9daafd1c Mon Sep 17 00:00:00 2001 From: Julien Dubois Date: Wed, 16 Jul 2025 11:42:23 +0200 Subject: [PATCH 05/18] Set up TestContainers instead of running the mock server from a script. - This removes the need to have NPM installed - This should be faster as there is no "npm install" - This removes the risk of having a mock server that keeps running in the background after the tests have benn run Fix #54 --- .github/workflows/ci.yml | 2 +- openai-java-core/build.gradle.kts | 2 + .../kotlin/com/openai/TestServerExtension.kt | 128 ++++++++++++++---- scripts/test | 56 -------- 4 files changed, 104 insertions(+), 84 deletions(-) delete mode 100755 scripts/test diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 0ebdc70c..ebdbd517 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -57,7 +57,7 @@ jobs: uses: gradle/gradle-build-action@v2 - name: Run tests - run: ./scripts/test + run: ./gradlew test examples: timeout-minutes: 10 name: examples diff --git a/openai-java-core/build.gradle.kts b/openai-java-core/build.gradle.kts index 720aa139..9a3a84a1 100644 --- a/openai-java-core/build.gradle.kts +++ b/openai-java-core/build.gradle.kts @@ -42,4 +42,6 @@ dependencies { testImplementation("org.mockito:mockito-core:5.14.2") testImplementation("org.mockito:mockito-junit-jupiter:5.14.2") testImplementation("org.mockito.kotlin:mockito-kotlin:4.1.0") + testImplementation("org.testcontainers:testcontainers:1.19.8") + testImplementation("org.testcontainers:junit-jupiter:1.19.8") } diff --git a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt index 7dfdef2b..4e82a14a 100644 --- a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt +++ b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt @@ -1,41 +1,122 @@ package com.openai import java.lang.RuntimeException -import java.net.URL +import java.io.File import org.junit.jupiter.api.extension.BeforeAllCallback import org.junit.jupiter.api.extension.ConditionEvaluationResult import org.junit.jupiter.api.extension.ExecutionCondition import org.junit.jupiter.api.extension.ExtensionContext +import org.testcontainers.containers.GenericContainer +import org.testcontainers.containers.wait.strategy.Wait +import org.testcontainers.utility.DockerImageName +import org.testcontainers.utility.MountableFile +import java.time.Duration class TestServerExtension : BeforeAllCallback, ExecutionCondition { - override fun beforeAll(context: ExtensionContext?) { - try { - URL(BASE_URL).openConnection().connect() - } catch (e: Exception) { - throw RuntimeException( - """ - The test suite will not run without a mock Prism server running against your OpenAPI spec. + companion object { + private const val INTERNAL_PORT = 4010 // Port inside the container - You can set the environment variable `SKIP_MOCK_TESTS` to `true` to skip running any tests - that require the mock server. + val BASE_URL: String + get() = "http://${prismContainer.host}:${prismContainer.getMappedPort(INTERNAL_PORT)}" + + const val SKIP_TESTS_ENV: String = "SKIP_MOCK_TESTS" + private const val PRISM_IMAGE = "stoplight/prism:5" + private const val API_SPEC_PATH = "/app/openapi.yml" // Path inside the container + + // Track if the container has been started + private var containerStarted = false + + private fun getOpenApiSpecPath(): String { + // First check environment variable + val envPath = System.getenv("OPENAPI_SPEC_PATH") + if (envPath != null) { + return envPath + } + + // Try to read from .stats.yml file + try { + val statsFile = File("../.stats.yml") + if (statsFile.exists()) { + val content = statsFile.readText() + val urlLine = content.lines().find { it.startsWith("openapi_spec_url:") } + if (urlLine != null) { + val url = urlLine.substringAfter("openapi_spec_url:").trim() + if (url.isNotEmpty()) { + return url + } + } + } + } catch (e: Exception) { + println("Could not read .stats.yml fails, fall back to default. Error is: ${e.message}") + } + return "/tmp/openapi.yml" + } + + private val prismContainer: GenericContainer<*> by lazy { + val apiSpecPath = getOpenApiSpecPath() + println("Using OpenAPI spec path: $apiSpecPath") + val isUrl = apiSpecPath.startsWith("http://") || apiSpecPath.startsWith("https://") + + // Create container with or without copying the file based on whether apiSpecPath is a URL + val container = GenericContainer(DockerImageName.parse(PRISM_IMAGE)) + .withExposedPorts(INTERNAL_PORT) + .withCommand("mock", apiSpecPath, "--host", "0.0.0.0", "--port", INTERNAL_PORT.toString()) + .withReuse(true) - To fix: + // Only copy the file to the container if apiSpecPath is a local file + if (!isUrl) { + try { + val file = File(apiSpecPath) + if (file.exists()) { + container.withCopyToContainer(MountableFile.forHostPath(apiSpecPath), API_SPEC_PATH) + } else { + println("OpenAPI spec file not found at: $apiSpecPath") + throw RuntimeException("OpenAPI spec file not found at: $apiSpecPath") + } + } catch (e: Exception) { + println("Error reading OpenAPI spec file: ${e.message}") + throw RuntimeException("Error reading OpenAPI spec file: $apiSpecPath", e) + } + } - 1. Install Prism (requires Node 16+): + // Add waiting strategy + container.waitingFor( + Wait.forLogMessage(".*Prism is listening.*", 1) + .withStartupTimeout(Duration.ofSeconds(300)) + ) - With npm: - $ npm install -g @stoplight/prism-cli + // Start the container here once during lazy initialization + container.start() + containerStarted = true + println("Prism container started at: ${container.host}:${container.getMappedPort(INTERNAL_PORT)}") - With yarn: - $ yarn global add @stoplight/prism-cli + container + } - 2. Run the mock server + // Method to ensure container is started, can be called from beforeAll + fun ensureContainerStarted() { + if (!containerStarted) { + // This will trigger lazy initialization and start the container + prismContainer + } + } + } - To run the server, pass in the path of your OpenAPI spec to the prism command: - $ prism mock path/to/your.openapi.yml + override fun beforeAll(context: ExtensionContext?) { + try { + // Use the companion method to ensure container is started only once + ensureContainerStarted() + } catch (e: Exception) { + throw RuntimeException( """ - .trimIndent(), + Failed to connect to Prism mock server running in TestContainer. + + You can set the environment variable `SKIP_MOCK_TESTS` to `true` to skip running any tests + that require the mock server. + + You may also need to set `OPENAPI_SPEC_PATH` to the path of your OpenAPI spec file. + """.trimIndent(), e, ) } @@ -52,11 +133,4 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { ) } } - - companion object { - - val BASE_URL = System.getenv("TEST_API_BASE_URL") ?: "http://localhost:4010" - - const val SKIP_TESTS_ENV: String = "SKIP_MOCK_TESTS" - } } diff --git a/scripts/test b/scripts/test deleted file mode 100755 index 6b750a74..00000000 --- a/scripts/test +++ /dev/null @@ -1,56 +0,0 @@ -#!/usr/bin/env bash - -set -e - -cd "$(dirname "$0")/.." - -RED='\033[0;31m' -GREEN='\033[0;32m' -YELLOW='\033[0;33m' -NC='\033[0m' # No Color - -function prism_is_running() { - curl --silent "http://localhost:4010" >/dev/null 2>&1 -} - -kill_server_on_port() { - pids=$(lsof -t -i tcp:"$1" || echo "") - if [ "$pids" != "" ]; then - kill "$pids" - echo "Stopped $pids." - fi -} - -function is_overriding_api_base_url() { - [ -n "$TEST_API_BASE_URL" ] -} - -if ! is_overriding_api_base_url && ! prism_is_running ; then - # When we exit this script, make sure to kill the background mock server process - trap 'kill_server_on_port 4010' EXIT - - # Start the dev server - ./scripts/mock --daemon -fi - -if is_overriding_api_base_url ; then - echo -e "${GREEN}✔ Running tests against ${TEST_API_BASE_URL}${NC}" - echo -elif ! prism_is_running ; then - echo -e "${RED}ERROR:${NC} The test suite will not run without a mock Prism server" - echo -e "running against your OpenAPI spec." - echo - echo -e "To run the server, pass in the path or url of your OpenAPI" - echo -e "spec to the prism command:" - echo - echo -e " \$ ${YELLOW}npm exec --package=@stoplight/prism-cli@~5.3.2 -- prism mock path/to/your.openapi.yml${NC}" - echo - - exit 1 -else - echo -e "${GREEN}✔ Mock prism server is running with your OpenAPI spec${NC}" - echo -fi - -echo "==> Running tests" -./gradlew test From 3786a1e88ee2691b6e7783ca825f391dd780798c Mon Sep 17 00:00:00 2001 From: dtmeadows Date: Wed, 16 Jul 2025 12:37:51 -0400 Subject: [PATCH 06/18] Update CONTRIBUTING.md Co-authored-by: Tomer Aberbach --- CONTRIBUTING.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 60179411..a90937c3 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -53,7 +53,7 @@ To use a local version of this library from source in another project, you can p $ ./gradlew publishToMavenLocal ``` -Note: for now you'll need to comment out the line for `signAllPublications()` here: buildSrc/src/main/kotlin/openai.publish.gradle.kts +Note: for now you'll need to comment out the line for `signAllPublications()` here: `buildSrc/src/main/kotlin/openai.publish.gradle.kts` Then in your project's `build.gradle.kts` or `pom.xml`, reference the locally published version: From abe671e74674c11bb7993bfa9839eb559914305a Mon Sep 17 00:00:00 2001 From: dtmeadows Date: Wed, 16 Jul 2025 12:38:24 -0400 Subject: [PATCH 07/18] Apply suggestions from code review Co-authored-by: Tomer Aberbach --- CONTRIBUTING.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index a90937c3..8332cecf 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -79,7 +79,7 @@ Alternatively, you can build and install the JAR files directly: $ ./gradlew build ``` -JAR files will be available in each module's build/libs/ directory +JAR files will be available in each module's `build/libs/` directory. ## Running tests @@ -133,7 +133,7 @@ $ ./scripts/lint This will compile all modules and run static analysis checks. -To format and fix all formatting issues automatically: +To fix all formatting issues automatically: ```sh $ ./scripts/format From 190570349808ab08acd7195659426670f8bcb055 Mon Sep 17 00:00:00 2001 From: David Meadows Date: Wed, 16 Jul 2025 17:21:19 -0400 Subject: [PATCH 08/18] pr comments --- CONTRIBUTING.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 8332cecf..b2884477 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -16,7 +16,8 @@ The SDK consists of three artifacts: - Depends on [OkHttp](https://square.github.io/okhttp) - Exposes [`OpenAIOkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClient.kt) and [`OpenAIOkHttpClientAsync`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClientAsync.kt), which provide a way to construct [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt) and [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), respectively, using OkHttp - `openai-java` - - Depends on and exposes the APIs of + - Depends on and exposes the APIs of both `openai-java-core` and `openai-java-client-okhttp` + - Does not have its own logic ## Modifying or adding code @@ -53,7 +54,8 @@ To use a local version of this library from source in another project, you can p $ ./gradlew publishToMavenLocal ``` -Note: for now you'll need to comment out the line for `signAllPublications()` here: `buildSrc/src/main/kotlin/openai.publish.gradle.kts` +> [!NOTE] +> For now, to publish locally, you'll need to comment out the line for `signAllPublications()` here: `buildSrc/src/main/kotlin/openai.publish.gradle.kts` Then in your project's `build.gradle.kts` or `pom.xml`, reference the locally published version: From 3239c2de360864456786043a2a3ffb1a71ac0a45 Mon Sep 17 00:00:00 2001 From: "stainless-app[bot]" <142633134+stainless-app[bot]@users.noreply.github.com> Date: Wed, 16 Jul 2025 21:54:20 +0000 Subject: [PATCH 09/18] chore(internal): allow running specific example from cli --- openai-java-example/build.gradle.kts | 9 ++++++++- 1 file changed, 8 insertions(+), 1 deletion(-) diff --git a/openai-java-example/build.gradle.kts b/openai-java-example/build.gradle.kts index 534eb16d..48509e3c 100644 --- a/openai-java-example/build.gradle.kts +++ b/openai-java-example/build.gradle.kts @@ -18,5 +18,12 @@ tasks.withType().configureEach { } application { - mainClass = "com.openai.example.Main" + // Use `./gradlew :openai-java-example:run` to run `Main` + // Use `./gradlew :openai-java-example:run -Dexample=Something` to run `SomethingExample` + mainClass = "com.openai.example.${ + if (project.hasProperty("example")) + "${project.property("example")}Example" + else + "Main" + }" } From a00c39b9b1e06a15fa3a0b2b495adfff86cddd10 Mon Sep 17 00:00:00 2001 From: "stainless-app[bot]" <142633134+stainless-app[bot]@users.noreply.github.com> Date: Thu, 17 Jul 2025 17:49:06 +0000 Subject: [PATCH 10/18] fix(client): ensure error handling always occurs --- .../com/openai/core/handlers/ErrorHandler.kt | 26 +- .../services/async/BatchServiceAsyncImpl.kt | 25 +- .../async/CompletionServiceAsyncImpl.kt | 17 +- .../async/ContainerServiceAsyncImpl.kt | 21 +- .../async/EmbeddingServiceAsyncImpl.kt | 10 +- .../services/async/EvalServiceAsyncImpl.kt | 25 +- .../services/async/FileServiceAsyncImpl.kt | 27 +- .../services/async/ImageServiceAsyncImpl.kt | 31 +- .../services/async/ModelServiceAsyncImpl.kt | 19 +- .../async/ModerationServiceAsyncImpl.kt | 10 +- .../async/ResponseServiceAsyncImpl.kt | 36 +- .../services/async/UploadServiceAsyncImpl.kt | 22 +- .../async/VectorStoreServiceAsyncImpl.kt | 29 +- .../async/audio/SpeechServiceAsyncImpl.kt | 11 +- .../audio/TranscriptionServiceAsyncImpl.kt | 15 +- .../audio/TranslationServiceAsyncImpl.kt | 10 +- .../beta/realtime/SessionServiceAsyncImpl.kt | 10 +- .../TranscriptionSessionServiceAsyncImpl.kt | 10 +- .../chat/ChatCompletionServiceAsyncImpl.kt | 31 +- .../completions/MessageServiceAsyncImpl.kt | 10 +- .../async/containers/FileServiceAsyncImpl.kt | 22 +- .../files/ContentServiceAsyncImpl.kt | 11 +- .../async/evals/RunServiceAsyncImpl.kt | 25 +- .../evals/runs/OutputItemServiceAsyncImpl.kt | 13 +- .../async/finetuning/JobServiceAsyncImpl.kt | 33 +- .../alpha/GraderServiceAsyncImpl.kt | 14 +- .../checkpoints/PermissionServiceAsyncImpl.kt | 16 +- .../jobs/CheckpointServiceAsyncImpl.kt | 10 +- .../responses/InputItemServiceAsyncImpl.kt | 11 +- .../async/uploads/PartServiceAsyncImpl.kt | 11 +- .../vectorstores/FileBatchServiceAsyncImpl.kt | 19 +- .../vectorstores/FileServiceAsyncImpl.kt | 28 +- .../services/blocking/BatchServiceImpl.kt | 25 +- .../blocking/CompletionServiceImpl.kt | 17 +- .../services/blocking/ContainerServiceImpl.kt | 21 +- .../services/blocking/EmbeddingServiceImpl.kt | 10 +- .../services/blocking/EvalServiceImpl.kt | 25 +- .../services/blocking/FileServiceImpl.kt | 24 +- .../services/blocking/ImageServiceImpl.kt | 31 +- .../services/blocking/ModelServiceImpl.kt | 19 +- .../blocking/ModerationServiceImpl.kt | 10 +- .../services/blocking/ResponseServiceImpl.kt | 36 +- .../services/blocking/UploadServiceImpl.kt | 22 +- .../blocking/VectorStoreServiceImpl.kt | 29 +- .../blocking/audio/SpeechServiceImpl.kt | 8 +- .../audio/TranscriptionServiceImpl.kt | 15 +- .../blocking/audio/TranslationServiceImpl.kt | 10 +- .../beta/realtime/SessionServiceImpl.kt | 10 +- .../TranscriptionSessionServiceImpl.kt | 10 +- .../chat/ChatCompletionServiceImpl.kt | 31 +- .../chat/completions/MessageServiceImpl.kt | 10 +- .../blocking/containers/FileServiceImpl.kt | 22 +- .../containers/files/ContentServiceImpl.kt | 8 +- .../services/blocking/evals/RunServiceImpl.kt | 25 +- .../evals/runs/OutputItemServiceImpl.kt | 13 +- .../blocking/finetuning/JobServiceImpl.kt | 33 +- .../finetuning/alpha/GraderServiceImpl.kt | 14 +- .../checkpoints/PermissionServiceImpl.kt | 16 +- .../finetuning/jobs/CheckpointServiceImpl.kt | 10 +- .../responses/InputItemServiceImpl.kt | 11 +- .../blocking/uploads/PartServiceImpl.kt | 11 +- .../vectorstores/FileBatchServiceImpl.kt | 19 +- .../blocking/vectorstores/FileServiceImpl.kt | 28 +- .../com/openai/services/ErrorHandlingTest.kt | 974 +++++++++++++++++- 64 files changed, 1515 insertions(+), 640 deletions(-) diff --git a/openai-java-core/src/main/kotlin/com/openai/core/handlers/ErrorHandler.kt b/openai-java-core/src/main/kotlin/com/openai/core/handlers/ErrorHandler.kt index 90486647..1748c1b7 100644 --- a/openai-java-core/src/main/kotlin/com/openai/core/handlers/ErrorHandler.kt +++ b/openai-java-core/src/main/kotlin/com/openai/core/handlers/ErrorHandler.kt @@ -20,7 +20,7 @@ import com.openai.errors.UnprocessableEntityException import com.openai.models.ErrorObject @JvmSynthetic -internal fun errorHandler(jsonMapper: JsonMapper): Handler { +internal fun errorBodyHandler(jsonMapper: JsonMapper): Handler { val handler = jsonHandler(jsonMapper) return object : Handler { @@ -36,52 +36,52 @@ internal fun errorHandler(jsonMapper: JsonMapper): Handler { } @JvmSynthetic -internal fun Handler.withErrorHandler(errorHandler: Handler): Handler = - object : Handler { - override fun handle(response: HttpResponse): T = +internal fun errorHandler(errorBodyHandler: Handler): Handler = + object : Handler { + override fun handle(response: HttpResponse): HttpResponse = when (val statusCode = response.statusCode()) { - in 200..299 -> this@withErrorHandler.handle(response) + in 200..299 -> response 400 -> throw BadRequestException.builder() .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() 401 -> throw UnauthorizedException.builder() .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() 403 -> throw PermissionDeniedException.builder() .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() 404 -> throw NotFoundException.builder() .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() 422 -> throw UnprocessableEntityException.builder() .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() 429 -> throw RateLimitException.builder() .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() in 500..599 -> throw InternalServerException.builder() .statusCode(statusCode) .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() else -> throw UnexpectedStatusCodeException.builder() .statusCode(statusCode) .headers(response.headers()) - .error(errorHandler.handle(response)) + .error(errorBodyHandler.handle(response)) .build() } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/BatchServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/BatchServiceAsyncImpl.kt index 81fb29e1..e2bc77ff 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/BatchServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/BatchServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.batches.Batch import com.openai.models.batches.BatchCancelParams import com.openai.models.batches.BatchCreateParams @@ -70,7 +70,8 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : BatchServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -79,8 +80,7 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie clientOptions.toBuilder().apply(modifier::accept).build() ) - private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun create( params: BatchCreateParams, @@ -98,7 +98,7 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -110,8 +110,7 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie } } - private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: BatchRetrieveParams, @@ -131,7 +130,7 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -145,7 +144,6 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: BatchListParams, @@ -162,7 +160,7 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -182,8 +180,7 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie } } - private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val cancelHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun cancel( params: BatchCancelParams, @@ -204,7 +201,7 @@ class BatchServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/CompletionServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/CompletionServiceAsyncImpl.kt index a178172d..ec054da8 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/CompletionServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/CompletionServiceAsyncImpl.kt @@ -5,14 +5,15 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.JsonValue import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.AsyncStreamResponse import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.StreamResponse @@ -21,7 +22,6 @@ import com.openai.core.http.map import com.openai.core.http.parseable import com.openai.core.http.toAsync import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.completions.Completion import com.openai.models.completions.CompletionCreateParams import java.util.concurrent.CompletableFuture @@ -59,7 +59,8 @@ class CompletionServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : CompletionServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -69,7 +70,7 @@ class CompletionServiceAsyncImpl internal constructor(private val clientOptions: ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: CompletionCreateParams, @@ -87,7 +88,7 @@ class CompletionServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -100,9 +101,7 @@ class CompletionServiceAsyncImpl internal constructor(private val clientOptions: } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: CompletionCreateParams, @@ -129,7 +128,7 @@ class CompletionServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/ContainerServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/ContainerServiceAsyncImpl.kt index fdda439b..68445392 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/ContainerServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/ContainerServiceAsyncImpl.kt @@ -6,9 +6,9 @@ import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired import com.openai.core.handlers.emptyHandler +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -17,7 +17,6 @@ import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.containers.ContainerCreateParams import com.openai.models.containers.ContainerCreateResponse import com.openai.models.containers.ContainerDeleteParams @@ -79,7 +78,8 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ContainerServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val files: FileServiceAsync.WithRawResponse by lazy { FileServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -96,7 +96,6 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: ContainerCreateParams, @@ -114,7 +113,7 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -128,7 +127,6 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: ContainerRetrieveParams, @@ -148,7 +146,7 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -162,7 +160,6 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: ContainerListParams, @@ -179,7 +176,7 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -199,7 +196,7 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: } } - private val deleteHandler: Handler = emptyHandler().withErrorHandler(errorHandler) + private val deleteHandler: Handler = emptyHandler() override fun delete( params: ContainerDeleteParams, @@ -220,7 +217,9 @@ class ContainerServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { response.use { deleteHandler.handle(it) } } + errorHandler.handle(response).parseable { + response.use { deleteHandler.handle(it) } + } } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/EmbeddingServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/EmbeddingServiceAsyncImpl.kt index 4f9f4888..a8ccfd32 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/EmbeddingServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/EmbeddingServiceAsyncImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.embeddings.CreateEmbeddingResponse import com.openai.models.embeddings.EmbeddingCreateParams import java.util.concurrent.CompletableFuture @@ -42,7 +42,8 @@ class EmbeddingServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : EmbeddingServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -53,7 +54,6 @@ class EmbeddingServiceAsyncImpl internal constructor(private val clientOptions: private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: EmbeddingCreateParams, @@ -71,7 +71,7 @@ class EmbeddingServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/EvalServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/EvalServiceAsyncImpl.kt index 4bb6bfca..4010dc18 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/EvalServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/EvalServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.evals.EvalCreateParams import com.openai.models.evals.EvalCreateResponse import com.openai.models.evals.EvalDeleteParams @@ -87,7 +87,8 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : EvalServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val runs: RunServiceAsync.WithRawResponse by lazy { RunServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -103,7 +104,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien override fun runs(): RunServiceAsync.WithRawResponse = runs private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: EvalCreateParams, @@ -121,7 +122,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -135,7 +136,6 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: EvalRetrieveParams, @@ -155,7 +155,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -168,7 +168,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: EvalUpdateParams, @@ -189,7 +189,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -203,7 +203,6 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: EvalListParams, @@ -220,7 +219,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -241,7 +240,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: EvalDeleteParams, @@ -262,7 +261,7 @@ class EvalServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/FileServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/FileServiceAsyncImpl.kt index 7f366622..223661d5 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/FileServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/FileServiceAsyncImpl.kt @@ -5,9 +5,9 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -17,7 +17,6 @@ import com.openai.core.http.json import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.files.FileContentParams import com.openai.models.files.FileCreateParams import com.openai.models.files.FileDeleteParams @@ -81,7 +80,8 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -91,7 +91,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: FileCreateParams, @@ -109,7 +109,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -122,7 +122,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: FileRetrieveParams, @@ -142,7 +142,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -156,7 +156,6 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: FileListParams, @@ -173,7 +172,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -194,7 +193,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: FileDeleteParams, @@ -215,7 +214,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -242,9 +241,9 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien .build() .prepareAsync(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) - return request.thenComposeAsync { - clientOptions.httpClient.executeAsync(it, requestOptions) - } + return request + .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } + .thenApply { response -> errorHandler.handle(response) } } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/ImageServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/ImageServiceAsyncImpl.kt index b5383141..6c585712 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/ImageServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/ImageServiceAsyncImpl.kt @@ -6,14 +6,15 @@ import com.openai.core.ClientOptions import com.openai.core.JsonValue import com.openai.core.MultipartField import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.AsyncStreamResponse import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.StreamResponse @@ -23,7 +24,6 @@ import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.http.toAsync import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.images.ImageCreateVariationParams import com.openai.models.images.ImageEditParams import com.openai.models.images.ImageEditStreamEvent @@ -89,7 +89,8 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ImageServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -99,7 +100,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie ) private val createVariationHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun createVariation( params: ImageCreateVariationParams, @@ -121,7 +122,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createVariationHandler.handle(it) } .also { @@ -134,7 +135,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie } private val editHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun edit( params: ImageEditParams, @@ -156,7 +157,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { editHandler.handle(it) } .also { @@ -169,9 +170,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie } private val editStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun editStreaming( params: ImageEditParams, @@ -194,7 +193,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .let { editStreamingHandler.handle(it) } .let { streamResponse -> @@ -209,7 +208,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie } private val generateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun generate( params: ImageGenerateParams, @@ -231,7 +230,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { generateHandler.handle(it) } .also { @@ -244,9 +243,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie } private val generateStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun generateStreaming( params: ImageGenerateParams, @@ -273,7 +270,7 @@ class ImageServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .let { generateStreamingHandler.handle(it) } .let { streamResponse -> diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/ModelServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/ModelServiceAsyncImpl.kt index 62ecc8b0..37c96a10 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/ModelServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/ModelServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.models.Model import com.openai.models.models.ModelDeleteParams import com.openai.models.models.ModelDeleted @@ -63,7 +63,8 @@ class ModelServiceAsyncImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ModelServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -72,8 +73,7 @@ class ModelServiceAsyncImpl internal constructor(private val clientOptions: Clie clientOptions.toBuilder().apply(modifier::accept).build() ) - private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: ModelRetrieveParams, @@ -93,7 +93,7 @@ class ModelServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -107,7 +107,6 @@ class ModelServiceAsyncImpl internal constructor(private val clientOptions: Clie private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: ModelListParams, @@ -124,7 +123,7 @@ class ModelServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -145,7 +144,7 @@ class ModelServiceAsyncImpl internal constructor(private val clientOptions: Clie } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: ModelDeleteParams, @@ -166,7 +165,7 @@ class ModelServiceAsyncImpl internal constructor(private val clientOptions: Clie return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/ModerationServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/ModerationServiceAsyncImpl.kt index 8aca6c89..afce8586 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/ModerationServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/ModerationServiceAsyncImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.moderations.ModerationCreateParams import com.openai.models.moderations.ModerationCreateResponse import java.util.concurrent.CompletableFuture @@ -42,7 +42,8 @@ class ModerationServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ModerationServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -53,7 +54,6 @@ class ModerationServiceAsyncImpl internal constructor(private val clientOptions: private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: ModerationCreateParams, @@ -75,7 +75,7 @@ class ModerationServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/ResponseServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/ResponseServiceAsyncImpl.kt index 1ebda153..329cdf3f 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/ResponseServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/ResponseServiceAsyncImpl.kt @@ -7,11 +7,11 @@ import com.openai.core.JsonValue import com.openai.core.RequestOptions import com.openai.core.checkRequired import com.openai.core.handlers.emptyHandler +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.AsyncStreamResponse import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest @@ -24,7 +24,6 @@ import com.openai.core.http.map import com.openai.core.http.parseable import com.openai.core.http.toAsync import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.responses.Response import com.openai.models.responses.ResponseCancelParams import com.openai.models.responses.ResponseCreateParams @@ -106,7 +105,8 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ResponseServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val inputItems: InputItemServiceAsync.WithRawResponse by lazy { InputItemServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -122,7 +122,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C override fun inputItems(): InputItemServiceAsync.WithRawResponse = inputItems private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: ResponseCreateParams, @@ -140,7 +140,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -153,9 +153,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: ResponseCreateParams, @@ -182,7 +180,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> @@ -197,7 +195,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: ResponseRetrieveParams, @@ -217,7 +215,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -230,9 +228,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C } private val retrieveStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun retrieveStreaming( params: ResponseRetrieveParams, @@ -253,7 +249,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .let { retrieveStreamingHandler.handle(it) } .let { streamResponse -> @@ -267,7 +263,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C } } - private val deleteHandler: Handler = emptyHandler().withErrorHandler(errorHandler) + private val deleteHandler: Handler = emptyHandler() override fun delete( params: ResponseDeleteParams, @@ -288,12 +284,14 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { response.use { deleteHandler.handle(it) } } + errorHandler.handle(response).parseable { + response.use { deleteHandler.handle(it) } + } } } private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun cancel( params: ResponseCancelParams, @@ -314,7 +312,7 @@ class ResponseServiceAsyncImpl internal constructor(private val clientOptions: C return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/UploadServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/UploadServiceAsyncImpl.kt index 522f43c5..7f895d21 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/UploadServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/UploadServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.uploads.Upload import com.openai.models.uploads.UploadCancelParams import com.openai.models.uploads.UploadCompleteParams @@ -66,7 +66,8 @@ class UploadServiceAsyncImpl internal constructor(private val clientOptions: Cli class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : UploadServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val parts: PartServiceAsync.WithRawResponse by lazy { PartServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -81,8 +82,7 @@ class UploadServiceAsyncImpl internal constructor(private val clientOptions: Cli override fun parts(): PartServiceAsync.WithRawResponse = parts - private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun create( params: UploadCreateParams, @@ -100,7 +100,7 @@ class UploadServiceAsyncImpl internal constructor(private val clientOptions: Cli return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -112,8 +112,7 @@ class UploadServiceAsyncImpl internal constructor(private val clientOptions: Cli } } - private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val cancelHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun cancel( params: UploadCancelParams, @@ -134,7 +133,7 @@ class UploadServiceAsyncImpl internal constructor(private val clientOptions: Cli return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { @@ -146,8 +145,7 @@ class UploadServiceAsyncImpl internal constructor(private val clientOptions: Cli } } - private val completeHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val completeHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun complete( params: UploadCompleteParams, @@ -168,7 +166,7 @@ class UploadServiceAsyncImpl internal constructor(private val clientOptions: Cli return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { completeHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/VectorStoreServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/VectorStoreServiceAsyncImpl.kt index 9b14d234..3045db3c 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/VectorStoreServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/VectorStoreServiceAsyncImpl.kt @@ -5,18 +5,18 @@ package com.openai.services.async import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.vectorstores.VectorStore import com.openai.models.vectorstores.VectorStoreCreateParams import com.openai.models.vectorstores.VectorStoreDeleteParams @@ -109,7 +109,8 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : VectorStoreServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val files: FileServiceAsync.WithRawResponse by lazy { FileServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -131,7 +132,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions override fun fileBatches(): FileBatchServiceAsync.WithRawResponse = fileBatches private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: VectorStoreCreateParams, @@ -150,7 +151,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -163,7 +164,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: VectorStoreRetrieveParams, @@ -184,7 +185,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -197,7 +198,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: VectorStoreUpdateParams, @@ -219,7 +220,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -233,7 +234,6 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: VectorStoreListParams, @@ -251,7 +251,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -272,7 +272,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: VectorStoreDeleteParams, @@ -294,7 +294,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -308,7 +308,6 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions private val searchHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun search( params: VectorStoreSearchParams, @@ -330,7 +329,7 @@ class VectorStoreServiceAsyncImpl internal constructor(private val clientOptions return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { searchHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/SpeechServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/SpeechServiceAsyncImpl.kt index bee7d0e1..a9e21cb4 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/SpeechServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/SpeechServiceAsyncImpl.kt @@ -4,6 +4,7 @@ package com.openai.services.async.audio import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest @@ -11,7 +12,6 @@ import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.json import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.audio.speech.SpeechCreateParams import java.util.concurrent.CompletableFuture import java.util.function.Consumer @@ -38,7 +38,8 @@ class SpeechServiceAsyncImpl internal constructor(private val clientOptions: Cli class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : SpeechServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -64,9 +65,9 @@ class SpeechServiceAsyncImpl internal constructor(private val clientOptions: Cli deploymentModel = params.model().toString(), ) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) - return request.thenComposeAsync { - clientOptions.httpClient.executeAsync(it, requestOptions) - } + return request + .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } + .thenApply { response -> errorHandler.handle(response) } } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt index 779e2f2b..31104a16 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt @@ -5,12 +5,12 @@ package com.openai.services.async.audio import com.openai.core.ClientOptions import com.openai.core.MultipartField import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler import com.openai.core.handlers.stringHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.AsyncStreamResponse import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest @@ -23,7 +23,6 @@ import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.http.toAsync import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.audio.transcriptions.Transcription import com.openai.models.audio.transcriptions.TranscriptionCreateParams import com.openai.models.audio.transcriptions.TranscriptionCreateResponse @@ -64,7 +63,8 @@ class TranscriptionServiceAsyncImpl internal constructor(private val clientOptio class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : TranscriptionServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -75,7 +75,6 @@ class TranscriptionServiceAsyncImpl internal constructor(private val clientOptio private val createJsonHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) private val createStringHandler: Handler = object : Handler { @@ -111,7 +110,7 @@ class TranscriptionServiceAsyncImpl internal constructor(private val clientOptio if (params.responseFormat().getOrNull()?.isJson() != false) createJsonHandler else createStringHandler - response.parseable { + errorHandler.handle(response).parseable { response .use { handler.handle(it) } .also { @@ -124,9 +123,7 @@ class TranscriptionServiceAsyncImpl internal constructor(private val clientOptio } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: TranscriptionCreateParams, @@ -153,7 +150,7 @@ class TranscriptionServiceAsyncImpl internal constructor(private val clientOptio return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranslationServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranslationServiceAsyncImpl.kt index 8a3f934d..42a7d630 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranslationServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranslationServiceAsyncImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.async.audio import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.audio.translations.TranslationCreateParams import com.openai.models.audio.translations.TranslationCreateResponse import java.util.concurrent.CompletableFuture @@ -42,7 +42,8 @@ class TranslationServiceAsyncImpl internal constructor(private val clientOptions class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : TranslationServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -53,7 +54,6 @@ class TranslationServiceAsyncImpl internal constructor(private val clientOptions private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: TranslationCreateParams, @@ -75,7 +75,7 @@ class TranslationServiceAsyncImpl internal constructor(private val clientOptions return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/SessionServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/SessionServiceAsyncImpl.kt index 7e201289..e4c89bb2 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/SessionServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/SessionServiceAsyncImpl.kt @@ -4,18 +4,18 @@ package com.openai.services.async.beta.realtime import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.beta.realtime.sessions.SessionCreateParams import com.openai.models.beta.realtime.sessions.SessionCreateResponse import java.util.concurrent.CompletableFuture @@ -48,7 +48,8 @@ class SessionServiceAsyncImpl internal constructor(private val clientOptions: Cl class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : SessionServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -59,7 +60,6 @@ class SessionServiceAsyncImpl internal constructor(private val clientOptions: Cl private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: SessionCreateParams, @@ -78,7 +78,7 @@ class SessionServiceAsyncImpl internal constructor(private val clientOptions: Cl return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/TranscriptionSessionServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/TranscriptionSessionServiceAsyncImpl.kt index e276677d..1877c3f2 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/TranscriptionSessionServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/beta/realtime/TranscriptionSessionServiceAsyncImpl.kt @@ -4,18 +4,18 @@ package com.openai.services.async.beta.realtime import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.beta.realtime.transcriptionsessions.TranscriptionSession import com.openai.models.beta.realtime.transcriptionsessions.TranscriptionSessionCreateParams import java.util.concurrent.CompletableFuture @@ -53,7 +53,8 @@ internal constructor(private val clientOptions: ClientOptions) : TranscriptionSe class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : TranscriptionSessionServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -64,7 +65,6 @@ internal constructor(private val clientOptions: ClientOptions) : TranscriptionSe private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: TranscriptionSessionCreateParams, @@ -83,7 +83,7 @@ internal constructor(private val clientOptions: ClientOptions) : TranscriptionSe return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/chat/ChatCompletionServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/chat/ChatCompletionServiceAsyncImpl.kt index de60f5a3..fd7dec52 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/chat/ChatCompletionServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/chat/ChatCompletionServiceAsyncImpl.kt @@ -6,14 +6,15 @@ import com.openai.core.ClientOptions import com.openai.core.JsonValue import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.AsyncStreamResponse import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.StreamResponse @@ -22,7 +23,6 @@ import com.openai.core.http.map import com.openai.core.http.parseable import com.openai.core.http.toAsync import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.chat.completions.ChatCompletion import com.openai.models.chat.completions.ChatCompletionChunk import com.openai.models.chat.completions.ChatCompletionCreateParams @@ -105,7 +105,8 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ChatCompletionServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val messages: MessageServiceAsync.WithRawResponse by lazy { MessageServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -121,7 +122,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS override fun messages(): MessageServiceAsync.WithRawResponse = messages private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: ChatCompletionCreateParams, @@ -139,7 +140,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -152,9 +153,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: ChatCompletionCreateParams, @@ -181,7 +180,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> @@ -196,7 +195,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: ChatCompletionRetrieveParams, @@ -216,7 +215,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -229,7 +228,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: ChatCompletionUpdateParams, @@ -250,7 +249,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -264,7 +263,6 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: ChatCompletionListParams, @@ -285,7 +283,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -307,7 +305,6 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS private val deleteHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun delete( params: ChatCompletionDeleteParams, @@ -328,7 +325,7 @@ internal constructor(private val clientOptions: ClientOptions) : ChatCompletionS return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/chat/completions/MessageServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/chat/completions/MessageServiceAsyncImpl.kt index 2581be26..7d3b7ea5 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/chat/completions/MessageServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/chat/completions/MessageServiceAsyncImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.async.chat.completions import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.chat.completions.messages.MessageListPageAsync import com.openai.models.chat.completions.messages.MessageListPageResponse import com.openai.models.chat.completions.messages.MessageListParams @@ -44,7 +44,8 @@ class MessageServiceAsyncImpl internal constructor(private val clientOptions: Cl class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : MessageServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -55,7 +56,6 @@ class MessageServiceAsyncImpl internal constructor(private val clientOptions: Cl private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: MessageListParams, @@ -75,7 +75,7 @@ class MessageServiceAsyncImpl internal constructor(private val clientOptions: Cl return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/containers/FileServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/containers/FileServiceAsyncImpl.kt index fbe52f89..c96e92bf 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/containers/FileServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/containers/FileServiceAsyncImpl.kt @@ -6,9 +6,9 @@ import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired import com.openai.core.handlers.emptyHandler +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -18,7 +18,6 @@ import com.openai.core.http.json import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.containers.files.FileCreateParams import com.openai.models.containers.files.FileCreateResponse import com.openai.models.containers.files.FileDeleteParams @@ -80,7 +79,8 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val content: ContentServiceAsync.WithRawResponse by lazy { ContentServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -96,7 +96,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien override fun content(): ContentServiceAsync.WithRawResponse = content private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: FileCreateParams, @@ -117,7 +117,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -131,7 +131,6 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: FileRetrieveParams, @@ -156,7 +155,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -170,7 +169,6 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: FileListParams, @@ -190,7 +188,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -210,7 +208,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien } } - private val deleteHandler: Handler = emptyHandler().withErrorHandler(errorHandler) + private val deleteHandler: Handler = emptyHandler() override fun delete( params: FileDeleteParams, @@ -236,7 +234,9 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { response.use { deleteHandler.handle(it) } } + errorHandler.handle(response).parseable { + response.use { deleteHandler.handle(it) } + } } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/containers/files/ContentServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/containers/files/ContentServiceAsyncImpl.kt index cbd87e0e..8f828e93 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/containers/files/ContentServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/containers/files/ContentServiceAsyncImpl.kt @@ -5,13 +5,13 @@ package com.openai.services.async.containers.files import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.containers.files.content.ContentRetrieveParams import java.util.concurrent.CompletableFuture import java.util.function.Consumer @@ -39,7 +39,8 @@ class ContentServiceAsyncImpl internal constructor(private val clientOptions: Cl class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ContentServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -69,9 +70,9 @@ class ContentServiceAsyncImpl internal constructor(private val clientOptions: Cl .build() .prepareAsync(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) - return request.thenComposeAsync { - clientOptions.httpClient.executeAsync(it, requestOptions) - } + return request + .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } + .thenApply { response -> errorHandler.handle(response) } } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/evals/RunServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/evals/RunServiceAsyncImpl.kt index 7e151c7f..65f4a6a1 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/evals/RunServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/evals/RunServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async.evals import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.evals.runs.RunCancelParams import com.openai.models.evals.runs.RunCancelResponse import com.openai.models.evals.runs.RunCreateParams @@ -89,7 +89,8 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : RunServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val outputItems: OutputItemServiceAsync.WithRawResponse by lazy { OutputItemServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -105,7 +106,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client override fun outputItems(): OutputItemServiceAsync.WithRawResponse = outputItems private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: RunCreateParams, @@ -126,7 +127,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -140,7 +141,6 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: RunRetrieveParams, @@ -160,7 +160,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -174,7 +174,6 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: RunListParams, @@ -194,7 +193,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -215,7 +214,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: RunDeleteParams, @@ -236,7 +235,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -249,7 +248,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client } private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun cancel( params: RunCancelParams, @@ -270,7 +269,7 @@ class RunServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/evals/runs/OutputItemServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/evals/runs/OutputItemServiceAsyncImpl.kt index 32e6c0fe..8ca10fa7 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/evals/runs/OutputItemServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/evals/runs/OutputItemServiceAsyncImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.async.evals.runs import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.evals.runs.outputitems.OutputItemListPageAsync import com.openai.models.evals.runs.outputitems.OutputItemListPageResponse import com.openai.models.evals.runs.outputitems.OutputItemListParams @@ -53,7 +53,8 @@ class OutputItemServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : OutputItemServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -64,7 +65,6 @@ class OutputItemServiceAsyncImpl internal constructor(private val clientOptions: private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: OutputItemRetrieveParams, @@ -91,7 +91,7 @@ class OutputItemServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -105,7 +105,6 @@ class OutputItemServiceAsyncImpl internal constructor(private val clientOptions: private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: OutputItemListParams, @@ -131,7 +130,7 @@ class OutputItemServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/JobServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/JobServiceAsyncImpl.kt index d49c92ca..f5f96ab8 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/JobServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/JobServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async.finetuning import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.finetuning.jobs.FineTuningJob import com.openai.models.finetuning.jobs.JobCancelParams import com.openai.models.finetuning.jobs.JobCreateParams @@ -104,7 +104,8 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : JobServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val checkpoints: CheckpointServiceAsync.WithRawResponse by lazy { CheckpointServiceAsyncImpl.WithRawResponseImpl(clientOptions) @@ -120,7 +121,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client override fun checkpoints(): CheckpointServiceAsync.WithRawResponse = checkpoints private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: JobCreateParams, @@ -138,7 +139,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -151,7 +152,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: JobRetrieveParams, @@ -171,7 +172,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -185,7 +186,6 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: JobListParams, @@ -202,7 +202,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -223,7 +223,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client } private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun cancel( params: JobCancelParams, @@ -244,7 +244,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { @@ -258,7 +258,6 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client private val listEventsHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun listEvents( params: JobListEventsParams, @@ -278,7 +277,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listEventsHandler.handle(it) } .also { @@ -299,7 +298,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client } private val pauseHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun pause( params: JobPauseParams, @@ -320,7 +319,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { pauseHandler.handle(it) } .also { @@ -333,7 +332,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client } private val resumeHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun resume( params: JobResumeParams, @@ -354,7 +353,7 @@ class JobServiceAsyncImpl internal constructor(private val clientOptions: Client return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { resumeHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/alpha/GraderServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/alpha/GraderServiceAsyncImpl.kt index dfedf9ba..f20fd900 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/alpha/GraderServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/alpha/GraderServiceAsyncImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.async.finetuning.alpha import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.finetuning.alpha.graders.GraderRunParams import com.openai.models.finetuning.alpha.graders.GraderRunResponse import com.openai.models.finetuning.alpha.graders.GraderValidateParams @@ -51,7 +51,8 @@ class GraderServiceAsyncImpl internal constructor(private val clientOptions: Cli class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : GraderServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -61,7 +62,7 @@ class GraderServiceAsyncImpl internal constructor(private val clientOptions: Cli ) private val runHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun run( params: GraderRunParams, @@ -79,7 +80,7 @@ class GraderServiceAsyncImpl internal constructor(private val clientOptions: Cli return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { runHandler.handle(it) } .also { @@ -93,7 +94,6 @@ class GraderServiceAsyncImpl internal constructor(private val clientOptions: Cli private val validateHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun validate( params: GraderValidateParams, @@ -111,7 +111,7 @@ class GraderServiceAsyncImpl internal constructor(private val clientOptions: Cli return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { validateHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/checkpoints/PermissionServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/checkpoints/PermissionServiceAsyncImpl.kt index 10f6b97e..28dd3a60 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/checkpoints/PermissionServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/checkpoints/PermissionServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async.finetuning.checkpoints import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.finetuning.checkpoints.permissions.PermissionCreatePageAsync import com.openai.models.finetuning.checkpoints.permissions.PermissionCreatePageResponse import com.openai.models.finetuning.checkpoints.permissions.PermissionCreateParams @@ -63,7 +63,8 @@ class PermissionServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : PermissionServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -74,7 +75,6 @@ class PermissionServiceAsyncImpl internal constructor(private val clientOptions: private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: PermissionCreateParams, @@ -100,7 +100,7 @@ class PermissionServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -122,7 +122,6 @@ class PermissionServiceAsyncImpl internal constructor(private val clientOptions: private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: PermissionRetrieveParams, @@ -147,7 +146,7 @@ class PermissionServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -161,7 +160,6 @@ class PermissionServiceAsyncImpl internal constructor(private val clientOptions: private val deleteHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun delete( params: PermissionDeleteParams, @@ -188,7 +186,7 @@ class PermissionServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/jobs/CheckpointServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/jobs/CheckpointServiceAsyncImpl.kt index e033cb35..7ba5aa60 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/jobs/CheckpointServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/finetuning/jobs/CheckpointServiceAsyncImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.async.finetuning.jobs import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.finetuning.jobs.checkpoints.CheckpointListPageAsync import com.openai.models.finetuning.jobs.checkpoints.CheckpointListPageResponse import com.openai.models.finetuning.jobs.checkpoints.CheckpointListParams @@ -44,7 +44,8 @@ class CheckpointServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : CheckpointServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -55,7 +56,6 @@ class CheckpointServiceAsyncImpl internal constructor(private val clientOptions: private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: CheckpointListParams, @@ -75,7 +75,7 @@ class CheckpointServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/responses/InputItemServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/responses/InputItemServiceAsyncImpl.kt index 22bed7ec..1cd778ac 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/responses/InputItemServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/responses/InputItemServiceAsyncImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.async.responses import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.responses.inputitems.InputItemListPageAsync import com.openai.models.responses.inputitems.InputItemListParams import com.openai.models.responses.inputitems.ResponseItemList @@ -44,7 +44,8 @@ class InputItemServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : InputItemServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -54,7 +55,7 @@ class InputItemServiceAsyncImpl internal constructor(private val clientOptions: ) private val listHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun list( params: InputItemListParams, @@ -74,7 +75,7 @@ class InputItemServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/uploads/PartServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/uploads/PartServiceAsyncImpl.kt index 504741cf..5f598711 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/uploads/PartServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/uploads/PartServiceAsyncImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.async.uploads import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.uploads.parts.PartCreateParams import com.openai.models.uploads.parts.UploadPart import java.util.concurrent.CompletableFuture @@ -44,7 +44,8 @@ class PartServiceAsyncImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : PartServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -54,7 +55,7 @@ class PartServiceAsyncImpl internal constructor(private val clientOptions: Clien ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: PartCreateParams, @@ -75,7 +76,7 @@ class PartServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileBatchServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileBatchServiceAsyncImpl.kt index f579924c..b35b930a 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileBatchServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileBatchServiceAsyncImpl.kt @@ -5,18 +5,18 @@ package com.openai.services.async.vectorstores import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.vectorstores.filebatches.FileBatchCancelParams import com.openai.models.vectorstores.filebatches.FileBatchCreateParams import com.openai.models.vectorstores.filebatches.FileBatchListFilesPageAsync @@ -76,7 +76,8 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileBatchServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -87,7 +88,6 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: FileBatchCreateParams, @@ -109,7 +109,7 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -123,7 +123,6 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: FileBatchRetrieveParams, @@ -149,7 +148,7 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -163,7 +162,6 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: private val cancelHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun cancel( params: FileBatchCancelParams, @@ -191,7 +189,7 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { @@ -205,7 +203,6 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: private val listFilesHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun listFiles( params: FileBatchListFilesParams, @@ -232,7 +229,7 @@ class FileBatchServiceAsyncImpl internal constructor(private val clientOptions: return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listFilesHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileServiceAsyncImpl.kt index f8ac69ed..6c738a3d 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/vectorstores/FileServiceAsyncImpl.kt @@ -5,18 +5,18 @@ package com.openai.services.async.vectorstores import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepareAsync -import com.openai.models.ErrorObject import com.openai.models.vectorstores.files.FileContentPageAsync import com.openai.models.vectorstores.files.FileContentPageResponse import com.openai.models.vectorstores.files.FileContentParams @@ -95,7 +95,8 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileServiceAsync.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -105,7 +106,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: FileCreateParams, @@ -127,7 +128,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -140,7 +141,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: FileRetrieveParams, @@ -166,7 +167,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -179,7 +180,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: FileUpdateParams, @@ -206,7 +207,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -220,7 +221,6 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: FileListParams, @@ -241,7 +241,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -263,7 +263,6 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien private val deleteHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun delete( params: FileDeleteParams, @@ -290,7 +289,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -304,7 +303,6 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien private val contentHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun content( params: FileContentParams, @@ -331,7 +329,7 @@ class FileServiceAsyncImpl internal constructor(private val clientOptions: Clien return request .thenComposeAsync { clientOptions.httpClient.executeAsync(it, requestOptions) } .thenApply { response -> - response.parseable { + errorHandler.handle(response).parseable { response .use { contentHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/BatchServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/BatchServiceImpl.kt index 7302baba..02515868 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/BatchServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/BatchServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.batches.Batch import com.openai.models.batches.BatchCancelParams import com.openai.models.batches.BatchCreateParams @@ -57,7 +57,8 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : BatchService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -66,8 +67,7 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt clientOptions.toBuilder().apply(modifier::accept).build() ) - private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun create( params: BatchCreateParams, @@ -83,7 +83,7 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -94,8 +94,7 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt } } - private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: BatchRetrieveParams, @@ -113,7 +112,7 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -126,7 +125,6 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: BatchListParams, @@ -141,7 +139,7 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -159,8 +157,7 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt } } - private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val cancelHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun cancel( params: BatchCancelParams, @@ -179,7 +176,7 @@ class BatchServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/CompletionServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/CompletionServiceImpl.kt index 65a4f09b..0292b143 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/CompletionServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/CompletionServiceImpl.kt @@ -5,13 +5,14 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.JsonValue import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.StreamResponse @@ -19,7 +20,6 @@ import com.openai.core.http.json import com.openai.core.http.map import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.completions.Completion import com.openai.models.completions.CompletionCreateParams import java.util.function.Consumer @@ -53,7 +53,8 @@ class CompletionServiceImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : CompletionService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -63,7 +64,7 @@ class CompletionServiceImpl internal constructor(private val clientOptions: Clie ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: CompletionCreateParams, @@ -79,7 +80,7 @@ class CompletionServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -91,9 +92,7 @@ class CompletionServiceImpl internal constructor(private val clientOptions: Clie } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: CompletionCreateParams, @@ -118,7 +117,7 @@ class CompletionServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ContainerServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ContainerServiceImpl.kt index 55da41c7..1634e945 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ContainerServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ContainerServiceImpl.kt @@ -6,9 +6,9 @@ import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired import com.openai.core.handlers.emptyHandler +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -17,7 +17,6 @@ import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.containers.ContainerCreateParams import com.openai.models.containers.ContainerCreateResponse import com.openai.models.containers.ContainerDeleteParams @@ -76,7 +75,8 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ContainerService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val files: FileService.WithRawResponse by lazy { FileServiceImpl.WithRawResponseImpl(clientOptions) @@ -93,7 +93,6 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: ContainerCreateParams, @@ -109,7 +108,7 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -122,7 +121,6 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: ContainerRetrieveParams, @@ -140,7 +138,7 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -153,7 +151,6 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: ContainerListParams, @@ -168,7 +165,7 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -186,7 +183,7 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien } } - private val deleteHandler: Handler = emptyHandler().withErrorHandler(errorHandler) + private val deleteHandler: Handler = emptyHandler() override fun delete( params: ContainerDeleteParams, @@ -205,7 +202,9 @@ class ContainerServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { response.use { deleteHandler.handle(it) } } + return errorHandler.handle(response).parseable { + response.use { deleteHandler.handle(it) } + } } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/EmbeddingServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/EmbeddingServiceImpl.kt index f7d45a18..e35514b0 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/EmbeddingServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/EmbeddingServiceImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.embeddings.CreateEmbeddingResponse import com.openai.models.embeddings.EmbeddingCreateParams import java.util.function.Consumer @@ -41,7 +41,8 @@ class EmbeddingServiceImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : EmbeddingService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -52,7 +53,6 @@ class EmbeddingServiceImpl internal constructor(private val clientOptions: Clien private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: EmbeddingCreateParams, @@ -68,7 +68,7 @@ class EmbeddingServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/EvalServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/EvalServiceImpl.kt index 06f5116d..77d83fed 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/EvalServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/EvalServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.evals.EvalCreateParams import com.openai.models.evals.EvalCreateResponse import com.openai.models.evals.EvalDeleteParams @@ -82,7 +82,8 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : EvalService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val runs: RunService.WithRawResponse by lazy { RunServiceImpl.WithRawResponseImpl(clientOptions) @@ -98,7 +99,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti override fun runs(): RunService.WithRawResponse = runs private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: EvalCreateParams, @@ -114,7 +115,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -127,7 +128,6 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: EvalRetrieveParams, @@ -145,7 +145,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -157,7 +157,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: EvalUpdateParams, @@ -176,7 +176,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -189,7 +189,6 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: EvalListParams, @@ -204,7 +203,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -223,7 +222,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: EvalDeleteParams, @@ -242,7 +241,7 @@ class EvalServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/FileServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/FileServiceImpl.kt index 92eadecc..63318a1b 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/FileServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/FileServiceImpl.kt @@ -5,9 +5,9 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -17,7 +17,6 @@ import com.openai.core.http.json import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.files.FileContentParams import com.openai.models.files.FileCreateParams import com.openai.models.files.FileDeleteParams @@ -64,7 +63,8 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -74,7 +74,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: FileCreateParams, @@ -90,7 +90,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -102,7 +102,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: FileRetrieveParams, @@ -120,7 +120,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -133,7 +133,6 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: FileListParams, @@ -148,7 +147,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -167,7 +166,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: FileDeleteParams, @@ -186,7 +185,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -212,7 +211,8 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .build() .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) - return clientOptions.httpClient.execute(request, requestOptions) + val response = clientOptions.httpClient.execute(request, requestOptions) + return errorHandler.handle(response) } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ImageServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ImageServiceImpl.kt index 5cc08430..fa042bac 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ImageServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ImageServiceImpl.kt @@ -6,13 +6,14 @@ import com.openai.core.ClientOptions import com.openai.core.JsonValue import com.openai.core.MultipartField import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.StreamResponse @@ -21,7 +22,6 @@ import com.openai.core.http.map import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.images.ImageCreateVariationParams import com.openai.models.images.ImageEditParams import com.openai.models.images.ImageEditStreamEvent @@ -77,7 +77,8 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ImageService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -85,7 +86,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt WithRawResponseImpl(clientOptions.toBuilder().apply(modifier::accept).build()) private val createVariationHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun createVariation( params: ImageCreateVariationParams, @@ -105,7 +106,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt ) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createVariationHandler.handle(it) } .also { @@ -117,7 +118,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt } private val editHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun edit( params: ImageEditParams, @@ -137,7 +138,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt ) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { editHandler.handle(it) } .also { @@ -149,9 +150,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt } private val editStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun editStreaming( params: ImageEditParams, @@ -172,7 +171,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .let { editStreamingHandler.handle(it) } .let { streamResponse -> @@ -186,7 +185,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt } private val generateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun generate( params: ImageGenerateParams, @@ -206,7 +205,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt ) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { generateHandler.handle(it) } .also { @@ -218,9 +217,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt } private val generateStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun generateStreaming( params: ImageGenerateParams, @@ -245,7 +242,7 @@ class ImageServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .let { generateStreamingHandler.handle(it) } .let { streamResponse -> diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModelServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModelServiceImpl.kt index 65f85ff2..cc3c634e 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModelServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModelServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.models.Model import com.openai.models.models.ModelDeleteParams import com.openai.models.models.ModelDeleted @@ -53,7 +53,8 @@ class ModelServiceImpl internal constructor(private val clientOptions: ClientOpt class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ModelService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -62,8 +63,7 @@ class ModelServiceImpl internal constructor(private val clientOptions: ClientOpt clientOptions.toBuilder().apply(modifier::accept).build() ) - private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: ModelRetrieveParams, @@ -81,7 +81,7 @@ class ModelServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, params.model().get()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -94,7 +94,6 @@ class ModelServiceImpl internal constructor(private val clientOptions: ClientOpt private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: ModelListParams, @@ -109,7 +108,7 @@ class ModelServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -128,7 +127,7 @@ class ModelServiceImpl internal constructor(private val clientOptions: ClientOpt } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: ModelDeleteParams, @@ -147,7 +146,7 @@ class ModelServiceImpl internal constructor(private val clientOptions: ClientOpt .prepare(clientOptions, params, params.model().get()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModerationServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModerationServiceImpl.kt index 7ace846f..e9928428 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModerationServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ModerationServiceImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.moderations.ModerationCreateParams import com.openai.models.moderations.ModerationCreateResponse import java.util.function.Consumer @@ -41,7 +41,8 @@ class ModerationServiceImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ModerationService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -52,7 +53,6 @@ class ModerationServiceImpl internal constructor(private val clientOptions: Clie private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: ModerationCreateParams, @@ -68,7 +68,7 @@ class ModerationServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ResponseServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ResponseServiceImpl.kt index c57ecd81..edcb4e1d 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/ResponseServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/ResponseServiceImpl.kt @@ -7,11 +7,11 @@ import com.openai.core.JsonValue import com.openai.core.RequestOptions import com.openai.core.checkRequired import com.openai.core.handlers.emptyHandler +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -22,7 +22,6 @@ import com.openai.core.http.json import com.openai.core.http.map import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.responses.Response import com.openai.models.responses.ResponseCancelParams import com.openai.models.responses.ResponseCreateParams @@ -87,7 +86,8 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ResponseService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val inputItems: InputItemService.WithRawResponse by lazy { InputItemServiceImpl.WithRawResponseImpl(clientOptions) @@ -103,7 +103,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client override fun inputItems(): InputItemService.WithRawResponse = inputItems private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: ResponseCreateParams, @@ -119,7 +119,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -131,9 +131,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: ResponseCreateParams, @@ -158,7 +156,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> @@ -172,7 +170,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: ResponseRetrieveParams, @@ -190,7 +188,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -202,9 +200,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client } private val retrieveStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun retrieveStreaming( params: ResponseRetrieveParams, @@ -223,7 +219,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .let { retrieveStreamingHandler.handle(it) } .let { streamResponse -> @@ -236,7 +232,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client } } - private val deleteHandler: Handler = emptyHandler().withErrorHandler(errorHandler) + private val deleteHandler: Handler = emptyHandler() override fun delete( params: ResponseDeleteParams, @@ -255,11 +251,13 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { response.use { deleteHandler.handle(it) } } + return errorHandler.handle(response).parseable { + response.use { deleteHandler.handle(it) } + } } private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun cancel( params: ResponseCancelParams, @@ -278,7 +276,7 @@ class ResponseServiceImpl internal constructor(private val clientOptions: Client .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/UploadServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/UploadServiceImpl.kt index eef31eb6..e856be5a 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/UploadServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/UploadServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.uploads.Upload import com.openai.models.uploads.UploadCancelParams import com.openai.models.uploads.UploadCompleteParams @@ -56,7 +56,8 @@ class UploadServiceImpl internal constructor(private val clientOptions: ClientOp class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : UploadService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val parts: PartService.WithRawResponse by lazy { PartServiceImpl.WithRawResponseImpl(clientOptions) @@ -71,8 +72,7 @@ class UploadServiceImpl internal constructor(private val clientOptions: ClientOp override fun parts(): PartService.WithRawResponse = parts - private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun create( params: UploadCreateParams, @@ -88,7 +88,7 @@ class UploadServiceImpl internal constructor(private val clientOptions: ClientOp .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -99,8 +99,7 @@ class UploadServiceImpl internal constructor(private val clientOptions: ClientOp } } - private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val cancelHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun cancel( params: UploadCancelParams, @@ -119,7 +118,7 @@ class UploadServiceImpl internal constructor(private val clientOptions: ClientOp .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { @@ -130,8 +129,7 @@ class UploadServiceImpl internal constructor(private val clientOptions: ClientOp } } - private val completeHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + private val completeHandler: Handler = jsonHandler(clientOptions.jsonMapper) override fun complete( params: UploadCompleteParams, @@ -150,7 +148,7 @@ class UploadServiceImpl internal constructor(private val clientOptions: ClientOp .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { completeHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/VectorStoreServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/VectorStoreServiceImpl.kt index 12f402a2..f8965d56 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/VectorStoreServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/VectorStoreServiceImpl.kt @@ -5,18 +5,18 @@ package com.openai.services.blocking import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.vectorstores.VectorStore import com.openai.models.vectorstores.VectorStoreCreateParams import com.openai.models.vectorstores.VectorStoreDeleteParams @@ -106,7 +106,8 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : VectorStoreService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val files: FileService.WithRawResponse by lazy { FileServiceImpl.WithRawResponseImpl(clientOptions) @@ -128,7 +129,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli override fun fileBatches(): FileBatchService.WithRawResponse = fileBatches private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: VectorStoreCreateParams, @@ -145,7 +146,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -157,7 +158,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: VectorStoreRetrieveParams, @@ -176,7 +177,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -188,7 +189,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: VectorStoreUpdateParams, @@ -208,7 +209,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -221,7 +222,6 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: VectorStoreListParams, @@ -237,7 +237,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -256,7 +256,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: VectorStoreDeleteParams, @@ -276,7 +276,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -289,7 +289,6 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli private val searchHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun search( params: VectorStoreSearchParams, @@ -309,7 +308,7 @@ class VectorStoreServiceImpl internal constructor(private val clientOptions: Cli .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { searchHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/SpeechServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/SpeechServiceImpl.kt index 9d978c03..92dca2d9 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/SpeechServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/SpeechServiceImpl.kt @@ -4,6 +4,7 @@ package com.openai.services.blocking.audio import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest @@ -11,7 +12,6 @@ import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.json import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.audio.speech.SpeechCreateParams import java.util.function.Consumer @@ -34,7 +34,8 @@ class SpeechServiceImpl internal constructor(private val clientOptions: ClientOp class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : SpeechService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -56,7 +57,8 @@ class SpeechServiceImpl internal constructor(private val clientOptions: ClientOp .build() .prepare(clientOptions, params, deploymentModel = params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) - return clientOptions.httpClient.execute(request, requestOptions) + val response = clientOptions.httpClient.execute(request, requestOptions) + return errorHandler.handle(response) } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt index fdf16a50..9c26e557 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt @@ -5,12 +5,12 @@ package com.openai.services.blocking.audio import com.openai.core.ClientOptions import com.openai.core.MultipartField import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler import com.openai.core.handlers.stringHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -21,7 +21,6 @@ import com.openai.core.http.map import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.audio.transcriptions.Transcription import com.openai.models.audio.transcriptions.TranscriptionCreateParams import com.openai.models.audio.transcriptions.TranscriptionCreateResponse @@ -58,7 +57,8 @@ class TranscriptionServiceImpl internal constructor(private val clientOptions: C class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : TranscriptionService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -69,7 +69,6 @@ class TranscriptionServiceImpl internal constructor(private val clientOptions: C private val createJsonHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) private val createStringHandler: Handler = object : Handler { @@ -95,7 +94,7 @@ class TranscriptionServiceImpl internal constructor(private val clientOptions: C .prepare(clientOptions, params, deploymentModel = params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { val handler = if (params.responseFormat().getOrNull()?.isJson() != false) createJsonHandler else createStringHandler @@ -110,9 +109,7 @@ class TranscriptionServiceImpl internal constructor(private val clientOptions: C } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: TranscriptionCreateParams, @@ -133,7 +130,7 @@ class TranscriptionServiceImpl internal constructor(private val clientOptions: C .prepare(clientOptions, params, deploymentModel = params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranslationServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranslationServiceImpl.kt index abc0dfe2..37524976 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranslationServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranslationServiceImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.blocking.audio import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.audio.translations.TranslationCreateParams import com.openai.models.audio.translations.TranslationCreateResponse import java.util.function.Consumer @@ -41,7 +41,8 @@ class TranslationServiceImpl internal constructor(private val clientOptions: Cli class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : TranslationService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -52,7 +53,6 @@ class TranslationServiceImpl internal constructor(private val clientOptions: Cli private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: TranslationCreateParams, @@ -68,7 +68,7 @@ class TranslationServiceImpl internal constructor(private val clientOptions: Cli .prepare(clientOptions, params, deploymentModel = params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/SessionServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/SessionServiceImpl.kt index 4671463f..523eb0e7 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/SessionServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/SessionServiceImpl.kt @@ -4,18 +4,18 @@ package com.openai.services.blocking.beta.realtime import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.beta.realtime.sessions.SessionCreateParams import com.openai.models.beta.realtime.sessions.SessionCreateResponse import java.util.function.Consumer @@ -47,7 +47,8 @@ class SessionServiceImpl internal constructor(private val clientOptions: ClientO class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : SessionService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -58,7 +59,6 @@ class SessionServiceImpl internal constructor(private val clientOptions: ClientO private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: SessionCreateParams, @@ -75,7 +75,7 @@ class SessionServiceImpl internal constructor(private val clientOptions: ClientO .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/TranscriptionSessionServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/TranscriptionSessionServiceImpl.kt index e8dc4855..3341bd55 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/TranscriptionSessionServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/beta/realtime/TranscriptionSessionServiceImpl.kt @@ -4,18 +4,18 @@ package com.openai.services.blocking.beta.realtime import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.beta.realtime.transcriptionsessions.TranscriptionSession import com.openai.models.beta.realtime.transcriptionsessions.TranscriptionSessionCreateParams import java.util.function.Consumer @@ -49,7 +49,8 @@ internal constructor(private val clientOptions: ClientOptions) : TranscriptionSe class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : TranscriptionSessionService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -60,7 +61,6 @@ internal constructor(private val clientOptions: ClientOptions) : TranscriptionSe private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: TranscriptionSessionCreateParams, @@ -77,7 +77,7 @@ internal constructor(private val clientOptions: ClientOptions) : TranscriptionSe .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/ChatCompletionServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/ChatCompletionServiceImpl.kt index c9741de7..b078d7a5 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/ChatCompletionServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/ChatCompletionServiceImpl.kt @@ -6,13 +6,14 @@ import com.openai.core.ClientOptions import com.openai.core.JsonValue import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler import com.openai.core.handlers.mapJson import com.openai.core.handlers.sseHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.StreamResponse @@ -20,7 +21,6 @@ import com.openai.core.http.json import com.openai.core.http.map import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.chat.completions.ChatCompletion import com.openai.models.chat.completions.ChatCompletionChunk import com.openai.models.chat.completions.ChatCompletionCreateParams @@ -97,7 +97,8 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ChatCompletionService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val messages: MessageService.WithRawResponse by lazy { MessageServiceImpl.WithRawResponseImpl(clientOptions) @@ -113,7 +114,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: override fun messages(): MessageService.WithRawResponse = messages private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: ChatCompletionCreateParams, @@ -129,7 +130,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: .prepare(clientOptions, params, params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -141,9 +142,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: } private val createStreamingHandler: Handler> = - sseHandler(clientOptions.jsonMapper) - .mapJson() - .withErrorHandler(errorHandler) + sseHandler(clientOptions.jsonMapper).mapJson() override fun createStreaming( params: ChatCompletionCreateParams, @@ -168,7 +167,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: .prepare(clientOptions, params, params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .let { createStreamingHandler.handle(it) } .let { streamResponse -> @@ -182,7 +181,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: ChatCompletionRetrieveParams, @@ -200,7 +199,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: .prepare(clientOptions, params, null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -212,7 +211,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: ChatCompletionUpdateParams, @@ -231,7 +230,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: .prepare(clientOptions, params, null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -244,7 +243,6 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: ChatCompletionListParams, @@ -263,7 +261,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: ) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -283,7 +281,6 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: private val deleteHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun delete( params: ChatCompletionDeleteParams, @@ -302,7 +299,7 @@ class ChatCompletionServiceImpl internal constructor(private val clientOptions: .prepare(clientOptions, params, null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/completions/MessageServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/completions/MessageServiceImpl.kt index 249f1cbc..69ff2f57 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/completions/MessageServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/chat/completions/MessageServiceImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.blocking.chat.completions import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.chat.completions.messages.MessageListPage import com.openai.models.chat.completions.messages.MessageListPageResponse import com.openai.models.chat.completions.messages.MessageListParams @@ -40,7 +40,8 @@ class MessageServiceImpl internal constructor(private val clientOptions: ClientO class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : MessageService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -51,7 +52,6 @@ class MessageServiceImpl internal constructor(private val clientOptions: ClientO private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: MessageListParams, @@ -69,7 +69,7 @@ class MessageServiceImpl internal constructor(private val clientOptions: ClientO .prepare(clientOptions, params, null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/FileServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/FileServiceImpl.kt index 31a5d1f1..45400120 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/FileServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/FileServiceImpl.kt @@ -6,9 +6,9 @@ import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired import com.openai.core.handlers.emptyHandler +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse @@ -18,7 +18,6 @@ import com.openai.core.http.json import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.containers.files.FileCreateParams import com.openai.models.containers.files.FileCreateResponse import com.openai.models.containers.files.FileDeleteParams @@ -73,7 +72,8 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val content: ContentService.WithRawResponse by lazy { ContentServiceImpl.WithRawResponseImpl(clientOptions) @@ -89,7 +89,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti override fun content(): ContentService.WithRawResponse = content private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: FileCreateParams, @@ -108,7 +108,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -121,7 +121,6 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: FileRetrieveParams, @@ -144,7 +143,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -157,7 +156,6 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: FileListParams, @@ -175,7 +173,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -193,7 +191,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti } } - private val deleteHandler: Handler = emptyHandler().withErrorHandler(errorHandler) + private val deleteHandler: Handler = emptyHandler() override fun delete( params: FileDeleteParams, @@ -217,7 +215,9 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { response.use { deleteHandler.handle(it) } } + return errorHandler.handle(response).parseable { + response.use { deleteHandler.handle(it) } + } } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/files/ContentServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/files/ContentServiceImpl.kt index 116a4d57..5b6cd6cb 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/files/ContentServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/containers/files/ContentServiceImpl.kt @@ -5,13 +5,13 @@ package com.openai.services.blocking.containers.files import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.containers.files.content.ContentRetrieveParams import java.util.function.Consumer import kotlin.jvm.optionals.getOrNull @@ -38,7 +38,8 @@ class ContentServiceImpl internal constructor(private val clientOptions: ClientO class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : ContentService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -68,7 +69,8 @@ class ContentServiceImpl internal constructor(private val clientOptions: ClientO .build() .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) - return clientOptions.httpClient.execute(request, requestOptions) + val response = clientOptions.httpClient.execute(request, requestOptions) + return errorHandler.handle(response) } } } diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/RunServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/RunServiceImpl.kt index 101762a3..cc513e79 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/RunServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/RunServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking.evals import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.evals.runs.RunCancelParams import com.openai.models.evals.runs.RunCancelResponse import com.openai.models.evals.runs.RunCreateParams @@ -82,7 +82,8 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : RunService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val outputItems: OutputItemService.WithRawResponse by lazy { OutputItemServiceImpl.WithRawResponseImpl(clientOptions) @@ -98,7 +99,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio override fun outputItems(): OutputItemService.WithRawResponse = outputItems private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: RunCreateParams, @@ -117,7 +118,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -130,7 +131,6 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: RunRetrieveParams, @@ -148,7 +148,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -161,7 +161,6 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: RunListParams, @@ -179,7 +178,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -198,7 +197,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio } private val deleteHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun delete( params: RunDeleteParams, @@ -217,7 +216,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -229,7 +228,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio } private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun cancel( params: RunCancelParams, @@ -248,7 +247,7 @@ class RunServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/runs/OutputItemServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/runs/OutputItemServiceImpl.kt index 5658996c..b656d22c 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/runs/OutputItemServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/evals/runs/OutputItemServiceImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.blocking.evals.runs import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.evals.runs.outputitems.OutputItemListPage import com.openai.models.evals.runs.outputitems.OutputItemListPageResponse import com.openai.models.evals.runs.outputitems.OutputItemListParams @@ -52,7 +52,8 @@ class OutputItemServiceImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : OutputItemService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -63,7 +64,6 @@ class OutputItemServiceImpl internal constructor(private val clientOptions: Clie private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: OutputItemRetrieveParams, @@ -88,7 +88,7 @@ class OutputItemServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -101,7 +101,6 @@ class OutputItemServiceImpl internal constructor(private val clientOptions: Clie private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: OutputItemListParams, @@ -125,7 +124,7 @@ class OutputItemServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/JobServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/JobServiceImpl.kt index 286b0546..b143a6f9 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/JobServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/JobServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking.finetuning import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.finetuning.jobs.FineTuningJob import com.openai.models.finetuning.jobs.JobCancelParams import com.openai.models.finetuning.jobs.JobCreateParams @@ -85,7 +85,8 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : JobService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) private val checkpoints: CheckpointService.WithRawResponse by lazy { CheckpointServiceImpl.WithRawResponseImpl(clientOptions) @@ -101,7 +102,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio override fun checkpoints(): CheckpointService.WithRawResponse = checkpoints private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: JobCreateParams, @@ -117,7 +118,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, params.model().toString()) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -129,7 +130,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: JobRetrieveParams, @@ -147,7 +148,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -160,7 +161,6 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: JobListParams, @@ -175,7 +175,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -194,7 +194,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio } private val cancelHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun cancel( params: JobCancelParams, @@ -213,7 +213,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { @@ -226,7 +226,6 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio private val listEventsHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun listEvents( params: JobListEventsParams, @@ -244,7 +243,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listEventsHandler.handle(it) } .also { @@ -263,7 +262,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio } private val pauseHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun pause( params: JobPauseParams, @@ -282,7 +281,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { pauseHandler.handle(it) } .also { @@ -294,7 +293,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio } private val resumeHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun resume( params: JobResumeParams, @@ -313,7 +312,7 @@ class JobServiceImpl internal constructor(private val clientOptions: ClientOptio .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { resumeHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/alpha/GraderServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/alpha/GraderServiceImpl.kt index 059e9536..0d8b0ae9 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/alpha/GraderServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/alpha/GraderServiceImpl.kt @@ -4,17 +4,17 @@ package com.openai.services.blocking.finetuning.alpha import com.openai.core.ClientOptions import com.openai.core.RequestOptions +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.finetuning.alpha.graders.GraderRunParams import com.openai.models.finetuning.alpha.graders.GraderRunResponse import com.openai.models.finetuning.alpha.graders.GraderValidateParams @@ -47,7 +47,8 @@ class GraderServiceImpl internal constructor(private val clientOptions: ClientOp class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : GraderService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -57,7 +58,7 @@ class GraderServiceImpl internal constructor(private val clientOptions: ClientOp ) private val runHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun run( params: GraderRunParams, @@ -73,7 +74,7 @@ class GraderServiceImpl internal constructor(private val clientOptions: ClientOp .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { runHandler.handle(it) } .also { @@ -86,7 +87,6 @@ class GraderServiceImpl internal constructor(private val clientOptions: ClientOp private val validateHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun validate( params: GraderValidateParams, @@ -102,7 +102,7 @@ class GraderServiceImpl internal constructor(private val clientOptions: ClientOp .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { validateHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/checkpoints/PermissionServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/checkpoints/PermissionServiceImpl.kt index 9cada335..8bbffb9f 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/checkpoints/PermissionServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/checkpoints/PermissionServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking.finetuning.checkpoints import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.finetuning.checkpoints.permissions.PermissionCreatePage import com.openai.models.finetuning.checkpoints.permissions.PermissionCreatePageResponse import com.openai.models.finetuning.checkpoints.permissions.PermissionCreateParams @@ -62,7 +62,8 @@ class PermissionServiceImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : PermissionService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -73,7 +74,6 @@ class PermissionServiceImpl internal constructor(private val clientOptions: Clie private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: PermissionCreateParams, @@ -97,7 +97,7 @@ class PermissionServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -117,7 +117,6 @@ class PermissionServiceImpl internal constructor(private val clientOptions: Clie private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: PermissionRetrieveParams, @@ -140,7 +139,7 @@ class PermissionServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -153,7 +152,6 @@ class PermissionServiceImpl internal constructor(private val clientOptions: Clie private val deleteHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun delete( params: PermissionDeleteParams, @@ -178,7 +176,7 @@ class PermissionServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/jobs/CheckpointServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/jobs/CheckpointServiceImpl.kt index 2807c36c..9517decd 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/jobs/CheckpointServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/finetuning/jobs/CheckpointServiceImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.blocking.finetuning.jobs import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.finetuning.jobs.checkpoints.CheckpointListPage import com.openai.models.finetuning.jobs.checkpoints.CheckpointListPageResponse import com.openai.models.finetuning.jobs.checkpoints.CheckpointListParams @@ -43,7 +43,8 @@ class CheckpointServiceImpl internal constructor(private val clientOptions: Clie class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : CheckpointService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -54,7 +55,6 @@ class CheckpointServiceImpl internal constructor(private val clientOptions: Clie private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: CheckpointListParams, @@ -72,7 +72,7 @@ class CheckpointServiceImpl internal constructor(private val clientOptions: Clie .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/responses/InputItemServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/responses/InputItemServiceImpl.kt index 5f11218e..643f4068 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/responses/InputItemServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/responses/InputItemServiceImpl.kt @@ -5,16 +5,16 @@ package com.openai.services.blocking.responses import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.responses.inputitems.InputItemListPage import com.openai.models.responses.inputitems.InputItemListParams import com.openai.models.responses.inputitems.ResponseItemList @@ -43,7 +43,8 @@ class InputItemServiceImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : InputItemService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -53,7 +54,7 @@ class InputItemServiceImpl internal constructor(private val clientOptions: Clien ) private val listHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun list( params: InputItemListParams, @@ -71,7 +72,7 @@ class InputItemServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/uploads/PartServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/uploads/PartServiceImpl.kt index 24d63a7d..0f2a9d28 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/uploads/PartServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/uploads/PartServiceImpl.kt @@ -5,17 +5,17 @@ package com.openai.services.blocking.uploads import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.multipartFormData import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.uploads.parts.PartCreateParams import com.openai.models.uploads.parts.UploadPart import java.util.function.Consumer @@ -39,7 +39,8 @@ class PartServiceImpl internal constructor(private val clientOptions: ClientOpti class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : PartService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -49,7 +50,7 @@ class PartServiceImpl internal constructor(private val clientOptions: ClientOpti ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: PartCreateParams, @@ -68,7 +69,7 @@ class PartServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileBatchServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileBatchServiceImpl.kt index 69523faf..a86991b5 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileBatchServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileBatchServiceImpl.kt @@ -5,18 +5,18 @@ package com.openai.services.blocking.vectorstores import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.vectorstores.filebatches.FileBatchCancelParams import com.openai.models.vectorstores.filebatches.FileBatchCreateParams import com.openai.models.vectorstores.filebatches.FileBatchListFilesPage @@ -75,7 +75,8 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileBatchService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -86,7 +87,6 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien private val createHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun create( params: FileBatchCreateParams, @@ -106,7 +106,7 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -119,7 +119,6 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien private val retrieveHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun retrieve( params: FileBatchRetrieveParams, @@ -143,7 +142,7 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -156,7 +155,6 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien private val cancelHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun cancel( params: FileBatchCancelParams, @@ -182,7 +180,7 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { cancelHandler.handle(it) } .also { @@ -195,7 +193,6 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien private val listFilesHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun listFiles( params: FileBatchListFilesParams, @@ -220,7 +217,7 @@ class FileBatchServiceImpl internal constructor(private val clientOptions: Clien .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listFilesHandler.handle(it) } .also { diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileServiceImpl.kt index 5f7b6b54..783ecbe9 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/vectorstores/FileServiceImpl.kt @@ -5,18 +5,18 @@ package com.openai.services.blocking.vectorstores import com.openai.core.ClientOptions import com.openai.core.RequestOptions import com.openai.core.checkRequired +import com.openai.core.handlers.errorBodyHandler import com.openai.core.handlers.errorHandler import com.openai.core.handlers.jsonHandler -import com.openai.core.handlers.withErrorHandler import com.openai.core.http.Headers import com.openai.core.http.HttpMethod import com.openai.core.http.HttpRequest +import com.openai.core.http.HttpResponse import com.openai.core.http.HttpResponse.Handler import com.openai.core.http.HttpResponseFor import com.openai.core.http.json import com.openai.core.http.parseable import com.openai.core.prepare -import com.openai.models.ErrorObject import com.openai.models.vectorstores.files.FileContentPage import com.openai.models.vectorstores.files.FileContentPageResponse import com.openai.models.vectorstores.files.FileContentParams @@ -84,7 +84,8 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) : FileService.WithRawResponse { - private val errorHandler: Handler = errorHandler(clientOptions.jsonMapper) + private val errorHandler: Handler = + errorHandler(errorBodyHandler(clientOptions.jsonMapper)) override fun withOptions( modifier: Consumer @@ -94,7 +95,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti ) private val createHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun create( params: FileCreateParams, @@ -114,7 +115,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { createHandler.handle(it) } .also { @@ -126,7 +127,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti } private val retrieveHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun retrieve( params: FileRetrieveParams, @@ -150,7 +151,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { retrieveHandler.handle(it) } .also { @@ -162,7 +163,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti } private val updateHandler: Handler = - jsonHandler(clientOptions.jsonMapper).withErrorHandler(errorHandler) + jsonHandler(clientOptions.jsonMapper) override fun update( params: FileUpdateParams, @@ -187,7 +188,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { updateHandler.handle(it) } .also { @@ -200,7 +201,6 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti private val listHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun list( params: FileListParams, @@ -219,7 +219,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { listHandler.handle(it) } .also { @@ -239,7 +239,6 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti private val deleteHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun delete( params: FileDeleteParams, @@ -264,7 +263,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { deleteHandler.handle(it) } .also { @@ -277,7 +276,6 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti private val contentHandler: Handler = jsonHandler(clientOptions.jsonMapper) - .withErrorHandler(errorHandler) override fun content( params: FileContentParams, @@ -302,7 +300,7 @@ class FileServiceImpl internal constructor(private val clientOptions: ClientOpti .prepare(clientOptions, params, deploymentModel = null) val requestOptions = requestOptions.applyDefaults(RequestOptions.from(clientOptions)) val response = clientOptions.httpClient.execute(request, requestOptions) - return response.parseable { + return errorHandler.handle(response).parseable { response .use { contentHandler.handle(it) } .also { diff --git a/openai-java-core/src/test/kotlin/com/openai/services/ErrorHandlingTest.kt b/openai-java-core/src/test/kotlin/com/openai/services/ErrorHandlingTest.kt index f19d5a98..7be10767 100644 --- a/openai-java-core/src/test/kotlin/com/openai/services/ErrorHandlingTest.kt +++ b/openai-java-core/src/test/kotlin/com/openai/services/ErrorHandlingTest.kt @@ -192,18 +192,830 @@ internal class ErrorHandlingTest { ) } + @Test + fun jobsCreate400WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() + stubFor( + post(anyUrl()) + .willReturn( + status(400).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(400) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + @Test fun jobsCreate401() { val jobService = client.fineTuning().jobs() stubFor( post(anyUrl()) .willReturn( - status(401).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + status(401).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(401) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + + @Test + fun jobsCreate401WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() + stubFor( + post(anyUrl()) + .willReturn( + status(401).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(401) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + + @Test + fun jobsCreate403() { + val jobService = client.fineTuning().jobs() + stubFor( + post(anyUrl()) + .willReturn( + status(403).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(403) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + + @Test + fun jobsCreate403WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() + stubFor( + post(anyUrl()) + .willReturn( + status(403).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(403) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + + @Test + fun jobsCreate404() { + val jobService = client.fineTuning().jobs() + stubFor( + post(anyUrl()) + .willReturn( + status(404).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(404) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + + @Test + fun jobsCreate404WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() + stubFor( + post(anyUrl()) + .willReturn( + status(404).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(404) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + + @Test + fun jobsCreate422() { + val jobService = client.fineTuning().jobs() + stubFor( + post(anyUrl()) + .willReturn( + status(422).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) ) ) val e = - assertThrows { + assertThrows { jobService.create( JobCreateParams.builder() .model(JobCreateParams.Model.BABBAGE_002) @@ -293,7 +1105,7 @@ internal class ErrorHandlingTest { ) } - assertThat(e.statusCode()).isEqualTo(401) + assertThat(e.statusCode()).isEqualTo(422) assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) assertThat(e.body()) .isEqualTo( @@ -309,17 +1121,17 @@ internal class ErrorHandlingTest { } @Test - fun jobsCreate403() { - val jobService = client.fineTuning().jobs() + fun jobsCreate422WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() stubFor( post(anyUrl()) .willReturn( - status(403).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + status(422).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) ) ) val e = - assertThrows { + assertThrows { jobService.create( JobCreateParams.builder() .model(JobCreateParams.Model.BABBAGE_002) @@ -409,7 +1221,7 @@ internal class ErrorHandlingTest { ) } - assertThat(e.statusCode()).isEqualTo(403) + assertThat(e.statusCode()).isEqualTo(422) assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) assertThat(e.body()) .isEqualTo( @@ -425,17 +1237,17 @@ internal class ErrorHandlingTest { } @Test - fun jobsCreate404() { + fun jobsCreate429() { val jobService = client.fineTuning().jobs() stubFor( post(anyUrl()) .willReturn( - status(404).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + status(429).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) ) ) val e = - assertThrows { + assertThrows { jobService.create( JobCreateParams.builder() .model(JobCreateParams.Model.BABBAGE_002) @@ -525,7 +1337,7 @@ internal class ErrorHandlingTest { ) } - assertThat(e.statusCode()).isEqualTo(404) + assertThat(e.statusCode()).isEqualTo(429) assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) assertThat(e.body()) .isEqualTo( @@ -541,17 +1353,17 @@ internal class ErrorHandlingTest { } @Test - fun jobsCreate422() { - val jobService = client.fineTuning().jobs() + fun jobsCreate429WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() stubFor( post(anyUrl()) .willReturn( - status(422).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + status(429).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) ) ) val e = - assertThrows { + assertThrows { jobService.create( JobCreateParams.builder() .model(JobCreateParams.Model.BABBAGE_002) @@ -641,7 +1453,7 @@ internal class ErrorHandlingTest { ) } - assertThat(e.statusCode()).isEqualTo(422) + assertThat(e.statusCode()).isEqualTo(429) assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) assertThat(e.body()) .isEqualTo( @@ -657,17 +1469,17 @@ internal class ErrorHandlingTest { } @Test - fun jobsCreate429() { + fun jobsCreate500() { val jobService = client.fineTuning().jobs() stubFor( post(anyUrl()) .willReturn( - status(429).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + status(500).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) ) ) val e = - assertThrows { + assertThrows { jobService.create( JobCreateParams.builder() .model(JobCreateParams.Model.BABBAGE_002) @@ -757,7 +1569,7 @@ internal class ErrorHandlingTest { ) } - assertThat(e.statusCode()).isEqualTo(429) + assertThat(e.statusCode()).isEqualTo(500) assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) assertThat(e.body()) .isEqualTo( @@ -773,8 +1585,8 @@ internal class ErrorHandlingTest { } @Test - fun jobsCreate500() { - val jobService = client.fineTuning().jobs() + fun jobsCreate500WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() stubFor( post(anyUrl()) .willReturn( @@ -1004,6 +1816,122 @@ internal class ErrorHandlingTest { ) } + @Test + fun jobsCreate999WithRawResponse() { + val jobService = client.fineTuning().jobs().withRawResponse() + stubFor( + post(anyUrl()) + .willReturn( + status(999).withHeader(HEADER_NAME, HEADER_VALUE).withBody(ERROR_JSON_BYTES) + ) + ) + + val e = + assertThrows { + jobService.create( + JobCreateParams.builder() + .model(JobCreateParams.Model.BABBAGE_002) + .trainingFile("file-abc123") + .hyperparameters( + JobCreateParams.Hyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .addIntegration( + JobCreateParams.Integration.builder() + .wandb( + JobCreateParams.Integration.Wandb.builder() + .project("my-wandb-project") + .entity("entity") + .name("name") + .addTag("custom-tag") + .build() + ) + .build() + ) + .metadata( + JobCreateParams.Metadata.builder() + .putAdditionalProperty("foo", JsonValue.from("string")) + .build() + ) + .method( + JobCreateParams.Method.builder() + .type(JobCreateParams.Method.Type.SUPERVISED) + .dpo( + DpoMethod.builder() + .hyperparameters( + DpoHyperparameters.builder() + .batchSizeAuto() + .betaAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .reinforcement( + ReinforcementMethod.builder() + .grader( + StringCheckGrader.builder() + .input("input") + .name("name") + .operation(StringCheckGrader.Operation.EQ) + .reference("reference") + .build() + ) + .hyperparameters( + ReinforcementHyperparameters.builder() + .batchSizeAuto() + .computeMultiplierAuto() + .evalIntervalAuto() + .evalSamplesAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .reasoningEffort( + ReinforcementHyperparameters.ReasoningEffort + .DEFAULT + ) + .build() + ) + .build() + ) + .supervised( + SupervisedMethod.builder() + .hyperparameters( + SupervisedHyperparameters.builder() + .batchSizeAuto() + .learningRateMultiplierAuto() + .nEpochsAuto() + .build() + ) + .build() + ) + .build() + ) + .seed(42L) + .suffix("x") + .validationFile("file-abc123") + .build() + ) + } + + assertThat(e.statusCode()).isEqualTo(999) + assertThat(e.headers().toMap()).contains(entry(HEADER_NAME, listOf(HEADER_VALUE))) + assertThat(e.body()) + .isEqualTo( + JsonValue.from( + mapOf( + "code" to "code", + "message" to "message", + "param" to "param", + "type" to "type", + ) + ) + ) + } + @Test fun jobsCreateInvalidJsonBody() { val jobService = client.fineTuning().jobs() From 77f54fdea8bf0a609f90ec511977531bffc1a9b1 Mon Sep 17 00:00:00 2001 From: D Gardner Date: Thu, 17 Jul 2025 21:57:58 +0100 Subject: [PATCH 11/18] feat(client): add `ResponseAccumulator` (#391) * response-accumulator: first draft for review. * response-accumulator: added accumulator and documented streaming support. --- README.md | 62 +++- .../com/openai/helpers/ResponseAccumulator.kt | 325 ++++++++++++++++++ .../openai/helpers/ResponseAccumulatorTest.kt | 166 +++++++++ ...nsesStructuredOutputsStreamingExample.java | 87 +++++ 4 files changed, 635 insertions(+), 5 deletions(-) create mode 100644 openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt create mode 100644 openai-java-core/src/test/kotlin/com/openai/helpers/ResponseAccumulatorTest.kt create mode 100644 openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsStreamingExample.java diff --git a/README.md b/README.md index 67ea87d5..90f31102 100644 --- a/README.md +++ b/README.md @@ -350,6 +350,53 @@ client.chat() ChatCompletion chatCompletion = chatCompletionAccumulator.chatCompletion(); ``` +The SDK provides conveniences for streamed responses. A +[`ResponseAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt) +can record the stream of response events as they are processed and accumulate a +[`Response`](openai-java-core/src/main/kotlin/com/openai/models/responses/Response.kt) +object similar to that which would have been returned by the non-streaming API. + +For a synchronous response add a +[`Stream.peek()`](https://docs.oracle.com/javase/8/docs/api/java/util/stream/Stream.html#peek-java.util.function.Consumer-) +call to the stream pipeline to accumulate each event: + +```java +import com.openai.core.http.StreamResponse; +import com.openai.helpers.ResponseAccumulator; +import com.openai.models.responses.Response; +import com.openai.models.responses.ResponseStreamEvent; + +ResponseAccumulator responseAccumulator = ResponseAccumulator.create(); + +try (StreamResponse streamResponse = + client.responses().createStreaming(createParams)) { + streamResponse.stream() + .peek(responseAccumulator::accumulate) + .flatMap(event -> event.outputTextDelta().stream()) + .forEach(textEvent -> System.out.print(textEvent.delta())); +} + +Response response = responseAccumulator.response(); +``` + +For an asynchronous response, add the `ResponseAccumulator` to the `subscribe()` call: + +```java +import com.openai.helpers.ResponseAccumulator; +import com.openai.models.responses.Response; + +ResponseAccumulator responseAccumulator = ResponseAccumulator.create(); + +client.responses() + .createStreaming(createParams) + .subscribe(event -> responseAccumulator.accumulate(event) + .outputTextDelta().ifPresent(textEvent -> System.out.print(textEvent.delta()))) + .onCompleteFuture() + .join(); + +Response response = responseAccumulator.response(); +``` + ## Structured outputs with JSON schemas Open AI [Structured Outputs](https://platform.openai.com/docs/guides/structured-outputs?api-mode=chat) @@ -527,11 +574,16 @@ For a full example of the usage of _Structured Outputs_ with Streaming and the C see [`StructuredOutputsStreamingExample`](openai-java-example/src/main/java/com/openai/example/StructuredOutputsStreamingExample.java). -At present, there is no accumulator for streaming responses using the Responses API. It is still -possible to derive a JSON schema from a Java class and create a streaming response for a -[`StructuredResponseCreateParams`](openai-java-core/src/main/kotlin/com/openai/models/responses/StructuredResponseCreateParams.kt) -object, but there is no helper for deserialization of the response to an instance of that Java -class. +With the Responses API, accumulate events while streaming using the +[`ResponseAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt). +Once accumulated, use `ResponseAccumulator.response(Class)` to convert the accumulated `Response` +into a +[`StructuredResponse`](openai-java-core/src/main/kotlin/com/openai/models/responses/StructuredResponse.kt). +The [`StructuredResponse`] can then automatically deserialize the JSON strings into instances of +your Java class. + +For a full example of the usage of _Structured Outputs_ with Streaming and the Responses API, see +[`ResponsesStructuredOutputsStreamingExample`](openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsStreamingExample.java). ### Defining JSON schema properties diff --git a/openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt b/openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt new file mode 100644 index 00000000..485442ba --- /dev/null +++ b/openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt @@ -0,0 +1,325 @@ +package com.openai.helpers + +import com.openai.errors.OpenAIInvalidDataException +import com.openai.models.responses.Response +import com.openai.models.responses.ResponseAudioDeltaEvent +import com.openai.models.responses.ResponseAudioDoneEvent +import com.openai.models.responses.ResponseAudioTranscriptDeltaEvent +import com.openai.models.responses.ResponseAudioTranscriptDoneEvent +import com.openai.models.responses.ResponseCodeInterpreterCallCodeDeltaEvent +import com.openai.models.responses.ResponseCodeInterpreterCallCodeDoneEvent +import com.openai.models.responses.ResponseCodeInterpreterCallCompletedEvent +import com.openai.models.responses.ResponseCodeInterpreterCallInProgressEvent +import com.openai.models.responses.ResponseCodeInterpreterCallInterpretingEvent +import com.openai.models.responses.ResponseCompletedEvent +import com.openai.models.responses.ResponseContentPartAddedEvent +import com.openai.models.responses.ResponseContentPartDoneEvent +import com.openai.models.responses.ResponseCreatedEvent +import com.openai.models.responses.ResponseErrorEvent +import com.openai.models.responses.ResponseFailedEvent +import com.openai.models.responses.ResponseFileSearchCallCompletedEvent +import com.openai.models.responses.ResponseFileSearchCallInProgressEvent +import com.openai.models.responses.ResponseFileSearchCallSearchingEvent +import com.openai.models.responses.ResponseFunctionCallArgumentsDeltaEvent +import com.openai.models.responses.ResponseFunctionCallArgumentsDoneEvent +import com.openai.models.responses.ResponseImageGenCallCompletedEvent +import com.openai.models.responses.ResponseImageGenCallGeneratingEvent +import com.openai.models.responses.ResponseImageGenCallInProgressEvent +import com.openai.models.responses.ResponseImageGenCallPartialImageEvent +import com.openai.models.responses.ResponseInProgressEvent +import com.openai.models.responses.ResponseIncompleteEvent +import com.openai.models.responses.ResponseMcpCallArgumentsDeltaEvent +import com.openai.models.responses.ResponseMcpCallArgumentsDoneEvent +import com.openai.models.responses.ResponseMcpCallCompletedEvent +import com.openai.models.responses.ResponseMcpCallFailedEvent +import com.openai.models.responses.ResponseMcpCallInProgressEvent +import com.openai.models.responses.ResponseMcpListToolsCompletedEvent +import com.openai.models.responses.ResponseMcpListToolsFailedEvent +import com.openai.models.responses.ResponseMcpListToolsInProgressEvent +import com.openai.models.responses.ResponseOutputItemAddedEvent +import com.openai.models.responses.ResponseOutputItemDoneEvent +import com.openai.models.responses.ResponseOutputTextAnnotationAddedEvent +import com.openai.models.responses.ResponseQueuedEvent +import com.openai.models.responses.ResponseReasoningDeltaEvent +import com.openai.models.responses.ResponseReasoningDoneEvent +import com.openai.models.responses.ResponseReasoningSummaryDeltaEvent +import com.openai.models.responses.ResponseReasoningSummaryDoneEvent +import com.openai.models.responses.ResponseReasoningSummaryPartAddedEvent +import com.openai.models.responses.ResponseReasoningSummaryPartDoneEvent +import com.openai.models.responses.ResponseReasoningSummaryTextDeltaEvent +import com.openai.models.responses.ResponseReasoningSummaryTextDoneEvent +import com.openai.models.responses.ResponseRefusalDeltaEvent +import com.openai.models.responses.ResponseRefusalDoneEvent +import com.openai.models.responses.ResponseStreamEvent +import com.openai.models.responses.ResponseTextDeltaEvent +import com.openai.models.responses.ResponseTextDoneEvent +import com.openai.models.responses.ResponseWebSearchCallCompletedEvent +import com.openai.models.responses.ResponseWebSearchCallInProgressEvent +import com.openai.models.responses.ResponseWebSearchCallSearchingEvent +import com.openai.models.responses.StructuredResponse + +/** + * An accumulator that constructs a [Response] from a sequence of streamed events. Pass all events + * to [accumulate] and then call [response] to get the final accumulated response. The final + * `Response` will be similar to what would have been received had the non-streaming API been used. + * + * A [ResponseAccumulator] may only be used to accumulate _one_ response. To accumulate another + * response, create another instance of `ResponseAccumulator`. + */ +class ResponseAccumulator private constructor() { + + /** + * The response accumulated from the event stream. This is set when a terminal event is + * accumulated. That single event carries all the response details. + */ + private var response: Response? = null + + companion object { + @JvmStatic fun create() = ResponseAccumulator() + } + + /** + * Gets the final accumulated response. Until the last event has been accumulated, a [Response] + * will not be available. Wait until all events have been handled by [accumulate] before calling + * this method. + * + * @throws IllegalStateException If called before the stream has been completed. + */ + fun response() = checkNotNull(response) { "Completed response is not yet received." } + + /** + * Gets the final accumulated response with support for structured outputs. Until the last event + * has been accumulated, a [StructuredResponse] will not be available. Wait until all events + * have been handled by [accumulate] before calling this method. See that method for more + * details on how the last event is detected. See the + * [SDK documentation](https://github.com/openai/openai-java/#usage-with-streaming) for more + * details and example code. + * + * @param responseType The Java class from which the JSON schema in the request was derived. The + * output JSON conforming to that schema can be converted automatically back to an instance of + * that Java class by the [StructuredResponse]. + * @throws IllegalStateException If called before the last event has been accumulated. + * @throws OpenAIInvalidDataException If the JSON data cannot be parsed to an instance of the + * [responseType] class. + */ + fun response(responseType: Class) = StructuredResponse(responseType, response()) + + /** + * Accumulates a streamed event and uses it to construct a [Response]. When all events have been + * accumulated, the response can be retrieved by calling [response]. The last event is detected + * if one of `ResponseCompletedEvent`, `ResponseIncompleteEvent`, or `ResponseFailedEvent` is + * accumulated. After that event, no more events are expected. + * + * @return The given [event] for convenience, such as when chaining method calls. + * @throws IllegalStateException If [accumulate] is called again after the last event has been + * accumulated. A [ResponseAccumulator] can only be used to accumulate a single [Response]. + */ + fun accumulate(event: ResponseStreamEvent): ResponseStreamEvent { + check(response == null) { "Response has already been completed." } + + event.accept( + object : ResponseStreamEvent.Visitor { + // -------------------------------------------------------------------------------- + // The following events _all_ have a `response` property. + + override fun visitCreated(created: ResponseCreatedEvent) { + // The initial response (on creation) has no content, so it is not stored. + } + + override fun visitCompleted(completed: ResponseCompletedEvent) { + response = completed.response() + } + + override fun visitInProgress(inProgress: ResponseInProgressEvent) { + // An in-progress response is not complete, so it is not stored. + } + + override fun visitQueued(queued: ResponseQueuedEvent) { + // A queued response that is awaiting processing is not complete, so it is not + // stored. + } + + override fun visitFailed(failed: ResponseFailedEvent) { + // TODO: Confirm that this is a "terminal" event and will occur _instead of_ + // `ResponseCompletedEvent` or `ResponseIncompleteEvent`. + // Store the response so the reason for the failure can be interrogated. + response = failed.response() + } + + override fun visitIncomplete(incomplete: ResponseIncompleteEvent) { + // TODO: Confirm that this is a "terminal" event and will occur _instead of_ + // `ResponseCompletedEvent` or `ResponseFailedEvent`. + // Store the response so the reason for the incompleteness can be interrogated. + response = incomplete.response() + } + + // -------------------------------------------------------------------------------- + // The following events do _not_ have a `Response` property. + + override fun visitAudioDelta(audioDelta: ResponseAudioDeltaEvent) {} + + override fun visitAudioDone(audioDone: ResponseAudioDoneEvent) {} + + override fun visitAudioTranscriptDelta( + audioTranscriptDelta: ResponseAudioTranscriptDeltaEvent + ) {} + + override fun visitAudioTranscriptDone( + audioTranscriptDone: ResponseAudioTranscriptDoneEvent + ) {} + + override fun visitCodeInterpreterCallCodeDelta( + codeInterpreterCallCodeDelta: ResponseCodeInterpreterCallCodeDeltaEvent + ) {} + + override fun visitCodeInterpreterCallCodeDone( + codeInterpreterCallCodeDone: ResponseCodeInterpreterCallCodeDoneEvent + ) {} + + override fun visitCodeInterpreterCallCompleted( + codeInterpreterCallCompleted: ResponseCodeInterpreterCallCompletedEvent + ) {} + + override fun visitCodeInterpreterCallInProgress( + codeInterpreterCallInProgress: ResponseCodeInterpreterCallInProgressEvent + ) {} + + override fun visitCodeInterpreterCallInterpreting( + codeInterpreterCallInterpreting: ResponseCodeInterpreterCallInterpretingEvent + ) {} + + override fun visitContentPartAdded( + contentPartAdded: ResponseContentPartAddedEvent + ) {} + + override fun visitContentPartDone(contentPartDone: ResponseContentPartDoneEvent) {} + + override fun visitError(error: ResponseErrorEvent) {} + + override fun visitFileSearchCallCompleted( + fileSearchCallCompleted: ResponseFileSearchCallCompletedEvent + ) {} + + override fun visitFileSearchCallInProgress( + fileSearchCallInProgress: ResponseFileSearchCallInProgressEvent + ) {} + + override fun visitFileSearchCallSearching( + fileSearchCallSearching: ResponseFileSearchCallSearchingEvent + ) {} + + override fun visitFunctionCallArgumentsDelta( + functionCallArgumentsDelta: ResponseFunctionCallArgumentsDeltaEvent + ) {} + + override fun visitFunctionCallArgumentsDone( + functionCallArgumentsDone: ResponseFunctionCallArgumentsDoneEvent + ) {} + + override fun visitOutputItemAdded(outputItemAdded: ResponseOutputItemAddedEvent) {} + + override fun visitOutputItemDone(outputItemDone: ResponseOutputItemDoneEvent) {} + + override fun visitReasoningSummaryPartAdded( + reasoningSummaryPartAdded: ResponseReasoningSummaryPartAddedEvent + ) {} + + override fun visitReasoningSummaryPartDone( + reasoningSummaryPartDone: ResponseReasoningSummaryPartDoneEvent + ) {} + + override fun visitReasoningSummaryTextDelta( + reasoningSummaryTextDelta: ResponseReasoningSummaryTextDeltaEvent + ) {} + + override fun visitReasoningSummaryTextDone( + reasoningSummaryTextDone: ResponseReasoningSummaryTextDoneEvent + ) {} + + override fun visitRefusalDelta(refusalDelta: ResponseRefusalDeltaEvent) {} + + override fun visitRefusalDone(refusalDone: ResponseRefusalDoneEvent) {} + + override fun visitOutputTextDelta(outputTextDelta: ResponseTextDeltaEvent) {} + + override fun visitOutputTextDone(outputTextDone: ResponseTextDoneEvent) {} + + override fun visitWebSearchCallCompleted( + webSearchCallCompleted: ResponseWebSearchCallCompletedEvent + ) {} + + override fun visitWebSearchCallInProgress( + webSearchCallInProgress: ResponseWebSearchCallInProgressEvent + ) {} + + override fun visitWebSearchCallSearching( + webSearchCallSearching: ResponseWebSearchCallSearchingEvent + ) {} + + override fun visitImageGenerationCallCompleted( + imageGenerationCallCompleted: ResponseImageGenCallCompletedEvent + ) {} + + override fun visitImageGenerationCallGenerating( + imageGenerationCallGenerating: ResponseImageGenCallGeneratingEvent + ) {} + + override fun visitImageGenerationCallInProgress( + imageGenerationCallInProgress: ResponseImageGenCallInProgressEvent + ) {} + + override fun visitImageGenerationCallPartialImage( + imageGenerationCallPartialImage: ResponseImageGenCallPartialImageEvent + ) {} + + override fun visitMcpCallArgumentsDelta( + mcpCallArgumentsDelta: ResponseMcpCallArgumentsDeltaEvent + ) {} + + override fun visitMcpCallArgumentsDone( + mcpCallArgumentsDone: ResponseMcpCallArgumentsDoneEvent + ) {} + + override fun visitMcpCallCompleted( + mcpCallCompleted: ResponseMcpCallCompletedEvent + ) {} + + override fun visitMcpCallFailed(mcpCallFailed: ResponseMcpCallFailedEvent) {} + + override fun visitMcpCallInProgress( + mcpCallInProgress: ResponseMcpCallInProgressEvent + ) {} + + override fun visitMcpListToolsCompleted( + mcpListToolsCompleted: ResponseMcpListToolsCompletedEvent + ) {} + + override fun visitMcpListToolsFailed( + mcpListToolsFailed: ResponseMcpListToolsFailedEvent + ) {} + + override fun visitMcpListToolsInProgress( + mcpListToolsInProgress: ResponseMcpListToolsInProgressEvent + ) {} + + override fun visitOutputTextAnnotationAdded( + outputTextAnnotationAdded: ResponseOutputTextAnnotationAddedEvent + ) {} + + override fun visitReasoningDelta(reasoningDelta: ResponseReasoningDeltaEvent) {} + + override fun visitReasoningDone(reasoningDone: ResponseReasoningDoneEvent) {} + + override fun visitReasoningSummaryDelta( + reasoningSummaryDelta: ResponseReasoningSummaryDeltaEvent + ) {} + + override fun visitReasoningSummaryDone( + reasoningSummaryDone: ResponseReasoningSummaryDoneEvent + ) {} + } + ) + + return event + } +} diff --git a/openai-java-core/src/test/kotlin/com/openai/helpers/ResponseAccumulatorTest.kt b/openai-java-core/src/test/kotlin/com/openai/helpers/ResponseAccumulatorTest.kt new file mode 100644 index 00000000..81d0b833 --- /dev/null +++ b/openai-java-core/src/test/kotlin/com/openai/helpers/ResponseAccumulatorTest.kt @@ -0,0 +1,166 @@ +package com.openai.helpers + +import com.openai.core.JsonNull +import com.openai.models.ResponsesModel +import com.openai.models.responses.Response +import com.openai.models.responses.ResponseCompletedEvent +import com.openai.models.responses.ResponseCreatedEvent +import com.openai.models.responses.ResponseFailedEvent +import com.openai.models.responses.ResponseInProgressEvent +import com.openai.models.responses.ResponseIncompleteEvent +import com.openai.models.responses.ResponseOutputItem +import com.openai.models.responses.ResponseOutputMessage +import com.openai.models.responses.ResponseOutputText +import com.openai.models.responses.ResponseStreamEvent +import org.assertj.core.api.Assertions.assertThat +import org.assertj.core.api.Assertions.assertThatNoException +import org.assertj.core.api.Assertions.assertThatThrownBy +import org.junit.jupiter.api.Test + +internal class ResponseAccumulatorTest { + + @Test + fun responseBeforeAccumulation() { + val accumulator = ResponseAccumulator.create() + + assertThatThrownBy { accumulator.response() } + .isExactlyInstanceOf(IllegalStateException::class.java) + .hasMessage("Completed response is not yet received.") + } + + @Test + fun structuredResponseBeforeAccumulation() { + val accumulator = ResponseAccumulator.create() + + assertThatThrownBy { accumulator.response(String::class.java) } + .isExactlyInstanceOf(IllegalStateException::class.java) + .hasMessage("Completed response is not yet received.") + } + + @Test + fun responseAfterAccumulation() { + val accumulator = ResponseAccumulator.create() + + accumulator.accumulate(ResponseStreamEvent.ofCompleted(responseCompletedEvent())) + + assertThatNoException().isThrownBy { accumulator.response() } + assertThat(accumulator.response().id()).isEqualTo("response-id") + } + + @Test + fun structuredResponseAfterAccumulation() { + val accumulator = ResponseAccumulator.create() + + accumulator.accumulate(ResponseStreamEvent.ofCompleted(responseCompletedEvent())) + + // No deserialization is attempted, so the `Class` does not matter. Deserialization is + // beyond the scope of this test; it is tested elsewhere at a lower level. + assertThatNoException().isThrownBy { accumulator.response(String::class.java) } + assertThat(accumulator.response(String::class.java).id()).isEqualTo("response-id") + assertThat(accumulator.response(String::class.java).responseType) + .isEqualTo(String::class.java) + } + + @Test + fun accumulateAfterCompleted() { + val accumulator = ResponseAccumulator.create() + + accumulator.accumulate(ResponseStreamEvent.ofCompleted(responseCompletedEvent())) + + assertThatThrownBy { + accumulator.accumulate(ResponseStreamEvent.ofCompleted(responseCompletedEvent())) + } + .isExactlyInstanceOf(IllegalStateException::class.java) + .hasMessage("Response has already been completed.") + } + + @Test + fun accumulateUntilCompleted() { + val accumulator = ResponseAccumulator.create() + + accumulator.accumulate(ResponseStreamEvent.ofCreated(responseCreatedEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofCompleted(responseCompletedEvent())) + + val response = accumulator.response() + + assertThat(response.id()).isEqualTo("response-id") + } + + @Test + fun accumulateUntilIncomplete() { + val accumulator = ResponseAccumulator.create() + + accumulator.accumulate(ResponseStreamEvent.ofCreated(responseCreatedEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofIncomplete(responseIncompleteEvent())) + + val response = accumulator.response() + + assertThat(response.id()).isEqualTo("response-id") + } + + @Test + fun accumulateUntilFailed() { + val accumulator = ResponseAccumulator.create() + + accumulator.accumulate(ResponseStreamEvent.ofCreated(responseCreatedEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofInProgress(responseInProgressEvent())) + accumulator.accumulate(ResponseStreamEvent.ofFailed(responseFailedEvent())) + + val response = accumulator.response() + + assertThat(response.id()).isEqualTo("response-id") + } + + private fun responseCreatedEvent() = + ResponseCreatedEvent.builder().response(response()).sequenceNumber(1L).build() + + private fun responseInProgressEvent() = + ResponseInProgressEvent.builder().response(response()).sequenceNumber(1L).build() + + private fun responseCompletedEvent() = + ResponseCompletedEvent.builder().response(response()).sequenceNumber(1L).build() + + private fun responseFailedEvent() = + ResponseFailedEvent.builder().response(response()).sequenceNumber(1L).build() + + private fun responseIncompleteEvent() = + ResponseIncompleteEvent.builder().response(response()).sequenceNumber(1L).build() + + private fun response() = + Response.builder() + .id("response-id") + .createdAt(System.currentTimeMillis() / 1_000.0) + .error(null) + .incompleteDetails(null) + .instructions(null) + .metadata(null) + .model(ResponsesModel.ResponsesOnlyModel.O1_PRO) + .addOutput(responseOutputItemOfMessage()) + .parallelToolCalls(false) + .temperature(null) + .toolChoice(JsonNull.of()) + .tools(listOf()) + .topP(null) + .build() + + private fun responseOutputItemOfMessage() = + ResponseOutputItem.ofMessage(responseOutputMessage()) + + private fun responseOutputMessage() = + ResponseOutputMessage.builder() + .id("message-id") + .addContent(ResponseOutputMessage.Content.ofOutputText(responseOutputText())) + .status(ResponseOutputMessage.Status.COMPLETED) + .build() + + private fun responseOutputText() = + ResponseOutputText.builder().text("Hello World").annotations(listOf()).build() +} diff --git a/openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsStreamingExample.java b/openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsStreamingExample.java new file mode 100644 index 00000000..337e23d4 --- /dev/null +++ b/openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsStreamingExample.java @@ -0,0 +1,87 @@ +package com.openai.example; + +import com.fasterxml.jackson.annotation.JsonIgnore; +import com.fasterxml.jackson.annotation.JsonPropertyDescription; +import com.openai.client.OpenAIClient; +import com.openai.client.okhttp.OpenAIOkHttpClient; +import com.openai.core.http.StreamResponse; +import com.openai.helpers.ResponseAccumulator; +import com.openai.models.ChatModel; +import com.openai.models.responses.ResponseCreateParams; +import com.openai.models.responses.ResponseStreamEvent; +import com.openai.models.responses.StructuredResponseCreateParams; +import java.util.List; + +public final class ResponsesStructuredOutputsStreamingExample { + + public static class Person { + @JsonPropertyDescription("The first name and surname of the person.") + public String name; + + public int birthYear; + + @JsonPropertyDescription("The year the person died, or 'present' if the person is living.") + public String deathYear; + + @Override + public String toString() { + return name + " (" + birthYear + '-' + deathYear + ')'; + } + } + + public static class Book { + public String title; + + public Person author; + + @JsonPropertyDescription("The year in which the book was first published.") + public int publicationYear; + + public String genre; + + @JsonIgnore + public String isbn; + + @Override + public String toString() { + return '"' + title + "\" (" + publicationYear + ") [" + genre + "] by " + author; + } + } + + public static class BookList { + public List books; + } + + private ResponsesStructuredOutputsStreamingExample() {} + + public static void main(String[] args) { + // Configures using one of: + // - The `OPENAI_API_KEY` environment variable + // - The `OPENAI_BASE_URL` and `AZURE_OPENAI_KEY` environment variables + OpenAIClient client = OpenAIOkHttpClient.fromEnv(); + + StructuredResponseCreateParams createParams = ResponseCreateParams.builder() + .input("List some famous late twentieth century novels.") + .text(BookList.class) + .model(ChatModel.GPT_4O) + .build(); + + ResponseAccumulator accumulator = ResponseAccumulator.create(); + + try (StreamResponse streamResponse = + client.responses().createStreaming(createParams)) { + streamResponse.stream() + .peek(accumulator::accumulate) + .flatMap(event -> event.outputTextDelta().stream()) + .forEach(textEvent -> System.out.print(textEvent.delta())); + System.out.println(); + } + + accumulator.response(BookList.class).output().stream() + .flatMap(item -> item.message().stream()) + .flatMap(message -> message.content().stream()) + .flatMap(content -> content.outputText().stream()) + .flatMap(bookList -> bookList.books.stream()) + .forEach(book -> System.out.println(" - " + book)); + } +} From 2d185ba387569d90ffffa07adf3337ffce918e3e Mon Sep 17 00:00:00 2001 From: Tomer Aberbach Date: Thu, 17 Jul 2025 17:01:04 -0400 Subject: [PATCH 12/18] chore(client): remove non-existent method --- .../services/async/audio/TranscriptionServiceAsyncImpl.kt | 2 +- .../openai/services/blocking/audio/TranscriptionServiceImpl.kt | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt index 31104a16..66209215 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/async/audio/TranscriptionServiceAsyncImpl.kt @@ -78,7 +78,7 @@ class TranscriptionServiceAsyncImpl internal constructor(private val clientOptio private val createStringHandler: Handler = object : Handler { - private val stringHandler = stringHandler().withErrorHandler(errorHandler) + private val stringHandler = stringHandler() override fun handle(response: HttpResponse): TranscriptionCreateResponse = TranscriptionCreateResponse.ofTranscription( diff --git a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt index 9c26e557..d9196a09 100644 --- a/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt +++ b/openai-java-core/src/main/kotlin/com/openai/services/blocking/audio/TranscriptionServiceImpl.kt @@ -72,7 +72,7 @@ class TranscriptionServiceImpl internal constructor(private val clientOptions: C private val createStringHandler: Handler = object : Handler { - private val stringHandler = stringHandler().withErrorHandler(errorHandler) + private val stringHandler = stringHandler() override fun handle(response: HttpResponse): TranscriptionCreateResponse = TranscriptionCreateResponse.ofTranscription( From 6fa070050cb8e288fb791051291627a71b7f48b2 Mon Sep 17 00:00:00 2001 From: Julien Dubois Date: Wed, 16 Jul 2025 11:42:23 +0200 Subject: [PATCH 13/18] Set up TestContainers instead of running the mock server from a script. - This removes the need to have NPM installed - This should be faster as there is no "npm install" - This removes the risk of having a mock server that keeps running in the background after the tests have benn run Fix #54 --- .github/workflows/ci.yml | 2 +- openai-java-core/build.gradle.kts | 2 + .../kotlin/com/openai/TestServerExtension.kt | 128 ++++++++++++++---- scripts/test | 56 -------- 4 files changed, 104 insertions(+), 84 deletions(-) delete mode 100755 scripts/test diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 0ebdc70c..ebdbd517 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -57,7 +57,7 @@ jobs: uses: gradle/gradle-build-action@v2 - name: Run tests - run: ./scripts/test + run: ./gradlew test examples: timeout-minutes: 10 name: examples diff --git a/openai-java-core/build.gradle.kts b/openai-java-core/build.gradle.kts index 7c7d6d0e..abeb5621 100644 --- a/openai-java-core/build.gradle.kts +++ b/openai-java-core/build.gradle.kts @@ -43,6 +43,8 @@ dependencies { testImplementation("org.mockito:mockito-core:5.14.2") testImplementation("org.mockito:mockito-junit-jupiter:5.14.2") testImplementation("org.mockito.kotlin:mockito-kotlin:4.1.0") + testImplementation("org.testcontainers:testcontainers:1.19.8") + testImplementation("org.testcontainers:junit-jupiter:1.19.8") } if (project.hasProperty("graalvmAgent")) { diff --git a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt index 7dfdef2b..4e82a14a 100644 --- a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt +++ b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt @@ -1,41 +1,122 @@ package com.openai import java.lang.RuntimeException -import java.net.URL +import java.io.File import org.junit.jupiter.api.extension.BeforeAllCallback import org.junit.jupiter.api.extension.ConditionEvaluationResult import org.junit.jupiter.api.extension.ExecutionCondition import org.junit.jupiter.api.extension.ExtensionContext +import org.testcontainers.containers.GenericContainer +import org.testcontainers.containers.wait.strategy.Wait +import org.testcontainers.utility.DockerImageName +import org.testcontainers.utility.MountableFile +import java.time.Duration class TestServerExtension : BeforeAllCallback, ExecutionCondition { - override fun beforeAll(context: ExtensionContext?) { - try { - URL(BASE_URL).openConnection().connect() - } catch (e: Exception) { - throw RuntimeException( - """ - The test suite will not run without a mock Prism server running against your OpenAPI spec. + companion object { + private const val INTERNAL_PORT = 4010 // Port inside the container - You can set the environment variable `SKIP_MOCK_TESTS` to `true` to skip running any tests - that require the mock server. + val BASE_URL: String + get() = "http://${prismContainer.host}:${prismContainer.getMappedPort(INTERNAL_PORT)}" + + const val SKIP_TESTS_ENV: String = "SKIP_MOCK_TESTS" + private const val PRISM_IMAGE = "stoplight/prism:5" + private const val API_SPEC_PATH = "/app/openapi.yml" // Path inside the container + + // Track if the container has been started + private var containerStarted = false + + private fun getOpenApiSpecPath(): String { + // First check environment variable + val envPath = System.getenv("OPENAPI_SPEC_PATH") + if (envPath != null) { + return envPath + } + + // Try to read from .stats.yml file + try { + val statsFile = File("../.stats.yml") + if (statsFile.exists()) { + val content = statsFile.readText() + val urlLine = content.lines().find { it.startsWith("openapi_spec_url:") } + if (urlLine != null) { + val url = urlLine.substringAfter("openapi_spec_url:").trim() + if (url.isNotEmpty()) { + return url + } + } + } + } catch (e: Exception) { + println("Could not read .stats.yml fails, fall back to default. Error is: ${e.message}") + } + return "/tmp/openapi.yml" + } + + private val prismContainer: GenericContainer<*> by lazy { + val apiSpecPath = getOpenApiSpecPath() + println("Using OpenAPI spec path: $apiSpecPath") + val isUrl = apiSpecPath.startsWith("http://") || apiSpecPath.startsWith("https://") + + // Create container with or without copying the file based on whether apiSpecPath is a URL + val container = GenericContainer(DockerImageName.parse(PRISM_IMAGE)) + .withExposedPorts(INTERNAL_PORT) + .withCommand("mock", apiSpecPath, "--host", "0.0.0.0", "--port", INTERNAL_PORT.toString()) + .withReuse(true) - To fix: + // Only copy the file to the container if apiSpecPath is a local file + if (!isUrl) { + try { + val file = File(apiSpecPath) + if (file.exists()) { + container.withCopyToContainer(MountableFile.forHostPath(apiSpecPath), API_SPEC_PATH) + } else { + println("OpenAPI spec file not found at: $apiSpecPath") + throw RuntimeException("OpenAPI spec file not found at: $apiSpecPath") + } + } catch (e: Exception) { + println("Error reading OpenAPI spec file: ${e.message}") + throw RuntimeException("Error reading OpenAPI spec file: $apiSpecPath", e) + } + } - 1. Install Prism (requires Node 16+): + // Add waiting strategy + container.waitingFor( + Wait.forLogMessage(".*Prism is listening.*", 1) + .withStartupTimeout(Duration.ofSeconds(300)) + ) - With npm: - $ npm install -g @stoplight/prism-cli + // Start the container here once during lazy initialization + container.start() + containerStarted = true + println("Prism container started at: ${container.host}:${container.getMappedPort(INTERNAL_PORT)}") - With yarn: - $ yarn global add @stoplight/prism-cli + container + } - 2. Run the mock server + // Method to ensure container is started, can be called from beforeAll + fun ensureContainerStarted() { + if (!containerStarted) { + // This will trigger lazy initialization and start the container + prismContainer + } + } + } - To run the server, pass in the path of your OpenAPI spec to the prism command: - $ prism mock path/to/your.openapi.yml + override fun beforeAll(context: ExtensionContext?) { + try { + // Use the companion method to ensure container is started only once + ensureContainerStarted() + } catch (e: Exception) { + throw RuntimeException( """ - .trimIndent(), + Failed to connect to Prism mock server running in TestContainer. + + You can set the environment variable `SKIP_MOCK_TESTS` to `true` to skip running any tests + that require the mock server. + + You may also need to set `OPENAPI_SPEC_PATH` to the path of your OpenAPI spec file. + """.trimIndent(), e, ) } @@ -52,11 +133,4 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { ) } } - - companion object { - - val BASE_URL = System.getenv("TEST_API_BASE_URL") ?: "http://localhost:4010" - - const val SKIP_TESTS_ENV: String = "SKIP_MOCK_TESTS" - } } diff --git a/scripts/test b/scripts/test deleted file mode 100755 index 6b750a74..00000000 --- a/scripts/test +++ /dev/null @@ -1,56 +0,0 @@ -#!/usr/bin/env bash - -set -e - -cd "$(dirname "$0")/.." - -RED='\033[0;31m' -GREEN='\033[0;32m' -YELLOW='\033[0;33m' -NC='\033[0m' # No Color - -function prism_is_running() { - curl --silent "http://localhost:4010" >/dev/null 2>&1 -} - -kill_server_on_port() { - pids=$(lsof -t -i tcp:"$1" || echo "") - if [ "$pids" != "" ]; then - kill "$pids" - echo "Stopped $pids." - fi -} - -function is_overriding_api_base_url() { - [ -n "$TEST_API_BASE_URL" ] -} - -if ! is_overriding_api_base_url && ! prism_is_running ; then - # When we exit this script, make sure to kill the background mock server process - trap 'kill_server_on_port 4010' EXIT - - # Start the dev server - ./scripts/mock --daemon -fi - -if is_overriding_api_base_url ; then - echo -e "${GREEN}✔ Running tests against ${TEST_API_BASE_URL}${NC}" - echo -elif ! prism_is_running ; then - echo -e "${RED}ERROR:${NC} The test suite will not run without a mock Prism server" - echo -e "running against your OpenAPI spec." - echo - echo -e "To run the server, pass in the path or url of your OpenAPI" - echo -e "spec to the prism command:" - echo - echo -e " \$ ${YELLOW}npm exec --package=@stoplight/prism-cli@~5.3.2 -- prism mock path/to/your.openapi.yml${NC}" - echo - - exit 1 -else - echo -e "${GREEN}✔ Mock prism server is running with your OpenAPI spec${NC}" - echo -fi - -echo "==> Running tests" -./gradlew test From d97e27df6ee79e04ea43108d2ae4f58166077ea8 Mon Sep 17 00:00:00 2001 From: Julien Dubois Date: Fri, 18 Jul 2025 10:44:09 +0200 Subject: [PATCH 14/18] Use Stainless version of Prism --- .../src/test/kotlin/com/openai/TestServerExtension.kt | 7 ++++--- 1 file changed, 4 insertions(+), 3 deletions(-) diff --git a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt index 4e82a14a..10130daa 100644 --- a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt +++ b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt @@ -21,7 +21,8 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { get() = "http://${prismContainer.host}:${prismContainer.getMappedPort(INTERNAL_PORT)}" const val SKIP_TESTS_ENV: String = "SKIP_MOCK_TESTS" - private const val PRISM_IMAGE = "stoplight/prism:5" + private const val NODEJS_IMAGE = "node:22" + private const val PRISM_CLI_VERSION = "5.8.5" private const val API_SPEC_PATH = "/app/openapi.yml" // Path inside the container // Track if the container has been started @@ -59,9 +60,9 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { val isUrl = apiSpecPath.startsWith("http://") || apiSpecPath.startsWith("https://") // Create container with or without copying the file based on whether apiSpecPath is a URL - val container = GenericContainer(DockerImageName.parse(PRISM_IMAGE)) + val container = GenericContainer(DockerImageName.parse(NODEJS_IMAGE)) .withExposedPorts(INTERNAL_PORT) - .withCommand("mock", apiSpecPath, "--host", "0.0.0.0", "--port", INTERNAL_PORT.toString()) + .withCommand("npm", "exec", "--package=@stainless-api/prism-cli@$PRISM_CLI_VERSION", "--", "prism", "mock", apiSpecPath, "--host", "0.0.0.0", "--port", INTERNAL_PORT.toString()) .withReuse(true) // Only copy the file to the container if apiSpecPath is a local file From 5e02837902d3a1587e4698acff6ad437e81b0765 Mon Sep 17 00:00:00 2001 From: Tomer Aberbach Date: Fri, 18 Jul 2025 10:33:48 -0400 Subject: [PATCH 15/18] chore: format --- .../kotlin/com/openai/TestServerExtension.kt | 52 +++++++++++++------ 1 file changed, 37 insertions(+), 15 deletions(-) diff --git a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt index 10130daa..ceb13557 100644 --- a/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt +++ b/openai-java-core/src/test/kotlin/com/openai/TestServerExtension.kt @@ -1,7 +1,8 @@ package com.openai -import java.lang.RuntimeException import java.io.File +import java.lang.RuntimeException +import java.time.Duration import org.junit.jupiter.api.extension.BeforeAllCallback import org.junit.jupiter.api.extension.ConditionEvaluationResult import org.junit.jupiter.api.extension.ExecutionCondition @@ -10,7 +11,6 @@ import org.testcontainers.containers.GenericContainer import org.testcontainers.containers.wait.strategy.Wait import org.testcontainers.utility.DockerImageName import org.testcontainers.utility.MountableFile -import java.time.Duration class TestServerExtension : BeforeAllCallback, ExecutionCondition { @@ -46,11 +46,13 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { if (url.isNotEmpty()) { return url } - } - } - } catch (e: Exception) { - println("Could not read .stats.yml fails, fall back to default. Error is: ${e.message}") - } + } + } + } catch (e: Exception) { + println( + "Could not read .stats.yml fails, fall back to default. Error is: ${e.message}" + ) + } return "/tmp/openapi.yml" } @@ -59,18 +61,35 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { println("Using OpenAPI spec path: $apiSpecPath") val isUrl = apiSpecPath.startsWith("http://") || apiSpecPath.startsWith("https://") - // Create container with or without copying the file based on whether apiSpecPath is a URL - val container = GenericContainer(DockerImageName.parse(NODEJS_IMAGE)) - .withExposedPorts(INTERNAL_PORT) - .withCommand("npm", "exec", "--package=@stainless-api/prism-cli@$PRISM_CLI_VERSION", "--", "prism", "mock", apiSpecPath, "--host", "0.0.0.0", "--port", INTERNAL_PORT.toString()) - .withReuse(true) + // Create container with or without copying the file based on whether apiSpecPath is a + // URL + val container = + GenericContainer(DockerImageName.parse(NODEJS_IMAGE)) + .withExposedPorts(INTERNAL_PORT) + .withCommand( + "npm", + "exec", + "--package=@stainless-api/prism-cli@$PRISM_CLI_VERSION", + "--", + "prism", + "mock", + apiSpecPath, + "--host", + "0.0.0.0", + "--port", + INTERNAL_PORT.toString(), + ) + .withReuse(true) // Only copy the file to the container if apiSpecPath is a local file if (!isUrl) { try { val file = File(apiSpecPath) if (file.exists()) { - container.withCopyToContainer(MountableFile.forHostPath(apiSpecPath), API_SPEC_PATH) + container.withCopyToContainer( + MountableFile.forHostPath(apiSpecPath), + API_SPEC_PATH, + ) } else { println("OpenAPI spec file not found at: $apiSpecPath") throw RuntimeException("OpenAPI spec file not found at: $apiSpecPath") @@ -90,7 +109,9 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { // Start the container here once during lazy initialization container.start() containerStarted = true - println("Prism container started at: ${container.host}:${container.getMappedPort(INTERNAL_PORT)}") + println( + "Prism container started at: ${container.host}:${container.getMappedPort(INTERNAL_PORT)}" + ) container } @@ -117,7 +138,8 @@ class TestServerExtension : BeforeAllCallback, ExecutionCondition { that require the mock server. You may also need to set `OPENAPI_SPEC_PATH` to the path of your OpenAPI spec file. - """.trimIndent(), + """ + .trimIndent(), e, ) } From fdeac0bfe67be770e32b0c332b4cafba26387cb8 Mon Sep 17 00:00:00 2001 From: Tomer Aberbach Date: Fri, 18 Jul 2025 10:35:24 -0400 Subject: [PATCH 16/18] chore: keep ./scripts/test --- .github/workflows/ci.yml | 2 +- scripts/test | 8 ++++++++ 2 files changed, 9 insertions(+), 1 deletion(-) create mode 100755 scripts/test diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index ebdbd517..0ebdc70c 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -57,7 +57,7 @@ jobs: uses: gradle/gradle-build-action@v2 - name: Run tests - run: ./gradlew test + run: ./scripts/test examples: timeout-minutes: 10 name: examples diff --git a/scripts/test b/scripts/test new file mode 100755 index 00000000..0f93c0bd --- /dev/null +++ b/scripts/test @@ -0,0 +1,8 @@ +#!/usr/bin/env bash + +set -e + +cd "$(dirname "$0")/.." + +echo "==> Running tests" +./gradlew test From dd9c8c14a0f5857ab07737f6062ac332718f65e3 Mon Sep 17 00:00:00 2001 From: Tomer Aberbach Date: Fri, 18 Jul 2025 10:36:08 -0400 Subject: [PATCH 17/18] chore: remove dupe test impls --- openai-java-core/build.gradle.kts | 3 --- 1 file changed, 3 deletions(-) diff --git a/openai-java-core/build.gradle.kts b/openai-java-core/build.gradle.kts index c9ae0fb3..abeb5621 100644 --- a/openai-java-core/build.gradle.kts +++ b/openai-java-core/build.gradle.kts @@ -45,9 +45,6 @@ dependencies { testImplementation("org.mockito.kotlin:mockito-kotlin:4.1.0") testImplementation("org.testcontainers:testcontainers:1.19.8") testImplementation("org.testcontainers:junit-jupiter:1.19.8") - testImplementation("org.testcontainers:testcontainers:1.19.8") - testImplementation("org.testcontainers:junit-jupiter:1.19.8") - } if (project.hasProperty("graalvmAgent")) { From ac635a936e5f341361fdbfad0f8d2e813685311a Mon Sep 17 00:00:00 2001 From: Tomer Aberbach Date: Fri, 18 Jul 2025 10:37:55 -0400 Subject: [PATCH 18/18] chore: get rid of ./scripts/mock --- .github/workflows/create-releases.yml | 4 --- CONTRIBUTING.md | 8 +----- scripts/mock | 41 --------------------------- 3 files changed, 1 insertion(+), 52 deletions(-) delete mode 100755 scripts/mock diff --git a/.github/workflows/create-releases.yml b/.github/workflows/create-releases.yml index f7b80580..b7c04246 100644 --- a/.github/workflows/create-releases.yml +++ b/.github/workflows/create-releases.yml @@ -40,10 +40,6 @@ jobs: run: | ./gradlew :openai-java-core:compileJava :openai-java-core:compileTestJava -x test - - name: Run the Prism server - run: | - ./scripts/mock --daemon - - name: Setup GraalVM uses: graalvm/setup-graalvm@v1 with: diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index b2884477..0c074129 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -87,18 +87,12 @@ JAR files will be available in each module's `build/libs/` directory. Most tests require [our mock server](https://github.com/stoplightio/prism) to be running against the OpenAPI spec to work. -The test script will automatically start the mock server for you (if it's not already running) and run the tests against it: +The test script will automatically start the mock server for you and run the tests against it: ```sh $ ./scripts/test ``` -You can also manually start the mock server if you want to run tests repeatedly: - -```sh -$ ./scripts/mock -``` - Then run the tests: ```sh diff --git a/scripts/mock b/scripts/mock deleted file mode 100755 index d2814ae6..00000000 --- a/scripts/mock +++ /dev/null @@ -1,41 +0,0 @@ -#!/usr/bin/env bash - -set -e - -cd "$(dirname "$0")/.." - -if [[ -n "$1" && "$1" != '--'* ]]; then - URL="$1" - shift -else - URL="$(grep 'openapi_spec_url' .stats.yml | cut -d' ' -f2)" -fi - -# Check if the URL is empty -if [ -z "$URL" ]; then - echo "Error: No OpenAPI spec path/url provided or found in .stats.yml" - exit 1 -fi - -echo "==> Starting mock server with URL ${URL}" - -# Run prism mock on the given spec -if [ "$1" == "--daemon" ]; then - npm exec --package=@stainless-api/prism-cli@5.8.5 -- prism mock "$URL" &> .prism.log & - - # Wait for server to come online - echo -n "Waiting for server" - while ! grep -q "✖ fatal\|Prism is listening" ".prism.log" ; do - echo -n "." - sleep 0.1 - done - - if grep -q "✖ fatal" ".prism.log"; then - cat .prism.log - exit 1 - fi - - echo -else - npm exec --package=@stainless-api/prism-cli@5.8.5 -- prism mock "$URL" -fi