Skip to content

chore(internal): set up TestContainers for running mock server #537

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

jdubois
Copy link
Contributor

@jdubois jdubois commented Jul 16, 2025

  • This removes the need to have NPM installed
  • This should be faster as there is no "npm install"
  • This removes the risk of having a mock server that keeps running in the background after the tests have benn run

Fix #54

- This removes the need to have NPM installed
- This should be faster as there is no "npm install"
- This removes the risk of having a mock server that keeps running in the background after the tests have benn run

Fix openai#54
@TomerAberbach
Copy link
Collaborator

I am realizing that we use our own custom fork of Prism...

npm exec --package=@stainless-api/[email protected] -- prism mock "$URL" &> .prism.log &

Is it true that we'd need to publish our own Docker image of that to use it here?

@jdubois
Copy link
Contributor Author

jdubois commented Jul 17, 2025

Oh why are you doing this @TomerAberbach ? That's a bit surprising, especially as I have all the tests working fine with the "standard" Prism. You can try this PR, everything is OK!
Then if that's what you want, I can still make this work in TestContainers, even without a customer image: it's just a matter of running NPM inside Docker. It'll just be a bit slower (time to pull everything), and not as much reproductible (if there's a network issue when pulling an NPM package)

@TomerAberbach TomerAberbach changed the title Set up TestContainers instead of running the mock server from a script. chore(internal): set up TestContainers for running mock server Jul 17, 2025
@TomerAberbach
Copy link
Collaborator

I think the reason is because Prism has a bunch of bugs and weird behaviors that we've had to fix over time.

It's quite possible that none of these issues currently affect OpenAI, but they affect other Stainless customers, and so they may affect OpenAI in the future as the spec is updated.

If it's not too much trouble, making it work with the fork would be amazing 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add TestContainers for running the Mock server
2 participants