top | item 39531892

(no title)

redact207 | 2 years ago

I didn't quite understand why this was made. We create our local test environments using docker-compose, and so I read:

> Creating reliable and fully-initialized service dependencies using raw Docker commands or using Docker Compose requires good knowledge of Docker internals and how to best run specific technologies in a container

This sounds like a <your programming language> abstraction over docker-compose, which lets you define your docker environment without learning the syntax of docker-compose itself. But then

> port conflicts, containers not being fully initialized or ready for interactions when the tests start, etc.

means you'd still need a good understanding of docker networking, dependencies, healthchecks to know if your test environment is ready to be used.

Am I missing something? Is this basically change what's starting your docker test containers?

discuss

order

c0balt|2 years ago

Going to the sections for language interactions shows a lot more stuff, e.g., the first full go example: https://testcontainers.com/guides/getting-started-with-testc...

Shows how you can embed the declaration of db for testing in a unit test:

> pgContainer, err := postgres.RunContainer(ctx, > testcontainers.WithImage("postgres:15.3-alpine"), > postgres.WithInitScripts(filepath.Join("..", "testdata", "init-db.sql")), > postgres.WithDatabase("test-db"), > postgres.WithUsername("postgres"), > postgres.WithPassword("postgres"), > testcontainers.WithWaitStrategy( > wait.ForLog("database system is ready to accept connections").

This does look quite neat for setting up test specific database instances instead of spawning one outside of the test context with docker(compose). It should also make it possible to run tests that require their own instance in parallel.

simonw|2 years ago

On Hacker News you need to indent code examples with four spaces - like this:

    pgContainer, err := postgres.RunContainer(
        ctx, testcontainers.WithImage("postgres:15.3-alpine"
    ),
    postgres.WithInitScripts(filepath.Join("..", "testdata", "init-db.sql")),
     postgres.WithDatabase("test-db"),
     postgres.WithUsername("postgres"),
     postgres.WithPassword("postgres"),
     testcontainers.WithWaitStrategy(
      wait.ForLog("database system is ready to accept connections").

peterldowns|2 years ago

This seems great but is actually quite slow. This will create a new container, with a new postgres server, and a new database in that server, for each test. You'll then need to run migrations in that database. This ends up being a huge pain in the ass.

A better approach is to create a single postgres server one-time before running all of your tests. Then, create a template database on that server, and run your migrations on that template. Now, for each unit test, you can connect to the same server and create a new database from that template. This is not a pain in the ass and it is very fast: you run your migrations one time, and pay a ~20ms cost for each test to get its own database.

I've implemented this for golang here — considering also implementing this for Django and for Typescript if there is enough interest. https://github.com/peterldowns/pgtestdb

ath3nd|2 years ago

As a user of testcontainers I can tell you they are very powerful yet simple.

Indeed all they do is provide an abstraction for your language, but this is soo useful for unit/integration tests.

At my work we have many microservices in both Java and python, all of which use testcontainers to set up the local env or integration tests. The integration with localstack and the ability to programmatically set it up without fighting with compose files, is somewhat I find very useful.

stonecolddevin|2 years ago

Testcontainers is great. It's got seamless junit integration and really Just Works. I've never once had to even think about any of the docker aspects of it. There's really not much to it.

mleo|2 years ago

It’s not coming across in your comment, but Testcontainers can work with unit tests to start a container, run the unit tests and shutdown. For example, to verify database operations against the actual database, the unit test can start an instance of Postgres run tests and then shut it down. If running tests in parallel, each test can start its own container and shutdown at the end.

DanHulton|2 years ago

Wouldn't that just massively, _massively_ slow down your tests, if each test was spinning up its own Postgres container?

I ask because I really like this and would love to use it, but I'm concerned that that would add just an insane amount of overhead to the point where the convenience isn't worth the immense amount of extra time it would take.

marginalia_nu|2 years ago

Testcontainers are for testing individual components, apart from the application.

I built a new service registry recently, its unit tests spins up a zookeeper instance for the duration of the test, and then kills it.

Also very nice with databases. Spin up a clean db, run migrations, then test db code with zero worries about accidentally leaving stuff in a table that poisons other tests.

I guess the killer feature is how well it works.

dns_snek|2 years ago

> Also very nice with databases. Spin up a clean db, run migrations, then test db code with zero worries about accidentally leaving stuff in a table that poisons other tests.

Are you spinning up a new instance between every test case? Because that sounds painfully slow.

I would just define a function which DELETEs all the data and call it between every test.

codeonline|2 years ago

This looks to be like just language specific bindings over the docker compose syntax. You're right that docker compose handles all of the situations they describe.

mickael-kerjean|2 years ago

The major issue I had with docker compose in my CI environment is flaky tests when a port is already used by another job I don't control. With testcontainers, I haven't seen any false positive as I can use whatever port is available and not a hardcoded one hoping it won't conflict with what other people are doing.

AlfeG|2 years ago

We create own DB env for each set of test fixtures to run them parallel. There is no way I can achieve this with this little amount of frictions.

cosmosgenius|2 years ago

I just started using them specifically to test docker container implementation (Correctness of Dockerfile, entrypoint etc.)