I stumbled upon Docker Compose yesterday, so please excuse my ignorance since I'm only familiar with it at a high level overview level, but Docker Compose looks very similar to DevLab.
Compose is actually part of the reason we wrote this tool. We tried so hard to make it make sense for a TDD workflow, but it was always cumbersome.
Compose wants you to build your application into a container, and build and run that container every time you have a task to run. This takes time and a fair amount of cleanup, especially when you want a clean environment to run your tests that doesn't persist to the next run. DevLab just wants you to specify the environment to plug your application into, and doesn't build a container at all.
The result is:
- No manual cleanup
- No pile of images or processes that stack up
- Your project doesn't have to be a Docker project. It doesn't need a Dockerfile. You can use this for something you plan to clone on an EC2 node and run from an upstart script.
- No Vagrant/Ansible/Chef/Puppet, server config, anything. You pick out an image that matches your environment's needs (node:4.2, wordpress, go, etc., hub.docker.com is a great place to start) and DevLab plugs your app into it.
- More development-oriented features outside of the pure Docker ecosystem that Compose stays confined to. For example, soon, Mac users running docker-machine will get to enjoy ports bound to localhost much like Linux docker users do [1].
To get this functionality outside of Compose, we had a monolithic makefile to maintain all of this for us. It wasn't DRY and it wasn't smooth. I hope you enjoy DevLab!
"... Between Makefiles that spun up commands and Docker-Compose we were tired of our tooling getting in our way, or creating extra tasks for us to manage. The goal of DevLab is to have a tool with a small footprint, both in application and configuration requirements, which allows for local containerization using Docker. ..."
Sorry I am pretty much a noob (specially at containerization) compared to most of the peeps here, but I can't resist asking - what is the need for this? What exactly does the author mean by "services (like databases) without needing to setup and maintain these environments manually"? Don't we just install node/mongo once ever followed by only updates?
My question is not just for this particular thing, but all "lets dev in a container" thing like python's virtual env, npm's "local hard copy of all the modules" etc.? Does it really happen that your node binary or $language interpreter/compiler gets infected/changed somehow? Or one of your crucial modules gets borked? And if it does, shouldn't we be looking at investigating why such a thing really happened in the first place instead of duplicating everything?
As for maintenance, what kind of maintenance does node (or any $compiler or $interpreter) require? Don't they just sit waiting for invocation with some code to return the output/compiler binary? The only thing I have ever needed is stuff like nvm/rvm for using multiple versions of the same package/interpreter/compiler. Can anyone educate me?
While purely local development works, the problem you often face in a fast-paced development environment is differences between your local system and production. That's not only dealing with things like file paths and service versions and configuration; it's also very stateful.
With DevLab, you not only get an environment that very closely matches (or can be made to closely match) production, you also get a clean, blank-slate installation any time you run tests or any other task against your app. If you're developing locally, you're prone to database cruft, evolving configurations, pre-existing tempfiles, and other complications that can be easily avoided with the use of good development tooling.
Now extend that idea to multiple applications: chances are, your team isn't developing just one thing. You probably have multiple development concerns, and if you have an architecture like an SOA, you may have a multitude of microservices. When you update the language they're based on, be it node, go, python, or anything else, you need the ability to test and update your applications individually. You need your entire team to be using the same versions for the same applications. And more than anything, you need your potentially large team to get all these updates without collective hours of manual work to orchestrate those configurations on their own local machines.
My colleague addressed the question of DevLab vs. Compose elsewhere in this topic, but even if you don't use DevLab, I do heavily encourage virtualizing your environment to avoid these pitfalls in the way that makes the most sense for your team.
I don't see much use for this project over, say, Compose, but I use dockerised services for development in my current day-job.
I'm an architect helping to move a a cluster of monoliths to a proper service-oriented setup; some of the teams work in PHP Symfony with MySQL, some work in Python 2.7 with Postgres, some in Python 3.4 with [insert hipster technology here]. We use Redis and Couchbase at various versions for different systems, Eventstore for messaging, and Elasticsearch.
That's a lot of moving parts to keep updated, but because we're deploying docker containers to production anyway, it's really simple to get up and running by just running `docker-compose up` from the root of a project.
tl;dr you're correct, unless you have to work in multiple systems that can version their infrastructure separately.
As far as I can tell, the entire point is to ensure your code is platform independent from the start instead of making subtle assumptions derived from how your development boxes are configured. Making the test environment as similar to the deployment environment as you can.
I think it just adds complexity by introducing yet another external dependency to dev environment. Not sure it will save time on projects with custom services (not available on Docker w/required configuration) vs running traditional VM.
[+] [-] nate908|10 years ago|reply
How is DevLab different than Docker Compose?
https://docs.docker.com/compose/
[+] [-] TomFrost|10 years ago|reply
Compose is actually part of the reason we wrote this tool. We tried so hard to make it make sense for a TDD workflow, but it was always cumbersome.
Compose wants you to build your application into a container, and build and run that container every time you have a task to run. This takes time and a fair amount of cleanup, especially when you want a clean environment to run your tests that doesn't persist to the next run. DevLab just wants you to specify the environment to plug your application into, and doesn't build a container at all.
The result is:
- No manual cleanup
- No pile of images or processes that stack up
- Your project doesn't have to be a Docker project. It doesn't need a Dockerfile. You can use this for something you plan to clone on an EC2 node and run from an upstart script.
- No Vagrant/Ansible/Chef/Puppet, server config, anything. You pick out an image that matches your environment's needs (node:4.2, wordpress, go, etc., hub.docker.com is a great place to start) and DevLab plugs your app into it.
- More development-oriented features outside of the pure Docker ecosystem that Compose stays confined to. For example, soon, Mac users running docker-machine will get to enjoy ports bound to localhost much like Linux docker users do [1].
To get this functionality outside of Compose, we had a monolithic makefile to maintain all of this for us. It wasn't DRY and it wasn't smooth. I hope you enjoy DevLab!
[1] https://github.com/TechnologyAdvice/DevLab/pull/10
[+] [-] irickt|10 years ago|reply
"... Between Makefiles that spun up commands and Docker-Compose we were tired of our tooling getting in our way, or creating extra tasks for us to manage. The goal of DevLab is to have a tool with a small footprint, both in application and configuration requirements, which allows for local containerization using Docker. ..."
[+] [-] awalGarg|10 years ago|reply
My question is not just for this particular thing, but all "lets dev in a container" thing like python's virtual env, npm's "local hard copy of all the modules" etc.? Does it really happen that your node binary or $language interpreter/compiler gets infected/changed somehow? Or one of your crucial modules gets borked? And if it does, shouldn't we be looking at investigating why such a thing really happened in the first place instead of duplicating everything?
As for maintenance, what kind of maintenance does node (or any $compiler or $interpreter) require? Don't they just sit waiting for invocation with some code to return the output/compiler binary? The only thing I have ever needed is stuff like nvm/rvm for using multiple versions of the same package/interpreter/compiler. Can anyone educate me?
[+] [-] ksafranski|10 years ago|reply
With DevLab, you not only get an environment that very closely matches (or can be made to closely match) production, you also get a clean, blank-slate installation any time you run tests or any other task against your app. If you're developing locally, you're prone to database cruft, evolving configurations, pre-existing tempfiles, and other complications that can be easily avoided with the use of good development tooling.
Now extend that idea to multiple applications: chances are, your team isn't developing just one thing. You probably have multiple development concerns, and if you have an architecture like an SOA, you may have a multitude of microservices. When you update the language they're based on, be it node, go, python, or anything else, you need the ability to test and update your applications individually. You need your entire team to be using the same versions for the same applications. And more than anything, you need your potentially large team to get all these updates without collective hours of manual work to orchestrate those configurations on their own local machines.
My colleague addressed the question of DevLab vs. Compose elsewhere in this topic, but even if you don't use DevLab, I do heavily encourage virtualizing your environment to avoid these pitfalls in the way that makes the most sense for your team.
[+] [-] bobfromhuddle|10 years ago|reply
I'm an architect helping to move a a cluster of monoliths to a proper service-oriented setup; some of the teams work in PHP Symfony with MySQL, some work in Python 2.7 with Postgres, some in Python 3.4 with [insert hipster technology here]. We use Redis and Couchbase at various versions for different systems, Eventstore for messaging, and Elasticsearch.
That's a lot of moving parts to keep updated, but because we're deploying docker containers to production anyway, it's really simple to get up and running by just running `docker-compose up` from the root of a project.
tl;dr you're correct, unless you have to work in multiple systems that can version their infrastructure separately.
[+] [-] Natanael_L|10 years ago|reply
[+] [-] cabirum|10 years ago|reply