Typically we have an app that consitutes multiple Docker containers (service A, service B, Redis, Postgre, etc), how would you instanitate this environment and execute testing during a Jenkins build? e.g. simply run docker-compose within the Jenkins master?
any tips on testing over a dockerzied Jenkins will be appreciated
评论:
shott85:
weitzj:I'd suggest adding an agent (f.k.a. slave) to Jenkins instead of installing any tools on the master or using on-master executors. This approach is better for several reasons including performance, security, and scalability.
Install Docker, Docker Compose, and other required tools (Git, etc.) on the agent and then invoke your build process using [power]shell scripts. You don't need any Docker plugins to do this.
Keep the agent clean by running Docker cleanup scripts on a CRON schedule, or as a post-build action like this.
If the agent itself is a container, then you'll need to install Docker and Docker Compose inside the container and bind mount /var/run/docker.sock (Docker on Docker), similar to what I've done in this example right here and here.
themissedsymphony:Install Docker compose and docker in a slave. Add a docker and docker compose tag to this node. And then in your job use a bash script with docker-compose up -d and an exit trap on the bash script with
```bash function stopDocker { docker-compose stop }
trap stopDocker EXIT ```
You can do this either the docker plugin, a configured host machine or the kubernetes plugin. But in general, it's very difficult and every so often, you will have to clean up a lot of bad state (containers left running, full hard drives etc)
For this reason I recommend either Google Container Builder, CircleCI or indeed TravisCI as they can run all of the above and clean up after themselves. I had some success with Wercker but it's very limiting when you want to build a special image (say, with libvips or c libs)
