Leveraging the pipeline concept
CI tools help to clarify how a build should proceed and work around the concept of a pipeline. A pipeline is a collection of stages. If any of them are not successful, the pipeline stops.
Each stage in the pipeline can produce elements that could be used at later stages or are available as the final product of the full build. These final elements are known as artifacts.
Let's look at an example of a pipeline:
The first stage pulls the latest commit from the source control system. Then, we build all the containers and run both tests and the static analysis. If all has been successful, we tag the resulting server container and push it to the registry.
CI tools normally allow great configuration in pipelines, including the possibility of running different stages in parallel. To be able to run stages in parallel, they need to be able to be parallelizable, meaning that they should not change the same elements.
If the chosen CI tool allows running stages in parallel, the pipeline could be defined as follows:
Note that we build the database and the test images in parallel. The next stage builds the rest of the elements, which are already available in the cache, so it will be very quick. Both the tests and the static analysis can run in parallel, in two different containers.
This may speed up complex builds.
The pipeline is described in a script specific to the Travis CI tool. We'll look at an example with Travis CI later.