Containers. Containers everywhere. I have been traveling the tech trade-show circuit lately and containers are without a doubt the hottest trending topic. Thanks to Docker, the open-source tooling that makes building, shipping, and running applications via containers easier than ever, we are starting to see containers eat at the monolithic world we were previously living in. No longer is the question “will it run on Windows or Linux?” only “does it run on Docker”.
What are Containers?
Containers are standalone packages of software that include everything that we need to run an application code, runtime, settings, libraries, etc.
Containers can isolate software from the environment and system it’s running on, ensuring that software will always run consistently, regardless of choice in infrastructure. And although they have similar benefits to Virtual Machines, Docker containers are much more efficient and portable. They don’t require the virtualization of an entire operating system, just a Docker engine layer.
The History of Containerization
If you ever want to see how standardization can revolutionize an industry, let’s take a trip back in history to see the effects of physical containerization adoption by shipping companies in the 1950’s. Nearly overnight, everything that was coming off a ship was the same exact size and dimension. No more priceless cars that needed extra attention, or barrels of liquid, or bars of soap. Just neat containers lined up in the hold of a ship, ready to be lifted off by the same crane that you can find at nearly any port in the world. Not only did it severely shorten shipping time, reduce port congestion, and decrease the amount of lost cargo; but it also displaced thousands of longshoremen around the world who formerly handled “bulk cargo”.
Over a decade later, containers are having the same effect on the software shipping world as they had on the physical shipping world. And just like shipping containers, software containers are drastically changing downstream functions and processes like the Cloud, Microservices, CI/CD, and most importantly, testing.
How Containers Are Being Used In Testing
Software testing is going through a revolution of its own as it tries to become faster, smarter, and more embedded in our continuous delivery pipelines. Testing teams no longer have weeks to test new software, only hours and minutes. Because of this, the QA world has turned to automation and parallelization to help alleviate time bottlenecks. Virtualization has previously been used as a solution for parallelization, spinning up concurrent virtual operating systems on whatever infrastructure was available or leftover for test execution against their application.
With Docker, teams can spin up different containers at will without needing towers of infrastructure, but only on a local laptop or cloud instance. Suddenly, concurrency is not just possible at a level of “tens of parallel machines” but at a level of “hundred of parallel machines”. This has transformed the way testing fits into in the CI/CD pipeline, reducing the traditional bottlenecks that revolved around time and resources by solving it with a multi-container testing strategy.
Microservices and Containers
Containers are often used for microservices, an architecture pattern that splits up an application into smaller functions. By creating isolated environments for these functions, Docker has made it incredibly simple to isolate and remediate bugs and defects. Testing teams now know exactly where to look in their infrastructure or code when tests fail.
Although containers and microservices have had many positive impacts on testing, developing and testing applications concurrently has also presented other challenges. Dependent services, data-replication, and port clashes are just some of the problems that arise with the use of containers inside a continuous delivery pipeline. One way we can solve this is with service virtualization, which enables us to deliver reliable services and data throughout the pipeline by virtualizing them.
Container performance is also a challenge. Because of the way Docker containers and clusters are used, many developers overlook how their application will perform at scale. But running load and performance tests on container infrastructure is critical, as clusters of containers can react differently under periods of intense traffic. Once in production, measuring the performance of your container infrastructure can help find bottlenecks and defects before your customers experience any lag.