hero image

Docker & DevOps - Part 3

How the Understanding of Servers Has Changed

Now that have taken a look at the concepts of dynamic infrastructure and Infrastructure as Code, let's take a quick look at Docker, Amazon Web Services, and DevOps, as they have made a lasting impact on the understanding of servers and how they are being used.

Docker as new standard of software development

Infrastructure as Code is often encountered in connection with Docker. Docker introduces a different way of thinking for what was previously understood by the term "server". Docker splits an application into smaller parts that run as individual services in Docker containers.

Docker is often compared to container ships. Each application is located in a separate container on a container ship. The Docker containers can communicate with each other so that the application can still run as a whole.

Modularization of applications

The concept of modularization is not a secret trick, but a standard approach in IT: major problems are solved by dividing them into smaller parts. Subsequently, the individual solutions are connected to solve the larger problem. Of course, this procedure has its own name: divide and conquer (Lat. "divide et impera").

In complex environments, modularization is the keyword for replacing "luck" with predictability and reliability. But modularization can be accomplished without Docker, which is why that alone is not the point. Modularization is, so far, only a prerequisite for the use of Docker.

One of the biggest challenges in managing server infrastructures for complex applications is finding matching versions for libraries, base applications, and other third-party components. In other words, it's often a time-consuming puzzle to prepare a server for installing an application.

Since updates are necessary for security, the puzzle begins again from the beginning with each new version. In addition to compatibility, optimizations are also a challenge. A configuration setting that is ideal for module A may slow down module B.

Wouldn’t it be a good idea if every part (service) of the application had its own technical platform? This is indeed the core idea of ​​Docker. The environment for each module of the application is called a Docker container. At first this might sound like it's awkward, but in practice it's easier to manage multiple Docker containers than to solve the puzzle. Infrastructure as Code enables efficient control over multiple Docker containers.

Virtualization of applications

The application, or rather each module of the application, is bundled with the underlying system. As mentioned above, this simplifies the setup and administration of the application and its environment. Docker also makes the application independent of the operating system.

This is essentially the same concept as in the field of virtualization. Running a Windows server as a virtual machine makes it independent of the hardware, and thus the respective drivers. The hardware is then always the same: the "simulated" virtual hardware. And this does not change when a server hardware is replaced. Docker combines the benefits of virtualization with the benefits of application modularization.

Cut operating costs with Amazon Web Services

Amazon Web Services uses a technology that is essentially the same as that of Docker. With the Docker technology comes new way of thinking about hosting. You now pay for computing time rather than renting a server. In addition, the Docker containers can be tailored perfectly to the requirements.

This means that you can choose a cheaper Amazon Web Services product if you need it. Amazon actually uses clothing sizes like S, M, L, XL, XXL, but also offers products with different features.

Finding the right size-sized product for each Docker container can be a challenge, but it pays off by tailoring operating costs to the level of performance and reliability you want.

Better Collaboration Between Software Development and IT Operations Thanks to DevOps

Web software development and server execution used to be two distinct areas of activity. Technologies like Docker and cloud services like Amazon Web Services are leading to a different way of thinking about infrastructure. DevOps means that the software development team assumes some responsibilities from the server administrators.

Because the infrastructure is defined as code, there is a clean-structured interface between software developers and server administrators. The software developers have the most profound knowledge of the application, the modules of the application and how they interact. Obviously, the specialists who are close to software development use their knowledge to define the respective parts of the infrastructure.

Do you need support?

Amazon Web Services offers a large number of products, and finding the right one can feel like navigating a jungle. Finding the right people and setting them up properly is also not always easy. We can help you set up an Amazon Web Services infrastructure, or we can do it for you.

We also support you in docking your existing application. As part of the DevOps mentality, h.com also monitors and optimizes the infrastructure and adjusts it when software updates require it. We are there where you need us. Contact us.

Christian Haag, founder and CEO of h.com networkers GmbH

Does that sound interesting? You would like to know more?

Good software is created in dialogue and we are happy to exchange ideas with you.