In 2013, Docker was the “it” company. Docker made headlines for the critical role it played in bringing containers into the mainstream, and in many ways dislodged PaaS as the hot spot at the time (Heroku, anyone?). Now, the company is back in the press with the introduction of a new model for Docker Desktop that requires large organizations to purchase a paid subscription for the tools. There was a candid reaction to this announcement, which reminds me of the important role Docker played in promoting a model we know, love and primarily use now: containers.

Docker did not invent containers, but made access to the technology through an open source tool and reusable images. With Docker, developers can really build programs once and run them locally or on a production server.

The fact that the Docker command line tool has replaced years of exciting web interfaces may be a comment on what developers really want. But to really understand the impact of Docker, it’s important to go back a little while before Docker container technology first came into being.

I’m looking for the next big thing

By 2009, the value of using virtualization was well understood and widely disseminated. Most organizations have already had the benefits of virtualization or had a roadmap for getting there. The marketing machine is tired of virtualization. People were thirsty for the upcoming innovation in IT and software development. It came in the form of Heroku. In fact, PaaS in general and Heroku in particular have become very popular. So much so that it seemed like PaaS would take over the world.

At the time, Heroku was huge. All you have to do is go out to this portal and develop your applications and offer them as a service? What do you not like? Why not develop apps on Heroku?

As it turns out, there were two good reasons not to use Heroku and PaaS platforms like her. For example, apps built on Heroku were not portable; It was only available inside Heroku. Developers had to work remotely on the PaaS platform if they wanted to collaborate. Unlike Netflix, it turns out that developers love to develop locally. If a developer wants to work on their local box, they still have to create the environment manually themselves.

In addition, although the Heroku model was very powerful if you used what was provided out of the box, it was complex behind the scenes. Once your team built something more complex than a simple web application, or needed to customize the infrastructure for security or performance reasons, it became a very difficult and “real” engineering problem.

She was great…even she wasn’t. But in typical IT fashion, a lot of people have gone before realizing that platforms like Heroku have their place but it’s not the right tool for every job.

Docker teams

On the other hand, containers solved many challenges with PaaS, and Docker was the company that made developers, IT managers, and business managers see and understand this. In fact, when Docker appeared, its value was startlingly obvious: all things that were difficult for Heroku were easy with Docker, and all things that were easy for Heroku were also easy with Docker. With Docker, you can launch a pre-built service quickly and easily, but you can also easily develop locally, and customize the services to make them do what you need them to.

This does not mean that Docker was beautiful. I actually took advantage of the UX that first became popular in the 1970s on Unix! Docker was just a command that ran in a Linux terminal—a far cry from the glossy graphical interfaces on most PaaS systems. But the Docker Command Line Interface (CLI) was really neat. In fact, I would argue that the Docker CLI in particular showed the world that when we bring a modern sense of UX to CLI development, it can change the world.

Docker – and containers in general – provides the core technology for developing cloud-native applications. They have worked, and continue to work, across highly distributed architectures and within the devops and CI/CD (Continuous Integration and Continuous Delivery) models required today to meet new and ongoing customer demands for improvements without regression (aka bugs, security issues, etc.).

Containers allow developers to quickly change applications, without breaking the functionality that users rely on. Moreover, the ecosystem that has evolved around containers—including the seemingly unbreakable Kubernetes coordination platform—has enabled organizations to effectively scale and manage growing pools of containers.

Developers quickly understood the value of containers. Operations teams quickly understood, and Silicon Valley investors understood. But it took some work to convince managers, CIOs, and CEOs, who usually watch great demos, that the command-line tool was better than all the bells and whistles with PaaS.

Life in a world full of containers

So here we are in 2021 and the command line tool is still making waves. This is wonderful, to say the least. It even appears that there is room for two players in this market for CLI containers (see “Red Hat Enterprise Linux Targets Edge Computing” and When to Use Docker vs. Podman: A Developer’s Perspective).

Now, thanks to the road paved with container technology, developers can work on-premises or in the cloud more easily than before. CIOs and CEOs can expect shorter development cycles, lower downtime risks, and even lower cost of managing applications over the lifecycle.

Docker is not perfect, and neither are containers. It can be said, it is More work To migrate applications into containers compared to virtual machines, but the benefits last throughout the entire lifecycle of the application, so it is worth the investment. This is especially true with new applications just being developed, but it also applies to lift and transform migrations, or even building reconstruction.

Docker has brought container technology front, center, and priority, replacing PaaS as the mainstream, and for this reason alone it has truly changed the world.

in a red hat, Scott McCarty helps educate IT professionals, customers, and partners on all aspects of Linux containers, from organizational transformation to technical implementation, and serves to advance Red Hat’s go-to-market strategy on containers and related technologies.

The New Technology Forum provides a place to explore and discuss emerging enterprise technology with unprecedented depth and breadth. The choice is personal, based on our choice of technologies that we believe are important and of great interest to InfoWorld readers. InfoWorld does not accept marketing guarantees for publication and reserves the right to edit all Contributed Content. Send all inquiries to newtechforum@infoworld.com.

Copyright © 2021 IDG Communications, Inc.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *