Explore Courses
course iconScrum AllianceCertified ScrumMaster (CSM) Certification
  • 16 Hours
Best seller
course iconScrum AllianceCertified Scrum Product Owner (CSPO) Certification
  • 16 Hours
Best seller
course iconScaled AgileLeading SAFe 6.0 Certification
  • 16 Hours
Trending
course iconScrum.orgProfessional Scrum Master (PSM) Certification
  • 16 Hours
course iconScaled AgileSAFe 6.0 Scrum Master (SSM) Certification
  • 16 Hours
course iconScaled Agile, Inc.Implementing SAFe 6.0 (SPC) Certification
  • 32 Hours
Recommended
course iconScaled Agile, Inc.SAFe 6.0 Release Train Engineer (RTE) Certification
  • 24 Hours
course iconScaled Agile, Inc.SAFe® 6.0 Product Owner/Product Manager (POPM)
  • 16 Hours
Trending
course iconKanban UniversityKMP I: Kanban System Design Course
  • 16 Hours
course iconIC AgileICP Agile Certified Coaching (ICP-ACC)
  • 24 Hours
course iconScrum.orgProfessional Scrum Product Owner I (PSPO I) Training
  • 16 Hours
course iconAgile Management Master's Program
  • 32 Hours
Trending
course iconAgile Excellence Master's Program
  • 32 Hours
Agile and ScrumScrum MasterProduct OwnerSAFe AgilistAgile CoachFull Stack Developer BootcampData Science BootcampCloud Masters BootcampReactNode JsKubernetesCertified Ethical HackingAWS Solutions Artchitct AssociateAzure Data Engineercourse iconPMIProject Management Professional (PMP) Certification
  • 36 Hours
Best seller
course iconAxelosPRINCE2 Foundation & Practitioner Certificationn
  • 32 Hours
course iconAxelosPRINCE2 Foundation Certification
  • 16 Hours
course iconAxelosPRINCE2 Practitioner Certification
  • 16 Hours
Change ManagementProject Management TechniquesCertified Associate in Project Management (CAPM) CertificationOracle Primavera P6 CertificationMicrosoft Projectcourse iconJob OrientedProject Management Master's Program
  • 45 Hours
Trending
course iconProject Management Master's Program
  • 45 Hours
Trending
PRINCE2 Practitioner CoursePRINCE2 Foundation CoursePMP® Exam PrepProject ManagerProgram Management ProfessionalPortfolio Management Professionalcourse iconAWSAWS Certified Solutions Architect - Associate
  • 32 Hours
Best seller
course iconAWSAWS Cloud Practitioner Certification
  • 32 Hours
course iconAWSAWS DevOps Certification
  • 24 Hours
course iconMicrosoftAzure Fundamentals Certification
  • 16 Hours
course iconMicrosoftAzure Administrator Certification
  • 24 Hours
Best seller
course iconMicrosoftAzure Data Engineer Certification
  • 45 Hours
Recommended
course iconMicrosoftAzure Solution Architect Certification
  • 32 Hours
course iconMicrosoftAzure Devops Certification
  • 40 Hours
course iconAWSSystems Operations on AWS Certification Training
  • 24 Hours
course iconAWSArchitecting on AWS
  • 32 Hours
course iconAWSDeveloping on AWS
  • 24 Hours
course iconJob OrientedAWS Cloud Architect Masters Program
  • 48 Hours
New
course iconCareer KickstarterCloud Engineer Bootcamp
  • 100 Hours
Trending
Cloud EngineerCloud ArchitectAWS Certified Developer Associate - Complete GuideAWS Certified DevOps EngineerAWS Certified Solutions Architect AssociateMicrosoft Certified Azure Data Engineer AssociateMicrosoft Azure Administrator (AZ-104) CourseAWS Certified SysOps Administrator AssociateMicrosoft Certified Azure Developer AssociateAWS Certified Cloud Practitionercourse iconAxelosITIL 4 Foundation Certification
  • 16 Hours
Best seller
course iconAxelosITIL Practitioner Certification
  • 16 Hours
course iconPeopleCertISO 14001 Foundation Certification
  • 16 Hours
course iconPeopleCertISO 20000 Certification
  • 16 Hours
course iconPeopleCertISO 27000 Foundation Certification
  • 24 Hours
course iconAxelosITIL 4 Specialist: Create, Deliver and Support Training
  • 24 Hours
course iconAxelosITIL 4 Specialist: Drive Stakeholder Value Training
  • 24 Hours
course iconAxelosITIL 4 Strategist Direct, Plan and Improve Training
  • 16 Hours
ITIL 4 Specialist: Create, Deliver and Support ExamITIL 4 Specialist: Drive Stakeholder Value (DSV) CourseITIL 4 Strategist: Direct, Plan, and ImproveITIL 4 Foundationcourse iconJob OrientedData Science Bootcamp
  • 6 Months
Trending
course iconJob OrientedData Engineer Bootcamp
  • 289 Hours
course iconJob OrientedData Analyst Bootcamp
  • 6 Months
course iconJob OrientedAI Engineer Bootcamp
  • 288 Hours
New
Data Science with PythonMachine Learning with PythonData Science with RMachine Learning with RPython for Data ScienceDeep Learning Certification TrainingNatural Language Processing (NLP)TensorflowSQL For Data Analyticscourse iconIIIT BangaloreExecutive PG Program in Data Science from IIIT-Bangalore
  • 12 Months
course iconMaryland UniversityExecutive PG Program in DS & ML
  • 12 Months
course iconMaryland UniversityCertificate Program in DS and BA
  • 31 Weeks
course iconIIIT BangaloreAdvanced Certificate Program in Data Science
  • 8+ Months
course iconLiverpool John Moores UniversityMaster of Science in ML and AI
  • 750+ Hours
course iconIIIT BangaloreExecutive PGP in ML and AI
  • 600+ Hours
Data ScientistData AnalystData EngineerAI EngineerData Analysis Using ExcelDeep Learning with Keras and TensorFlowDeployment of Machine Learning ModelsFundamentals of Reinforcement LearningIntroduction to Cutting-Edge AI with TransformersMachine Learning with PythonMaster Python: Advance Data Analysis with PythonMaths and Stats FoundationNatural Language Processing (NLP) with PythonPython for Data ScienceSQL for Data Analytics CoursesAI Advanced: Computer Vision for AI ProfessionalsMaster Applied Machine LearningMaster Time Series Forecasting Using Pythoncourse iconDevOps InstituteDevOps Foundation Certification
  • 16 Hours
Best seller
course iconCNCFCertified Kubernetes Administrator
  • 32 Hours
New
course iconDevops InstituteDevops Leader
  • 16 Hours
KubernetesDocker with KubernetesDockerJenkinsOpenstackAnsibleChefPuppetDevOps EngineerDevOps ExpertCI/CD with Jenkins XDevOps Using JenkinsCI-CD and DevOpsDocker & KubernetesDevOps Fundamentals Crash CourseMicrosoft Certified DevOps Engineer ExperteAnsible for Beginners: The Complete Crash CourseContainer Orchestration Using KubernetesContainerization Using DockerMaster Infrastructure Provisioning with Terraformcourse iconTableau Certification
  • 24 Hours
Recommended
course iconData Visualisation with Tableau Certification
  • 24 Hours
course iconMicrosoftMicrosoft Power BI Certification
  • 24 Hours
Best seller
course iconTIBCO Spotfire Training
  • 36 Hours
course iconData Visualization with QlikView Certification
  • 30 Hours
course iconSisense BI Certification
  • 16 Hours
Data Visualization Using Tableau TrainingData Analysis Using Excelcourse iconEC-CouncilCertified Ethical Hacker (CEH v12) Certification
  • 40 Hours
course iconISACACertified Information Systems Auditor (CISA) Certification
  • 22 Hours
course iconISACACertified Information Security Manager (CISM) Certification
  • 40 Hours
course icon(ISC)²Certified Information Systems Security Professional (CISSP)
  • 40 Hours
course icon(ISC)²Certified Cloud Security Professional (CCSP) Certification
  • 40 Hours
course iconCertified Information Privacy Professional - Europe (CIPP-E) Certification
  • 16 Hours
course iconISACACOBIT5 Foundation
  • 16 Hours
course iconPayment Card Industry Security Standards (PCI-DSS) Certification
  • 16 Hours
course iconIntroduction to Forensic
  • 40 Hours
course iconPurdue UniversityCybersecurity Certificate Program
  • 8 Months
CISSPcourse iconCareer KickstarterFull-Stack Developer Bootcamp
  • 6 Months
Best seller
course iconJob OrientedUI/UX Design Bootcamp
  • 3 Months
Best seller
course iconEnterprise RecommendedJava Full Stack Developer Bootcamp
  • 6 Months
course iconCareer KickstarterFront-End Development Bootcamp
  • 490+ Hours
course iconCareer AcceleratorBackend Development Bootcamp (Node JS)
  • 4 Months
ReactNode JSAngularJavascriptPHP and MySQLcourse iconPurdue UniversityCloud Back-End Development Certificate Program
  • 8 Months
course iconPurdue UniversityFull Stack Development Certificate Program
  • 9 Months
course iconIIIT BangaloreExecutive Post Graduate Program in Software Development - Specialisation in FSD
  • 13 Months
Angular TrainingBasics of Spring Core and MVCFront-End Development BootcampReact JS TrainingSpring Boot and Spring CloudMongoDB Developer Coursecourse iconBlockchain Professional Certification
  • 40 Hours
course iconBlockchain Solutions Architect Certification
  • 32 Hours
course iconBlockchain Security Engineer Certification
  • 32 Hours
course iconBlockchain Quality Engineer Certification
  • 24 Hours
course iconBlockchain 101 Certification
  • 5+ Hours
NFT Essentials 101: A Beginner's GuideIntroduction to DeFiPython CertificationAdvanced Python CourseR Programming LanguageAdvanced R CourseJavaJava Deep DiveScalaAdvanced ScalaC# TrainingMicrosoft .Net Frameworkcourse iconSalary Hike GuaranteedSoftware Engineer Interview Prep
  • 3 Months
Data Structures and Algorithms with JavaScriptData Structures and Algorithms with Java: The Practical GuideLinux Essentials for Developers: The Complete MasterclassMaster Git and GitHubMaster Java Programming LanguageProgramming Essentials for BeginnersComplete Python Programming CourseSoftware Engineering Fundamentals and Lifecycle (SEFLC) CourseTest-Driven Development for Java ProgrammersTypeScript: Beginner to Advanced
  • Home
  • Blog
  • Devops
  • Docker In Production: Deployment, Advantages & Best Practices

Docker In Production: Deployment, Advantages & Best Practices

By DhineshSunder Ganapathi

Updated on Sep 10, 2022 | 14 min read | 14.2k views

Share:

Docker is a container orchestration platform that allows you to build, ship, and run distributed applications. Docker containers encapsulate all the files that make up an application, including code, runtime, and system libraries. With native integration for Windows and Linux systems, Docker offers a single solution for managing applications across multiple platforms. 

Applications such as MySQL run in a single container, which is a lightweight package containing the OS, application files, and dependencies. Containers are launched from images defined in Dockerfiles. To retain data between restarts, Docker volumes or host folder bindings are used. 

Docker Compose can launch multiple containers with one command using a docker-compose.yml file. Orchestration tools like Docker Swarm and Kubernetes manage container replication in production. This article explores Docker in production, its advantages, and best practices. 

Docker in Production

Dev Genius

Before Docker, the software industry faced major issues like unpredictable application behavior during server migration. Docker simplified this process. Key challenges included: 

  • Dependency Matrix - Ensuring the same runtime version (e.g., Java 8 or Python 3.5) was difficult due to different system application needs. 
  • Time-consuming Migration - Migration often caused bugs, leading to the question, “What is different between this and the last environment?” 
  • “It Works on my Machine!” - New developers frequently encountered setup issues, leading to the phrase, “It works on my system.” 

Docker's ecosystem is evolving, with popular images continuously improved for reliability, speed, and security. Docker resolves the "works on my machine" problem by containerizing applications. 

For DevOps, Docker is crucial in managing deployments, setting up infrastructure, and preventing code development issues. Learn more through Docker Training online and DevOps Certification courses

Deploying Containers Across Environments with Docker in Production 

When it comes to Docker images, big is bad! Big means slow and hard to work with. Also, it brings in more potential vulnerabilities and possibly a bigger attack surface. For these reasons, Docker images should be small. The aim of the game is to only ship production images with the stuff needed to run your app in production. 

The problem is that keeping images small was hard work. For example, the way you write your Dockerfiles has a huge impact on the size of your images. A common example is that every RUN instruction adds a new layer. As a result, it’s usually considered a best practice to include multiple commands as part of a single RUN instruction—all glued together with double-ampersands (&&) and backslash (\) line-breaks. While this isn’t rocket science, it requires time and discipline. 

Another issue is that we don’t clean up after ourselves. We RUN a command against an image that pulls some build-time tools and leave all those tools in the image when we ship it for production. But this is not ideal. 

Here, Multi-stage builds comes to the rescue. Multi-stage builds are all about optimizing builds without adding complexity. These builds have a single Dockerfile containing multiple FROM instructions. Each FROM instruction is a new build stage that can easily COPY artifacts from previous stages. 

Example

Let us create a sample Dockerfile for understanding a Linux-based application, so it will only work on a Linux Docker host. It is also quite old, so don’t deploy it to an important system, and be sure to delete it as soon as you are finished. 

The Dockerfile is shown below:

FROM node:latest AS storefront 
WORKDIR /usr/src/atsea/app/react-app. 
COPY react-app. 
RUN npm install. 
RUN npm run build. 
FROM maven:latest AS appserver 
WORKDIR /usr/src/atsea 
COPY pom.xml. 
RUN mvn -B -f pom.xml -s /usr/share/maven/ref/settings-docker.xml dependency:resolve 
COPY. . 
RUN mvn -B -s /usr/share/maven/ref/settings-docker.xml package -DskipTests 
FROM java:8-jdk-alpine AS production 
RUN adduser -Dh /home/gordon gordon 
WORKDIR /static 
COPY --from=storefront /usr/src/atsea/app/react-app/build/. 
WORKDIR /app 
COPY --from=appserver /usr/src/atsea/target/AtSea-0.0.1-SNAPSHOT.jar . 
ENTRYPOINT ["java", "-jar", "/app/AtSea-0.0.1-SNAPSHOT.jar"] 
CMD ["--spring.profiles.active=postgres"]

The first thing to note is that Dockerfile has three FROM instructions. Each of these constitutes a distinct build stage. Internally, they’re numbered from the top starting at 0. However, we have also given each stage a friendly name. 

  • Stage 0 is called storefront. 
  • Stage 1 is called appserver. 
  • Stage 2 is called production. 

Stage 0

The storefront stage pulls the node:latest image, which is over 900MB in size. It sets the working directory, copies in some app code, and uses two RUN instructions to perform some npm magic. This adds three layers and considerable size. The resulting image is even bigger than the base node:latest image, as it contains lots of build stuff and not a lot of app code. 

Stage 1

The appserver stage pulls the maven:latest image which, is over 500MB in size. It adds four layers of content via two COPY instructions and two RUN instructions. This produces another very large image with lots of build tools and very little actual production code. 

Stage 2

The production stage starts by pulling the java:8-jdk-alpine image. This image is approximately 150MB. That is considerably smaller than the node and maven images used by the previous build stages. It adds a user, sets the working directory, and copies in some app code from the image produced by the storefront stage.  

After that, it sets a different working directory and copies in the application code from the image produced by the appserver stage. Finally, it sets the main application for the image to run when it starts as a container. 

An important thing to note is that COPY --from instructions are used to only copy production-related application code from the images built by the previous stages. They do not copy build artifacts that are not needed for production. 

Multi-stage builds are new with Docker 17.05 and are an excellent feature for building small production-worthy images. 

Adopting Docker in a Production Environment: Enterprise Considerations  

The first thing you need to consider when adopting Docker is the size of your infrastructure. You need a good-sized infrastructure with plenty of RAM, CPU, and Disk space. In addition, you need to ensure that all the containers run on separate hosts (or physical machines) and not share resources. This is because having many containers on one host can lead to performance issues and increase complexity in maintaining your infrastructure.

Another important thing to consider when adopting Docker is that it needs a lot of secure storage space. The storage should be backed up regularly and must be encrypted so that no one can access it without permission from you or your team members who have access to its keys. You must also make sure that there are enough servers available for running your application. On top of it if there aren't enough servers available then your application might fail due to high load on someone else's server. 

1. Constantly Changing Docker Ecosystem  

The Docker ecosystem is constantly changing. The most popular Docker images are constantly being improved and updated, which makes them more reliable, faster and more secure. 

If you want to use the latest version of your favorite image, you have to use the latest version of Docker Engine. All the images in the official repository are available for free. If you want to build your own image and publish it as an image in the public registry, you need a paid license for Docker Hub Pro or Enterprise Edition 

2. Enforcing Policy and Controls for Docker in Production  

Docker can be configured to be extremely secure. It supports all of the major Linux security technologies, including kernel namespaces, cgroups, capabilities, MAC, and seccomp. It ships with sensible defaults for all of these, but you can customize or even disable them. 

Over and above the general Linux security technologies, Docker includes an extensive set of its own security technologies. Swarm Mode is built on TLS and is extremely simple to configure and customize. Image scanning can perform binary-level scans of images and provide detailed reports of known vulnerabilities. Docker Content Trust lets you sign and verify content, and Docker Secrets allow you to securely share sensitive data with containers and Swarm services. 

The net result is that your Docker environment can be configured to be as secure or insecure as you desire; it all depends on how you configure it.

Best Practices for Running Docker in Production  

1. Use Specific Docker Image Versions 

When using Docker in Production, if we don’t use specific image version in the build, by default it picks up the latest tag of the image. 

Issues with this approach. 

  • you might get a different image version as in the previous build. 
  • the new image version may break stuff. 
  • latest tag is unpredictable, causing unexpected behaviour 

So instead of a random latest image tag, we would like to fixate the image version.  The rule here is: the more specific the better. 

2. Docker Monitoring and Logging   

To securely manage your Docker deployment, you need to gain visibility into the entire ecosystem. You can achieve this by using a monitoring solution that tracks container instances across the environment and allows you to automate responses to fault conditions- testing and deployment. 

3. Cautionary Use of Docker Socket   

In the visualizer service, we have mounted Docker socket /var/run/docker.sock on the container. Bind mounting the Docker daemon socket gives a lot of power to a container as it can control the daemon. It must be used with caution and only with containers we can trust. There are a lot of third-party tools that demand this socket to be mounted while using their service. 

You should verify such services with Docker Content Trust and vulnerability management processes before using them. 

Docker Security Best Practices  

Docker has provided numerous benefits over its competitors. However, most of its components are shared with the host kernel. So, if proper security measures are not taken, the host system can be at risk of being compromised and let an attacker take control of it. 

1. Host Machine Compromise 

Since containers use the host’s kernel as a shared kernel for running processes, a compromised container kernel can exploit or attack the entire host system. 

2. Container Mismanagement 

If somehow, a user is able to escape the container namespace, it will be able to interact with the other processes on the host and can stop or kill the processes. 

3. Maxing out Utilization 

Sometimes a container uses all the resources of the host machine if it is not restricted. This will force other services to halt and stop the execution. 

4. Issue with Untrusted Images 

Docker allows us to run all images present on Docker Hub as well as a local build. So, when an image from an untrusted source is run on the machine, the attacker’s malicious program may get access to the kernel or steal all the data present in the container and mounted volumes. 

5. Best Practices to Mitigate Risks 

No matter what measures you take, security can be breached at any level, and no one can totally remove the security risks. However, we can mitigate the risks by following some best processes to ensure that we close all the gates for an attacker to get access to our host machine. 

6. Container Lifecycle Management 

Through the container lifecycle management process, we establish a strong foundation for the review process of creating, updating, and deleting a container. This takes care of all the security measures at the start while creating a container. When a container is updated instead of reviewing only the updated layer, we should review all layers again. 

7. Information Management 

Never push any sensitive data like passwords, ssh keys, tokens, certificates in an image. It should be encrypted and kept inside a secret manager. Access to these secrets should be explicitly provided to the services and only when they are running. 

8. No Root-level Access 

A container should never be run with root-level access. A role-based access control system will reduce the possibility of accidental access to other processes running in the same namespace. Many of the organizations restrict access using the active directory and provide appropriate access based on user roles. 

In general, we can use Linux’s inbuilt commands to create a temporary non-root user on the fly.

FROM python:3.9 
RUN groupadd -r myuser && useradd -r -g myuser myuser 
<HERE DO WHAT YOU HAVE TO DO AS A ROOT USER LIKE INSTALLING PACKAGES ETC.> 
USER myuser

or while running a container from the image use, 

docker run-u 4000 python:3.9 

This will run the container as a non-root user. 

9. Trusted Image Source 

Check the authenticity of every image pulled from Docker Hub. Use Docker Content Trust to check the authenticity of the Image. Docker Content Trust is a new feature incorporated into Docker 1.8. It is disabled by default, but once enabled, allows you to verify the integrity, authenticity, and publication date of all Docker images from the Docker Hub registry. 

Enable Docker content trust using export DOCKER_CONTENT_TRUST=1 and try to pull this unsigned image docker pull dganapathi/test123.

# docker pull dganapathi/test123 
Using default tag: latest 
Error: remote trust data does not exist for docker.io/dganapathi/test123: notary.docker.io does not have trust data for docker.io/dganapathi/test123

Docker will check whether the image is safe or not and will throw an error if it is not. 

10. Mounting Volumes as Read-only 

One of the best practices is to mount the host filesystem as read-only if there is no data saved by the container. We can do that by simply using a ro flag for mounts using -v argument or readonly with the --mount argument. 

Example:

$ docker run -v volume-name:/path/in/container:ro python:3.9

or

$ docker run --mount source=volume-name,destination=/path/in/container,readonly python:3.9

Advantages of Using Docker in Production   

1. Industry Demand  

As Docker promises an equivalent environment in both development and production, companies don’t have to test applications twice in different environments. As a result, Docker adoption by companies increases daily. 

2. Isolation from the Main System  

As a developer, you will always experiment with libraries and different versions of programming languages. For example, if you are testing asyncio support for one application that needs Python 3.7, and you decide not to use it, you might need to uninstall Python 3.7 and install the previous version. With Docker, you simply remove the container without zero complexities. 

3. Configurations  

There are a lot of different configurations required for every project. Maintaining a list of configurations is very difficult. Docker provides the capability to configure images with different configurations and tag images. 

4. Docker Hub 

Ever imagined sharing your machine like you share the code using Github? Docker Hub provides access to thousands of images that are configured with the environment so that when your code works in your machine, you can build images and share it all over the internet. 

Conclusion 

Docker is very flexible and can be used in multiple ways, and we just scraped the surface here. You can now use docker to manage your own data and text files or act as a communication layer between servers. We hope this article made you familiar with the basics of how to use Docker in production mode and how it can help you. To get certified in docker and Kubernetes, check out KnowledgeHut’s Kubernetes Docker certification

Frequently Asked Questions (FAQs)

1. Can Docker be used in production?

2. Is Docker good for production database?

3. What are the most common misconceptions about using Docker in production?

4. Can I use docker to create GUI application?

DhineshSunder Ganapathi

DhineshSunder Ganapathi

4 articles published

Get Free Consultation

By submitting, I accept the T&C and
Privacy Policy