blogarchive

Like any development shop that works on the web, we've faced common challenges when it comes to creating and maintaining a consistent working environment throughout the development process.

  • Jan 19, 2017
  • Michael Evangelista
  • Developers

Like any development shop that works on the web, we've faced common challenges when it comes to creating and maintaining a consistent working environment throughout the development process.

From localhost-based code authoring to public staging and client revision, and eventually the launch of a completed website on a public-facing server, there have always been specific hurdles to overcome in terms of version differences between operating systems, database and web server versions, and other variables within each developer's specific setup and coding preferences.

Adding to the complexity, team members usually possess varying levels of familiarity with required server and database technologies, while local or individually hosted servers often differ from the actual software stack used in production. Our process faced these same challenges, along with a requirement for delivering excellent experiences to the client company and our customers.

Enter Docker Swarm

In an ongoing attempt to streamline production and reduce these common points of friction, Lead Architect, Eddie Ballisty, took on the task of implementing a consistent, standardized development process including all aspects of the web server stack. The goal was to eliminate guesswork and alleviate software version conflicts between team members' local environments, using modern scaleable technologies. After many months of experimentation, exploration and learning, a new paradigm based on the powerful Docker software containerization platform, and eventually the "Docker Swarm", was put into place, with excellent results.

Docker containers wrap a piece of software in a complete filesystem that contains everything needed to run. This includes code, runtime, system tools, system libraries – anything that can be installed on a server -  guaranteeing that the software will always load and operate the same way, regardless of the environment. 

As our team soon discovered, an integration of Docker with Amazon Web Services and a code versioning service like github or bitbucket provides an incredibly agile and manageable set of publishing options. The additional benefit of automatic deployment from git to production servers adds an unmatched level of convenience.

Once configured, the Docker Swarm clustering engine turns a pool of virtual Docker hosts into a single, virtual host environment. By utilizing AWS for scalable server infrastructure and Docker for management of the server's layered technologies, many of the common obstacles to team coding and deployment of versioned code to production can be removed, allowing for a cleaner and more confident development process. Independent, open-source Docker images for Lucee, MySQL and Apache are readily available. Combining these with a "data volume" containing project-specific code and any editable files, the Swarm allows for management of all of these containers in a single concise environment.

Before adopting Docker as the platform of choice, we relied on Vagrant, a widely-used virtualization solution, to achieve a similar consistency between multiple environments. However, as our usage of Vagrant became more sophisticated, some additional requirements became clear. While Vagrant provides a more simplistic approach to management, large file sizes and options to change individual parts of the web server stack proved to be limiting. The move to Docker, coupled with Docker Swarm and utilizing freely available images downloaded from Docker Hub, increased compatibility, minimized loading times, and added a reliable, user-friendly set of tools for management of elastic production servers on the developer-oriented AWS service.

Traditional server configuration can be tedious and repetitive, and setting up local work spaces rarely involves the same exact set of technologies, let alone exactly matching versions and stacking order for the various moving parts. Virtualization and containerization technologies like Docker and Vagrant were born out of the obvious and ubiquitous need for a way to manage all the endless variables, and eliminate the constant repetition and management of base server environments.

With the release of Docker 1.12 in July of 2016, which made the Docker Swarm configuration much more automatic, it became clear, Ballisty says, that "this is the future". Despite a several-month learning curve and trial-and-error process, the overall benefits were apparent, and worth the time and effort to integrate into the company's development cycle.

Monitoring and controlling independent nodes within a running Docker Swarm instance was simplified greatly with the release of the Docker Swarm Visualizer which utilizes the Docker Remote API to display status and service information in a simple visual diagram. 


The Docker Swarm Visualizer makes monitoring easy. When a node goes down or is taken offline, a red status icon indicates that the node is unavailable, and details are shown for each service and node in the swarm.

Docker Hub

Docker Hub, which is a separate service, provides a centralized resource for common open-source container images (e.g. MySQL, Apache, Lucee) and allows code repositories to be linked to Docker images, along with other workflow automation tools. One of the most notable features, Docker Hub's built-in security scanning can scan images in private repositories to verify that they are free from known security vulnerabilities, and report the results of the scan for each image tag, adding an automatic element of confidence and peace of mind with the entire process.

Costs & Benefits

We pay a nominal monthly fee for the use of Docker Hub, at approximately the same cost as a private GitHub account, or other similar services. This cost, however, is easily recovered in man-hours and eliminated frustration, even for a single user, though perhaps not immediately. Fully understanding the options, deciding to make the move from Vagrant, and implementing Docker within an active website publishing process required an investment of both time and patience from each person involved. Over time, wrinkles have been ironed out, questions have been answered, the Docker product itself has been improved and updated, and a "new normal" has taken hold among the development team. Additional cost savings may be realized for those moving from a traditional hardware-based environment to a scalable cloud-based solution, since the Docker Swarm combines multiple nodes and even multiple project containers into a single block of server space.

Deployment

Starting with a base image of the Lucee server, assets and code files specific to a given project can be included, to extend the original Docker image. Code changes are pushed through a standard git commit process, while the creation of a "tag" in the project's code repository triggers a fresh deployment to the production server. A post-commit listener function or "hook", set up in Docker ahead of time, will recognize the new tag and trigger a repackaging and deployment of the updated code, bundling it with a fresh image of Lucee, MySQL, and any other required services, automatically creating a new Docker image for the entire stack. Changes to database content are also handled automatically, with specialized options for running .sql scripts at startup. Running a single command within the Docker Swarm deploys this image, bringing the new version of the project online with minimal effort and very little room for error.

The Docker Swarm is easily managed and manipulated with simple command-line functions, using a friendly syntax.

Since the Docker environment is completely portable, it can be used on any standard cloud hosting service like Google's App Engine, DigitalOcean and others that are capable of running Docker images. The addition of the Amazon elastic load balancer and related caching services paves the way for a seamless roll out of changes and updates with minimal, if any, downtime while those changes are deployed. An additional hosted service called "Docker Cloud" couples natively with Docker Hub, and may be of special interest to those just getting started or looking for a more rapid entry into a container-based workflow, 

For us, implementation of these dynamic services has resulted in reduced friction, increased production, and an enhanced sense of confidence in the overall process. Ballisty puts it simply and concisely, "I just want my developers to do development. Write code, push to git, and let the magic happen."

Mura Docker Image Coming Soon

Eddie and the Blue River Team are also working on a pre-configured Mura CMS image which will be made available on Docker Hub, containing a full open source stack with everything needed to get Mura up and running instantly in a Docker container.

More at MuraCon

In addition to several shorter talks at MuraCon 2017, our annual conference dedicated to the Mura Platform, Eddie will be presenting a full-length Friday session entitled "Mura & Docker Swarm: Enterprise Level Scaling", in which he'll summarize share his results and insights after working with these hot-topic technologies as part of his work at Blue River.

Using a methodical approach and dedicated determination, he has made notable improvements to the team's development process, and provided a number of impressive architecture options for high-traffic Mura CMS websites deployed via Docker.

Since joining the company over ten years ago, Eddie has worked on all facets of website development and cultivation of Mura CMS itself, most recently in his current role as Lead Architect of Blue River's Professional Services Division. With over 16 years of experience in the industry, Eddie is passionate about rapidly evolving and expanding technologies, and is always looking for opportunities to push the team and its products forward. 

A Sacramento native, Eddie loves getting out and about when time allows. As a devout Raiders fan and avid weekend cyclist, he is known to frequent the Oakland Coliseum as well as various mountain bike trails in Davis and Sacramento counties. When they are not busy producing quality projects or avidly improving the team workflow, he and his wife, Blue River Project Manager, Christine Ballisty, take every opportunity to travel. Recent destinations have included Hong Kong, Tokyo, and several places in Europe. 

Don't miss this opportunity to meet Eddie, Christine and all rest of the Blue River/Mura team at MuraCon 2017, February 9 & 10 in Sacramento!

Developers

Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Michael Evangelista
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Michael Evangelista
Blog Post
By Michael Evangelista
Blog Post
By Grant Shepert
Blog Post
By Matt Levine
Blog Post
By The Mura Team
Blog Post
By Pat Santora
Blog Post
By Pat Santora
Blog Post
By Matt Levine
Blog Post
By Matt Levine
Blog Post
By Matt Levine
Blog Post
By Eddie Ballisty
Blog Post
By Sean Schroeder
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert
Blog Post
By Grant Shepert

Marketers

Blog Post
By Andrew Medal
Blog Post
Blog Post
By Ronnie Duke
Blog Post
By Sean Schroeder