When we design new software or assemble a new system out of existing software components, we usually have several configuration questions come up. For example, how will we begin configuring it? How will the configuration load? On the software side, we must decide how to structure the configuration, how to achieve the correct level of abstraction, and how to support diverging use cases seamlessly.

Spencer Kotowick
May 27th, 2019

The problem most companies face: Shortcuts

Data assumes a common role in controlling software logic, in the form of application parameters or configuration data. Unfortunately, this carefully executed monitoring is sometimes avoided when there is a need for source code changes. Businesses are able to find ways to bypass the calculated regulations in an effort to keep up with changes within their company. Instead of following procedure, businesses will look for shortcuts that provide immediate results. This means that what was once set in stone is now completely reconfigured.

Even though it is considered bad practice to use shortcuts to change code on a whim, it is still shockingly easy for people to change configuration data this way.

The problem still exists with those that have the authority to change configuration data (i.e IT staff). Thus, restricting access to configuration data is only half of the solution. The other half is to introduce controls over configuration data, similar to the controls we have on source code (revisioning, delivery, etc).

How to curb risk, increase productivity

The ideal solution to configuration management is first to decide what data should be stored in code, and what data should be parameterized. Then introduce a revisioning & approval system.

Our Problem

As new services were conceptualized to utilize Docker containers, it was decided to migrate existing services to use the same approach. This created a need for a new common configuration management system. There was a mismatch of practices across the board!

With the entire infrastructure being hosted on AWS, we chose to use AWS Parameter Store for storing the data.

Our Solution

Instead of integrating AWS Parameter Store SDK directly into 35+ micro-services across multiple languages, we chose an approach that would allow us to add and remove features without having to modify each micro-service.

Option 1

Option 1

Storing each key/pair value as separate parameters in AWS Parameter Store. There were issues with API rate limits!

Option 2

Option 2

Storing all configuration key/value pairs as a Single Parameter in AWS Parameter Store. Resolved API rate limiting.

To achieve this goal, we built a managed service that acts as a proxy caching system between micro-services and AWS Parameter Store. It lives within the same environment as the micro-services that use it. Each micro-service sends it requests. The proxy then parses the data, sends the request to AWS Parameter Store API, and upon response, applies any additional features we have in place. Then it caches the response for future requests for the same version.

Technical Details

The Proxy Caching System (a Golang project)

To achieve Option 2, we built a standalone service that acts as a middle man between our micro-services and AWS Parameter Store API.

Option 2: AWS limits the size of parameters to 2048!

Eventually, this limit limited us. That’s because we were storing certs as parameters (large amounts of data). We added a feature to the managed service which allows Parameters to reference other Parameters. The value for a key/pair entry be prefixed with ssm://. The proxy service then uses that value as a parameter name and retrieves the data from AWS Parameter Store.

Luckily, its easy to add features to this managed service. We avoided having to update 35+ micro-servcies to allow for such a feature.

Other Moving Parts

We chose to store configuration data for micro-services in Ansible Playbooks (due to team familiarity).

Looking at the example below, all configuration key/values are stored under the parameter: section on line 13. The version number is stored on line 10, and the name of the Parameter to be stored in Parameter Store is on line 9.

Copy to Clipboard

We wrote another small wrapper in GoLang that Ansible could use to push these parameters up.

It became increasingly important to be able to revise configuration data along side application releases. This is so we can update and rollback with an immutable approach meaning that the guessing or checking to see if the configuration data is correct for the rollback is removed from the equation!

Inside the micro-service’s Ansible Playbook, we specify a version number (as noted in the previous point). This number is uniquely used and added both to the name of the parameter and the description of the parameter. It is also important to note that a parameter version that has been pushed to AWS SSM Param Store cannot be overridden (by default, that is).

A member from the team makes changes locally, pushes them to a new branch in GitHub, and creates a pull-request. Another member from the team reviews the pull-request, either approving or requiring changes.

Once approved, any member of the team can then merge the pull-request.

The merge gets built in CodeShip, runs the micro-service’s playbook, and pushes the data into the AWS SSM Parameter Store. It’s important to note that the Ansible Playbook uses our binary wrapper to perform the push.

The playbook data is stored as a single SecureString in AWS SMS Parameter Store.

Copy to Clipboard


This proxy & caching service was written in GoLang, a language our team wasn’t so familiar with. However, we wanted easily exportable binary files that could be used cross-platform.

Other challenges include:

  • API rate limiting with AWS Parameter Store
  • Parameter size limit in AWS Parameter Store (2048)
  • Method to overcome the above


We love hearing from our clients and potential customers, so if you have any questions, comments, or concerns, don’t hesitate to contact us! Just fill out the form below, and we will get back to you as soon as we can!