An introduction to microservices?

The idea behind microservices is that some types of applications become easier to build and maintain when they are broken down into smaller, composable pieces which work together. Each component is developed separately, and the application is then simply the sum of its constituent components. This is in contrast to a traditional, “monolithic” application which is all developed all in one piece.

There are many reasons why this approach is considered an easier way to develop large applications, particular enterprise applications, and various types of software as a service delivered over the Internet.

One of the reasons is from a project engineering perspective. When the different components of an application are separated, they can be developed concurrently. Another is resilience. Rather than relying upon a single virtual or physical machine, components can be spread around multiple severs or even multiple data centers. If a component dies, you spin up another, and the rest of the application can continue to function. It allows more efficient scaling, as rather than scaling up with bigger and more powerful machines, or just more copies of the entire application, you can scale out with duplicate copies of the heaviest-used parts.

Is this a new concept?

The idea of separating applications into smaller parts is nothing new; there are other programming paradigms which address this same concept, such as Service Oriented Architecture (SOA). What may be new are some of the tools and techniques used to deliver on the promise of microservices.

The common definition of microservices generally relies upon each microservice providing an API endpoint, often but not always a stateless REST API which can be accessed over HTTP(S) just like a standard webpage. This method for accessing microservices make them easy for developers to consume as they only require tools and methods many developers are already familiar with.

Microservices depend not just on the technology being set up to support this concept, but on an organization having the culture, know-how, and structures in place for development teams to be able to adopt this model. Microservices are a part of a larger shift in IT departments towards a DevOps culture, in which development and operations teams work closely together to support an application over its lifecycle, and go through a rapid or even continuous release cycle rather than a more traditional long cycle.

Why is open source important for microservices?

When you design your applications from the ground up to be modular and composable, it allows you to use drop-in components in many places where in the past you may have required proprietary solutions, either because the licensing of the components, or specialized requirements. Many application components can be off-the-shelf open source tools.

A focus on microservices may also make it easier for application developers to offer alternative interfaces to your applications. When everything is an API, communications between application components become standardized. All a component has to do to make use of your application and data is to be able to authenticate and communicate across those standard APIs. This allows both those inside and, when appropriate, outside your organization to easily develop new ways to utilize your application’s data and services.

Where do Docker and container technologies come in?

Many people see Docker or other container technologies as enablers of a microservice architecture.

Unlike virtual machines, containers are designed to be pared down to the minimal viable pieces needed to run whatever the one thing the container is designed to do, rather than packing multiple functions into the same virtual or physical machine. The ease of development that Docker and similar tools provide help make possible rapid development and testing of services.

Of course, containers are just a tool, and microservice architecture is just a concept. So it is entirely possible to build an application which could be described as following a microservices approach without using containers, just as it would be possible to build a much more traditional application inside of a container (although this may not be a good idea).

How do you orchestrate microservices?

In order to actually run an application based on microservices, you need to be able monitor, manage, and scale the different constituent parts.

There are a number of different tools that might allow you to accomplish this. For containers, open source tools like Kubernetes, Docker Swarm, or Apache projects like Mesos or ZooKeeper might be a part of your solution. Alternatively, for non-container pieces of an application, other tools may be used for orchestrating components: for example, in an OpenStack cloud you might use Heat for managing application components. Another option is to use a Platform as a Service (PaaS) tool, which lets developers focus on writing code by abstracting some of the underlying orchestration technology and allowing them to easily select off-the-shelf open source components for certain parts of an application, like a database storage engine, a logging service, a continuous integration server, web server, or other pieces of the puzzle. Some PaaS systems like OpenShift directly use upstream projects like Docker and Kubernetes for managing application components, while others try to re-implement management tools themselves.

What about existing applications?

While utilizing microservices may be an important component of an organization’s IT strategy going forward, there are certainly many applications which don’t meet this model, nor is it likely that those applications will be rearchitected overnight to meet this new paradigm. Microservices and traditional applications can work together in the same environments, provided the organization has a solid bi-modal IT strategy.

Bi-modal IT, according to Gartner, is the ability to deliver on both traditional IT applications with a focus on stability and up time, and newer, more agile but possibly less tested applications through newer methods involving things like the ability of developers to self-provision machines and short development cycles.

Many if not most organizations will need to be adapted to work with both approaches for many years to come.

source: microservices

François Encrenaz

Cloud Specialist | Technical Leader | Technology Strategist

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Apache Hadoop for big data, HDFS, YARN introduction

What is Machine Learning ?

The 17th version of OpenStack ‘Queens’ is upon us

What are the Open Source Alternatives for Project Management tools ?