What is Docker and how to dockerize ASP .Net Core Microservice?

Docker helps us develop, run, and ship apps anywhere. In this blog post, we will explore what is docker and how it can be added to a project using visual studio code.

What is Docker?

Docker is a container management service. It is written in Go and open source. Docker container enables developers to bundle application and it’s dependencies and ship everything out as one package which then can be deployed.

Container is a just a process which shares the host system kernel. It is limited to what resources it can access and stops when process stops. It is very small than a virtual machine which makes it very light and fast to start and stop.

You can download the docker from docker.com for windows/linux based on your machine’s OS.

Docker Terminologies

You should become familiar with some of the following terms and definitions

Container Image: A set of all the dependencies and information needed to create a container.

Container: A container describes a runtime for a system, process, or service. It is an instance of a docker image.

Tag: Tag is used to label images to identify different versions and environments.

Dockerfile: It is a text file that holds commands on how to build a docker image.

Compose: YAML file format with metadata for creating and running multi-container applications. It is a very handy CLI tool that let’s us manage the multi-containers with single commands.

Dockerize ASP.NET Core Application

.Net core works on Linux and windows both. To containerize .net core application add the Dockerfile in your project folder.

Dockerfile should be in the directory of your project. If you have referenced other projects. You need to place Dockerfile in the right context.

Add-dockerfile

In my project, I have placed Dockerfile in the Solution folder because I need to copy my multi-layers API, domain, service, data DLL’s.

Dockerfile Overview

A Dockerfile is a text file that holds commands on how to build a docker image. It can contain multi-stage builds. The following file has four build stages.

The base stage gets ASP .net core runtime and NuGet packages from the docker hub. It then sets the working directory to the ‘app’. Exposes docker’s port 80 and 443 for network communication with the outside world.

The build stage gets the SDK image to run the application. It will set the working directory to ‘src’, copy all the project files, and store DLLs inside this folder. Next, it will restore the NuGet packages of User.API project.

Copy . . is a very important command. It copies all the files from your docker build context(current project directory) to the docker image directory.

Publish stage publishes the application to /app/publish directory. The parameter ‘-c Release -o /app/publish’ compiles the app with release configuration to the output path.

In the final stage, we set the path back to ‘/app’ because it is where our app is going to run with a runtime environment (see Stage 1). Copy all the files from /app/publish directory to /app.

When the docker container starts. ENTRYPOINT will execute the ‘Dotnet project_name.dll’ command. In our case, it is bound to run the application with the help of User.API.dll.

Build and Run Docker Image

Now we have to build an image and run that image in the container. Go to the project directory where the Dockerfile exists and run the following command

docker build -t user.api .

-t is optional and used to tag images. “.” tells the docker build command to set the build context to the current directory of the project.

The following command will run the image in container.

docker run -p 3000:80 –name userapi-container user.api 

With -p (–publish) option we bind the container’s port to the host (using TCP). With the –name option we specify the name of the container. At the end we pass the name of the image.

Conclusion

In this post, I walked you through about core concepts of Docker and how it can be added to .Net core. In a fast DevOps culture, it is required to quickly change the application and deploy it in production. Docker makes this very easy.

In the next post, I will explain how you can use powerful commands like docker-compose to deploy multiple services with a single command.

You can download the reference source code on my Github.

Mediator Design Pattern with .NET CORE

Design patterns are documented and tested solutions for recurring problems. They are used to solve the problems of objection creation and integration. Developers encounters these problems on a day to day basis. So, it is a best practice to use design patterns as they are reusable solutions.

It is important to remember that you do not need to over-do design patterns. Design patterns are for projects, projects are not for design patterns. So, use it in the project if you think it is a good fit. Otherwise, It can make your project messy.

Types of Design Patterns

Gang of Four categorised design patterns into three main categories

1. Creational Design Pattern
2. Structural Design Pattern
3. Behavioural Design Pattern

In this blog, we will discuss Mediator design pattern. It is a behavioral design pattern that promotes loose coupling.

Mediator Design Pattern

It is used to reduce communication between objects. This pattern provides a mediator object and that object handles all the communication between objects. 

In the following diagram, you can see there are 3 objects(X, Y, and Z). Suppose, object X has to send data to Y and then Y has to send a message to Z. These objects need to know each other. They are tightly coupled. In the real world, there are hundreds and thousands of objects that have to call each other. It makes the on-going management of code an up-hill task.

How Mediator Pattern solves the problem?

If object X has to call object Y and Z. It doesn’t have to call directly. It will call the mediator object and it is the responsibility of the mediator object to pass the message to the destination object. The mediator helps you promote loose coupling by keeping objects from referring and interacting with each other.

Implementation of Mediator Pattern

You can install by Nuget Package Manager

Install-Package MediatR

or via .Net core CLI

dotnet add package MediatR

dotnet add package MediatR.Extensions.Microsoft.DependencyInjection

I have installed MediatR and MediatR.Extensions.Microsoft.DependencyInjection in my demo project.

mediator installation
Register MediatR with assembly

Next thing is that you have to inject the mediatR in controller.

The mediator call consists of a request and a handler. Every request has its handler. The request could be anything from a complex object to an integer value.

In the following image, you can see we are making a request to the mediator which takes CreateUserCommand as a model.

Mediator request call

Request

Requests are of two types.

1. IRequest<TRequest> – returns TRequest
2. IRequest – returns nothing

A request is a class that has a handler attached to it. You can see in the image below, I have implemented IRequest<TRequest>. CreateUserCommand request takes AppUser as a return value.

Handler

Request Handler (Responses) are of two types.

1. IRequestHandler<TRequest, TResponse> – returns TResponse
2. IRequestHandler<TRequest> – returns nothing

You can see in the image below, I have implemented IRequestHandler<TRequest, TResponse>. This handler implements the request of CreateUserCommand and returns AppUser.

Mediator handler request

Conclusion

The mediator design pattern helps us reduce coupling and promote SOLID principles. You should use it if you think it can be a good fit for your project. I have explained in this blog about the design pattern and its implementation. As a reference, you can find the complete source code in my demo app.

CQRS pattern with ASP.NET CORE

CQRS is a command and query responsibility pattern that separates read and write operations into different models. Commands (Insert/update) operations will have a separate model. Queries (read) operations will have a separate model.  By separating them, it allows models to be more focused on the tasks that they are performing. 

Traditionally, commands will have a large model as it would map the model to the whole database table with some business logic validating the model before saving it the database. Whereas, on the other hand, queries will have a simple model, returning a dataset according to UI requirements. 

Traditional vs CQRS

In traditional applications, we use the same data model for read and write operations. It is good for simple CRUD applications but if you have a large complex application which has a large model. It get’s really difficult to manage it as every other change can cause issues in testing with additional bloated code. 

When we use CQRS, we can either have one database or we have can have two databases, one for commands and others for queries. If we create two databases then we have to keep them in sync. This is the additional work you have to do.

How to implement it?

You will see two folders, one for command and another for the query. Query folder has one model class and one handler. I am using a mediator pattern here, which I will explain in my other blog post.

Query:

In the following image, you can see the handler class receives the query model and makes a query to the database to get all users from the database.

Command:

In the following image, you can see the handler class receives the command model and makes a write operation to save the user in the database.

Conclusion:

There are pros and cons to everything. Some of you may feel that implementing CQRS may add additional complexity to the project. In my opinion, It comes down to the use case of the project and how you use this pattern. I have also made a working demo app on GitHub with the CQRS pattern. 

Introduction to Microservices

Microservices is a new buzzword that is very popular nowadays. I have spoken to a few developers and every single one of them defines it differently. Some say ‘Microservices is just a pattern to break the large monolithic application into smaller monolithic applications’ while others describe it as ‘Microservices is just SOA (Services oriented architecture) done right’. In this blog, we will focus on microservices.

What is a Microservices?

It is an architectural style pattern that is used to develop large complex applications into small modular loosely coupled services that are small, independent, easily manageable, and deployable. In simple words, microservice helps you break a large application into smaller applications that enforce SRP (Single responsible principle).

Companies run into trouble when it gets too difficult to manage a large application and upgrade it.  Microservices is the answer and can help us break down complex large applications into small independent applications that communicate with each other using language-independent interfaces (APIs).

History of Microservices

Term microservice was coined in mid of 2011. 

The idea behind microservice is just good architecture principles. It has been around for a few decades. In the early 1980’s the first distribution technology Remote Procedure Calls (RPC) was introduce by Sun microsystems. 

Who is using it?

Companies like Netflix, Amazon, Uber, eBay, and Spotify have a microservice architecture that helps them serve resources with intensive requests in a scalable manner. Also, a lot of other companies are moving their monolithic products to microservice architecture.

There are many advantages and disadvantages of Microservices.

Advantages:

Modular and independent

Microservices are broken down into multiple service components by design, which are easily developed by small teams. Each of the service component is focused on a specific module, can be developed and deployed on docker independently. This helps new team members understand the functionality in less time. 

Decentralised and cross-functional

As Conway’s law states “The design of the software can tell us about the social fabric of the organisation and vice versa”. The small teams where every team member is responsible for each business function is ideal organisation for the development of microservices. From Architects, developers, QA engineers, product owner, analysts and DevOps – Each one is responsible for their own piece of microservices.

Highly Resilient

Microservices are better for fault isolation, if one service fails, other services will continue to work without impacting the whole system. If our monolithic application fails, everything that it’s accountable fails at the side of it. Therefore, microservices are highly resilient, needs orchestrator  and good cloud infrastructure for high availability.

Highly Scalable

Microservices are designed for scaling. Scaling allows applications to react to variable load by increasing and decreasing the number of instances of the services it’s using. Also, microservices are cost efficient as you scale the services based on their demand but in monolithic application you would have to scale the whole application irrespective of which service/function would be highly used.

Disadvantages

Data Consistency

Microservices are designed to have their own data storage. This raises another question that how do we achieve the data consistency?  We use events driven approach and advanced messaging queue technology (Bus Service, RabbitMQ, etc). If a price of product is changed in the product microservice then we have to update the price in basket microservice. 

Latency

Microservice is a disturbed architectural design pattern. We have to make calls to different services to get the data. This leads to increase in round trips and more latency. However, you can minimise this by using API aggregate design to aggregate the data through API gateway.

High Complexity

Microservices are small and modular but when you have an application which consists of hundreds of microservices, calling each other. It becomes really complex.

Conclusion

Today, we have discussed what is a microservice, it’s advantages and disadvantages. I have also made a demo app for you to understand the whole concept. You can download the code and play around with it. In the next upcoming articles, I will explain a more hands-on approach and walk you through, how you can develop a microservice.