Serverless Architecture Using Docker Container

Serverless Solution to run Docker Container using Amazon Fargate

Client: Mail Sorting Technology Company

Serverless Architecture Using Docker Container

Challenge:

  • Lack of handling the various docker container
  • Shortage of lifecycle management to compute infrastructure
  • Need for visibility into a docker container environment
  • Absence of modern technology implementation
  • No internal security to a Docker container

Solution:

Results:

  • SaaS-enabled software in cloud, performing with a 130ms response time
  • Mission-critical, 99.9% available, auto-scalable solution on AWS
  • Pay-per-use only as per load on each container
  • Cost-effective and auto-scalable solution
  • No IT support required for server maintenance

Wanted to Develop an Easy-to-Manage Serverless Docker Container

The client wanted to develop an inline web service for customers who run their mail sorting machines for data verification on the backend. Our customer does not have a budget for big expenses on developing data center or put their servers. They needed to embrace modern cloud-based technologies to save on costs. The main issue was to run the service inline on a machine, and the response required is less than 150ms. Since the number of customers/machines signing up for this service is unknown, it must be able to scale up at short notice.

Implemented Modern Technology for Auto-Scalable Solution

Our team of architects built the functional design and served on the follow-on application of logical and physical architecture. We used the latest technology MEAN (MongoDB, Express.js, Angular.js, Node.js) stack on the Amazon cloud. This provided a very light application server that can be scaled as multiple nodes as per the load requirements. Further, our team decided to containerize the application using Docker / Kubernetes technology. The front processing nodes were packaged as a Docker container and physically deployed on Amazon’s Fargate cloud infrastructure. AWS Fargate technology provides excellent tools to set up load balancing between stateless Docker containers for load balancing and failover. Bursys DevOps team setup auto-scaling of the Docker containers, where the containers will automatically initiate when there is more demand.

Delivered a Highly AutomatedDocker Container with no Manual Interventions

Bursys delivered a serverless docker container architecture solution on AWS Fargate. Each container node is lightweight and pays very little on Amazon Fargate for each node as it requires minimal memory, compute, and disk. Once the number of requests increases to defined capacity on each docker container on Kubernetes, it will automatically fire up a new Docker container with no manual intervention.

Pay for more nodes is only during the day when the machines are active and making large requests, but no need to pay for nodes in off-hours when the number of requests goes down due to reduced or no activity from machines. Our fully automated build processes can perform automatic deployment of new code builds to the cloud. No IT support staff required to maintain the servers. No scrambling around when new customers are signed up and load increases unexpectedly.

Let Us Help You Build a Serverless Platform with Managed Services!

Share
Share on facebook
Share on twitter
Share on linkedin