Thoughts before scaling

Just creating a microservice is not sufficient. We are doing all this for performance, which, in turn, is for a smooth user experience. We are doing this to increase the users on the platform. So, with an increasing number of users, scaling a microservice becomes important. The more users, the more scaling needed; better scaling gives a better user experience, which gives more users on a platform and moves us back to more scaling. With microservices, everything is more granular, including scalability and managing spikes in demand. No matter how you view the challenges of scalability for microservices, from the customer or end-user perspective, what matters is the performance of the app itself.

Let's take a step back. When it comes to scaling microservices, before doing any dynamic scaling, there are some key points to be thought of:

  • Does infra support it: You need to know whether the infrastructure on which the system is working supports the dynamic scaling or not. If it supports it, then we can move forward, and if not, then another question is, does it support manual scaling? If it supports manual scaling, then it impacts the rest of the decisions we are going to take about calling policies.
  • Policy for scale up and scale down: What should the policy be for scaling the system up and down? These policies can be dependent on CPU usage, network traffic in and out, and so on.
  • If we have a dynamic instance of our server coming up during peak time, at that time, our monitoring system should be able to monitor that dynamically added instance also.
  • Microservice-based architectures are heavily decentralized, because they focus on the successful reusability of individual components, which helps in scaling. Microservices makes it easier to scale because each service can scale independently of other services.
  • The increased number of microservices makes containers the most promising option for the deployment of microservices. There are many tools and facilities available to make your microservice containerized. Open source container tools like Kubernetes can be a good choice. Major cloud providers are also adding support for containers to their platform--like Amazon ECS and Google Container Engine (GKE). They also provide mechanisms for easily scaled container-based applications, depending on load and business rules.
  • Containerised deployment also increases the challenge at scale. Monitoring them is one of the issues. Monitoring at a higher scale of container becomes the big data problem. As you have so much data from monitoring, collection of that, making sense of data, taking reactions to that data is a challenge in itself.

There are a few challenges for you to face when scaling containers:

  • As we mentioned earlier, there can be many containers, dynamically increasing and decreasing based on load in platform. So, it is crucial to collect logs before these containers come down. This becomes an important point of consideration for scaling in a container.
  • Another point of concern is testing. Containerised deployment at scale also magnifies the inherited challenges of container deployment itself. An increased number of containers increases the importance of automated testing, API testing, communication performance, and so on.
  • Network security is another issue. At scale, it is going to be another consideration point. How can someone plan or architect a solution where each container has a certain level of networking security, without affecting the performance of the application.
  • The most important thing is the team should have enough knowledge of the platform or containers which are going to use the production environment. It's not only the DevOps team; the developer should also be handy with container technology. Developers should also be aware of what is happening inside the container and debugging it.
  • The deployment pipeline also increases and becomes more complex. Lots of homework is required on CI and CD. Coming into the microservice world, with so many different services developed independently, the number of deployments increases dramatically.

As per studies, only about 19 percent of developers are using technology for production. Other teams are facing issues such as performance, testing, security, and so on. Also, cloud providers are coming into the field with their container services, solving all these issues. This number is expected to increase.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset