Blog>>Cloud>>Kubernetes>>Harnessing the power of Kubernetes: 7 use cases

Harnessing the power of Kubernetes: 7 use cases

According to The 2021 Kubernetes Adoption Report      link-icon, 68% of respondents claim that their usage of K8s increased due to the recent (pandemic) crisis. The main reasons for choosing this container orchestration tool were cost reduction, increased automation, and more frequent deployments. In this article, you will find practical examples of how else Kubernetes can benefit your projects (and, of course, your company).

Fig.1: Kubernetes use cases—the infographic
Kubernetes use cases

1. Learning Kubernetes by deploying a simple app

The first case where you can make use of Kubernetes may seem controversial, but still is very useful. Let’s assume that we have a simple three-tier application with backend written in Python/PHP, a database and front-end created in React or Angular. To deploy it, you can use Kubernetes. Yes, from a purely practical point of view this would be not very reasonable: Kubernetes is complex and creating a Kubernetes cluster to run one simple app would mean doing unnecessary work. Further, you can deploy such an app using other, less expensive solutions. But there is an educational purpose that shouldn’t be overlooked. In undertaking such a deployment, you will learn how to run a Kubernetes cluster and deploy applications on it.

There is still one more practical and advanced scenario where we can use Kubernetes to deploy apps. Imagine we work in a creative agency that is developing a marketing webpage for a client in the pharmaceutical industry. Each medicine advertised on the main page requires a separate webpage where a leaflet of information about the medicine, its ingredients, dosage, possible adverse effects etc. Each medicine would also have a dedicated app. In this scenario, we would be well advised to call on the power of Kubernetes. Thanks to the better resource allocation it affords, it will be cheaper to run one dedicated K8s cluster than many separate servers for each website. What’s more, it will be much easier to manage such a cluster than to employ separate hosts.

Ebook Application networking in Kubernetes

2. Microservices architecture

A use case where you want to deploy a more complicated app with many components that will communicate with one another is a classic scenario for Kubernetes. In fact, its origins go back to Google deploying, managing and scaling apps in a more efficient way by using containers. That’s how the container orchestration platform Kubernetes was born. So, we now have a K8s cluster with one complicated app deployed. This app has numerous components that communicate with one another. Kubernetes helps you manage this communication.

This is closely related with another important trend in software development: microservice architecture, which I’ll explain using the example of an Internet bookstore. In such a store, we have different functionalities: manage users, order books, manage order lists, etc. There can be many such functionalities and each of them is a separate app. This is a practical realization which are aptly called microservices. All these apps must communicate with each other. To enable such communication and coordination, code must be written to conform with the programming language of each component.

Here you can clearly see the power of Kubernetes in managing microservices. It handles for developers such tasks as detecting problems with communication between the intra-app components, managing the behaviour of components in the event of a failure or managing the authentication processes between components. What’s more, as more or less resources are needed for a particular component, Kubernetes automatically scales them up or down. This is a clear advantage of the microservice architecture: scalability. You can scale a single component rather than the whole app.

Kubernetes has built in tools like Horizontal Pod Autoscaler      link-icon, which helps ensure that each microservice has the optimal number of replicas. Thank to this cluster operators can be sure that the application has enough resources to work smoothly but doesn’t waste valuable resources.

Of course, at the design stage, it has to be decided which architecture is better for a given app, as there are many different approaches to software development. Microservices are not always the best choice. Still, if microservice architecture is chosen, Kubernetes offers a number of advantages. It simplifies the entire process of managing app components and considerably reduces the work needed to get the app up and running.

3. Lift and shift—from servers to cloud

This scenario occurs frequently today, as software is migrated from on-prem infrastructure to cloud solutions. Let’s imagine the following situation. We have an application deployed on physical servers in a classical data center. For practical or economic reasons, it has been decided to move it to the cloud: either to a Virtual Machine or to big pods in Kubernetes. Of course, moving it to big pods in K8s isn’t a cloud native approach, but it can be treated as an intermediary phase. First, such a big app working outside the cloud is moved to the same big app in Kubernetes. It is then split into smaller components to become a regular cloud native-app. Such methodology is called “lift and shift” and is a good use case where Kubernetes can be used effectively.

4. Cloud-native Network Functions (CNF)

A few years ago, big telco companies had a problem. Their network services were based on hardware such as firewalls or load balancers provided by specialized hardware companies. Of course, this left them dependent on the hardware providers, and gave them little in the way of flexibility. If new functionality was needed, operators had to upgrade existing hardware. When a device firmware update was not possible, additional hardware had to be purchased. To address this disadvantage, the telcos opted to have all these network services as software and use Virtual Machines and OpenStack for network function virtualization (NFV).

A step further is to use containers rather than VMs for the same purpose. This approach is called Cloud-native Network Functions (CNF). Our R&D team has prepared a demo of CNFs deployment in the Kubernetes environment along with an in-depth discussion of related network and operational aspects. The results are now freely available as a Service Function Chaining for Cloud-native Network Functions webinar      link-icon on CodiLime’s YouTube channel. 

Are you wondering how to build CNFs? Check out this article for more details on how to build Cloud-native network functions using the Ligato framework.

Webinar Application networking in Kubernetes

5. Machine learning and Kubernetes

Machine learning (ML) techniques are now widely used to solve real-life problems. Successes have come in multiple fields--self-driving cars, image recognition, machine translation, speech recognition, game playing (Go      link-icon or poker      link-icon). ML models have beaten even humans in games like Go, which was once thought to be too difficult a game for machines to crack. Moreover, AI could lead to real breakthroughs in detecting cancer and drug discovery. The business world has not failed to get in on the technology, either.

Yet, the process of building an effective AI model and using it in production is complicated and time-consuming. Building an app that can reliably recognize whether an image presents a cat or a dog is a case in point. First of all, a large dataset of images tagged “cat” or “dog” must be uploaded. Then, an untrained machine learning model is trained to classify the data in mathematical terms; trained, that is, to recognize the images that are neither in the training nor in the test dataset. After the model is trained, it is implemented in an app that will be made available to the public.

As you can see, it takes time to use an AI-trained model in an application. Therefore, many companies would like to simplify this process and make the life of data scientists or ML engineers easier by introducing a toolkit to speed up the whole process. In this way, the number of operations necessary to deploy such an app will be significantly reduced, shortening the app’s time-to-market. In this scenario, enterprises can harness the power of Kubernetes, as all the calculations necessary to train the ML model are performed inside the K8s cluster. The data scientist or ML engineer will only need to clean the data and write the code. The rest will be handled by a toolkit based on Kubernetes. Such toolkits are already available on the market: Kubeflow      link-icon by Google and CodiLime spin-off Neptune      link-icon both come to mind. The increasing demand for AI-powered solutions will surely further promote the adoption of Kubernetes.

6. Computing power for resource-hungry tasks

Recently, a Swiss university, broke the Guinness World Record for computing Pi      link-icon by reaching 62.8 trillion decimal places. Such calculations require huge computing power, so a Kubernetes cluster would be a natural solution here to manage the distribution of the calculations across multiple computers. Were we to follow in Iwao’s footsteps, we would only need to write a program to perform the calculations. Kubernetes would handle the rest. Another computation-heavy case that could make use of the power of K8s is drug discovery.

7. CI/CD—software development lifecycle

Kubernetes also brings considerable benefits to Continuous Integration/Continuous Deployment or Continuous Delivery methodology (you can read more about CI/CD in our blog post). This is a logical continuation of the use cases presented in points 1 and 2. In the cloud-native application development approach, choosing CI/CD pipeline tools wisely is necessary as once an app is deployed into operations, its work has to be constantly monitored. That is in addition to gathering users’ feedback, developing new features, and other business benefits. Whether it is for testing, frequent releases or deploying newer versions of an app, Kubernetes makes everything simpler and more manageable.

Kubernetes can also be expanded with the use of Kubernetes Gateway API. Learn about examples of how you can use it in this video:

Conclusion 

Kubernetes, not without reason, from year to year is becoming more commonly used by IT professionals. This container orchestration solution works well both in private and public clouds and on-premises servers. 

K8s helps with the above-mentioned cases to ensure application stability - the software can be easily changed or updated without downtime. Kubernetes can also help with reducing your costs (if you have significant computer resources). 

Original post date 07/02/2019, update date 03/03/2022.

Sawicki Maciej

Maciej Sawicki

Technical Leader

Maciej is a member of the cloud development team. He manages data center operations and assets, and is also involved in setting IT policies, technical standards and methods. Maciej is also a certified GCP Cloud Architect, GCP trainer, Certified Kubernetes Application Developer and a member of Google...Read about author >
Rusinowicz Karolina

Karolina Rusinowicz

Content writer

A content writer with a passion for software development and a unique blend of creativity and technical expertise. Karolina has been crafting engaging and insightful articles in collaboration with seasoned developers. In her writing, Karolina breaks down complex technical concepts into accessible and...Read about author >

Read also

Get your project estimate

For businesses that need support in their software or network engineering projects, please fill in the form and we’ll get back to you within one business day.

For businesses that need support in their software or network engineering projects, please fill in the form and we’ll get back to you within one business day.

We guarantee 100% privacy.

Trusted by leaders:

Cisco Systems
Palo Alto Services
Equinix
Jupiter Networks
Nutanix