Sign in

Cloud Architect | AWS, GCP, Azure, Python, Kubernetes, Terraform, Ansible


  • Install python
  • Install aws-cli
which aws
aws --version
  • Install and configure git.
  • Setup any text editor like Sublime Text. Here I setup Sublime Text as the Git Mergetool.
  • Configure Visual Studio Code


  1. You need to update the requirements.txt to specify the Python modules that are required to deploy the Dataflow jobs using virtualenv in Cloud Shell.
  2. Paste the following list of modules into requirements.txt.
  • This list ensures that the correct Python modules will be installed to allow you to deploy the Python Dataflow jobs.
  • The list also includes the Faker modules and some dependencies that are required where you deploy and test a streaming Dataflow job.

3. Enter the following…


A service mesh is a dedicated infrastructure layer for handling service-to-service communication. It’s responsible for the reliable delivery of requests through the complex topology of services that comprise a modern, cloud native application.

Service mesh solutions have two distinct components that behave somewhat differently:

The data plane is composed of a set of intelligent proxies (Envoy) deployed as sidecars. These proxies mediate and control all network communication between microservices along with Mixer, a general-purpose policy and telemetry hub.

The control plane manages and configures the proxies to route traffic. …


Kubeflow provides a simple, portable, and scalable way of running Machine Learning workloads on Kubernetes.

In this module, we will install Kubeflow on Amazon EKS, run a single-node training and inference using TensorFlow, train and deploy model locally and remotely using Fairing, setup Kubeflow pipeline and review how to call AWS managed services such as Sagemaker for training and inference.

Increase cluster size

We need more resources for completing this chapter of the EKS Workshop. First, we’ll increase the size of our cluster to 6 nodes

This document will help you to try out EKS/Fargate deployment in a Cloud environment.

Create Cloud9 environment

Create Cloud9 environment and increase the disk size on the Cloud9 instance.

Note: The above command…


TCP: 53, 135, 389, 445 ,464, 636, 3268, 3269, 49152–65535
UDP: 53, 88, 135, 389, 445, 464, 636, 3268, 3269, 123, 137, 138

Purpose of the ports:

  • UDP Port 88 for Kerberos authentication
  • UDP and TCP Port 135 for domain controllers-to-domain controller and client to domain controller operations.
  • TCP Port 139 and UDP 138 for File Replication Service between domain controllers.
  • TCP and UDP Port 389 for LDAP to handle normal queries from client computers to the domain controllers.
  • TCP and UDP Port 445 for File Replication Service
  • TCP and UDP Port 464 for Kerberos Password Change
  • TCP Port 3268 and 3269 for Global…


  • Http pipelining is a technique in which multiple http requests are sent on a single TCP connection without waiting for the corresponding responses.
  • Http 1.1 supports pipelining. Only limitation was that server has to maintain the response order as requested by the client. If the client made 3 requests 1, 2 and 3 then server in its response will have to maintain the order 1, 2 and 3. If the request 1 takes longer time then it introduces the problem known as Head of line blocking.
  • In Http2, each request/response tuple is associated with a unique ID, and is called…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store