Sign in

Cloud Architect | AWS, GCP, Azure, Python, Kubernetes, Terraform, Ansible


  • Install python
brew install python
ls -l /usr/local/bin/python*
ln -s -f /usr/local/bin/python3.9 /usr/local/bin/python
  • Install aws-cli
curl “" -o “AWSCLIV2.pkg”
sudo installer -pkg AWSCLIV2.pkg -target /
which aws
aws --version
  • Install and configure git.
brew install git
git --version
git config --global “<your username>”
git config --global “<your mail id>”
git config --global credential.helper osxkeychain
  • Setup any text editor like Sublime Text. Here I setup Sublime Text as the Git Mergetool.
git config --global mergetool.sublime.cmd “subl -w \$MERGED”
git config --global mergetool.sublime.trustExitCode false
git config --global merge.tool sublime
git mergetool -y
  • Configure Visual Studio Code
brew install --cask visual-studio-code


  1. You need to update the requirements.txt to specify the Python modules that are required to deploy the Dataflow jobs using virtualenv in Cloud Shell.
  2. Paste the following list of modules into requirements.txt.
  • This list ensures that the correct Python modules will be installed to allow you to deploy the Python Dataflow jobs.
  • The list also includes the Faker modules and some dependencies that are required where you deploy and test a streaming Dataflow job.
vi requirements.txtapache-beam==2.14.0

3. Enter the following…


TCP: 53, 135, 389, 445 ,464, 636, 3268, 3269, 49152–65535
UDP: 53, 88, 135, 389, 445, 464, 636, 3268, 3269, 123, 137, 138

Purpose of the ports:

  • UDP Port 88 for Kerberos authentication
  • UDP and TCP Port 135 for domain controllers-to-domain controller and client to domain controller operations.
  • TCP Port 139 and UDP 138 for File Replication Service between domain controllers.
  • TCP and UDP Port 389 for LDAP to handle normal queries from client computers to the domain controllers.
  • TCP and UDP Port 445 for File Replication Service
  • TCP and UDP Port 464 for Kerberos Password Change
  • TCP Port 3268 and 3269 for Global…


  • Http pipelining is a technique in which multiple http requests are sent on a single TCP connection without waiting for the corresponding responses.
  • Http 1.1 supports pipelining. Only limitation was that server has to maintain the response order as requested by the client. If the client made 3 requests 1, 2 and 3 then server in its response will have to maintain the order 1, 2 and 3. If the request 1 takes longer time then it introduces the problem known as Head of line blocking.
  • In Http2, each request/response tuple is associated with a unique ID, and is called…


  1. You can configure test Events, for example — Viewer-Request and Origin-Response that allows you to do unit testing in AWS console.

2. You can dump the entire Event to CloudWatch log by adding the following line in your function→ convert event to json string with 4 space indentation.

console.log(‘Received event:’, JSON.stringify(event, null, 4));

3. For end to end testing, ensure you rename the target file in S3 that your browser points to. This ensures no issues with caching on browser or lambda@edge side that could cause the lambda function not being invoked.

4. The console.log output of the lambda function…

Red Hat Enterprise Linux (RHEL) / Amazon Linux

  1. Download the latest Qualys Cloud Agent installer from below.

2. Perform installation with the following command:

sudo rpm -ivh qualys-cloud-agent.x86_64.rpm

3. Activate Qualys Cloud Agent with the following command:

sudo /usr/local/qualys/cloud-agent/bin/ ActivationId=1032h37-dd20-4fde-93c8-2q3dwedae34 

4. Update Qualys Proxy setting and restart the service.

echo "qualys_https_proxy=\"http://<proxy-url>:1080\"" > /etc/sysconfig/qualys-cloud-agentsystemctl restart qualys-cloud-agent

5. For troubleshooting -> log path in Linux environment



  1. Download the latest Qualys Cloud Agent installer from below.

2. Perform installation with the following command:

sudo dpkg --install QualysCloudAgent.deb

3. Activate Qualys Cloud Agent with the following command:

sudo /usr/local/qualys/cloud-agent/bin/ ActivationId=10feb827-dd20-4fde-93c8-q234dasds CustomerId=aa41eda0-6643-5564-8045-23edsdsdDs…


  1. Install Docker Desktop for Windows.

2. Install WSL2.

3. Make sure awscli is up to date.

4. Run the following command to login to the ECR repo.

aws ecr get-login-password --region <aws-region> | docker login --username AWS --password-stdin <ecr-repo-name>

Or if you are using an older awscli, you can try

aws ecr get-login --no-include-email --region <aws-region> > ./

Then run the shell file —

5. If you encounter error in step 4 →

Error saving credentials: error storing credentials - err: exit status 1

then you need to rename the following exe file.

Arun Kumar

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store