Sign in

Cloud Architect | AWS, GCP, Azure, Python, Kubernetes, Terraform, Ansible

Steps

  • Install python
brew install python
ls -l /usr/local/bin/python*
ln -s -f /usr/local/bin/python3.9 /usr/local/bin/python
  • Install aws-cli
curl “https://awscli.amazonaws.com/AWSCLIV2.pkg" -o “AWSCLIV2.pkg”
sudo installer -pkg AWSCLIV2.pkg -target /
which aws
aws --version
  • Install and configure git.
brew install git
git --version
git config --global user.name “<your username>”
git config --global user.email “<your mail id>”
git config --global credential.helper osxkeychain
  • Setup any text editor like Sublime Text. Here I setup Sublime Text as the Git Mergetool.
git config --global mergetool.sublime.cmd “subl -w \$MERGED”
git config --global mergetool.sublime.trustExitCode false
git config --global merge.tool sublime
git mergetool -y
  • Configure Visual Studio Code
brew install --cask visual-studio-code
echo…

Steps

  1. You need to update the requirements.txt to specify the Python modules that are required to deploy the Dataflow jobs using virtualenv in Cloud Shell.
  2. Paste the following list of modules into requirements.txt.
  • This list ensures that the correct Python modules will be installed to allow you to deploy the Python Dataflow jobs.
  • The list also includes the Faker modules and some dependencies that are required where you deploy and test a streaming Dataflow job.
vi requirements.txtapache-beam==2.14.0
google-api-core==1.14.2
google-apitools==0.5.28
google-auth==1.6.3
google-cloud==0.34.0
google-cloud-bigquery==1.17.0
google-cloud-bigtable==0.32.2
google-cloud-core==1.0.0
google-cloud-datastore==1.9.0
google-cloud-pubsub==0.42.1
google-cloud-storage==1.17.0
google-cloud-vision==0.38.0
httplib2==0.12.0
mock==2.0.0
numpy==1.17.0
six==1.12.0
Faker==2.0.0
faker-schema==0.1.4
Cython==0.29.13
fastavro==0.21.24

3. Enter the following…


Ports:

TCP: 53, 135, 389, 445 ,464, 636, 3268, 3269, 49152–65535
UDP: 53, 88, 135, 389, 445, 464, 636, 3268, 3269, 123, 137, 138

Purpose of the ports:

  • UDP Port 88 for Kerberos authentication
  • UDP and TCP Port 135 for domain controllers-to-domain controller and client to domain controller operations.
  • TCP Port 139 and UDP 138 for File Replication Service between domain controllers.
  • TCP and UDP Port 389 for LDAP to handle normal queries from client computers to the domain controllers.
  • TCP and UDP Port 445 for File Replication Service
  • TCP and UDP Port 464 for Kerberos Password Change
  • TCP Port 3268 and 3269 for Global…

Pipelining

  • Http pipelining is a technique in which multiple http requests are sent on a single TCP connection without waiting for the corresponding responses.
  • Http 1.1 supports pipelining. Only limitation was that server has to maintain the response order as requested by the client. If the client made 3 requests 1, 2 and 3 then server in its response will have to maintain the order 1, 2 and 3. If the request 1 takes longer time then it introduces the problem known as Head of line blocking.
  • In Http2, each request/response tuple is associated with a unique ID, and is called…


Steps

  1. You can configure test Events, for example — Viewer-Request and Origin-Response that allows you to do unit testing in AWS console.

2. You can dump the entire Event to CloudWatch log by adding the following line in your function→ convert event to json string with 4 space indentation.

console.log(‘Received event:’, JSON.stringify(event, null, 4));

3. For end to end testing, ensure you rename the target file in S3 that your browser points to. This ensures no issues with caching on browser or lambda@edge side that could cause the lambda function not being invoked.

4. The console.log output of the lambda function…


Red Hat Enterprise Linux (RHEL) / Amazon Linux

  1. Download the latest Qualys Cloud Agent installer from below.
https://qualysguard.qualys.com/am/help/sensors/cloud_agent.htm

2. Perform installation with the following command:

sudo rpm -ivh qualys-cloud-agent.x86_64.rpm

3. Activate Qualys Cloud Agent with the following command:

sudo /usr/local/qualys/cloud-agent/bin/qualys-cloud-agent.sh ActivationId=1032h37-dd20-4fde-93c8-2q3dwedae34 
CustomerId=aa9845nb0-6643-5564-8045-1234wsDDAS

4. Update Qualys Proxy setting and restart the service.

echo "qualys_https_proxy=\"http://<proxy-url>:1080\"" > /etc/sysconfig/qualys-cloud-agentsystemctl restart qualys-cloud-agent

5. For troubleshooting -> log path in Linux environment

/var/log/qualys/qualys-cloud-agent.log

Ubuntu

  1. Download the latest Qualys Cloud Agent installer from below.
https://qualysguard.qualys.com/am/help/sensors/cloud_agent.htm

2. Perform installation with the following command:

sudo dpkg --install QualysCloudAgent.deb

3. Activate Qualys Cloud Agent with the following command:

sudo /usr/local/qualys/cloud-agent/bin/qualys-cloud-agent.sh ActivationId=10feb827-dd20-4fde-93c8-q234dasds CustomerId=aa41eda0-6643-5564-8045-23edsdsdDs…

Steps

  1. Install Docker Desktop for Windows.
https://hub.docker.com/editions/community/docker-ce-desktop-windows

2. Install WSL2.

https://docs.microsoft.com/en-us/windows/wsl/install-win10

3. Make sure awscli is up to date.

https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2-windows.html

4. Run the following command to login to the ECR repo.

aws ecr get-login-password --region <aws-region> | docker login --username AWS --password-stdin <ecr-repo-name>

Or if you are using an older awscli, you can try

aws ecr get-login --no-include-email --region <aws-region> > ./run.sh

Then run the shell file — run.sh

5. If you encounter error in step 4 →

Error saving credentials: error storing credentials - err: exit status 1

then you need to rename the following exe file.

Arun Kumar

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store