Sign in

Cloud Architect | AWS, GCP, Azure, Python, Kubernetes, Terraform, Ansible

Overview

A service mesh is a dedicated infrastructure layer for handling service-to-service communication. It’s responsible for the reliable delivery of requests through the complex topology of services that comprise a modern, cloud native application.

Service mesh solutions have two distinct components that behave somewhat differently:

The data plane is composed of a set of intelligent proxies (Envoy) deployed as sidecars. These proxies mediate and control all network communication between microservices along with Mixer, a general-purpose policy and telemetry hub.

The control plane manages and configures the proxies to route traffic. …


Introduction

Kubeflow provides a simple, portable, and scalable way of running Machine Learning workloads on Kubernetes.

In this module, we will install Kubeflow on Amazon EKS, run a single-node training and inference using TensorFlow, train and deploy model locally and remotely using Fairing, setup Kubeflow pipeline and review how to call AWS managed services such as Sagemaker for training and inference.

Increase cluster size

We need more resources for completing this chapter of the EKS Workshop. First, we’ll increase the size of our cluster to 6 nodes

export NODEGROUP_NAME=$(eksctl get nodegroups --cluster eksworkshop-eksctl -o json | jq -r ‘.[0].Name’) …


This document will help you to try out EKS/Fargate deployment in a Cloud environment.

Create Cloud9 environment

Create Cloud9 environment and increase the disk size on the Cloud9 instance.

pip3.7 install — user — upgrade boto3
export instance_id=$(curl -s http://169.254.169.254/latest/meta-data/instance-id)
python -c “import boto3
import os
from botocore.exceptions import ClientError
ec2 = boto3.client(‘ec2’)
volume_info = ec2.describe_volumes(
Filters=[
{
‘Name’: ‘attachment.instance-id’,
‘Values’: [
os.getenv(‘instance_id’)
]
}
]
)
volume_id = volume_info[‘Volumes’][0][‘VolumeId’]
try:
resize = ec2.modify_volume(
VolumeId=volume_id,
Size=30
)
print(resize)
except ClientError as e:
if e.response[‘Error’][‘Code’] == ‘InvalidParameterValue’:
print(‘ERROR MESSAGE: {}’.format(e))”
if [ $? -eq 0 ]; then
sudo reboot
fi

Note: The above command…


Ports:

TCP: 53, 135, 389, 445 ,464, 636, 3268, 3269, 49152–65535
UDP: 53, 88, 135, 389, 445, 464, 636, 3268, 3269, 123, 137, 138

Purpose of the ports:

  • UDP Port 88 for Kerberos authentication
  • UDP and TCP Port 135 for domain controllers-to-domain controller and client to domain controller operations.
  • TCP Port 139 and UDP 138 for File Replication Service between domain controllers.
  • TCP and UDP Port 389 for LDAP to handle normal queries from client computers to the domain controllers.
  • TCP and UDP Port 445 for File Replication Service
  • TCP and UDP Port 464 for Kerberos Password Change
  • TCP Port 3268 and 3269 for Global…

Pipelining

  • Http pipelining is a technique in which multiple http requests are sent on a single TCP connection without waiting for the corresponding responses.
  • Http 1.1 supports pipelining. Only limitation was that server has to maintain the response order as requested by the client. If the client made 3 requests 1, 2 and 3 then server in its response will have to maintain the order 1, 2 and 3. If the request 1 takes longer time then it introduces the problem known as Head of line blocking.
  • In Http2, each request/response tuple is associated with a unique ID, and is called…


  1. You can configure test Events, for example — Viewer-Request and Origin-Response that allows you to do unit testing in AWS console.

2. You can dump the entire Event to cloudwatch log by adding the following line in your function→ convert event to json string with 4 space indentation.

console.log(‘Received event:’, JSON.stringify(event, null, 4));

3. For end to end testing, ensure you rename the target file in S3 that your browser points to. This ensures no issues with caching on browser or lambda@edge side that could cause the lambda function not being invoked.

4. The console.log output of the lambda function…


Red Hat Enterprise Linux (RHEL) / Amazon Linux

  1. Download the latest Qualys Cloud Agent installer from below.
https://qualysguard.qualys.com/am/help/sensors/cloud_agent.htm

2. Perform installation with the following command:

sudo rpm -ivh qualys-cloud-agent.x86_64.rpm

3. Activate Qualys Cloud Agent with the following command:

sudo /usr/local/qualys/cloud-agent/bin/qualys-cloud-agent.sh ActivationId=1032h37-dd20-4fde-93c8-2q3dwedae34 
CustomerId=aa9845nb0-6643-5564-8045-1234wsDDAS

4. Update Qualys Proxy setting and restart the service.

echo "qualys_https_proxy=\"http://<proxy-url>:1080\"" > /etc/sysconfig/qualys-cloud-agentsystemctl restart qualys-cloud-agent

5. For troubleshooting -> log path in Linux environment

/var/log/qualys/qualys-cloud-agent.log

Ubuntu

  1. Download the latest Qualys Cloud Agent installer from below.
https://qualysguard.qualys.com/am/help/sensors/cloud_agent.htm

2. Perform installation with the following command:

sudo dpkg --install QualysCloudAgent.deb

3. Activate Qualys Cloud Agent with the following command:

sudo /usr/local/qualys/cloud-agent/bin/qualys-cloud-agent.sh ActivationId=10feb827-dd20-4fde-93c8-q234dasds CustomerId=aa41eda0-6643-5564-8045-23edsdsdDs…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store