Top 5 ChatGPT use cases in AWS Solution Architecture role
Welcome
Hey there, my friends! Today, I want to share with you the top 5 use cases of ChatGPT in my daily job. But before we dive in, let me tell you a little story. Recently, I made the switch from Obsidian to Logseq, for note-taking, and it got me into the habit of preferring shorter form in general. So, let’s get straight to the point and explore how ChatGPT can help us save time and make our lives easier!
Tools Used in This Episode
Before we begin, let’s quickly go over the tools we’ll be using for these use cases:
- AWS CLI: The command-line interface for interacting with AWS services
- ChatGPT: Our trusty AI assistant
- CloudFormation: Native AWS Infrastructure as a Code tool
- ZSH: A powerful shell with cool features
Now, let’s jump into the exciting world of ChatGPT!
Use Case 1: Getting Subnets
Imagine you’re responsible for writing some Landing Zone handover documentation. You have to list numerous subnets across multiple AWS accounts. It can be quite tedious and time-consuming. But fear not! ChatGPT is here to save the day.
To generate a Markdown table listing all the subnets in a specific VPC, use the following command:
List in the form of a Markdown table all the subnets in VPC ID x.
Display the Subnet ID, CIDR Block, and the tag 'Name' in a
column named 'Name'. Use the AWS CLI.
Format the tags as a string, not an array.
Which generate:
aws ec2 describe-subnets \
--filters "Name=vpc-id,Values=x" \
--query 'Subnets[*].{SubnetId: SubnetId, CidrBlock: CidrBlock, Tags: Tags[?Key==`Name`].Value | [0]}' \
--output text | awk 'BEGIN{print "| Subnet ID | CIDR Block | Name |"} {printf("| %-12s | %-11s | %-4s |\n", $1, $2, $3)}'
The result? Ta-da! A neatly formatted table with all the necessary details.
Subnet ID | CIDR Block | Name |
---|---|---|
subnet-0a703b1877c7d6b3a | 10.201.10.0/26 | |
subnet-Application-1-primary-vpc | ||
subnet-0e8f05f1b6c45ec51 | 10.201.10.192/26 | |
subnet-Data-2-primary-vpc | ||
subnet-0fd362daf67578490 | 10.201.10.128/26 | |
subnet-Data-1-primary-vpc | ||
subnet-0ef10e7b79ad98de5 | 10.201.10.64/26 | |
subnet-Application-2-primary-vpc | ||
subnet-0d931e05c320692eb | 10.201.8.0/24 | |
subnet-Public-1-primary-vpc | ||
subnet-06144454dbd44ed21 | 10.201.9.0/24 | |
subnet-Public-2-primary-vpc |
Use Case 2: Connect to running Instances
Do you prefer using AWS Systems Manager (SSM) instead of SSH for accessing instances? If so, here’s a handy alias for you:
alias ssm="aws ssm start-session --target"
But to use this alias, you need a list of running EC2 instances with their IDs. ChatGPT has your back once again!
To generate a Markdown table listing all the running EC2 instances along with their instance IDs, status, and tags (env and project), use the following prompt:
List all running EC2 instances, get instanceid,
status, and tags: env, project.
Based on that build markdown table, which will display
following columns id, status, env, project and fill
it with gathered data. Get tags as string, not array
Generated output was:
aws ec2 describe-instances \
--filters "Name=instance-state-name,Values=running" \
--query 'Reservations[].Instances[].[InstanceId, State.Name, join(`,`,Tags[?Key==`env`].Value), join(`,`,Tags[?Key==`project`].Value)]' \
--output text | awk -F'\t' '{print "| " $1 " | " $2 " | " $3 " | " $4 " |"}'
Voila! You now have a table displaying the essential information about your running instances.
| i-0f9c122fccddaf11f | running | dev | sectree |
| i-0f9c1991ddde81fff | running | dev | sectree |
| i-00dd122f3ade91dds | running | dev | sectree |
This solution as one minus, it’s hard to build an zsh alias with it.. however we can use fzf here, and make it a bit easier.
Use Case 3: Documentation
Sometimes, we need to write documentation for internal purposes, especially when it comes to S3-like modules. ChatGPT can be a fantastic help in such situations!
To generate README-style documentation on the basic usage of the AWS CLI S3 command, simply ask ChatGPT:
Generate readme style documentation
about basic usage of aws cli s3 command. Print is as markdown
And just like magic, you’ll have a comprehensive guide at your fingertips, covering installation, configuration, and basic usage of the AWS CLI S3 command. Which looks like that:
# AWS CLI S3 Command - README
This document provides an overview of the basic usage of the AWS Command Line Interface (CLI) S3 command. The AWS CLI allows you to interact with Amazon Simple Storage Service (S3), a highly scalable cloud storage service provided by Amazon Web Services (AWS).
## Installation
To use the AWS CLI, you need to have it installed on your local machine. Follow these steps to install the AWS CLI:
1. Ensure that you have Python 3 installed on your machine. You can check this by running the command `python3 --version` in your terminal.
2. Install the AWS CLI by running the following command in your terminal:
`pip3 install awscli`
3. Once the installation is complete, you can verify it by running the command `aws --version`. It should display the version of the AWS CLI installed.
## Configuration
Before using the AWS CLI S3 command, you need to configure it with your AWS credentials. Follow these steps to configure the AWS CLI:
1. Obtain your AWS access key ID and secret access key from the AWS Management Console.
2. Run the following command in your terminal:
`aws configure`
3. When prompted, enter your AWS access key ID, secret access key, default region name, and default output format. The default region name is the AWS region where you want to perform your S3 operations (e.g., `us-east-1`).
4. Once the configuration is complete, the AWS CLI is ready to use.
## Basic Usage
The AWS CLI S3 command provides a set of subcommands to interact with S3 buckets and objects. Here are some common examples of how to use the AWS CLI S3 command:
### 1. List Buckets
To list all the S3 buckets in your AWS account, use the following command:
`aws s3 ls`
### 2. List Objects in a Bucket
To list all the objects in a specific S3 bucket, use the following command:
`aws s3 ls s3://bucket-name`
Replace `bucket-name` with the name of the bucket you want to list objects from.
### 3. Upload a File to a Bucket
To upload a file to an S3 bucket, use the following command:
`aws s3 cp /path/to/local/file s3://bucket-name/key`
Replace `/path/to/local/file` with the local file path you want to upload, `bucket-name` with the name of the target bucket, and `key` with the desired key/name for the uploaded file in S3.
### 4. Download a File from a Bucket
To download a file from an S3 bucket to your local machine, use the following command:
`aws s3 cp s3://bucket-name/key /path/to/local/file`
Replace `bucket-name` with the name of the source bucket, `key` with the key/name of the file in S3, and `/path/to/local/file` with the local file path where you want to save the downloaded file.
### 5. Delete a File from a Bucket
To delete a file from an S3 bucket, use the following command:
`aws s3 rm s3://bucket-name/key`
Replace `bucket-name` with the name of the bucket containing the file to delete, and `key` with the key/name of the file in S3.
## Additional Resources
For more information on the AWS CLI S3 command and its options, you can refer to the official AWS CLI documentation:
- [AWS CLI S3 Command Reference](https://docs.aws.amazon.com/cli/latest/reference/s3/index.html)
Use Case 4: Improving Security
Are you looking to improve the security and quality of your AWS CloudFormation templates? Look no further! ChatGPT can provide you with some insightful suggestions.
Let’s say you have a CloudFormation template for an EC2 security group that could use some hardening. Ask ChatGPT to improve its security and quality, providing the template as input. You’ll receive an enhanced version that priorities security best practices. With prompt like this:
Improve security and quality of fallowing AWS CloudFormation template:
Resources:
MySecurityGroup:
Type: AWS::EC2::SecurityGroup
Properties:
GroupDescription: My Security Group
SecurityGroupIngress:
- CidrIp: 0.0.0.0/0
- IpProtocol: -1
SecurityGroupEgress:
- CidrIp: 0.0.0.0/0
- IpProtocol: -1
Our result will be:
Resources:
WebServerSecurityGroup:
Type: AWS::EC2::SecurityGroup
Properties:
GroupDescription: Security Group for Web Server
SecurityGroupIngress:
- CidrIp: 10.0.0.0/24
- IpProtocol: tcp
- FromPort: 22
- ToPort: 22
SecurityGroupEgress:
- CidrIp: 0.0.0.0/0
- IpProtocol: tcp
- FromPort: 80
- ToPort: 443
Use Case 5: Landing Zone Testing
Have you ever found yourself in need of testing a specific environment? It’s a crucial step in post-implementation checks. But fear not, my fellow adventurers, for I bring you the power of ChatGPT, the friendly AI assistant who’s always here to help!
Imagine this scenario: you want to spin up an AWS CloudFormation template that creates a t2.micro instance in a public subnet, another instance in a private subnet, an ALB (Application Load Balancer), and connects the ALB to the instance in the private subnet. Sounds complex, right? But with ChatGPT by your side, it’s a breeze!
Here’s the magical CloudFormation template that ChatGPT conjured up for us with usage of prompt:
Generate AWS Cloudformation template that will spin one t2.micro
instance in public subnet, second in private subnet,
add ALB and attach it to instance in private subnet.
Additionally: Treat subnets ids as parameters. Assume
that networking part is done, already Install nginx on
instance in private subnet Get ALB DNS as output variable,
and based on it generate curl command
ChatGPT response:
AWSTemplateFormatVersion: "2010-09-09"
Parameters:
PublicSubnetId:
Type: String
Description: ID of the public subnet
PrivateSubnetId:
Type: String
Description: ID of the private subnet
Resources:
PublicInstance:
Type: "AWS::EC2::Instance"
Properties:
InstanceType: t2.micro
ImageId: ami-xxxxxxxx # Replace with your desired AMI ID
SubnetId: !Ref PublicSubnetId
PrivateInstance:
Type: "AWS::EC2::Instance"
Properties:
InstanceType: t2.micro
ImageId: ami-xxxxxxxx # Replace with your desired AMI ID
SubnetId: !Ref PrivateSubnetId
UserData:
Fn::Base64: !Sub |
#!/bin/bash
yum update -y
yum install -y nginx
service nginx start
chkconfig nginx on
LoadBalancer:
Type: "AWS::ElasticLoadBalancingV2::LoadBalancer"
Properties:
Name: MyLoadBalancer
Subnets:
- !Ref PublicSubnetId
SecurityGroups:
- !Ref LoadBalancerSecurityGroup
Type: application
TargetGroup:
Type: "AWS::ElasticLoadBalancingV2::TargetGroup"
Properties:
Name: MyTargetGroup
Port: 80
Protocol: HTTP
TargetType: instance
Targets:
- Id: !Ref PrivateInstance
Port: 80
Listener:
Type: "AWS::ElasticLoadBalancingV2::Listener"
Properties:
LoadBalancerArn: !Ref LoadBalancer
Port: 80
Protocol: HTTP
DefaultActions:
- Type: forward
TargetGroupArn: !Ref TargetGroup
Outputs:
ALBDNS:
Value: !GetAtt LoadBalancer.DNSName
Description: DNS name of the Application Load Balancer
Isn’t it amazing? This template will set up the instances, install Nginx on the private instance, create the ALB, and configure everything for you. Just provide the subnet IDs as parameters, and you’re good to go!
But wait, there’s more! ChatGPT even generated a handy curl command for you to test the ALB. Simply deploy the CloudFormation stack, and then use the ALB DNS name in the following command:
$ ALB_DNS=$(aws cloudformation describe-stacks --stack-name <stack-name> --query 'Stacks[0].Outputs[?OutputKey==`ALBDNS`].OutputValue' --output text)
$ curl http://${ALB_DNS}
Summary
In this blog post, we explore the question of whether we should fear AI, particularly Generative AI. In my opinion, there is no need for fear—at least not yet. Instead, I see AI, specifically ChatGPT, as a friend in need rather than an enemy. The post challenges the notion that AI, like an Articician Intelligent system, can replace Solution Architects, developers, and designers entirely. Through personal experiences, the I would like to highlights the importance of learning how to effectively communicate with AI prompts and utilize them for the right purposes. Ultimately, the I believes that AI serves as an invaluable assistant, capable of tackling tedious tasks that were previously too expensive or time-consuming for humans.