ChatGTP is booming and many developers are already exploring it. They evaluate how it can help them in their day-to-day tasks. Write code (snippets), find and fix errors, and suggest patterns for their challenges. Some people are extremely enthusiastic about it while others are still a bit sceptical or even a bit afraid. For this kind of new technology, it’s always nice to start some practical experiments yourself. To answer questions such as “In which way can it help you” and “Do I really benefit from its features”. From a personal point of view, I was curious about it myself. So I signed up and decided to seek the benefits of ChatGTP for my AWS-related projects. ChatGPT for AWS – what’s in it for you?
Context and goal
This article is from a practical point of view with concrete use cases and examples that AWS-minded professionals might face. I wanted to understand to which extent ChatGPT could help me correct errors, make proper suggestions for very specific questions, and also to come up with improvements for my current code snippets.

My primary focus was to deploy resources and simple “hello world like” applications. There is no sensitive data in my source code, all is based on non-critical workloads, so I did not have to take into account what’s being shared with systems I did not know about. This makes things a bit easier. If you intend to do the same, clear any sensitive data and business logic you don’t want to share outside the organization you’re working for.
Infrastructure as Code
AWS uses Cloudformation (templates) as its native method to deploy cloud resources. These templates can be either in JSON or YAML format.
Syntax validation
Every editor supports formatting and syntax validation of your source code. So two tests here.
The first template had the following snippet
PublicAccessBlockConfiguration: BlockPublicAcls: true BlockPublicPolicy: TRUES
ChatGPT corrected the snippet and came up with an explanation:
BlockPublicAcls: TRUES was changed to BlockPublicAcls: true (changed “TRUES” to “true”).
Great! Next, I uploaded a working Cloudformation template of about 150 lines. The template contained no syntax errors and other misconfigurations but some aspects which could be improved like indentation, repeated hard coded values for variables, and unneeded parameters here and there.
ChatGPT came up with the following feedback:
- Added !Ref to BucketName to use the provided ArtifactBucketName parameter.
- Removed Default from BucketName to allow the user to specify a value.
- Removed the Description from ArtifactBucketKMSKeyAlias and updated its AliasName as intended.
- Made minor formatting adjustments for readability.
So far so good, when deploying the template, it gave me the following error:
An error occurred (ValidationError) when calling the CreateStack operation: Template error: variable names in Fn::Sub syntax must contain only alphanumeric characters, underscores, periods, and colons
It turns out that it had replaced some static values with dynamic vars. After that, it could not evaluate the actual values (anymore). When I asked ChatGPT to help fix this problem, it gave me the following answer:
To resolve this issue, make sure that the variable names used in the Fn::Sub
function comply with the naming rules specified in the error message. Here are some best practices:
- Use only alphanumeric characters (letters and numbers).
- You can use underscores, periods, and colons.
By now it was a “best practice” instead of a strict rule that would produce an error. Strange.
Validation of an S3 bucket
I was wondering if ChatGPT could help me with some recent changes to help me set up a proper Access Control List which determines who can access the objects (files and directories) in my S3 bucket. My goal was to let ChatGPT correct the incorrect ACL configuration.
Input:
SampleBucket: Type: 'AWS::S3::Bucket' Properties: BucketName: !Ref SampleBucketName AccessControl: Private PublicAccessBlockConfiguration: BlockPublicAcls: true BlockPublicPolicy: true IgnorePublicAcls: true RestrictPublicBuckets: true
Error:
Bucket cannot have ACLs set with ObjectOwnership’s BucketOwnerEnforced setting (Service: Amazon S3; Status Code: 400; Error Code: InvalidBucketAclWithObjectOwnership;
Output:
SampleBucket: Type: AWS::S3::Bucket Properties: BucketName: !Ref SampleBucketName AccessControl: BucketOwnerFullControl
As you can see it corrected my ACL by changing the property for AccessControle and completely removing the “ PublicAccessBlockConfiguration” element. This fixed the issue.
Secure an EC2-instance
Let’s aim higher. Let ChatGTP evaluate some (intended) bad practices when setting up an EC2 instance. I was curious to see which security flaws ChatGTP would find when scanning the CFT template. The sections in bold are the ones that need some attention.

Input:
AWSTemplateFormatVersion: 2010-09-09 Description: >- Simple template to create an EC2 instance in AWS. Parameters: KeyName: Description: Name of an existing EC2 KeyPair to enable SSH access to the instance Type: 'AWS::EC2::KeyPair::KeyName' ConstraintDescription: must be the name of an existing EC2 KeyPair. InstanceType: Description: WebServer EC2 instance type Type: String Default: t2.large AllowedValues: - t1.micro ConstraintDescription: must be a valid EC2 instance type. SSHLocation: Description: The IP address range that can be used to SSH to the EC2 instances Type: String MinLength: '9' MaxLength: '18' Default: 0.0.0.0/0 AllowedPattern: '(\d{1,3})\.(\d{1,3})\.(\d{1,3})\.(\d{1,3})/(\d{1,2})' ConstraintDescription: must be a valid IP CIDR range of the form x.x.x.x/x. Mappings: AWSInstanceType2Arch: t1.micro: Arch: HVM64 AWSRegionArch2AMI: eu-central-1: HVM64: ami-0474863011a7d1541 Resources: EC2Instance: Type: 'AWS::EC2::Instance' Properties: InstanceType: !Ref InstanceType SecurityGroups: - !Ref InstanceSecurityGroup KeyName: !Ref KeyName ImageId: !FindInMap - AWSRegionArch2AMI - !Ref 'AWS::Region' - !FindInMap - AWSInstanceType2Arch - !Ref InstanceType - Arch InstanceSecurityGroup: Type: 'AWS::EC2::SecurityGroup' Properties: GroupDescription: Enable SSH access via port 22 SecurityGroupIngress: - IpProtocol: tcp FromPort: '10' ToPort: '1000' CidrIp: !Ref SSHLocation
When evaluating this template, ChatGPT came up with the following suggestions. I selected the top 5 here.
- Restrict SSH access to a more specific IP range. The selected address (0.0.0.0/0) is too wide which basically opens the door to more people/systems than needed.
- The recommended port for SSH services is 22. My port (range) was set between 10 to 1000 which is not according to the standard SSH port.
- Limit access to instances by specifying the KeyName parameter. This enforces the use of an existing EC2 KeyPair for SSH access.
- Consider creating the EC2 instances within a Virtual Private Cloud (VPC) for better network isolation.
- Define an IAM role and attach it to the EC2 instance if the instance needs to access other AWS services.
Besides these, there were some more generic tips like adding logging and monitoring for the EC2 instance, implementing backup and recovery mechanisms as well as updating & hardening the Amazon Machine Images. I’m impressed by the results, especially the tips which are very concrete and require a direct improvement of my YAML file. Most of the tips are fixed in a jiffy, but you still need to carefully evaluate them and compare them against the company policies that you might need to adhere to.
Serverless architecture
One step further was to explore the possibilities of creating a fully working serverless architecture. My question would be:
Create a cloudformation template with an API gateway that connects to a lambda function that connects to a dynamoDB. The s3 function should be written in nodeJS and stored in an S3 bucket. Enable cloudwatch logging for the s3 function.
Within a second, ChatGPT started to generate the Cloudformation template. It contained all of the requested resources. When I loaded the YAML file into the Editor of the AWS console. It generated a nice graphical view of the various components.
However, the template failed to validate. One of the errors came from the MyApiMethodResponse resource. AWS could not find this type. The original (working) template contained the MethodResponses property as part of the resource of type “AWS::ApiGateway::Method”. Perhaps those items were being mixed up or AWS has (recently) changed the actual specification. To be sure, I checked the documentation of AWS. The “MethodResponse” type exists.
NodeJS version
Besides this issue, it turned out that the chosen version of NodeJS was out of date. Although version 12.x is an LTS version and should be supported, AWS won’t let me deploy my Lambda function. By now version 14.x is the preferred one. Asking chatGPT reveals:
What is the recommended version of nodeJS for lambda functions in AWS?
A prompt reply of ChatGPT:
As of my last knowledge update in September 2021, here are the Node.js runtime versions that were supported by AWS Lambda:
- Node.js 14.x (Recommended): Node.js 14 was the Long-Term Support (LTS) version of Node.js at the time, and AWS Lambda typically supports the current LTS version.
- Node.js 12.x: Node.js 12 was also a LTS version, and AWS Lambda provided support for it.
- Node.js 10.x: Node.js 10 was a previous LTS version that was also supported on AWS Lambda. However, it was approaching its end of life in terms of Node.js support.
This is due to the fact that the contents of ChatGPT are ~2 years old. Still, you could use the answer to improve this minor misconfiguration.
Besides the above-mentioned tips, ChatGPT provided multiple ways to actually create the NodeJS function. As an example, the AWS CLI version is shown here:
aws lambda create-function
--function-name FetchUserFunction
--runtime nodejs14.x
--handler index.handler
--role arn:aws:iam::your-account-id:role/your-execution-role
--zip-file fileb://./index.js.zip
It easily lets you replace the default values and parameters so this is quickly turned into a working command.
Fetching users
Once you have deployed your resources correctly, it’s time to evaluate the application perspective itself. In the example above, you would connect to a DynamoDB database and fetch a record. This leads to the following question:
Create a nodeJS AWS lambda function that fetches a userID from dynamodb
Great to see that ChatGPT actually produced valuable code to get started:
const AWS = require('aws-sdk'); // Configure the AWS SDK with your region AWS.config.update({ region: 'us-east-1' }); const dynamodb = new AWS.DynamoDB.DocumentClient(); const TableName = 'YourDynamoDBTableName'; exports.handler = async (event) => { // Extract the user ID from the Lambda event const { userId } = event; // Define the parameters for the DynamoDB query const params = { TableName, Key: { userId }, }; try { const data = await dynamodb.get(params).promise(); if (data.Item) { // User found in DynamoDB return { statusCode: 200, body: JSON.stringify(data.Item), }; }
It greatly reflects on how this should be done based on some of AWS’ official manuals.
Besides this NodeJS code snippet, chatGPT also tells you how to create an IAM policy that connects to your function. This uses the “least privilege principle” by only allowing the “GetItem” permission. So this is great.
At the end, ChatGPT presents some guidance on how you would test this function using the AWS console. This makes validating the generated source code a bit easier.

Wrap up
Developers everywhere in the world seek ways to explore ChatGPT for their day-to-day programming work. The main focus of this article was to evaluate the usefulness of ChatGPT for AWS. Especially to help build correct infrastructure resources based on Cloud Formation Template. Although ChatGPT does do a good job, you need to finetune the results. Sometimes, the issues are crystal clear, but there are also issues that need more time to investigate.
ChatGPT can help with formatting issues or syntax errors as well as produce the “raw outlines” for a complete architecture. To make things actually work, you need to understand the inside out of CFT since not all come error-free. Also, be sure to remember that ChatGPT uses the data of 2021, so you’re a year behind the actual situation. All in all, it’s a great tool to help, but you still need to keep an eye on the results to get the most out of it.