Introduction
In my previous post Developing and Testing AWS Lambda Functions Locally with SAM CLI, we explored the foundations of developing and testing Lambda functions locally using SAM CLI. Now, let’s continue by creating a Lambda function that automates the generation of daily AWS cost reports.
Amazon Web Services, the AWS Lambda logo, and other AWS service marks are trademarks of Amazon.com, Inc. or its affiliates.

Ressources
- GitHub: DevOpsBug/AWS_DailyCostReport.
Motivation
Since I am just an individual with a personal account I want to make absolutely sure to avoid unexpected cost and prevent cost from skyrocketing. The primary way that AWS lets you monitor your cost is through budgets, which will notify you if your spending exceeds a certain amount. However, if you are like me and you want more frequent and more detailed insight into your AWS Cost, budgets aren’t good enough. I wanted to get daily updates on the actual cost incurred in my AWS Account.

Overview
Fortunately the AWS Cost Explorer offers an API that we can programmatically access using the AWS SDK. We can leverage this inside a lambda function to generate a cost report. Keeping track of expenses is very important to avoid unexpected charges and preventing the cost from sky-rocketing. By setting up automated daily cost reports, you can monitor your spending patterns and make timely adjustments to stay within your budget. This proactive approach ensures that you have full visibility into your cloud usage, allowing you to optimize resources and prevent any unpleasant surprises at the end of the month. It’s a simple yet effective way to maintain control over your personal cloud expenditures.

Architecture
We will develop a Lambda Function locally, to generate the daily cost reports, we will deploy the lambda function using SAM CLI. Then we will create an EventBridge Trigger which will invoke our lambda function once a day. The lambda function will query the Cost Explorer API and generate the Cost Report, which it will then send into an SNS Topic. Lastly we will create an E-Mail Subscription to the SNS Topic which will send us these Cost Reports by E-Mail.

Prerequisites
Before we begin, ensure you have the following:
- AWS Account: Active account with appropriate permissions.
- AWS CLI: Installed and configured on your local machine.
- AWS SDK for Python (Boto3): Installed in your development environment.
- Amazon SES: Verified email addresses for sending and receiving reports.

Preparing the SAM Project
In the previous post Developing and Testing AWS Lambda Functions Locally with SAM CLI we have setup a HelloWorld SAM Project with a Lambda Function for local testing. Let’s take this project and modify it so that it can generate daily Cost Reports.
- Right now, the lambda function looks like this:
[local-vm AWS_DailyCostReport]$ cat hello_world/app.py
import json
def lambda_handler(event, context):
return {
"statusCode": 200,
"body": json.dumps({
"message": "hello world",
}),
}
- And the sam template looks like this:
[local-vm AWS_DailyCostReport]$ cat template.yaml
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
AWS_DailyCostReport
Sample SAM Template for AWS_DailyCostReport
Globals:
Function:
Timeout: 3
MemorySize: 128
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: hello_world/
Handler: app.lambda_handler
Runtime: python3.9
Architectures:
- x86_64
Outputs:
HelloWorldFunction:
Description: "Hello World Lambda Function ARN"
Value: !GetAtt HelloWorldFunction.Arn
HelloWorldFunctionIamRole:
Description: "Implicit IAM Role created for Hello World function"
Value: !GetAtt HelloWorldFunctionRole.Arn

Refactor SAM Project
First let’s change the name from HelloWorld to something more descriptive like CostReport
sed -i s'#HelloWorld#CostReport#' template.yaml
sed -i s'#Hello World#Cost Report#' template.yaml
I also want to change the directory of the lambda function from hello_world to lambda:
sed -i s'#hello_world#lambda#' template.yaml
mv hello_world/ lambda/
So now the project structure looks like this:
.
├── events
│ └── event.json
├── __init__.py
├── lambda
│ ├── app.py
│ ├── __init__.py
│ └── requirements.txt
├── README.md
├── samconfig.toml
├── template.yaml
└── tests
├── __init__.py
├── integration
│ ├── __init__.py
│ └── test_api_gateway.py
├── requirements.txt
└── unit
├── __init__.py
└── test_handler.py

and the sam template looks like this:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
AWS_DailyCostReport
Sample SAM Template for AWS_DailyCostReport
Globals:
Function:
Timeout: 3
MemorySize: 128
Resources:
CostReportFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: lambda/
Handler: app.lambda_handler
Runtime: python3.9
Architectures:
- x86_64
Outputs:
CostReportFunction:
Description: "Cost Report Lambda Function ARN"
Value: !GetAtt CostReportFunction.Arn
CostReportFunctionIamRole:
Description: "Implicit IAM Role created for Cost Report function"
Value: !GetAtt CostReportFunctionRole.Arn

Create Lambda Function to generate Cost Reports
We will use the boto3 Cost Explorer Client boto3.client('ce')
for interacting with the Cost Explorer API. You can checkout the AWS Documentation for a full reference. For our purposes it has three required parameters:
- TimePeriod:
- the time period for which to calculate the cost
- Granularity:
- the level of detail for the cost. Since we are generating the report once per day we will set this to
'DAILY'
- the level of detail for the cost. Since we are generating the report once per day we will set this to
- Metrics:
- we want to know the actual cost on our bill, so we will set this to
'UnblendedCost'
- we want to know the actual cost on our bill, so we will set this to
If you are only interested in the total cost, you can call the function like this:
response = client.get_cost_and_usage(
TimePeriod={
'Start': start_date.strftime('%Y-%m-%d'),
'End': end_date.strftime('%Y-%m-%d')
},
Granularity='DAILY',
Metrics=['UnblendedCost']
)

If you are interested in a breakdown by service you add the GroupBy
parameter like this:
response = client.get_cost_and_usage(
TimePeriod={
'Start': start_date.strftime('%Y-%m-%d'),
'End': end_date.strftime('%Y-%m-%d')
},
Granularity='DAILY',
Metrics=['UnblendedCost'],
GroupBy=[
{
'Type': 'DIMENSION',
'Key': 'SERVICE'
}
]
)

The get_cost_and_usage function responds with a complex JSON similar to this:
{
"GroupDefinitions": [
{
"Type": "DIMENSION",
"Key": "SERVICE"
}
],
"ResultsByTime": [
{
"TimePeriod": {
"Start": "2025-01-01",
"End": "2025-01-30"
},
"Total": {},
"Groups": [
{
"Keys": [
"AWS Lambda"
],
"Metrics": {
"UnblendedCost": {
"Amount": "0.0000040168",
"Unit": "USD"
}
}
},
{
"Keys": [
"Amazon DynamoDB"
],
"Metrics": {
"UnblendedCost": {
"Amount": "0.000308726",
"Unit": "USD"
}
}
},
{
"Keys": [
"AWS WAF"
],
"Metrics": {
"UnblendedCost": {
"Amount": "1.1763187132",
"Unit": "USD"
}
}
},
{
"Keys": [
"Amazon Simple Storage Service"
],
"Metrics": {
"UnblendedCost": {
"Amount": "0.1594008089",
"Unit": "USD"
}
}
},
{
"Keys": [
"Tax"
],
"Metrics": {
"UnblendedCost": {
"Amount": "0.42",
"Unit": "USD"
}
}
}
],
"Estimated": true
}
],
"DimensionValueAttributes": [],
"ResponseMetadata": {
"RequestId": "07512ff0-5538-49a0-9140-8da56d49e629",
"HTTPStatusCode": 200,
"HTTPHeaders": {
"date": "Thu, 30 Jan 2025 15:39:43 GMT",
"content-type": "application/x-amz-json-1.1",
"content-length": "1912",
"connection": "keep-alive",
"x-amzn-requestid": "07512ff0-5538-49a0-9140-8da56d49e629",
"cache-control": "no-cache"
},
"RetryAttempts": 0
}
}

It is important to note is that since we are using Granularity='MONTHLY'
and only querying one month, the ResultsByTime
array will only hold one item. If you set the Granularity to DAILY
you will get one list item for each day. And each element (in our case its only one), will hold another array Groups
which contains a JSON for each service.
So to process this JSON we need to iterate over ResultsByTime
and Groups
to extract the information we want.
Adding some logic to extract the cost data and filter out all services with zero cost the finished Lambda Function looks like this:
import boto3
import os
from datetime import datetime
def lambda_handler(event, context):
client = boto3.client('ce')
# Get first day of current month and today's date
end_date = datetime.now().date() #current date
start_date = end_date.replace(day=1) #first of month
response = client.get_cost_and_usage(
TimePeriod={
'Start': start_date.strftime('%Y-%m-%d'),
'End': end_date.strftime('%Y-%m-%d')
},
Granularity='MONTHLY',
Metrics=['UnblendedCost'],
GroupBy=[
{
'Type': 'DIMENSION',
'Key': 'SERVICE'
}
]
)
# Process and format the response
cost_data = []
total_cost = 0.00
for result in response['ResultsByTime']:
for group in result['Groups']:
service_name = group['Keys'][0]
amount = float(group['Metrics']['UnblendedCost']['Amount'])
unit = group['Metrics']['UnblendedCost']['Unit']
total_cost += amount #aggregate total cost
if amount > 0.00: #filter out zero-cost services
cost_data.append({
'Service': service_name,
'Cost': f"{amount:.2f}",
'Currency': unit
})
# Sort services by cost (highest to lowest)
cost_data.sort(key=lambda x: float(x['Cost']), reverse=True)
return {
'statusCode': 200,
'body': {
'startDate': start_date.strftime('%Y-%m-%d'),
'endDate': end_date.strftime('%Y-%m-%d'),
'costs': cost_data,
'totalCost': f"{total_cost:.2f}",
'currency': unit # Currency unit
}
}

Now lets test the function locally:
[local-vm AWS_DailyCostReport]$ sam build
[local-vm AWS_DailyCostReport]$ sam local invoke CostReportFunction
Our function returns a nice and handy JSON like this:
{
"statusCode": 200,
"body": {
"startDate": "2025-01-01",
"endDate": "2025-01-30",
"costs": [
{
"Service": "AWS WAF",
"Cost": "1.18",
"Currency": "USD"
},
{
"Service": "AWS Cost Explorer",
"Cost": "0.87",
"Currency": "USD"
},
{
"Service": "Tax",
"Cost": "0.42",
"Currency": "USD"
},
{
"Service": "Amazon Simple Storage Service",
"Cost": "0.16",
"Currency": "USD"
},
{
"Service": "AWS Lambda",
"Cost": "0.00",
"Currency": "USD"
},
{
"Service": "Amazon DynamoDB",
"Cost": "0.00",
"Currency": "USD"
}
],
"totalCost": "2.63",
"currency": "USD"
}
}

This shows us our current month-to-date cost, broken down by service, exactly how we wanted.
You can find the code for this tutorial in the accompanying Git Repo.
If you want to deploy the Lambda Function to your AWS Account you can do
sam deploy

Conclusion
In this tutorial, we’ve explored how to automate daily AWS cost monitoring by leveraging AWS Lambda, Cost Explorer API, and Amazon SAM. By implementing this solution, you can proactively manage your AWS expenses, gaining timely insights to prevent unexpected charges and optimize resource utilization. We developed and tested a Lambda function locally which uses the boto3 Cost Explorer Client in Python to fetch the cost data which we then aggregated and formatted to get the information we want.

Key Takeaways:
-
Automated Monitoring: Utilizing AWS Lambda to schedule daily cost reports ensures consistent and up-to-date expense tracking without manual intervention.
-
Detailed Insights: Accessing the Cost Explorer API programmatically allows for customized and detailed cost analysis tailored to your specific needs.
-
Proactive Management: Receiving daily reports via Amazon SNS and email notifications empowers you to make informed decisions promptly, helping to maintain control over your AWS spending.
I hope you enjoyed this tutorial and hopefully keeps your AWS costs under control.
COMING UP NEXT - Learn how to send the response of a Lambda Function to an SNS Topic and receive E-Mail Reports of your account cost.

🙂 Happy coding!