An application uploads text files to S3 Bucket. Whenever a file is uploaded to the bucket a Lambda function get triggered. Lambda function reads the content of the file and pushes the data to the Kinesis data stream.
Services Covered
- Amazon Kinesis
- Amazon S3
- AWS Lambda
- IAM
Lab description
An application uploads text files to S3 Bucket. Whenever a file is uploaded to the bucket a Lambda function get triggered. Lambda function reads the content of the file and pushes the data to the Kinesis data stream. There are two consumers which consume the data from the stream.
- Creating Lambda functions and setting up Triggers
- Creating Kinesis data stream
- Setting Lambda functions as consumers of Kinesis data stream
- Monitoring CloudWatch events in Logs
Lab diagram
Lab date
25-09-2021
Prerequisites
- AWS account
Lab source
Lab steps
- In IAM create a ROle for Lambda functions, attach permissions policies
- AmazonS3FullAccess
- CloudWatchFullAccess
- AmazonKinesisFullAccess
Give it an adequate name, we will attach this Role to Lambda in next steps.
- Create Kinesis Data Stream. Set Number of open shards to 1. After creation enable server-side encryption in Configuration.
- Create a S3 Bucket. Enable Bucket Versioning and server-side encryption with Amazon S3 key.
- Create Lambda function called “producer”, select Node.js as runtime, as execution role use the one created in step 1. producer.js reads the data from newly uploaded S3 object and sends that data to earlier created Kinesis data stream.
- Create event notification in S3 Bucket created earlier, set the suffix for ‘.txt’ type of objects. As event types choose ‘All object create events’ and as Destination choose the ‘producer’ Lambda function. This will trigger Lambda each time new object will be uploaded.
- Create two consumer Lambda functions:
- Upload a text file to S3 Bucket. This will trigger the producer Lambda. To check the event logs go to CloudWatch and under Logs check if the producer logged it events. The consumer Lambdas should also log the events with the text from the data stream.
Lab files
- producer.js – Lambda function triggered when uploading file into S3 Bucket, reads the data and sends it to the Kinesis data stream
- consumer1.js and consumer2.js – consumer Lambda functions that process Kinesis data stream