How to read S3 object of file size more than 32 MB using AWS Lambda and to generate the log report

0 votes
Hello Guys, I am able to read my S3 object from the Lambda function and able to see the log using CloudWatch till the file size in S3 is 32MB. But when the file size is more than 32 MB, my log is not getting generated even if I allocate RAM of 3008 MB to my function. Please suggest me what should I do. " https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/AgentReference.html#agent-faq " I check here. It says log event exceeding 256 KB can not be generated. Please suggest me any alternative to read the file.
Sep 29, 2020 in AWS by Jeetz
• 120 points
929 views

1 answer to this question.

0 votes
Hi@Jeetz,

Just follow the steps given in the document. You can rename your exiting log file and create a new file with the same name. There may be some size limit in the log file. Currently, your log file reaches that limit.
answered Sep 29, 2020 by MD
• 95,180 points

Related Questions In AWS

0 votes
2 answers

How to display just the name of files using aws s3 ls command?

aws s3 ls s3://<your_bucket_name>/ | awk '{print ...READ MORE

answered Mar 17, 2019 in AWS by anonymous
11,564 views
0 votes
1 answer

How to download the latest file in a S3 bucket using AWS CLI?

You can use the below command $ aws ...READ MORE

answered Sep 6, 2018 in AWS by Archana
• 4,150 points
11,974 views
0 votes
1 answer

How to decrypt the encrypted S3 file using aws-encryption-cli --decrypt

Use command : aws s3 presign s3://mybucket/abc_count.png you get ...READ MORE

answered Oct 22, 2018 in AWS by Priyaj
• 58,120 points
2,792 views
0 votes
1 answer

How to measure the total size of an S3 bucket using python?

Hi@akhtar, You can do this tasks using Boto. ...READ MORE

answered Apr 15, 2020 in AWS by MD
• 95,180 points
1,769 views
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer