How to read S3 object of file size more than 32 MB using AWS Lambda and to generate the log report ?

0 votes
Hello Guys, I am able to read my S3 object from the Lambda function and able to see the log using CloudWatch till the file size in S3 is 32MB. But when the file size is more than 32 MB, my log is not getting generated even if I allocate RAM of 3008 MB to my function. Please suggest me what should I do. " https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/AgentReference.html#agent-faq " I check here. It says log event exceeding 256 KB can not be generated. Please suggest me any alternative to read the file.
Sep 29 in AWS by Jeetz
• 120 points
404 views

1 answer to this question.

0 votes
Hi@Jeetz,

Just follow the steps given in the document. You can rename your exiting log file and create a new file with the same name. There may be some size limit in the log file. Currently, your log file reaches that limit.
answered Sep 29 by MD
• 79,930 points

Related Questions In AWS

0 votes
2 answers

How to display just the name of files using aws s3 ls command?

aws s3 ls s3://<your_bucket_name>/ | awk '{print ...READ MORE

answered Mar 17, 2019 in AWS by anonymous
9,820 views
0 votes
1 answer

How to download the latest file in a S3 bucket using AWS CLI?

You can use the below command $ aws ...READ MORE

answered Sep 6, 2018 in AWS by Archana
• 4,150 points
10,388 views
0 votes
1 answer

How to decrypt the encrypted S3 file using aws-encryption-cli --decrypt

Use command : aws s3 presign s3://mybucket/abc_count.png you get ...READ MORE

answered Oct 22, 2018 in AWS by Priyaj
• 57,700 points
2,319 views
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer