Home » Exam Preparation » Certification » AWS Certified Developer Associate Exam Dumps » Page 11

AWS Certified Developer Associate Exam Dumps

Question #101

An application is using Amazon DynamoDB as its data store, and should be able to read 100 items per second as strongly consistent reads. Each item is 5 KB in size.
To what value should the table’s provisioned read throughput be set?

  • A. 50 read capacity units
  • B. 100 read capacity units
  • C. 200 read capacity units
  • D. 500 read capacity units B

Correct Answer: Explanation
Reference:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ReadWriteCapacityMode.html

Question #102

A web application is designed to allow new users to create accounts using their email addresses. The application will store attributes for each user, and is expecting millions of user to sign up.
What should the Developer implement to achieve the design goals?

  • A. Amazon Cognito user pools
  • B. AWS Mobile Hub user data storage
  • C. Amazon Cognito Sync
  • D. AWS Mobile Hub cloud logic

Correct Answer: A
Reference:
https://aws.amazon.com/cognito/

Question #103

A company needs a new REST API that can return information about the contents of an Amazon S3 bucket, such as a count of the objects stored in it. The company has decided that the new API should be written as a microservice using AWS Lambda and Amazon API Gateway.
How should the Developer ensure that the microservice has the necessary access to the Amazon S3 bucket, while adhering to security best practices?

  • A. Create an IAM user that has permissions to access the Amazon S3 bucket, and store the IAM user credentials in the Lambda function source code.
  • B. Create an IAM role that has permissions to access the Amazon S3 bucket and assign it to the Lambda function as its execution role.
  • C. Create an Amazon S3 bucket policy that specifies the Lambda service as its principal and assign it to the Amazon S3 bucket.
  • D. Create an IAM role, attach the AmazonS3FullAccess managed policy to it, and assign the role to the Lambda function as its execution role.
Related:  How to Mount Amazon S3 as Drive for Cloud File Sharing

Correct Answer: C

Question #104

An application is running on an EC2 instance. The Developer wants to store an application metric in Amazon CloudWatch.
What is the best practice for implementing this requirement?

  • A. Use the PUT Object API call to send data to an S3 bucket. Use an event notification to invoke a Lambda function to publish data to CloudWatch.
  • B. Publish the metric data to an Amazon Kinesis Stream using a PutRecord API call. Subscribe a Lambda function that publishes data to CloudWatch.
  • C. Use the CloudWatch PutMetricData API call to submit a custom metric to CloudWatch. Provide the required credentials to enable the API call.
  • D. Use the CloudWatch PutMetricData API call to submit a custom metric to CloudWatch. Launch the EC2 instance with the required IAM role to enable the API call.

Correct Answer: C
Reference:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/mon-scripts.html

Question #105

Queries to an Amazon DynamoDB table are consuming a large amount of read capacity. The table has a significant number of large attributes. The application does not need all of the attribute data.
How can DynamoDB costs be minimized while maximizing application performance?

  • A. Batch all the writes, and perform the write operations when no or few reads are being performed.
  • B. Create a global secondary index with a minimum set of projected attributes.
  • C. Implement exponential backoffs in the application.
  • D. Load balance the reads to the table using an Application Load Balancer. C
Related:  Essential AWS S3 CLI Commands to Manage S3 Buckets and Objects with Example

Correct Answer: Explanation

Question #106

AWS CodeBuild builds code for an application, creates the Docker image, pushes the image to Amazon Elastic Container Registry (Amazon ECR), and tags the image with a unique identifier.
If the Developers already have AWS CLI configured on their workstations, how can the Docker images be pulled to the workstations?

  • A. Run the following: docker pull REPOSITORY URI : TAG
  • B. Run the output of the following: aws ecr get-login and then run: docker pull REPOSITORY URI : TAG
  • C. Run the following: aws ecr get-login and then run: docker pull REPOSITORY URI : TAG
  • D. Run the output of the following: aws ecr get-download-url-for-layer and then run: docker pull REPOSITORY URI : TAG

Correct Answer: B

Question #107

A company caches session information for a web application in an Amazon DynamoDB table. The company wants an automated way to delete old items from the table.
What is the simplest way to do this?

  • A. Write a script that deletes old records; schedule the scripts as a cron job on an Amazon EC2 instance.
  • B. Add an attribute with the expiration time; enable the Time To Live feature based on that attribute.
  • C. Each day, create a new table to hold session data; delete the previous day’s table.
  • D. Add an attribute with the expiration time; name the attribute ItemExpiration.

Correct Answer: B
Reference:
https://aws.amazon.com/about-aws/whats-new/2017/02/amazon-dynamodb-now-supports-automatic-item-expiration-with-time-to-live-ttl/

Question #108

An application is expected to process many files. Each file takes four minutes to process each AWS Lambda invocation. The Lambda function does not return any important data.
What is the fastest way to process all the files?

  • A. First split the files to make them smaller, then process with synchronous RequestResponse Lambda invocations.
  • B. Make synchronous RequestResponse Lambda invocations and process the files one by one.
  • C. Make asynchronous Event Lambda invocations and process the files in parallel.
  • D. First join all the files, then process it all at once with an asynchronous Event Lambda invocation.
Related:  AWS Launches its Bottlerocket Container Operating System into General Availability

Correct Answer: C

Question #109

The upload of a 15 GB object to Amazon S3 fails. The error message reads: “Your proposed upload exceeds the maximum allowed object size.”
What technique will allow the Developer to upload this object?

  • A. Upload the object using the multi-part upload API.
  • B. Upload the object over an AWS Direct Connect connection.
  • C. Contact AWS Support to increase the object size limit.
  • D. Upload the object to another AWS region.

Correct Answer: A
Reference:
https://acloud.guru/forums/aws-certified-solutions-architect-associate/discussion/-KACOEWK92oCmeCwuj4t/s3-question-on-multi-part-upload

Question #110

A company has an AWS CloudFormation template that is stored as a single file. The template is able to launch and create a full infrastructure stack.
Which best practice would increase the maintainability of the template?

  • A. Use nested stacks for common template patterns.
  • B. Embed credentials to prevent typos.
  • C. Remove mappings to decrease the number of variables.
  • D. Use AWS::Include to reference publicly-hosted template files.

Correct Answer: A
Reference:
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/best-practices.html

1 thought on “AWS Certified Developer Associate Exam Dumps”

Leave a Comment