Question #121
A company has a legal requirement to store point-in-time copies of its Amazon RDS PostGreSQL database instance in facilities that are at least 200 miles apart.
Use of which of the following provides the easiest way to comply with this requirement?
- A. Cross-region read replica
- B. Multiple Availability Zone snapshot copy
- C. Multiple Availability Zone read replica
- D. Cross-region snapshot copy
Correct Answer:D
Question #122
After reviewing their logs, a startup company noticed large, random spikes in traffic to their web application. The company wants to configure a cost-efficient Auto
Scaling solution to support high availability of the web application.
Which scaling plan should a Solutions Architect recommend to meet the company’s needs?
- A. Dynamic
- B. Scheduled
- C. Manual
- D. Lifecycle
Correct Answer:A
Question #123
To meet compliance standards, a company must have encrypted archival data storage. Data will be accessed infrequently, with lead times well in advance of when archived data must be recovered. The company requires that the storage be secure, durable, and provided at the lowest price per 1TB of data stored.
What type of storage should be used?
- A. Amazon S3
- B. Amazon EBS
- C. Amazon Glacier
- D. Amazon EFS
Correct Answer:C
Question #124
An online company wants to conduct real-time sentiment analysis about its products from its social media channels using SQL.
Which of the following solutions has the LOWEST cost and operational burden?
- A. Set up a streaming data ingestion application on Amazon EC2 and connect it to a Hadoop cluster for data processing. Send the output to Amazon S3 and use Amazon Athena to analyze the data.
- B. Configure the input stream using Amazon Kinesis Data Streams. Use Amazon Kinesis Data Analytics to write SQL queries against the stream.
- C. Configure the input stream using Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to send data to an Amazon Redshift cluster, and then query directly against Amazon Redshift
- D. Set up streaming data ingestion application on Amazon EC2 and send the output to Amazon S3 using Kinesis Data Firehose. Use Athena to analyze the data.
Correct Answer:B
Question #125
An organization must process a stream of large-volume hashtag data in real time and needs to run custom SQL queries on the data to get insights on certain tags.
The organization needs this solution to be elastic and does not want to manage clusters.
Which of the following AWS services meets these requirements?
- A. Amazon Elasticsearch Service
- B. Amazon Athena
- C. Amazon Redshift
- D. Amazon Kinesis Data Analytics
Correct Answer:B
Reference –
https://aws.amazon.com/blogs/machine-learning/build-a-social-media-dashboard-using-machine-learning-and-bi-services/
Question #126
Which requirements must be met in order for a Solutions Architect to specify that an Amazon EC2 instance should stop rather than terminate when its Spot
Instance is interrupted? (Choose two.)
- A. The Spot Instance request type must be one-time.
- B. The Spot Instance request type must be persistent.
- C. The root volume must be an Amazon EBS volume.
- D. The root volume must be an instance store volume.
- E. The launch configuration is changed.
Correct Answer:BC
Reference –
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/spot-interruptions.html#interruption-behavior
Question #127
An application hosted on AWS uses object storage for storing internal reports that are accessed daily by the CFO. Currently, these reports are publicly available.
How should a Solutions Architect re-design this architecture to prevent unauthorized access to these reports?
- A. Encrypt the files on the client side and store the files on Amazon Glacier, then decrypt the reports on the client side.
- B. Move the files to Amazon ElastiCache and provide a username and password for downloading the reports.
- C. Specify the use of AWS KMS server-side encryption at the time of an object creation on Amazon S3.
- D. Store the files on Amazon S3 and use the application to generate S3 pre-signed URLs to users.
Correct Answer:D
Question #128
A Solutions Architect is designing an application on AWS that will connect to the on-premise data center through a VPN connection. The solution must be able to log network traffic over the VPN.
Which service logs this network traffic?
- A. AWS CloudTrail logs
- B. Amazon VPC flow logs
- C. Amazon S3 bucket logs
- D. Amazon CloudWatch Logs
Correct Answer:B
Question #129
A company wants to durably store data in 8 KB chunks. The company will access the data once every few months. However, when the company does access the data, it must be done with as little latency as possible.
Which AWS service should a Solutions Architect recommend if cost is NOT a factor?
- A. Amazon DynamoDB
- B. Amazon EBS Throughput Optimized HDD Volumes
- C. Amazon EBS Cold HDD Volumes
- D. Amazon ElastiCache
Correct Answer:A
Question #130
A media company has more than 100TB of data to be stored and retrieved infrequently. However, the company occasionally receives requests for data within an hour. The company needs a low-cost retrieval method to handle the requests.
Which service meets this requirement?
- A. Amazon S3 Standard
- B. Amazon Glacier standard retrievals
- C. Amazon Glacier bulk retrievals
- D. Amazon S3 Standard Infrequent Access
Correct Answer:D
Reference:
https://aws.amazon.com/blogs/aws/aws-storage-update-s3-glacier-price-reductions/
Question #131
An on-premises database is experiencing significant performance problems when running SQL queries. With 10 users, the lookups are performing as expected.
As the number of users increases, the lookups take three times longer than expected to return values to an application.
Which action should a Solutions Architect take to maintain performance as the user count increases?
- A. Use Amazon SQS.
- B. Deploy Multi-AZ RDS MySQL
- C. Configure Amazon RDS with additional read replicas.
- D. Migrate from MySQL to RDS Microsoft SQL Server.
Correct Answer:B
Question #132
A team has an application that detects new objects being uploaded into an Amazon S3 bucket. The uploads trigger a Lambda function to write object metadata into an Amazon DynamoDB table and RDS PostgreSQL database.
Which action should the team take to ensure high availability?
- A. Enable cross-region replication in the Amazon S3 bucket.
- B. Create a Lambda function for each Availability Zone the application is deployed in.
- C. Enable multi-AZ on the RDS PostgreSQL database.
- D. Create a DynamoDB stream for the DynamoDB table.
Correct Answer:D
Question #133
A media company must store 10 TB of audio recordings. Retrieval happens infrequently and requestors agree on an 8-hour turnaround time.
What is the MOST cost-effective solution to store the files?
- A. Amazon S3 Standard “” Infrequent Access (Standard “” IA)
- B. EBS Throughput Optimized HDD (st1)
- C. EBS Cold HDD (sc1)
- D. Amazon Glacier
Correct Answer:D
Reference:
https://aws.amazon.com/about-aws/whats-new/2016/11/access-your-amazon-glacier-data-in-minutes-with-new-retrieval-options/
Question #134
A company wants to improve the performance of their web application after receiving customer complaints. An analysis concluded that the same complex database queries were causing increased latency.
What should a Solutions Architect recommend to improve the application’s performance?
- A. Migrate the database to MySQL.
- B. Use Amazon RedShift to analyze the queries.
- C. Integrate Amazon ElastiCache into the application.
- D. Use a Lambda-triggered request to the backend database.
Correct Answer:C
Question #135
Which tool analyzes account resources and provides a detailed inventory of changes over time?
- A. AWS Config
- B. AWS CloudFormation
- C. Amazon CloudWatch
- D. AWS Service Catalog
Correct Answer:A
Reference:
https://docs.aws.amazon.com/config/latest/developerguide/WhatIsConfig.html
Question #136
A Solutions Architect is designing a solution that will include a database in Amazon RDS. Corporate security policy mandates that the database, its logs, and its backups are all encrypted.
Which is the MOST efficient option to fulfill the security policy using Amazon RDS?
- A. Launch an Amazon RDS instance with encryption enabled. Enable encryption for logs and backups.
- B. Launch an Amazon RDS instance. Enable encryption for database, logs and backups.
- C. Launch an Amazon RDS instance with encryption enabled. Logs and backups are automatically encrypted.
- D. Launch an Amazon RDS instance. Enable encryption for backups. Encrypt logs with a database-engine feature.
Correct Answer:D
Reference:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.Encryption.html
Question #137
A Solutions Architect is designing a public-facing web application for employees to upload images to their social media account. The application consists of multiple Amazon EC2 instances behind an elastic load balancer, an Amazon S3 bucket where uploaded images are stored, and an Amazon DynamoDB table for storing image metadata.
Which AWS service can the Architect use to automate the process of updating metadata in the DynamoDB table upon image upload?
- A. Amazon CloudWatch
- B. AWS CloudFormation
- C. AWS Lambda
- D. Amazon SQS
Correct Answer:C
Question #138
A company’s policy requires that all data stored in Amazon S3 is encrypted. The company wants to use the option with the least overhead and does not want to manage any encryption keys.
Which of the following options will meet the company’s requirements?
- A. AWS CloudHSM
- B. AWS Trusted Advisor
- C. Server Side Encryption (SSE-S3)
- D. Server Side Encryption (SSE-KMS)
Correct Answer:C
Reference:
https://docs.aws.amazon.com/AmazonS3/latest/dev/bucket-encryption.html
Question #139
A company has gigabytes of web log files stored in an Amazon S3 bucket. A Solutions Architect wants to copy those files into Amazon Redshift for analysis. The company’s security policy mandates that data is encrypted at rest both in the Amazon Redshift cluster and the Amazon S3 bucket.
Which process will fulfill the security requirements?
- A. Enable server-side encryption on the Amazon S3 bucket. Launch an unencrypted Amazon Redshift cluster. Copy the data into the Amazon Redshift cluster.
- B. Enable server-side encryption on the Amazon S3 bucket. Copy data from the Amazon S3 bucket into an unencrypted Redshift cluster. Enable encryption on the cluster.
- C. Launch an encrypted Amazon Redshift cluster. Copy the data from the Amazon S3 bucket into the Amazon Redshift cluster. Copy data back to the Amazon S3 bucket in encrypted form.
- D. Enable server-side encryption on the Amazon S3 bucket. Launch an encrypted Amazon Redshift cluster. Copy the data into the Amazon Redshift cluster.
Correct Answer:D
Reference:
https://aws.amazon.com/blogs/big-data/encrypt-your-amazon-redshift-loads-with-amazon-s3-and-aws-kms/
Question #140
An application runs on Amazon EC2 instances in an Auto Scaling group. When instances are terminated, the Systems Operations team cannot determine the route cause, because the logs reside on the terminated instances and are lost.
How can the root cause be determined?
- A. Use ephemeral volumes to store the log files.
- B. Use a scheduled Amazon CloudWatch Event to take regular Amazon EBS snapshots.
- C. Use an Amazon CloudWatch agent to push the logs to Amazon CloudWatch Logs.
- D. Use AWS CloudTrail to pull the logs from the Amazon EC2 instances.
Correct Answer:C