AWS Solutions Architect Associate: Interview Questions and Answers Part -2

Learn about AWS Solutions Architect Associate Interview Questions for Experienced

Question 1
A business enterprise is in the process of developing a web application to host in AWS. The application requires a data store for session data. Imagine you are the AWS Solution Architect. You have to mention two ideal ways to store session data. What are they?
Answer
According to the AWS Documentation, we have to use Amazon DynamoDB (NoSQL Database service) and Amazon ElastiCache (web service) to store session data.

Imagine there are users who upload images through a mobile application. A business enterprise needs to store these images. A security measure is also important to ensure no loss of data. What should you do to protect against unintended user actions?

Answer : We have to store data in an S3 bucket and enable versioning. Amazon S3 is has an option for versioning. We use Versioning to recover previous versions of an object and it is available on the bucket level.

Imagine an application requires a Datastore to host in AWS. You need to satisfy the following conditions for the Datastore:
There should be an introductory storage capacity of 8 TB.
The Datastore must be able to provide a database growth of 8 GB per day.
The Datastore should have 4 Read Replicas.
Which Datastore would you choose?

Answer :
We have to use Amazon Aurora (MySQL and PostgreSQL compatible relational database) to satisfy the above requirements.

Imagine a scenario where a requirement is there to host a database on an EC2 Instance. In addition to this, the EBS volume must support 12,000 IOPS. Which Amazon EBS volume type satisfies the performance requirements of this database?

Answer : According to the AWS Documentation, we need to use EBS Provisioned IOPS SSD to satisfy the above-mentioned performance requirements.

The Development teams in your business enterprise, utilize Amazon S3 buckets to store log files for numerous applications. AWS development environments are hosting these applications. The intention of the developers is to purge the logs after keeping them for a month to troubleshoot problems. Which feature do you think will enable this requirement?

Answer :According to the AWS Documentation, configuring lifecycle configuration rules on the Amazon S3 bucket will enable this requirement.

There is a need for a proprietary file system in a legacy application. Which Cloud File Storage option will you use to store data? Also, note that it must be accessible by an EC2 Instance.

Answer :
According to the AWS Documentation, we have to use the AWS EFS file storage system.

An application uses NGINX and is scalable at any point in time. Which two services would you choose to host this application?

Answer : According to the AWS Documentation, we should choose AWS EC2 and AWS Elastic Beanstalk to host the application.

There is a need to upload a million images to an Amazon S3. What should you do which guarantees excellent performance in this regard?

Answer :
According to the AWS Documentation, we have to use a hexadecimal hash for the prefix.

Are you Interested In Joining ?

Attend a Free Demo Session with a sip of Coffee.

Are you Interested In Joining ?

Attend a Free Demo Session with a sip of Coffee.

We need to obtain the IP addresses for resources accessed in a subnet which is private. Which Amazon feature should you use to solve this task?

Answer
According to the AWS Documentation, we must use the VPC Flow Logs feature.

Microsoft Azure

 

AWS Training

 

AWS Devops

 

The requirement is to send and process 500 messages in order. Which service would you use in this case?

Answer :According to the Aws Documentation, we have to use the AWS SQS FIFO feature.

We require a database for a Two-Tier application. It is expected for the data to go through numerous schema changes. The database is expected to be resilient and ACID compliant. Any modifications to the database should not lead to database downtime. Which storage option would you choose?

Answer :According to the AWS Documentation, we need to use Amazon Aurora (MySQL-compatible database) for data storage.

Are you Interested In Joining ?

Attend a Free Demo Session with a sip of Coffee.

There is 60 TB of data in a Redshift cluster as of this moment. The requirement is to erect a disaster recovery site in a location approximately 600 Km away. Which solution would satisfy the above requirement?

Answer : We have to enable Cross-Region snapshots for the Redshift cluster.

A business enterprise is utilizing a Redshift cluster to store their data warehouse. The internal IT Security team requires you to encrypt the data for the Redshift Database. How would you go about it?

Answer :According to the AWS Documentation, we have to use AWS KMS Customer Default master key to encrypt the data.

We require a block level storage to store 500 GB of data. We also require Data Encryption. Which block storage device would you use in this scenario?

Answer :We need to use AWS EBS Volumes for the above scenario.

An application is in need of an EC2 Instance. It is required to continuously batch process different activities. A maximum data throughput of 500 MiB/s is required. What is the best storage option in this scenario?

Answer : According to the AWS Documentation, we need to use EBS Throughput Optimized volume for this scenario.

Are you Interested In Joining ?

Attend a Free Demo Session with a sip of Coffee.

There is a need for an application to access data in another AWS account in the same region. What would you do to ensure that the data can be accessed as and when required?

Answer :According to the AWS Documentation, we have to use VPC Peering between both the accounts.

An application is using a NAT Instance as of this moment. This application is required to use a NAT Gateway. What would you do to achieve this?

Answer : We have to migrate from a NAT Instance to a NAT Gateway and host the NAT Gateway in the public subnet.

The following architecture exists for an application:
There are EC2 Instances in numerous AZ’s at the back of an ELB.
The launching of the EC2 Instances is done through an Auto Scaling Group.
A singular NAT Instance is utilized so that instances can download updates from the internet.
Out of the following options which one is a limitation to the architecture?
1 The Auto Scaling Group
2 The NAT Instance
3 The ELB
4 The EC2 Instances

Answer : The NAT Instance is a bottleneck since there is a singular NAT Instance. We have to launch NAT Instances in numerous Available Zones for high availability. We have to also make it a part of an Auto Scaling Group.

Are you Interested In Joining ?

Attend a Free Demo Session with a sip of Coffee.

A business enterprise controls an API which as of this moment receives 1000 requests per second. The company wants to optimize costs while hosting it in AWS. Which solution would satisfy the above condition?

Answer :We have to use API Gateway alongside AWS Lambda to satisfy the above solution.

A database application has a lot of resource-intensive reads and writes. The requirement is to host this database. Select the most suitable option from the list provided below.
1 EBS Cold Storage
2 EBS SSD
3 EBS Throughput Optimized
4 EBS Provisioned IOPS

Answer : According to the AWS Documentation, EBS Provisioned IOPS is the most suitable option for the above situation.

There exists an application which sends images to an Amazon S3. These images come with metadata. We must save this metadata in persistent storage and also index it. Which storage mechanism would you use for the metadata storage?

Answer : According to the AWS Documentation, the team should choose the Amazon DynamoDB service.

EC2 Instances host an application. The promotional campaign for the application is set to start in 14 days. A mandate from the Management states that traffic growth should not lead to performance problems. What do we need to do to the Auto Scaling Group to satisfy the above condition?

Answer : According to the AWS Documentation, we have to configure Dynamic Scaling and use Target tracking scaling Policy.

A company is making use of EBS snapshots to back up their EBS Volumes. To successfully continue the business, the snapshots need to be made accessible in a different region. How would you achieve this?

Answer : According to the AWS Documentation, we have to create a Snapshot and copy the snapshot to the desired location.

A business enterprise is hosting an application in AWS. The application consists of EC2 Instances at the back of an ELB with EC2 Instances. The administration requires the following:
We have to assure that the notifications are dispatched when the read requests exceed 1000 requests per minute.
We have to ensure that the notifications are dispatched when the latency exceeds 10 seconds.
Proper monitoring of any API activity that calls for sensitive data.
What would you do in this scenario?

Answer :According to the AWS Documentation, we need to use CloudTrail to keep an eye out for the API Activity. We have to use CloudWatch metrics to monitor the metrics in line with the administrative requirements. We need to create an alarm activity which dispatches notifications on reaching the predetermined metric threshold limit.

Are you Interested In Joining ?

Attend a Free Demo Session with a sip of Coffee.

A business enterprise hosts resources in their AWS Account. The requirement is to oversee API Activity for all the regions. We have to make sure that the audit is applied even in future regions. What would you do in this scenario?

Answer : According to the AWS Documentation, we have to ensure that one CloudTrail trail is enabled for all the regions.

The requirement is for an ISCSI device. At the same time, the legacy application requires local storage. What do you do in the above scenario?

Answer : According to the AWS Documentation, we have to configure Storage Gateway-Stored volume.

Microsoft Azure

 

AWS Training

 

AWS Devops

 

December 17, 2020
© 2023 Hope Tutors. All rights reserved.

Site Optimized by GigCodes.com

Request CALL BACK