Lead4Pass SAP-C02 dumps | Best AWS Certified Solutions Architect – Professional exam material

sapc-c02 exam

AWS Certified Solutions Architect – Professional exam (SAP-C02) replaces SAP-C01, you should choose the latest exam for certification!

Use the best AWS Certified Solutions Architect – Professional exam material: https://www.leads4pass.com/sap-c02.html (SAP-C02 dumps)
Contains 421 latest exam questions and answers to help you pass the exam 100%.

The latest SAP-C02 exam questions and answers are shared online

FromNumber of exam questionsExam nameExam codeLast updated
Lead4Pass15AWS Certified Solutions Architect – ProfessionalSAP-C02SAP-C01 dumps
Question 1:

A company\’s AWS architecture currently uses access keys and secret access keys stored on each instance to access AWS services. Database credentials are hard-coded on each instance. SSH keys for command-line remote access are stored in a secured Amazon S3 bucket. The company has asked its solutions architect to improve the security posture of the architecture without adding operational complexity.

Which combination of steps should the solutions architect take to accomplish this? (Select THREE.)

A. Use Amazon EC2 instance profiles with an IAM role.

B. Use AWS Secrets Manager to store access keys and secret access keys.

C. Use AWS Systems Manager Parameter Store to store database credentials.

D. Use a secure fleet of Amazon EC2 bastion hosts (or remote access.

E. Use AWS KMS to store database credentials.

F. Use AWS Systems Manager Session Manager for remote access

Correct Answer: ACF

Explanation: https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager.html

Question 2:

A solutions architect is working with a company that is extremely sensitive to its IT costs and wishes to implement controls that will result in a predictable AWS spend each month Which combination ot steps can help the company control and monitor its monthly AWS usage to achieve a cost that is as close as possible to the target amount? (Select THREE.)

A. Implement an IAM policy that requires users to specify a \’workload\’ tag for cost allocation when launching Amazon EC2 instances

B. Contact AWS Support and ask that they apply limits to the account so that users are not able to launch more than a certain number of instance types

C. Purchase all upfront Reserved Instances that cover 100% of the account\’s expected Amazon EC2 usage

D. Place conditions in the users\’ IAM policies that limit the number of instances they are able to launch

E. Define \’ workload\’ as a cost allocation tag in the AWS Billing and Cost Management console

F. Set up AWS Budgets to alert and notify when a given workload is expected to exceed a defined cost

Correct Answer: AEF

Question 3:

A company has migrated its forms-processing application to AWS. When users interact with the application, they upload scanned forms as files through a web application. A database stores user metadata and references to files that are stored in Amazon S3. The web application runs on Amazon EC2 instances and an Amazon RDS for the PostgreSQL database.

When forms are uploaded, the application sends notifications to a team through Amazon Simple Notification Service (Amazon SNS). A team member then logs in and processes each form. The team member performs data validation on the form and extracts relevant data before entering the information into another system that uses an API. A solutions architect needs to automate the manual processing of the forms. The solution must provide accurate form extraction, minimize

time to market, and minimize long-term operational overhead.

Which solution will meet these requirements?

A. Develop custom libraries to perform optical character recognition (OCR) on the forms. Deploy the libraries to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster as an application tier. Use this tier to process the forms when forms are uploaded. Store the output in Amazon S3. Parse this output by extracting the data into an Amazon DynamoDB table. Submit the data to the target system\’s API. Host the new application tier on EC2 instances.

B. Extend the system with an application tier that uses AWS Step Functions and AWS Lambda. Configure this tier to use artificial intelligence and machine learning (AI/ML) models that are trained and hosted on an EC2 instance to perform optical character recognition (OCR) on the forms when forms are uploaded. Store the output in Amazon S3. Parse this output by extracting the data that is required within the application tier. Submit the data to the target system\’s API.

C. Host a new application tier on EC2 instances. Use this tier to call endpoints that host artificial intelligence and machine learning (Al/ML) models that are trained and hosted in Amazon SageMaker to perform optical character recognition (OCR) on the forms. Store the output in Amazon ElastiCache. Parse this output by extracting the data that is required within the application tier. Submit the data to the target system\’s API.

D. Extend the system with an application tier that uses AWS Step Functions and AWS Lambda. Configure this tier to use Amazon Textract and Amazon Comprehend to perform optical character recognition (OCR) on the forms when forms are uploaded. Store the output in Amazon S3. Parse this output by extracting the data that is required within the application tier. Submit the data to the target system\’s API.

Correct Answer: A

Explanation: Developing custom libraries and deploying them to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster as an application tier allows the OCR processing to be done in real-time when forms are uploaded, and allows for more accurate form extraction.

The use of Kubernetes allows for easy scaling and management of the application tier, minimizing long-term operational overhead. Storing the output in Amazon S3 and parsing the data into an Amazon DynamoDB table allows for easy access and querying of the data, minimizing time to market.

References: Amazon Elastic Kubernetes Service (EKS):https://aws.amazon.com/eks/ Amazon DynamoDB:https://aws.amazon.com/dynamodb/ Optical Character Recognition (OCR):https://en.wikipedia.org/wiki/Optical_character_recognition

Question 4:

A company gives users the ability to upload images from a custom application. The upload process invokes an AWS Lambda function that processes and stores the image in an Amazon S3 bucket. The application invokes the Lambda function by using a specific function version ARN.

The Lambda function accepts image processing parameters by using environment variables. The company often adjusts the environment variables of the Lambda function to achieve optimal image processing output.

The company tests different parameters and publishes a new function version with the updated environment variables after validating the results.

This update process also requires frequent changes to the custom application to invoke the new function version ARN.

These changes cause interruptions for users.

A solutions architect needs to simplify this process to minimize disruption to users.

Which solution will meet these requirements with the LEAST operational overhead?

A. Directly modify the environment variables of the published Lambda function version. Use the LATEST version to test image processing parameters.

B. Create an Amazon DynamoDB table to store the image processing parameters. Modify the Lambda function to retrieve the image processing parameters from the DynamoDB table.

C. Directly code the image processing parameters within the Lambda function and remove the environment variables. Publish a new function version when the company updates the parameters.

D. Create a Lambda function alias. Modify the client application to use the function alias ARN. Reconfigure the Lambda alias to point to new versions of the function when the company finishes testing.

Correct Answer: D

Explanation: A Lambda function alias allows you to point to a specific version of a function and also can be updated to point to a new version of the function without modifying the client application.

This way, the company can test different versions of the function with different environment variables and, once the optimal parameters are found, update the alias to point to the new version, without the need to update the client application.

By using this approach, the company can simplify the process of updating the environment variables, minimize disruption to users, and reduce operational overhead.

Reference:

AWS Lambda documentation:https://aws.amazon.com/lambda/

AWS Lambda Aliases

documentation:https://docs.aws.amazon.com/lambda/latest/dg/aliases-intro.html

AWS Lambda versioning and aliases

documentation:https://aws.amazon.com/blogs/compute/versioning-aliases-in-aws-lambda/

Question 5:

A data analytics company has an Amazon Redshift cluster that consists of several reserved nodes. The duster is experiencing unexpected bursts of usage because a team of employees is compiling a deep audit analysis report The queries to generate the report are complex read queries and are CPU intensive.

Business requirements dictate that the cluster must be able to service read and write queries at) times A solutions architect must devise a solution that accommodates the bursts of usage

Which solution meets these requirements MOST cost-effectively?

A. Provision an Amazon EMR duster Offload the complex data processing tasks

B. Deploy an AWS Lambda function to add capacity to the Amazon Redshift cluster by using a classic resize operation when the duster\’s CPU metrics in Amazon CloudWatch reach 80%.

C. Deploy an AWS Lambda function to add capacity to the Amazon Redshift duster by using an elastic resize operation when the duster\’s CPU metrics in Amazon CloudWatch leach 80%.

D. Turn on the Concurrency Scaling feature for the Amazon Redshift duster

Correct Answer: D

Question 6:

A company that has multiple AWS accounts is using AWS Organizations. The company\’s AWS accounts host VPCs, Amazon EC2 instances, and containers.

The company\’s compliance team has deployed a security tool in each VPC where the company has deployments. The security tools run on EC2 instances and send information to the AWS account that is dedicated to the compliance team. The company has tagged all the compliance-related resources with a key of “cost center” and a value or “compliance”.

The company wants to identify the cost of the security tools that are running on the EC2 instances so that the company can charge the compliance team\’s AWS account. The cost calculation must be as accurate as possible.

What should a solutions architect do to meet these requirements?

A. In the management account of the organization, activate the cost center user-defined tag. Configure monthly AWS Cost and Usage Reports to save to an Amazon S3 bucket in the management account. Use the tag breakdown in the report to obtain the total cost for the cost center tagged resources.

B. In the member accounts of the organization, activate the cost center user-defined tag. Configure monthly AWS Cost and Usage Reports to save to an Amazon S3 bucket in the management account. Schedule a monthly AWS Lambda function to retrieve the reports and calculate the total cost for the cost center-tagged resources.

C. In the member accounts of the organization activate the cost center user-defined tag. From the management account, schedule a monthly AWS Cost and Usage Report. Use the tag breakdown in the report to calculate the total cost for the cost center-tagged resources.

D. Create a custom report in the organization view in AWS Trusted Advisor. Configure the report to generate a monthly billing summary for the cost center-tagged resources in the compliance team\’s AWS account.

Correct Answer: A

Explanation: https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/custom-tags.html https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/configurecostalloc report.html

Question 7:

A company is running a traditional web application on Amazon EC2 instances. The company needs to refactor the application as microservices that run on containers. Separate versions of the application exist in two distinct environments: production and testing. The load for the application is variable, but the minimum load and the maximum load are known. A solutions architect needs to design the updated application with a serverless architecture that minimizes operational complexity.

Which solution will meet these requirements MOST cost-effectively?

A. Upload the container images to AWS Lambda as functions. Configure a concurrency limit for the associated Lambda functions to handle the expected peak load. Configure two separate Lambda integrations within Amazon API Gateway: one for production and one for testing.

B. Upload the container images to Amazon Elastic Container Registry (Amazon ECR). Configure two auto-scaled Amazon Elastic Container Service (Amazon ECS) clusters with the Fargate launch type to handle the expected load. Deploy tasks from the ECR images. Configure two separate Application Load Balancers to direct traffic to the ECS clusters.

C. Upload the container images to Amazon Elastic Container Registry (Amazon ECR). Configure two auto-scaled Amazon Elastic Kubernetes Service (Amazon EKS) clusters with the Fargate launch type to handle the expected load. Deploy tasks from the ECR images. Configure two separate Application Load Balancers to direct traffic to the EKS clusters.

D. Upload the container images to AWS Elastic Beanstalk. In Elastic Beanstalk, create separate environments and deployments for production and testing. Configure two separate Application Load Balancers to direct traffic to the Elastic Beanstalk deployments.

Correct Answer: D

Explanation: minimizes operational + microservices that run on containers = AWS Elastic Beanstalk

Question 8:

A delivery company needs to migrate its third-party route planning application to AWS. The third-party supplies a supported Docker image from a public registry. The image can run in as many containers as required to generate the route map.

The company has divided the delivery area into sections with supply hubs so that delivery drivers travel the shortest distance possible from the hubs to the customers. To reduce the time necessary to generate route maps, each section uses its own set of Docker containers with a custom configuration that processes orders only in the section\’s area. The company needs the ability to allocate resources cost-effectively based on the number of running containers.

Which solution will meet these requirements with the LEAST operational overhead?

A. Create an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on Amazon EC2. Use the Amazon EKS CLI to launch the planning application in pods by using the -tags option to assign a custom tag to the pod.

B. Create an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on AWS Fargate. Use the Amazon EKS CLI to launch the planning application. Use the AWS CLI tag- resource API call to assign a custom tag to the pod.

C. Create an Amazon Elastic Container Service (Amazon ECS) cluster on Amazon EC2. Use the AWS CLI with run-tasks set to true to launch the planning application by using the – tags option to assign a custom tag to the task.

D. Create an Amazon Elastic Container Service (Amazon ECS) cluster on AWS Fargate. Use the AWS CLI run-task command and set enableECSManagedTags to true to launch the planning application. Use the –tags option to assign a custom tag to the task.

Correct Answer: D

Explanation: Amazon Elastic Container Service (ECS) on AWS Fargate is a fully managed service that allows you to run containers without having to manage the underlying infrastructure. When you launch tasks on Fargate, resources are automatically allocated based on the number of tasks running, which reduces the operational overhead.

Using ECS on Fargate allows you to assign custom tags to tasks using the –tags option in the run-task command, as described in the documentation: https://docs.aws.amazon.com/cli/latest/reference/ecs/run-task.html

You can also set enableECSManagedTags to true, which allows the service to automatically add the cluster name and service name as tags.https://docs.aws.amazon.com/AmazonECS/latest/developerguide/task-placement-constraints.html#tag-based-scheduling

Question 9:

A retail company runs a business-critical web service on an Amazon Elastic Container Service (Amazon ECS) cluster that runs on Amazon EC2 instances The web service receives POST requests from end users and writes data to a MySQL database that runs on a separate EC2 instance The company needs to ensure that data loss does not occur.

The current code deployment process includes manual updates of the ECS service During a recent deployment, end users encountered intermittent 502 Bad Gateway errors in response to valid web requests

The company wants to implement a reliable solution to prevent this issue from recurring. The company also wants to automate code deployments. The solution must be highly available and must optimize cost-effectiveness

Which combination of steps will meet these requirements? (Select THREE.)

A. Run the web service on an ECS cluster that has a Fargate launch type Use AWS CodePipeline and AWS CodeDeploy to perform a blue/green deployment with validation testing to update the ECS service.

B. Migrate the MySQL database to run on an Amazon RDS for MySQL Multi-AZ DB instance that uses Provisioned IOPS SSD (io2) storage

C. Configure an Amazon Simple Queue Service (Amazon SQS) queue as an event source to receive the POST requests from the web service Configure an AWS Lambda function to poll the queue Write the data to the database.

D. Run the web service on an ECS cluster that has a Fargate launch type Use AWS CodePipeline and AWS CodeDeploy to perform a canary deployment to update the ECS service.

Correct Answer: CD

Question 10:

A company wants to use Amazon Workspaces in combination with thin client devices to replace aging desktops Employees use the desktops to access applications that work with clinical trial data Corporate security policy states that access to the applications must be restricted to only company branch office locations. The company is considering adding an additional branch office in the next 6 months.

Which solution meets these requirements with the MOST operational efficiency?

A. Create an IP access control group rule with the list of public addresses from the branch offices Associate the IP access control group with the Workspaces directory

B. Use AWS Firewall Manager to create a web ACL rule with an IPSet with the list of public addresses from the branch office locations Associate the web ACL with the Workspaces directory

C. Use AWS Certificate Manager (ACM) to issue trusted device certificates to the machines deployed in the branch office locations Enable restricted access on the Workspaces directory

D. Create a custom Workspace image with a Windows Firewall configured to restrict access to the public addresses of the branch offices Use the image to deploy the Workspaces.

Correct Answer: C

Question 11:

A company is building an electronic document management system in which users upload their documents. The application stack is entirely serverless and runs on AWS in the EU- central-1 Region.

The system includes a web application that uses an Amazon CloudFront distribution for delivery with Amazon S3 as the origin. The web application communicates with Amazon API Gateway Regional endpoints.

The API Gateway APIs call AWS Lambda functions that store metadata in an Amazon Aurora Serverless database and put the documents into an S3 bucket.

The company is growing steadily and has completed a proof of concept with its largest customer. The company must improve latency outside of Europe.

Which combination of actions will meet these requirements? (Select TWO.)

A. Enable S3 Transfer Acceleration on the S3 bucket. Ensure that the web application uses the Transfer Acceleration signed URLs.

B. Create an accelerator in AWS Global Accelerator. Attach the accelerator to the CloudFront distribution.

C. Change the API Gateway Regional endpoints to edge-optimized endpoints.

D. Provision the entire stack in two other locations that are spread across the world. Use global databases on the Aurora Serverless cluster.

E. Add an Amazon RDS proxy between the Lambda functions and the Aurora Serverless database.

Correct Answer: AC

Explanation: https://aws.amazon.com/global-accelerator/faqs/

Question 12:

A company has an asynchronous HTTP application that is hosted as an AWS Lambda function. A public Amazon API Gateway endpoint invokes the Lambda function. The Lambda function and the API Gateway endpoint reside in the us-east1 Region. A solutions architect needs to redesign the application to support failover to another AWS Region.

Which solution will meet these requirements?

A. Create an API Gateway endpoint in the us-west-2 Region to direct traffic to the Lambda function in us-east-1. Configure Amazon Route 53 to use a failover routing policy to route traffic for the two API Gateway endpoints.

B. Create an Amazon Simple Queue Service (Amazon SQS) queue. Configure API Gateway to direct traffic to the SQS queue instead of to the Lambda function. Configure the Lambda function to pull messages from the queue for processing.

C. Deploy the Lambda function to the us-west-2 Region. Create an API Gateway endpoint in us-west-2 to direct traffic to the Lambda function in us-west-2. Configure AWS Global Accelerator and an Application Load Balancer to manage traffic across the two API Gateway endpoints.

D. Deploy the Lambda function and an API Gateway endpoint to the us-west-2 Region. Configure Amazon Route 53 to use a failover routing policy to route traffic for the two API Gateway endpoints.

Correct Answer: D

Explanation: This solution allows for deploying the Lambda function and API Gateway endpoint to another region, providing a failover option in case of any issues in the primary region.

Using Route 53\’s failover routing policy allows for the automatic routing of traffic to the healthy endpoint, ensuring that the application is available even in case of issues in one region. This solution provides a cost-effective and simple way to implement failover while minimizing operational overhead.

Question 13:

A group of research institutions and hospitals are in a partnership to study 2 PBs of genomic data. The institute that owns the data stores it in an Amazon S3 bucket and updates it regularly.

The institute would like to give all of the organizations in the partnership read access to the data. All members of the partnership are extremely cost-conscious, and the institute that owns the account with the S3 bucket is concerned about covering the costs tor requests and data transfers from Amazon S3.

Which solution allows for secure data sharing without causing the institute that owns the bucket to assume all the costs for S3 requests and data transfers\’?

A. Ensure that all organizations in the partnership have AWS accounts. In the account with the S3 bucket, create a cross-account role for each account in the partnership that allows reading access to the data. Have the organizations assume and use that read role when accessing the data.

B. Ensure that all organizations in the partnership have AWS accounts. Create a bucket policy on the bucket that owns the data The policy should allow the accounts in the partnership read access to the bucket. Enable Requester Pays on the bucket. Have the organizations use their AWS credentials when accessing the data.

C. Ensure that all organizations in the partnership have AWS accounts. Configure buckets in each of the accounts with a bucket policy that allows the institute that owns the data the ability to write to the bucket Periodically sync the data from the institute\’s account to the other organizations. Have the organizations use their AWS credentials when accessing the data using their accounts

D. Ensure that all organizations in the partnership have AWS accounts. In the account with the S3 bucket, create a cross-account role for each account in the partnership that allows reading access to the data. Enable Requester Pays on the bucket. Have the organizations assume and use that read role when accessing the data.

Correct Answer: B

Explanation: In general, bucket owners pay for all Amazon S3 storage and data transfer costs associated with their bucket. A bucket owner, however, can configure a bucket to be a Requester Pays bucket.

With Requester Pays buckets, the requester instead of the bucket owner pays the cost of the request and the data download from the bucket. The bucket owner always pays the cost of storing data.

If you enable Requester Pays on a bucket, anonymous access to that bucket is not allowed. https://docs.aws.amazon.com/AmazonS3/latest/userguide/RequesterPaysExamples.html

Question 14:

A large company runs workloads in VPCs that are deployed across hundreds of AWS accounts Each VPC consists of public subnets and private subnets that span across multiple Availability Zones NAT gateways are deployed in the public subnets and allow outbound connectivity to the internet from the private subnets.

A solutions architect is working on a hub-and-spoke design. All private subnets in the spoke VPCs must route traffic to the internet through an egress VPC The solutions architect already has deployed a NAT gateway in an egress VPC in a central AWS account

Which set of additional steps should the solutions architect take to meet these requirements?

A. Create peering connections between the egress VPC and the spoke VPCs Configure the required routing to allow access to the internet

B. Create a transit gateway and share it with the existing AWS accounts Attach existing VPCs to the transit gateway Configure the required routing to allow access to the internet

C. Create a transit gateway in every account Attach the NAT gateway to the transit gateways Configure the required routing to allow access to the internet

D. Create an AWS PrivateLink connection between the egress VPC and the spoke VPCs Configure the required routing to allow access to the internet

Correct Answer: B

Question 15:

A company has more than 10.000 sensors that send data to an on-premises Apache Kafka server by using the Message Queuing Telemetry Transport (MQTT) protocol. The on-premises Kafka server transforms the data and then stores the results as objects in an Amazon S3 bucket Recently, the Kafka server crashed.

The company lost sensor data while the server was being restored A solutions architect must create a new design on AWS that is highly available and scalable to prevent a similar occurrence

Which solution will meet these requirements?

A. Launch two Amazon EC2 instances to host the Kafka server in an active/standby configuration across two Availability Zones. Create a domain name in Amazon Route 53 Create a Route 53 failover policy Route the sensors to send the data to the domain name

B. Migrate the on-premises Kafka server to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Create a Network Load Balancer (NLB) that points to the Amazon MSK broker. Enable NLB health checks Route the sensors to send the data to the NLB.

C. Deploy AWS loT Core, and connect it to an Amazon Kinesis Data Firehose delivery stream Use an AWS Lambda function to handle data transformation Route the sensors to send the data to AWS loT Core

D. Deploy AWS loT Core, and launch an Amazon EC2 instance to host the Kafka server Configure AWS loT Core to send the data to the EC2 instance Route the sensors to send the data to AWSIoT Core.

Correct Answer: A


The SAP-C02 exam is the latest certification exam for AWS Certified Solutions Architect – Professional, replacing SAP-C01!

Read the latest SAP-C02 exam questions and answers online to help you increase your knowledge and experience and help you progress!

It is now recommended that you use the best AWS Certified Solutions Architect – Professional exam materials: download the latest SAP-C02 dumps: https://www.leads4pass.com/sap-c02.html (421 Q&A), to help you take the exam easily.

AwsExamDumps is the largest community of Amazon free dumps, and it has the latest and most complete Amazon (AWS Certified Associate, AWS Certified Foundational, AWS Certified Professional, AWS Certified Specialty) dump community. You can take online practice tests, and the latest version of the exam dump is recommended. Helping you pass the exam with ease.
Back To Top