leads4pass has updated the latest valid Amazon SAP-C01 exam questions and answers. AWS Certified Solutions Architect – Professional exam code “SAP-C01”. All exam questions have been verified to ensure successful passing of the exam. leads4pass SAP-C01 dumps https://www.leads4pass.com/aws-solution-architect-professional.html (Total Questions: 777 Q&A). With many years of exam experience, 99.5% of the exam pass rate. You can experience part of the exam practice questions shared by leads4pass online for free.
Free share part of Amazon SAP-C01 exam pdf
The free Amazon SAP-C01 exam PDF is shared from leads4pass. You can download the practice online. To get the complete Amazon SAP-C01 exam questions and answers, please choose leads4pass. We update all exam questions and answers in real-time throughout the year to ensure immediate validity.
Amazon SAP-C01 exam practice questions and answers come from leads4pass and share a part for free
QUESTION 1
A company has multiple AWS accounts and manages these accounts which AWS Organizations. A developer was
given IAM user credentials to access AWS resources. The developer should have read-only access to all Amazon S3
buckets in the account. However, when the developer tries to access the S3 buckets from the console, they receive an
access denied error message with no bucket listed. A solution architect reviews the permissions and finds that the developer\’s IAM user is listed as having read-only access to all S3 buckets in the account.
Which additional steps should the solutions architect take to troubleshoot the issue? (Choose two.)
A. Check the bucket policies for all S3 buckets.
B. Check the ACLs for all S3 buckets.
C. Check the SCPs set at the organizational units (OUs).
D. Check for the permissions boundaries set for the IAM user.
E. Check if an appropriate IAM role is attached to the IAM user.
Correct Answer: DE
QUESTION 2
A company has a web application that securely uploads pictures and videos to an Amazon S3 bucket. The company
requires that only authenticated users are allowed to post content. The application generates a pre-signed URL that is
used to upload objects through a browser interface. Most users are reporting slow upload times for objects larger than
100 MB. What can a Solutions Architect do to improve the performance of these uploads while ensuring only authenticated users are allowed to post content?
A. Set up an Amazon API Gateway with an edge-optimized API endpoint that has a resource as an S3 service proxy.
Configure the PUT method for this resource to expose the S3 PutObject operation. Secure the API Gateway using a
COGNITO_USER_POOLSauthorizer. Have the browser interface use API Gateway instead of the pre-signed URL to
upload objects.
B. Set up an Amazon API Gateway with a regional API endpoint that has a resource as an S3 service proxy. Configure
the PUT method for this resource to expose the S3 PutObject operation. Secure the API Gateway using an AWS
Lambda authorizer. Have the browser interface use API Gateway instead of the pre-signed URL to upload API objects.
C. Enable and S3 Transfer Acceleration endpoint on the S3 bucket. Use the endpoint when generating the presigned
URL. Have the browser interface upload the objects to this URL using the S3 multipart upload API.
D. Configure an Amazon CloudFront distribution for the destination S3 bucket. Enable PUT and POST methods for the
CloudFront cache behavior. Update the CloudFront origin to use an origin access identity (OAI). Give the OAI user
s3:PutObjectpermissions in the bucket policy. Have the browser interface upload objects using the CloudFront
distribution.
Correct Answer: C
QUESTION 3
A user wants to configure AutoScaling which scales up when the CPU utilization is above 70% and scales down when
the CPU utilization is below 30%.
How can the user configure AutoScaling for the above-mentioned condition?
A. Configure ELB to notify AutoScaling on load increase or decrease
B. Use AutoScaling with a schedule
C. Use AutoScaling by manually modifying the desired capacity during a condition
D. Use dynamic AutoScaling with a policy
Correct Answer: D
Explanation:
The user can configure the AutoScaling group to automatically scale up and then scale down based on the specified conditions. To configure this, the user must set up policies that will get triggered by the CloudWatch alarms. Reference: http://docs.aws.amazon.com/AutoScaling/latest/DeveloperGuide/as-scale-based-on-demand.html
QUESTION 4
A financial services company receives a regular data feed from its credit card servicing partner. Approximately 5,000
records are sent every 15 minutes in plaintext, delivered over HTTPS directly into an Amazon S3 bucket with server-side encryption. This feed contains sensitive credit card primary account number (PAN) data. The company needs to
automatically mask the PAN before sending the data to another S3 bucket for additional internal processing. The
company also needs to remove and merge specific fields, and then transform the record into JSON format. Additionally, extra feeds are likely to be added in the future, so any design needs to be easily expandable.
Which solutions will meet these requirements?
A. Trigger an AWS Lambda function on file delivery that extracts each record and writes it to an Amazon SQS queue.
Trigger another Lambda function when new messages arrive in the SQS queue to process the records, writing the
results to a temporary location in Amazon S3. Trigger a final Lambda function once the SQS queue is empty to
transform the records into JSON format and send the results to another S3 bucket for internal processing.
B. Trigger an AWS Lambda function on file delivery that extracts each record and writes it to an Amazon SQS queue.
Configure an AWS Fargate container application to automatically scale to a single instance when the SQS queue
contains messages. Have the application process each record, and transform the record into JSON format. When the
queue is empty, send the results to another S3 bucket for internal processing and scale down the AWS Fargate
instance.
C. Create an AWS Glue crawler and custom classifier based on the data feed formats and build a table definition to
match. Trigger an AWS Lambda function on file delivery to start an AWS Glue ETL job to transform the entire record
according to the processing and transformation requirements. Define the output format as JSON. Once complete, have
the ETL job sends the results to another S3 bucket for internal processing.
D. Create an AWS Glue crawler and custom classifier based upon the data feed formats and build a table definition to
match. Perform an Amazon Athena query on file delivery to start an Amazon EMR ETL job to transform the entire record according to the processing and transformation requirements. Define the output format as JSON. Once complete, send the results to another S3 bucket for internal processing and scale down the EMR cluster.
Correct Answer: A
QUESTION 5
An eCommerce website running on AWS uses an Amazon RDS for MySQL DB instance with General Purpose SSD
storage. The developers chose an appropriate instance type based on demand and configured 100 GB of storage with
a sufficient amount of free space.
The website was running smoothly for a few weeks until a marketing campaign launched. On the second day of the
campaign, users reported long wait times and time outs. Amazon CloudWatch metrics indicated that both reads and
writes to the DB instance were experiencing long response times.
The CloudWatch metrics show 40% to 50% CPU and memory utilization, and sufficient free storage space is still available. The application server logs show no evidence of database connectivity issues.
What could be the root cause of the issue with the marketing campaign?
A. It exhausted the I/O credit balance due to provisioning low disk storage during the setup phase.
B. It caused the data in the tables to change frequently, requiring indexes to be rebuilt to optimize queries.
C. It exhausted the maximum number of allowed connections to the database instance.
D. It exhausted the network bandwidth available to the RDS for MySQL DB instance.
Correct Answer: C
QUESTION 6
While assigning a tag to an instance, which of the below-mentioned options is not a valid tag key/value pair?
A. Key: “aws” Value:”aws”
B. Key: “aws:name” Value: “instanceAnswer: Aws”
C. Key: “Name :aws” Value: “instanceAnswer: Aws”
D. Key : “nameAnswer: Aws” Value:”aws:instance”
Correct Answer: B
Explanation:
In Amazon Web Services, to help manage EC2 instances as well their usage in a better way, the user can
tag the instances. The tags are metadata assigned by the user which consists of a key and value. The tag
the key cannot have a prefix as “aws:”, although it can have only “aws”.
Reference: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/Using_Tags.html
QUESTION 7
True or false: In a CloudFormation template, you can reuse the same logical ID several times to reference the resources
in other parts of the template.
A. True, a logical ID can be used several times to reference the resources in other parts of the template.
B. False, a logical ID must be unique within the template.
C. False, you can mention a resource only once and you cannot reference it in other parts of a template.
D. False, you cannot reference other parts of the template.
Correct Answer: B
Explanation:
In AWS CloudFormation, the logical ID must be alphanumeric (A-Za-z0-9) and unique within the template.
You use the logical name to reference the resource in other parts of the template.
Reference: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/concept-resources.html
QUESTION 8
Your company hosts a social media site supporting users in multiple countries. You have been asked to provide a highly available design for the application that leverages multiple regions for the most recently accessed content and latency-sensitive portions of the wet) site The most latency-sensitive component of the application involves reading user preferences to support the website personalization and ad selection.
In addition to running your application in multiple regions, which option will support this application\’s requirements?
A. Serve user content from S3. CloudFront and use Route53 latency-based routing between ELBs in each region
Retrieve user preferences from a local DynamoDB table in each region and leverage SQS to capture changes to user
preferences with SOS workers for propagating updates to each table.
B. Use the S3 Copy API to copy recently accessed content to multiple regions and serve user content from S3.
CloudFront with dynamic content and an ELB in each region Retrieve user preferences from an ElasticCache cluster in
each region and leverage SNS notifications to propagate user preference changes to a worker node in each region.
C. Use the S3 Copy API to copy recently accessed content to multiple regions and serve user content from S3
CloudFront and Route53 latency-based routing Between ELBs In each region Retrieve user preferences from a
DynamoDB table and leverage SQS to capture changes to user preferences with SOS workers for propagating
DynamoDB updates.
D. Serve user content from S3. CloudFront with dynamic content and an ELB in each region Retrieve user preferences
from an ElastiCache cluster in each region and leverage Simple Workflow (SWF) to manage the propagation of user
preferences from a centralized OB to each ElastiCache cluster.
Correct Answer: A
QUESTION 9
A company runs a memory-intensive analytics application using an on-demand Amazon EC2 C5 compute-optimized
instance. The application is used continuously and application demand doubles during working hours. The application
currently scales based on CPU usage. When scaling in occurs, a lifecycle hook is used because the instance requires 4
minutes to clean the application state before terminating.
Because users reported poor performance during working hours, scheduled scaling actions were implemented so
additional instances would be added during working hours. The Solutions Architect has been asked to reduce the cost
of the application.
Which solution is MOST cost-effective?
A. Use the existing launch configuration that uses C5 instances, and update the application AMI to include the Amazon
CloudWatch agent. Change the Auto Scaling policies to scale based on memory utilization. Use Reserved Instances for
the number of instances required after working hours, and use Spot Instances to cover the increased demand during
working hours.
B. Update the existing launch configuration to use R5 instances and update the application AMI to include SSM Agent.
Change the Auto Scaling policies to scale based on memory utilization. Use Reserved Instances for the number of
instances required after working hours, and use Spot Instances with on-Demand instances to cover the increased
demand during working hours.
C. Use the existing launch configuration that uses C5 instances and update the application AMI to include SSM Agent.
Leave the Auto Scaling policies to scale based on CPU utilization. Use scheduled Reserved Instances for the number of
instances required after working hours, and use Spot Instances to cover the increased demand during working hours.
D. Create a new launch configuration using R5 instances and update the application AMI to include the Amazon
CloudWatch agent. Change the Auto Scaling policies to scale based on memory utilization. Use Reserved Instances for
the number of instances required after working hours, and use Standard Reserved Instances with On-Demand
Instances to cover the increased demand during working hours.
Correct Answer: D
Reference:
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/monitoring_ec2.html
QUESTION 10
Which of the following cannot be done using AWS Data Pipeline?
A. Create complex data processing workloads that are fault-tolerant, repeatable, and highly available.
B. Regularly access your data where it\’s stored, transform and process it at scale, and efficiently transfer the results to
another AWS service.
C. Generate reports over data that has been stored.
D. Move data between different AWS compute and storage services as well as on-premise data sources at specified
intervals.
Correct Answer: C
Explanation:
AWS Data Pipeline is a web service that helps you reliably process and move data between different AWS
compute and storage services as well as on-premise data sources at specified intervals. With AWS Data
Pipeline, you can regularly access your data where it\’s stored, transform and process it at scale, and
efficiently transfer the results to another AWS.
AWS Data Pipeline helps you easily create complex data processing workloads that are fault-tolerant,
repeatable, and highly available. AWS Data Pipeline also allows you to move and process data that was
previously locked up in on-premise data silos.
Reference: http://aws.amazon.com/datapipeline/
QUESTION 11
You want to define permissions for a role in an IAM policy. Which of the following configuration formats should you
use?
A. An XML document was written in the IAM Policy Language
B. An XML document was written in a language of your choice
C. A JSON document was written in the IAM Policy Language
D. JSON document was written in a language of your choice
Correct Answer: C
Explanation:
You define the permissions for a role in an IAM policy. An IAM policy is a JSON document written in the
IAM Policy Language.
Reference: http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_terms-and-concepts.html
QUESTION 12
Is there any way to own a direct connection to Amazon Web Services?
A. No, AWS only allows access from the public Internet.
B. No, you can create an encrypted tunnel to VPC, but you cannot own the connection.
C. Yes, you can via Amazon Dedicated Connection
D. Yes, you can via AWS Direct Connect.
Correct Answer: D
Explanation:
AWS Direct Connect links your internal network to an AWS Direct Connect location over a standard 1 gigabit or 10 gigabit Ethernet fiber-optic cable. One end of the cable is connected to your router, the other to an AWS Direct Connect router. With this connection in place, you can create virtual interfaces directly to the AWS cloud (for example, to Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Simple Storage Service (Amazon S3),) and to Amazon Virtual Private Cloud (Amazon VPC), bypassing Internet service providers in your network path. Reference: http://docs.aws.amazon.com/directconnect/latest/UserGuide/Welcome.html
QUESTION 13
A Solutions Architect wants to make sure that only AWS users or roles with suitable permissions can access a new
Amazon API Gateway endpoint. The Solutions Architect wants an end-to-end view of each request to analyze the
latency of the request and create service maps.
How can the Solutions Architect design the API Gateway access control and perform request inspections?
A. For the API Gateway method set the authorization to AWS_IAM. Then, give the IAM user or role execute-API: Invoke
permission on the REST API resource. Enable the API caller to sign requests with AWS Signature when accessing the
endpoint. Use AWS X-Ray to trace and analyze user requests to API Gateway.
B. For the API Gateway resource set CORS to enabled and only return the company\’s domain in AccessControl-AllowOrigin headers. Then, give the IAM user or role execute-API: Invoke permission on the REST API resource. Use Amazon CloudWatch to trace and analyze user requests to API Gateway.
C. Create an AWS Lambda function as the custom authorizer, ask the API client to pass the key and secret when
making the call, and then use Lambda to validate the key/secret pair against the IAM system. Use AWS X-Ray to trace
and analyze user requests to API Gateway.
D. Create a client certificate for API Gateway. Distribute the certificate to the AWS users and roles that need to access
the endpoint. Enable the API caller to pass the client certificate when accessing the endpoint. Use Amazon CloudWatch
to trace and analyze user requests to API Gateway.
Correct Answer: D
Reference:
https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-cors.html
Related AWS Certified Professional
Exam Name | Exam PDF | Exam Practice | Advanced Exam Dumps |
SAP-C01 | SAP-C01 PDF | SAP-C01 Exam Practice | https://www.leads4pass.com/aws-solution-architect-professional.html |
DOP-C01 | DOP-C01 PDF | DOP-C01 Exam Practice | https://www.leads4pass.com/aws-devops-engineer-professional.html |
Summarize:
The free Amazon SAP-C01 exam practice questions come from a part of the real exam room. You can experience part of the exam content first. Get the complete SAP-C01 exam dumps at https://www.leads4pass.com/aws-solution-architect-professional.html (PDF + VCE) to help you successfully pass the exam.
leads4pass has two learning modes: PDF and VCE. You can choose according to your preferences.
ps.
The free Amazon SAP-C01 exam PDF is shared from leads4pass. You can download the practice online. To get the complete Amazon SAP-C01 exam questions and answers,
please choose leads4pass. We update all exam questions and answers in real-time throughout the year to ensure immediate validity.