Q256.A financial services company receives a regular data feed from its credit card servicing partner.Approximately 5000 records are sent every 15 minutes in plaintext delivered over HTTPS directly into an Amazon S3 bucket with

欢迎免费使用小程序搜题/刷题/查看解析,提升学历,成考自考报名,论文代写、论文查重请加客服微信skr-web


Q256.A financial services company receives a regular data feed from its credit card servicing partner.Approximately 5000 records are sent every 15 minutes in plaintext delivered over HTTPS directly into an Amazon S3 bucket with server- side encryption .This feed contains sensitive credit card primary account number (PAN) data. The company needs to automatically mask the PAN before sending the data to another S3 bucket for additional internal processing. The company also needs to remove and merge specific fields and then transform the record into JSON format .Additionallyextra feeds are likely to be added in the future so any designneeds to be easily expandable. Which solutions will meet these requirements?

A.Trigger an AWS Lambda function on file delivery that extracts each record and writes it to an Amazon SQS queue.Trigger another Lambda function when new messages arrive in the SQS queue to process the records writing the results to a temporary location in Amazon S3 .Trigger a final Lambda function once the SQS queue is empty to transform the records into JSON format and send the results to another S3 bucket tor internal processing.
B.Trigger an AWS Lambda function on file delivery that extracts each record and writes it to an Amazon SQS queue Configure an AWS Fargate container application to automatically scale to a single instance when the SQS queue
Contains messages. Have the application process each record. and transform the record into JSON format. When the queue is empty send the results to another S3 bucket for internal processing and scale down the AWS Fargate instance. C.Create an AWS Glue crawler and custom classifier based on the data feed formats and build a table definition to match.Trigger an AWS Lambda function on file delivery to start an AWS Glue ETL job to transform the entire record according to the processing and transformation requirements. Define the output format as JSON .Once complete have the ETL job send the results to another S3 bucket for internal processing. D.Create an AWS Glue crawler and custom classifier based upon the data feed formats and build a table definition to match. Perform an Amazon Athena query on file delivery to start an Amazon EMR ETL job to transform the entire record according to the processing and transformation requirements. Define the output format as JSON .Once complete send the results to another S3 bucket for internal processing and scale down the EMR cluster.
正确答案A
访客
邮箱
网址

通用的占位符缩略图

人工智能机器人,扫码免费帮你完成工作


  • 自动写文案
  • 自动写小说
  • 马上扫码让Ai帮你完成工作
通用的占位符缩略图

人工智能机器人,扫码免费帮你完成工作

  • 自动写论文
  • 自动写软件
  • 我不是人,但是我比人更聪明,我是强大的Ai
Top