Weekend Sale - Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: sntaclus

A mule application designed to fulfil two requirements

a) Processing files are synchronously from an FTPS server to a back-end database using VM intermediary queues for load balancing VM events

b) Processing a medium rate of records from a source to a target system using batch job scope

Considering the processing reliability requirements for FTPS files, how should VM queues be configured for processing files as well as for the batch job scope if the application is deployed to Cloudhub workers?

A.

Use Cloud hub persistent queues for FTPS files processing

There is no need to configure VM queues for the batch jobs scope as it uses by default the worker's disc for VM queueing

B.

Use Cloud hub persistent VM queue for FTPS file processing

There is no need to configure VM queues for the batch jobs scope as it uses by default the worker's JVM memory for VM queueing

C.

Use Cloud hub persistent VM queues for FTPS file processing

Disable VM queue for the batch job scope

D.

Use VM connector persistent queues for FTPS file processing Disable VM queue for the batch job scope

A leading bank implementing new mule API.

The purpose of API to fetch the customer account balances from the backend application and display them on the online platform the online banking platform. The online banking platform will send an array of accounts to Mule API get the account balances.

As a part of the processing the Mule API needs to insert the data into the database for auditing purposes and this process should not have any performance related implications on the account balance retrieval flow

How should this requirement be implemented to achieve better throughput?

A.

Implement the Async scope fetch the data from the backend application and to insert records in the Audit database

B.

Implement a for each scope to fetch the data from the back-end application and to insert records into the Audit database

C.

Implement a try-catch scope to fetch the data from the back-end application and use the Async scope to insert records into the Audit database

D.

Implement parallel for each scope to fetch the data from the backend application and use Async scope to insert the records into the Audit database

An organization is evaluating using the CloudHub shared Load Balancer (SLB) vs creating a CloudHub dedicated load balancer (DLB). They are evaluating how this choice affects the various types of certificates used by CloudHub deplpoyed Mule applications, including MuleSoft-provided, customer-provided, or Mule application-provided certificates.

What type of restrictions exist on the types of certificates that can be exposed by the CloudHub Shared Load Balancer (SLB) to external web clients over the public internet?

A.

Only MuleSoft-provided certificates are exposed.

B.

Only customer-provided wildcard certificates are exposed.

C.

Only customer-provided self-signed certificates are exposed.

D.

Only underlying Mule application certificates are exposed (pass-through)