Summer Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exc65

A client has multiple warehouses where orders can be fulfilled. The cost of shipping goods from each warehouse varies by day, due to the number of workers available. The Architect needs to make sure that when an order is shipped, it is shipped from the lowest cost warehouse that is open.

How should this functionality be implemented?

A.

Create anew class as a preference for Magento\inventoryShipping\piugin\Sales\shipment\AssignSourceCodeToShipmentPlugin to set the lowest-cost warehouse on a shipment.

B.

Create a new class implementing Magento\invtntorysourceSelectionApi\Modei\sourceSelectioninterfacece. which returns open warehouses sorted by cost.

C.

Create an after plugin OnHagento\InventoryDistanceBasedSourceSelection\Hodel\Algorithms\DistanceBasedAlgorithto sortto Warehouse sources by cost

An existing Adobe Commerce website is moving to a headless implementation.

The existing website features an "All Brands'' page, as well as individual pages for each brand. All brand-related pages are cached in Varnish using tags in the same manner as products and categories.

Two new GraphQL queries have been created to make this information available to the frontend for the new headless implementation:

During testing, the queries sometimes return out-of-date information. How should this problem be solved while maintaining performance?

A.

Specify a @cacgecacheable(cacheable: false) directive for each GraphQL query, making sure that the data returned is not cached, and is up to date

B.

Specify a $cache(cacheidentity: Path\\To\\identityclass) directive for each GraphQL query, corresponding to a class that adds cache tags for relevant brands and associated products

C.

Each GraphQL query's resolver class should inject \Magento\GraphQlcache\Model\cacheableQuery and call setcachevalidity(true) on it as part of the resolver's resolve function.

Since the last production deployment, customers can not complete checkout.

The error logs show the following message multiple times:

main.CRITICAL: Report ID: webapi-61b9fe83f0c3e; Message: Infinite loop detected, review the trace for the looping path

The Architect finds a deployed feature that should limit delivery for some specific postcodes.

The Architect sees the following code deployed in etc/webapi_rest/di. xml and etc/frontend/di. Xml

LimitRates.php:

Which step should the Architect perform to solve the issue?

A.

Change 'after' plugin with 'around' plugin. The issue is being caused by calling the result provider code after the code of the original method.

B.

Replace the injected dependency Of \Magento\Checkout\Model\Session With \Magento\FraBievork\Session\SessionManagerInterf ace

C.

Inject an instance Of Magento\(Quote\Api\CartRepositoryInterface and receive Cart instance Via $thiscartRepository->get($this->session->getQuoteId())

While reviewing a newly developed pull request that refactors multiple custom payment methods, the Architect notices multiple classes that depend on \Magento\Framework\Encryption\EncryptorInterface to decrypt credentials for sensitive data. The code that is commonly repeated is as follows:

The Architect needs to recommend an optimal solution to avoid redundant dependency and duplicate code among the methods. Which solution should the Architect recommend?

A.

Create a common config service class v«ndor\Pay-ient\Gat«way\conf ig\conf ig under Vendor.Payment and use it as a parent class for all of the

Vendor\Payi»entModule\Gateway\Conf ig\Conf ig ClaSSeS and remove $scopeConf ig and Sencryptor dependencies

B.

Replace all Vendor\PaymentModule\Gateway\Config\Config ClaSSeS With virtualType Of Magento\Payiaent\Gateway\Conf ig\Conf ig and Set Under config.xml

C.

Add a plugin after the getvalue method of $scopeConfig, remove the $encryptor from dependency and use it in the plugin to decrypt the value if the config name is user.secret'

A custom cron job has been added to an Adobe Commerce system to collect data for several reports. Its crontab. xml configuration is as follows:

The job is data intensive and runs for between 20 and 30 minutes each night.

Within a few days of deployment, it is noticed that the sites sitemap. xml file has not been updated since the new job was added.

What should be done to fix this issue?

A.

Change the schedule of the siten.aP_generate cron job to 30 0 * * *so that it runs after the gather_reporting_data job has completed.

B.

Create a new cron group for the reporting job, specifying i

C.

Break the data gathering job into a number of smaller jobs, so that each individual job runs for a maximum of 5 minutes

An Adobe Commerce Architect is troubleshooting an issue on an Adobe Commerce Cloud project that is not yet live.

The developers copied the Staging Database to Production in readiness to Go Live. However, when the developers test their Product Import feature, the new products do not appear on the front end.

The developers suspect the Varnish Cache is not being cleared. Staging seems to work as expected. Production was working before the database migration.

What is the likely cause?

A.

Thefatlycredentials in the Production Database are incorrect.

B.

A deployment should have been done on Production to initialize Fatly caching.

C.

The site URLs in the Production Database are the URLs of the Staging Instance and must be updated

A developer needs to uninstall two custom modules as well as the database data and schemas. The developer uses the following command: bin/magento module:uninstall Vendor_SampleMinimal Vendor_SampleModifyContent

When the command is run from CLI, the developer fails to remove the database schema and data defined in the module Uninstall class. Which three requirements should the Architect recommend be checked to troubleshoot this issue? (Choose three.)

A.

invoked uninstall() and uninstallschema are defined in the Uninstall class

B.

invoked unlnstalK) method is implemented in the Uninstall class

C.

bin/magento maintenance:enable command should be run in CLI before

D.

--remove-data option is specified as an argument for the CLI command

E.

--remove-schema and --remove-data options are specified as arguments for the CLI command

F.

composer.json file is present and defines the module as a composer package

An Adobe Commerce Architect is supporting deployment and building tools for on-premises Adobe Commerce projects. The tool is executing build scripts on a centralized server and using an SSH connection to deploy to project servers.

A client reports that users cannot work with Admin Panel because the site breaks every time they change interface locale.

Considering maintainability, which solution should the Architect implement?

A.

Modify project config.php file, configure 'admin_locales_for_deploy' value, and specify all required locales

B.

Edit project env.php file, configure 'adminJocales_for_build' value, and specify all required locales

C.

Adjust the tools build script and specify required locales during *setup:static-content:deploy' command

An Architect needs to review a custom product feed export module that a developer created for a merchant. During final testing before the solution is deployed, the product feed output is verified as correct. All unit and integration tests for code pass.

However, once the solution is deployed to production, the product price values in the feed are incorrect for several products. The products with incorrect data are all currently part of a content staging campaign where their prices have been reduced.

What did the developer do incorrectly that caused the feed output to be incorrect for products in the content staging campaign?

A.

The developer retrieved product data directly from the database using the entity_id column rather than a collection or repository.

B.

The developer forgot to use the getContentStagingValue method to retrieve the active campaign value of the product data.

C.

The developer did not check for an active content staging campaign and emulates the campaign state when retrieving product data.

An Architect is investigating a merchant's Adobe Commerce production environment where all customer session data is randomly being lost. Customer session data has been configured to be persisted using Redis, as are all caches (except full page cache, which is handled via Varnish).

After an initial review, the Architect is able to replicate the loss of customer session data by flushing the Magento cache storage, either via the Adobe Commerce Admin Panel or running bin/magento cache: flush on the command line. Refreshing all the caches in the Adobe Commerce Admin Panel or running bin/magento cache: clean on the command line does not cause session data to be lost.

What should be the next step?

A.

Check app/etc/env.php and make sure that the Redis configuration for caches and session data use different database numbers.

B.

Educate the merchant to not flush cache storage and only refresh the caches in future.

C.

Set the Stores > Configuration' option for Store Session Data Separately' to 'Yes' in the Adobe Commerce Admin Panel.