[2025-December-New]Braindump2go SC-200 Dumps VCE Free Share[Q313-Q360]

2025/December Latest Braindump2go SC-200 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go SC-200 Real Exam Questions!

QUESTION 313
You have a Microsoft 365 subscription that uses Microsoft Defender for Endpoint Plan 2 and contains 500 Windows devices.
As part of an incident investigation, you identify the following suspected malware files:
– sys
– pdf
– docx
– xlsx
You need to create indicator hashes to block users from downloading the files to the devices.
Which files can you block by using the indicator hashes?

A. File1.sys only
B. File1.sys and File3.docx only
C. File1.sys, File3.docx, and File4.xlsx only
D. File2.pdf, File3.docx, and File4.xlsx only
E. File1.sys, File2.pdf, File3.docx, and File4.xlsx

Answer: E
Explanation:
Based on File hashes, you should be able to block each and every file with this hash, regardless the name of the file.

QUESTION 314
You have a Microsoft 365 subscription that uses Microsoft Defender for Endpoint and contains a user named User1 and a Microsoft 365 group named Group1. All users are assigned a Defender for Endpoint Plan 1 license.
You enable Microsoft Defender XDR Unified role-based access control (RBAC) for Endpoints & Vulnerability Management.
You need to ensure that User1 can configure alerts that will send email notifications to Group1. The solution must follow the principle of least privilege.
Which permissions should you assign to User1?

A. Defender Vulnerability Management – Remediation handling
B. Alerts investigation
C. Live response capabilities: Basic
D. Manage security settings

» Read more

[2025-December-New]Braindump2go MB-800 Dumps Free[Q168-Q208]

2025/December Latest Braindump2go MB-800 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go MB-800 Real Exam Questions!

QUESTION 168
A company based in the United States uses Dynamics 365 Business Central.
A customer agrees to buy raw materials in the MXN (Mexican Peso) currency.
You need to set up the currency and exchange rates for this purchase.
What should you do?

A. Do not configure the currency MXN and recalculate all entries in USD before posting.
B. Configure the currency MXN, then set up the Currency Exchange Rate Service to upload currency rates automatically.
C. Use the currency MXN as a local currency in the system, then set the exchange rates manually before posting.
D. Use the currency MXN as an additional reporting currency, then set up the Currency Exchange Rate Service to upload rates automatically.

Answer: B
Explanation:
This option allows you to add the MXN (Mexican Peso) currency to Dynamics 365 Business Central and configure it for use in transactions. Setting up the Currency Exchange Rate Service to upload currency rates automatically will ensure that you have accurate and up-to-date exchange rates for converting between the local currency (USD) and the foreign currency (MXN).

QUESTION 169
A company uses Dynamics 365 Business Central. The company has a customer that will also be a vendor for the company in the next financial year.
The company plans to consolidate the customer and vendor balances to reduce unnecessary payments on receipts and reduce the amount of transaction fees.
You need to configure the contact card.
Which option should you configure first?

A. Create as Employee
B. Create as Customer
C. Create as Bank
D. Create as Vendor

» Read more

[2025-December-New]Braindump2go MB-700 Exam Prep Free[Q101-Q156]

2025/December Latest Braindump2go MB-700 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go MB-700 Real Exam Questions!

QUESTION 101
Case Study 2 – CoHo Vineyard and Winery
Overview
CoHo Vineyard and Winery is based in the United States. The company has a single vineyard. The company distributes full pallets of wine worldwide and sometimes sells wine by using a private label. Grape growing and wine production operations are owned and operated by a third-party company. The company uses bottles, accessories, and other disposables from a company in China. CoHo Vineyard and Winery opens a second location. This location manages inventory for Wine Club members and includes a wine tasting room. The location also includes a warehouse and distribution center.
The wine club currently has 200 members. The company hopes to increase this number. The company holds events for club members in the tasting room and ships both full cases and individual bottles to members. The tasting room is not treated as a retail store. All sales to club members only and purchases must be made on-account.
Company structure
The following graphic shows the company structure:

Organization
The current organizational chart and roles as follows:

Current environment
Coho Vineyard and Winery currently uses manual processes for most of its operations. The company stores names and phone numbers for club members and prospective club members in a Microsoft Excel workbook. The sales team currently tracks club membership in a separate system. Sales team stock awards are granted on a three-year basis. The solution for the sales Team cannot be replaced for three years.
– The types of wine, such as cabernet sauvignon and red blends are tracked as separate items. Each item will have a year associated with it such as Red Blend 2017, 2018, 2019, etc. The items are updated for the year association annually.
– Peak times for order entry could be in the hundreds. This typically happens when first and second shift workers overlap and all users are on the system, as well as orders imported in.
– Packaging materials, bottles, and accessories are received in the warehouse FOB Destination from the manufacturer in China.
– Each retailer has individual item numbers, barcode placement descriptions and other additions such as pictures and details of the wine year and flavor notes. CoHo Vineyard and Winery has outgrown their legacy ERP system and plans to implement Dynamics 365 Finance. CoHo wants to use out-of-the-box tools which are linked to the system where possible and not create new tools.
Licensing and organization
– Only the President/CEO, CFO, COO, Controller and VP Operations users must be able to access finance and warehouse features.
– Warehouse users must only have access to warehouse function. All other users must have access to finance functions.
– The distribution center must have mobile scanners for the warehouse. The tasting room will not have mobile scanners.
– The inventory cost must be tracked separately by distribution center or tasting room.
Club membership events
– You must ensure the members created in their existing system are also customers in Dynamics 365 Finance and that the customers do not exist in both systems under multiple record numbers. – When tasting events are planned, text notifications must be sent out to the club members.
– text notifications must sent out to non-club members to encourage the non-members to join the club.
– During tasting events. CoHo will need access to place orders during the event but will not have the need for cash registers.
Warehouse and shipping
– The system must be able to track inventory throughout the whole warehouse process.
– A Certificate of Origin must accompany each shipment.
– Private label items that are shipped to retailers must be stored and costed as a single item number at CoHo.
– The system must be able to compare year-to-year performance of a single wine type.
Implementation
– Business processes must be documented as step-by-step processes and must align with the process flows in a visual format
– Data will be migrated and is a requirement for order entry.
– Data migration must be completed prior to testing.
– You must track each step of process validation so that the users have ownership for their individual functional areas. The tracking system used should be integrated with the system where possible.
Issues
– Users are struggling to understand the new system’s processes.
– CoHo is concerned that the go-live will may not go smoothly.
You need to create a plan that meets the following requirements:
– Migrate the data to the new system.
– Implement a standardized method for creating items.
– Prevent items from being created in different ways going forward.
Hotspot Question
You need to recommend a tool to meet the solution requirement.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

» Read more

[2025-December-New]Braindump2go MB-330 Exam Dumps PDF Free[Q1-Q71]

2025/December Latest Braindump2go MB-330 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go MB-330 Real Exam Questions!

QUESTION 1
A company needs to create new items that can be company owned or vendor owned.
You need to create and set up the items so that they can be used as company owned or consignment.
What should you do?

A. Assign a non-stock service item model group
B. Assign a moving average costing inventory model
C. Activate batch dimension and assign a standard costing inventory model
D. Activate owner dimension and assign a standard costing inventory model

Answer: D
Explanation:
https://docs.microsoft.com/en-us/dynamics365/supply-chain/inventory/set-up-consignment

QUESTION 2
A company uses trade agreements for their customers. Prices for some customers must round to the nearest US dollar.
A customer reports that prices do not round to the nearest US dollar as required.
You need to resolve the issue.
In Trade agreement journals, which option should you use?

A. Adjustment
B. View smart rounding
C. Validate all lines
D. Apply smart rounding

Answer: D
Explanation:
https://technologyblog.rsmus.com/microsoft/use-smart-rounding-microsoft-dynamics-ax-customize-pricing-rules/

QUESTION 3
A company creates several item costing versions.
All new and existing items have costs associated with them. After applying the costs, the company notices the activation date has not been updated.
You need to update the items to the current date for activation.
What should you do?

A. Set the item cost record status to Active
B. Set the from date to today and leave the item cost record status at Pending
C. Set the item cost record status to Pending
D. Set the cost price and date of price on the released product

Answer: A
Explanation:
Use the Copy item prices form to copy active item cost records for standard costs into the next period’s costing version.
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/cost-management/costing-versions

QUESTION 4
An employee at a company releases a new product from the Released product maintenance workspace.
An employee in another department is unable to add the product to a sales order. You determine that dimension groups have not been applied to the product.
You need to ensure that the product can be added to the sales order.
Which two inventory dimension groups should you add to the product? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Tracking dimension group
B. Coverage group
C. Product dimension group
D. Storage dimension group

Answer: AD
Explanation:
The storage dimension group and the tracking dimension group do not have to be associated with a product until after the product has been created.
Implying that Product dimension group is required as part of product setup.
https://docs.microsoft.com/en-us/dynamicsax-2012/appuser-itpro/about-inventory-dimensions-and-dimension-groups

QUESTION 5
An employee at a company needs to lay out the various component builds for bicycles.
You need to identify which constraints the employee should use to set up the bicycles.
Which two types of constraints achieve the goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. table constraints that are used generically among product configuration models
B. expression constraints that are used generically among product configuration models
C. expression constraints that are unique to each product configuration model
D. table constraints that are always unique to each product configuration model

Answer: AC
Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/pim/expression-constraints-table-constraints-product-configuration-models

QUESTION 6
A company manufactures and sells speaker boxes. The speaker boxes can be silver or black with a basic or upgraded wiring harness assembly.
The speaker box must be created in the item master so that the variables for colors and harness type can be assigned at order entry.
You need to create a new item that supports multiple variables.
What should you do?

A. Create a new product. Select predefined variant as the configuration technology.
B. Create a new product master. Select constraint-based configuration as the configuration technology.
C. Create a new product. Select constraint-based configuration as the configuration technology.
D. Create a new product master. Select predefined variant as the configuration technology.

Answer: D

QUESTION 7
A company has items in inventory with two costing methods: FIFO and Standard.
The company needs to calculate the cost of all items at month end and provide a total inventory value to the finance department.
You need to determine the total value of inventory.
Which costing method requires running the inventory close?

A. FIFO and Standard Cost items
B. FIFO items only
C. LIFO, Moving Average, and Date Weighted Average items
D. Standard Cost items only

Answer: B
Explanation:
Moving average and Standard cost do not require inventory closing.

QUESTION 8
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company is implementing inventory management in Dynamics 365 for Finance and Operations.
The company needs to block inventory and ensure that physical inventory will not be reserved by other outbound transactions.
You need to select the appropriate option to block the inventory in the system.
Solution: Select the full blocking option in the item sampling page.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
You can specify which inventory statuses are blocking statuses by using the Inventory blocking parameter on the Inventory statuses page. You can’t use inventory statuses as blocking statuses for production orders, sales orders, transfer orders, outbound transactions, or project integrations. For outbound work, use items that have an available inventory status.

QUESTION 9
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company is implementing inventory management in Dynamics 365 for Finance and Operations.
The company needs to block inventory and ensure that physical inventory will not be reserved by other outbound transactions.
You need to select the appropriate option to block the inventory in the system.
Solution: Create a quality order on the quality orders page for the quantity to be blocked.
Does the solution meet the goal?

A. Yes
B. No

Answer: A
Explanation:
Because of when you create quality order for an item in a transaction, after the quality order is finished, the item always will be block to inspect.

QUESTION 10
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company is implementing inventory management in Dynamics 365 for Finance and Operations.
The company needs to block inventory and ensure that physical inventory will not be reserved by other outbound transactions.
You need to select the appropriate option to block the inventory in the system.
Solution: Manually create a transaction on the inventory blocking page.
Does the solution meet the goal?

A. Yes
B. No

Answer: A
Explanation:
Manual blocking is the appropriate strategy or approach to block the physical or on-hand inventory. Other blocking options are creation of quality order, transactions that requires quality order and blocking inventory status that makes the inventory not available. Analyze the scenario for the appropriate option to be used and do the same thing in answering the question with defined scenario, do the analysis or critical thinking.

QUESTION 11
A company uses Dynamics 365 for Finance and Operations.
An employee notices a discrepancy in inventory.
You need to create the inventory blocking transaction.
What are two possible ways to achieve the goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. inventory status
B. quality order
C. batch disposition code
D. manual inventory blocking

Answer: BD
Explanation:
The question says creating inventory blocking transaction, Inventory status is a dimension, so it is not creating a new transaction, actually updates existing inventory transaction.

QUESTION 12
A company has revenue items that generate high, medium, or low revenue.
You need to configure ABC classifications as follows:

Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Define highest, middle, and lowest ABC values as percentages
B. Define highest, middle, and lowest ABC values as amounts
C. Select ABC model of revenue
D. Define internal interest in percentage
E. Select ABC model of value

Answer: AC
Explanation:
Release product details >> tab Manage cost >> Select ABC CLASSIFICATION revenue
Inventory Management > Periodic > ABC Classification : Define percentage for A(highest) ,B (Middle) ,C (Lowest).

QUESTION 13
You are the materials manager at a distribution company.
You are responsible for setting up the ABC classification of all items as follows:
– Class A materials represent 70 percent of the material value.
– Class B materials represent 20 percent of the material value.
– Class C materials represent 10 percent of the material value but are the most commonly used.
You need to assign an ABC classification value model to all items using those values.
What should you do?

A. Run the ABC classification report
B. Run the ABC classification periodic task to update the value model for all items
C. Manually update the Value classification on the Released product record
D. Run the ABC classification periodic task to update the revenue model for all items

Answer: B

QUESTION 14
A company employee is in charge of warehouse operations and controlling inventory adjustments through journals.
The employee needs to add inventory for samples at a specific cost. The samples were shipped by a vendor without a purchase order. The employee needs to be sure that the inventory value goes to a ledger account so that the value of the samples is not mixed in with another inventory value.
You need to ensure that the employee is able to correctly add the inventory.
What should you do?

A. Create a movement journal, add the cost, and specify the offset ledger account on the line.
B. Create an adjustment journal, add the cost, and specify the offset ledger account on the line.
C. Create an arrival journal, add the cost, and specify the offset ledger account on the line.
D. Create a transfer journal, transfer to a different warehouse, and then adjust the cost.

Answer: A
Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/inventory/inventory-journals

QUESTION 15
A company is implementing sales order functionality in Dynamics 365 for Finance and Operations.
The company has a business requirement to fulfill sales orders by using direct delivery.
You need to enter a direct delivery sales order so that a purchase order is automatically created.
What should you do after you enter the sales order and lines?

A. Set the ship complete toggle to On and confirm the order on the sales order header.
B. Select automatic and confirm the sales order on the line level setup tab in the reservation field.
C. Change the customer’s address to the vendor’s direct delivery address and confirm the sales order.
D. Select the direct delivery option under the sales order action pane and complete the form.

Answer: D

QUESTION 16
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A vendor is offering a rebate program on bottles of wine that have purchase orders placed within a month. There is a $5.00 rebate on the purchase of 10-100 bottles and a $6.00 rebate for the purchase of 101-200 bottles. Customers can purchase wine by the bottle or by the case. Discounts apply to all varieties of wine sold by the vendor.
You need to create a vendor rebate agreement to ensure that the correct rebate amount is claimed at the end of the month.
Solution: On the rebate agreement, specify each item group assigned to wine. Add a rebate line break of quantity 10-100 and a second rebate line break of quantity 101-200.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
You can specify which inventory statuses are blocking statuses by using the Inventory blocking parameter on the Inventory statuses page. You can’t use inventory statuses as blocking statuses for production orders, sales orders, transfer orders, outbound transactions, or project integrations. For outbound work, use items that have an available inventory status.

QUESTION 17
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A vendor is offering a rebate program on bottles of wine that have purchase orders placed within a month. There is a $5.00 rebate on the purchase of 10-100 bottles and a $6.00 rebate for the purchase of 101-200 bottles. Customers can purchase wine by the bottle or by the case. Discounts apply to all varieties of wine sold by the vendor.
You need to create a vendor rebate agreement to ensure that the correct rebate amount is claimed at the end of the month.
Solution: On the rebate agreement, set the calculation date type field to created.
Does the solution meet the goal?

A. Yes
B. No

Answer: A
Explanation:
It says purchase orders placed within a month and Calculation date type Created uses the creation date of the purchase order.

QUESTION 18
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A vendor is offering a rebate program on bottles of wine that have purchase orders placed within a month. There is a $5.00 rebate on the purchase of 10-100 bottles and a $6.00 rebate for the purchase of 101-200 bottles. Customers can purchase wine by the bottle or by the case. Discounts apply to all varieties of wine sold by the vendor.
You need to create a vendor rebate agreement to ensure that the correct rebate amount is claimed at the end of the month.
Solution: On the rebate agreement, set the start date to be the first of the month. Set the expiry date to be 30 days.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
You cannot specify number of days on the Expiry date.

QUESTION 19
You configure purchasing policies and oversee purchasing processes for a company.
Users often submit requisitions with incorrect information. Users also select non-approved vendors or incorrect categories.
You need to set up a procurement policy that limits which procurement categories and vendors can be selected.
Which two policy rules should you configure? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Category access policy rule
B. Purchase requisition control rule
C. Catalog policy rule
D. Category policy rule

Answer: AD
Explanation:
– Category policy rule
The category policy rule defines how users can select vendors for each category. It also defines requirements for the receiving and invoicing processes.
– Category access policy rule
he category access policy rule determines which categories users have access to when they create purchase requisitions. If no rule is specified, all the procurement categories can be added to the purchase requisition.
Select the Include parent rule option to apply the category access policy rule of the parent organization to the category.
In the Available categories pane, select the categories that the rule applies to. When you select a category, all categories that are higher in the hierarchy are also added to the Selected categories list.
Select the Include subcategories option to apply the rule to all subcategories of the selected category.

QUESTION 20
A buyer places a purchase requisition for item C0001 from a new vendor.
All purchases from a new vendor must go through an internal workflow approval process.
You need to ensure that a purchase order (PO) is automatically created from the purchase requisition.
Which setup must be in place?

A. Status = approved, item = C0001, vendor populated on the PO
B. Status = in review, item = C0001, vendor populated on the purchase requisition
C. Status = approved, item = C0001, vendor populated on the purchase requisition
D. Status = draft, item = C0001, vendor populated on the PO

Answer: C
Explanation:
If the Public sector configuration key is selected, additional controls are available for purchase agreements.
To enter information about subcontractors on purchase agreements that use this classification, select the Subcontractors check box.
To enter information about insurance policies and bonds on purchase agreements that use this classification, select the Certifications check box.
To enter information about milestones and tasks on purchase agreements that use this classification, select the Activities check box.
To require the use of direct invoicing and prevent the use of release orders with purchase agreements that use this classification, select the Require direct invoicing check box.

QUESTION 21
A company uses Dynamics 365 for Finance and Operations.
A customer returns a product that is defective for a replacement.
You need to process the return order.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Create a return of type Credit Only
B. Set the deadline date according to the company policy
C. Create a credit note for the replaced product
D. Set the delivery address to the customer’s address in the return order
E. Create a return of type Physical Return
F. Set the deadline date to the date the customer returns the defective product
G. Set the delivery address to the company warehouse in the return order

Answer: BEG
Explanation:
Notes:
1. Return and replacement, thus physical return
2. Delivery address – By default, the organization’s address is used. If a specific warehouse is selected on the header, the delivery address is changed to the delivery address of the warehouse.
3: Deadline – The default value is calculated as the current date plus the period of validity. The period of validity is set on the Accounts receivable parameters page.
https://docs.microsoft.com/en-us/dynamics365/supply-chain/warehousing/sales-returns

QUESTION 22
A client wants to use Dynamics 365 for Finance and Operations to assist processing trade. You need to ensure that intercompany sales order payments process correctly when intercompany payable journals are posted.
What should you do?

A. In the intercompany trade parameters for sales order policies, select Post journal automatically
B. In the intercompany trade parameters for purchase order policies, select Post invoice automatically
C. In the intercompany trade parameters for purchase order policies, select Post journal automatically
D. In the intercompany trade parameters for sales order policies, select Allow summary update of documents for original customer

Answer: A
Explanation:
https://docs.microsoft.com/en-us/dynamicsax-2012/appuser-itpro/register-payments-automatically-for-intercompany-customer-invoices

QUESTION 23
You are the customer relations manager at a wholesale company.
You perform promotion planning and must track fund usage.
You need to set up a trade allowance agreement to register and track promotion contracts.
Which two items should you set up prior to creating the agreement? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Sales category hierarchy
B. Opportunity reasons
C. Customer category hierarchy
D. Trade allowance funds

Answer: CD
Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/sales-marketing/trade-allowance

QUESTION 24
A company uses Dynamics 365 for Finance and Operations and implements procurement categories.
Purchase requisitions are required for the purchase of procurement category goods.
You need to ensure that the company purchases office supplies only from one specific vendor.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Add the preferred vendor to the office supplies procurement category setup
B. Create a preferred trade agreement for the office supplies vendor
C. In purchasing policies, configure a specific category policy for office supplies
D. In purchasing policies, create a Purchase requisition control rule
E. Configure the purchase requisition workflow to specify the office supplies vendor

Answer: AC
Explanation:
Create a Category policy rule where you can define the category and set up conditions on which vendor should be available for said category.

QUESTION 25
A company uses the warehouse mobile app for Dynamics 365 for Finance and Operations.
You must create a menu item for reprinting license plate labels. Reprinting a license plate label must not create warehouse work.
You need to configure the warehouse mobile app to add the new menu item.
What should you do?

A. Set the Mode to Work
B. Set the Mode to Indirect
C. Set the Activity code to Cancel work
D. Set the Activity code to None

Answer: B
Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/warehousing/configure-mobile-devices-warehouse

QUESTION 26
A company plans to implement Dynamics 365 for Finance and Operations shipping manifests.
The company wants to use a multiple-level manifest process.
You need to ensure that the system is configured for multiple-level manifest processing.
What should you validate?

A. All container groups are manifested before the shipment is manifested
B. All containers are of the status open before the group is manifested
C. The allow split picks configuration is enabled
D. All container types are set up with all four attributes

Answer: A
Explanation:
If using a multiple level manifest procedure, it is a requirement that:
– All containers must be manifested before container group is manifested.
– All container groups must be manifested before shipment is manifested.

QUESTION 27
You are implementing containerization functionality.
You must automate containerization so that containers and picking work for shipments are created when a wave is processed. The work lines will be split into quantities to fit required containers by size.
You need to set up a container build template that defines the containerization process.
Which three items should you set up before you create the container build template? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. a wave template that includes the containerize method
B. container packing policies
C. a container group
D. container types
E. container packing strategies

Answer: ACD
Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/warehousing/tasks/set-up-containerization

QUESTION 28
You are the inventory manager at a large distribution company.
You notice item P0001 has been running out regularly and the on-hand count seems to differ from what is in Dynamics 365 for Finance and Operations. You want cycle count work to be automatically created when the quantity drops below 10 pieces, which is about once a week.
You need to appropriately configure warehouse management to generate cycle count work.
What should you do?

A. Create a cycle count plan for item P0001 to run when the quantity is below 10.
B. Create a cycle count threshold that is percentage based that will generate work when inventory drops below 10% for item P0001.
C. Create a cycle count threshold that is quantity based and specify 10 for the quantity. Add P0001 as a selected item.
D. Create a cycle count plan for item P0001. Generate a batch job that runs once a week.

Answer: C
Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/warehousing/cycle-counting

QUESTION 29
You are implementing warehousing in Dynamics 365 for Finance and Operations.
You configure and approve one warehouse.
You need to use the established warehouse setup to create additional warehouses.
What should you use?

A. warehouse management parameter setup only
B. warehouse work template
C. warehouse configuration template
D. inventory and warehouse management parameter setup

Answer: C
Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/inventory/warehouse-template

QUESTION 30
You are the logistics manager at a distribution company.
Your primary carrier service provides rates for transportation between New York City and Colorado. These rates are a flat rate depending on the city or general area of pickup as follows:
– New York City = $500
– Colorado = $450
You need to set up Transportation Management to calculate the rate from New York City to Colorado.
What should you do?

A. Use a Point-to-Point engine based on weight and miles. Assign rates from New York City as the starting location and Colorado as the ending location and break the rates out based on the weight of the package.
B. Set up zones in the Zone Master for New York City and Colorado. Assign rates to each zone in the Zone Master by starting and ending location.
C. Create hubs for both locations. Add a route plan from New York City to Colorado and assign the two charges as spot rates.
D. Set up a Transit Time Engine to track days from New York City to Colorado. Set up rates in the Rate Master tied to day breaks.

Answer: C

QUESTION 31
A company operates a chain of retail coffee shops and a distribution center. Each coffee shop and the distribution center are distinct warehouses.
Cups and lids are replenished from a single distribution center.
You need to configure store replenishment for coffee cup lids.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Set the coverage plan by dimension for warehouse on the Storage dimension group.
B. Set the minimum item coverage to the safety stock quantity. Set the maximum to the maximum quantity of stock to reorder above the safety stock quantity.
C. Set the minimum item coverage to the safety stock quantity. Set the maximum item coverage to 0.
D. Create a coverage group assignment for the item.
E. Set the minimum item coverage to 0. Set the maximum item coverage to the number of lids to keep on hand.

Answer: ABD

QUESTION 32
A company has several warehouse locations. The company acquires a new warehouse.
You must design a new warehouse process workflow for the new warehouse.
You need to configure the workflow.
Which three features should you configure? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. work pools
B. cluster picking
C. outbound wave processing
D. work templates
E. wave templates

Answer: ADE
Explanation:
You must configure components for inbound and outbound warehouse process workflows according to business requirements. The most important components that you must configure are wave templates, work templates, work pools, and location directives.
https://docs.microsoft.com/en-us/dynamics365/supply-chain/warehousing/warehouse-configuration

QUESTION 33
You need to export balances from Microsoft Dynamics 365 for Finance and Operations to an external system. There is a “Use Consolidation Account” Yes/No perimeter in the criteria.
In which two circumstances should you choose “Yes” for this perimeter? Each correct answer presents a complete solution.

A. You want to export some balances into a different account then the main account of the balances in the subsidiary company,
B. You want to export all balances into a different account then the main account of the balances in the subsidiary company,
C. You want to export all balances into the same account as the main account of the balances in the subsidiary company,
D. You want to export some balances into the same account as the main account of the balances in the subsidiary company,

Answer: AB
Explanation:
https://technet.microsoft.com/en-us/library/aa618539.aspx

QUESTION 34
The controller of your company has received notice from the taxing agency of another state that the sales tax rate on the company’s products sold in that will increase by 0.5% starting on January 1st and continuing in perpetuity.
You must make all changes necessary in Microsoft Dynamics 365 for Finance and Operations to make these changes take effect on January 1st.
Which object in Microsoft Dynamics 365 for Finance and Operations should be modified to make this change?

A. sales tax code
B. Ledger posting group
C. Sales tax authority
D. Settlement period

Answer: A

QUESTION 35
Drag and Drop Question
You manage a Dynamics 365 for Finance and Operations system for a company.
You need to configure agreements in the system.
Which agreement types should you use? To answer, drag the appropriate agreement types to the appropriate scenarios. Each agreement type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.

Answer:

Explanation:
https://docs.microsoft.com/en-us/learn/modules/configure-use-agreements-dyn365-supply-chain-mgmt/13-summary

QUESTION 36
Drag and Drop Question
A company manufactures wood furniture.
Cabinets can be purchased with different wood finishes including oak and maple.
You need to configure a product attribute to characterize the types of cabinet finishes.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Answer:

Explanation:
https://docs.microsoft.com/en-us/dynamics365/unified-operations/retail/attribute-attributegroups-lifecycle

QUESTION 37
Drag and Drop Question
You are the product manager at a distribution company. You are responsible for managing product compliance standards and reporting.
Chemical product, C0001 can be sold in all parts of the United States except for the state of California.
You need to set up these compliance requirements for C0001.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Answer:

QUESTION 38
Hotspot Question
You are configuring pricing for a new item.
Wholesale customers must pay $10.00 for order quantities of up to 9 units. All other customers receive a static price of $14.00 regardless of quantity.
You need to configure sales trade agreements.
In Trade Agreement Setup, which actions should you perform? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

Explanation:
https://financefunction.tech/2018/11/14/sales-prices-in-dynamics-365-for-finance-and-operations/#sales_price_in_trade_agreements
https://docs.microsoft.com/en-us/dynamics365/unified-operations/supply-chain/sales-marketing/tasks/create-new-trade-agreement

QUESTION 39
Hotspot Question
An airport uses Dynamics 365 for Finance and Operations. You purchase new baggage-sorting hardware.
You must add both the hardware and the service contract for the hardware to the product hierarchy.
You need to configure the category node.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

Explanation:
https://docs.microsoft.com/en-us/dynamicsax-2012/appuser-itpro/key-tasks-set-up-a-category-hierarchy

QUESTION 40
Hotspot Question
Inventory in a warehouse is assigned to an inventory status of available.
You need to set up an inventory status for damaged items so that they are not sold to customers.
Which values should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

Explanation:
https://docs.microsoft.com/en-us/dynamicsax-2012/appuser-itpro/set-up-an-inventory-status#optional-set-up-a-default-inventory-status

QUESTION 41
Hotspot Question
A company sells a new product line. Buyers purchase a large shipment into the distribution center.
The product must be divided among the retail stores equally.
You need to configure buyer push functionality.
Which configuration options should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 60
Drag and Drop Question
A manufacturing company is setting up a new warehouse.
The warehouse must store a product that is currently stored in another warehouse.
You need to create new item coverage for the warehouse.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Answer:

QUESTION 61
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are the purchasing manager at a manufacturing company that makes audio equipment.
You sign an agreement with a vendor to purchase 5,000 speaker cables, item C0001, at a discounted rate of $3.00 per cable. This agreement expires in exactly one year.
You need to set up pricing information and track the fulfillment of the agreement.
Solution:
– Create a purchase agreement of type Product value commitment.
– Add a line for item C0001.
– Enter a product value of $15,000 and enter an expiration date of one year.
Does the solution meet the goal?

A. Yes
B. No

Answer: A
Explanation:
https://docs.microsoft.com/en-us/dynamics365/supply-chain/procurement/purchase-agreements

QUESTION 62
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are the purchasing manager at a manufacturing company that makes audio equipment.
You sign an agreement with a vendor to purchase 5,000 speaker cables, item C0001, at a discounted rate of $3.00 per cable. This agreement expires in exactly one year.
You need to set up pricing information and track the fulfillment of the agreement.
Solution: On the released product, set a price of $3.00. Add the vendor to the vendor account field on the Purchase fast tab.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
https://docs.microsoft.com/en-us/dynamics365/supply-chain/procurement/purchase-agreements

QUESTION 63
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are the purchasing manager at a manufacturing company that makes audio equipment.
You sign an agreement with a vendor to purchase 5,000 speaker cables, item C0001, at a discounted rate of $3.00 per cable. This agreement expires in exactly one year.
You need to set up pricing information and track the fulfillment of the agreement.
Solution: Create a purchase agreement for the vendor that specifies a product quantity commitment. Include the quantity, the price, and the expiration date.
Does the solution meet the goal?

A. Yes
B. No

Answer: A
Explanation:
https://docs.microsoft.com/en-us/dynamics365/supply-chain/procurement/purchase-agreements

QUESTION 64
A distribution company wants to set up barcodes in their Dynamics 365 Supply Chain Management system for warehouse scanning.
Barcodes will be entered manually.
You need to minimize the risk of errors on barcode entry.
What should you do?

A. Create a new barcode, enter the value, select the type, and enter the mask.
B. Select an item, select the barcode type, select max length, and enter the value.
C. Select an item and set up security on the barcode field.
D. Create a new barcode, enter the value, select the type, and enter the size and max length.

Answer: D
Explanation:
https://docs.microsoft.com/en-us/dynamics365/supply-chain/inventory/tasks/maintain-barcode-types

QUESTION 65
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company plans to simplify interactions between purchasing department employees and vendors.
You need to ensure that employees are redirected to a vendor’s online store to select items for inclusion on purchase requisitions.
Solution: Create a retail product catalog.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
https://docs.microsoft.com/en-us/dynamics365/supply-chain/procurement/set-up-external-catalog-for-punchout

QUESTION 66
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company plans to simplify interactions between purchasing department employees and vendors.
You need to ensure that employees are redirected to a vendor’s online store to select items for inclusion on purchase requisitions.
Solution: Create a procurement catalog.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
https://docs.microsoft.com/en-us/dynamics365/supply-chain/procurement/set-up-external-catalog-for-punchout

QUESTION 67
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company plans to simplify interactions between purchasing department employees and vendors.
You need to ensure that employees are redirected to a vendor’s online store to select items for inclusion on purchase requisitions.
Solution: Create a vendor catalog.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Vendor catalogue is for internal.
https://docs.microsoft.com/en-us/dynamics365/supply-chain/procurement/set-up-external-catalog-for-punchout

QUESTION 68
A company uses Dynamics 365 Finance.
The finance department processes royalty claims using the accounts payable module.
You need to pass the claims to the accounts payable group for payment.
Which three events will occur? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. A Royalty accrual journal posting reverses the previous interim postings for accrual and expense amounts.
B. A credit is posted to the vendor’s payable account.
C. A vendor invoice for the royalty payment is set to draft.
D. A new vendor invoice for the royalty is created and posted.
E. A hold is put on the amounts held in the royalty fees account.

Answer: ABD
Explanation:
https://docs.microsoft.com/en-us/dynamics365/finance/accounts-payable/royalty-contract

QUESTION 69
A company manufactures and sells custom bicycles. Customers can customize some components to create a custom bicycle.
You need to configure sales orders to support the customization allowed for custom bicycle orders.
What are two possible ways to achieve this goal? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. Create a sales order for the bicycle and make configuration selections on the order for the upgraded seat and handlebars.
The production order will be automatically generated with the correct seat and handlebars.
B. Create a sales order for the bicycle and add separate line items for the upgraded seat and handlebars.
C. Create a sales order for the bicycle. Modify the production order after it has been reported as finished to delete the standard seat and handlebars and add the upgraded seat and handlebars.
D. Configure the product to allow for the seat and handlebars selection to be defined at order creation, automatically adding an upcharge to the sales price.

Answer: AD

QUESTION 70
A company manufactures and sells surround-sound audio systems. A third-party company manufactures the stereo receivers as part of the Bill of materials (BOM) for complete sound systems.
You need to automatically create a purchase order for the stereo receiver from the production order for a sound system.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Assign the Vendor account to the service item BOM line.
B. Set service item BOM line type to Vendor.
C. Set the service item BOM line type to Pegged supply.
D. Link the service items to the vendor on the costing sheet.
E. Attach the service item to the parent item as a BOM line.

Answer: ABE

QUESTION 71
A company plans to implement Dynamics 365 Supply Chain Management shipping manifests.
The company wants to use a multiple-level manifest process.
You need to ensure that the system is configured for multiple-level manifest processing.
What should you validate?

A. All containers are manifested after the container group is manifested.
B. All containers are manifested before the container group is manifested.
C. The allow split picks configuration is enabled.
D. All container types are set up with all four attributes.

Answer: B
Explanation:
https://cloudblogs.microsoft.com/dynamics365/no-audience/2016/12/01/improved-packing-functionality-dynamics-365-for-operations-1611/


Resources From:

1.2025 Latest Braindump2go MB-330 Exam Dumps (PDF & VCE) Free Share:
https://www.braindump2go.com/mb-330.html

2.2025 Latest Braindump2go MB-330 PDF and MB-330 VCE Dumps Free Share:
https://drive.google.com/drive/folders/1eLBlYFhqB3PPLirZ7WrsRUyeRxnhTpW4?usp=sharing

3.2025 Free Braindump2go MB-330 Exam Questions Download:
https://www.braindump2go.com/free-online-pdf/MB-330-VCE-Dumps(1-180).pdf

Free Resources from Braindump2go,We Devoted to Helping You 100% Pass All Exams!

[2025-December-New]Braindump2go MB-280 VCE Questions Free[Q22-Q70]

2025/December Latest Braindump2go MB-280 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go MB-280 Real Exam Questions!

QUESTION 22
A company created a new table named Locations.
The sales team needs your help to make the Locations table visible in the Sales Hub.
What should you do?

A. Create a Location Sub Area.
B. Add Location as an Area.
C. Create a Location Group.
D. Add Location to the App Designer.

Answer: D
Explanation:
To make a new table, such as Locations, visible in the Sales Hub, you need to modify the app using the App Designer in Dynamics 365. By adding the Locations table to the Sales Hub via the App Designer, you ensure that users in the Sales Hub can access and interact with the Locations data directly within the application.

QUESTION 23
A battery manufacturer wants to sell their batteries in boxes of 12 and cases of 24 boxes.
You need to set up a unit group so that the manufacturer can sell different quantities.
What should you create first?

A. primary unit
B. related unit
C. base unit

» Read more

[2025-December-New]Braindump2go MB-230 PDF Free Updated[Q130-Q212]

2025/December Latest Braindump2go MB-230 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go MB-230 Real Exam Questions!

QUESTION 130
You are a system administrator for Dynamics 365 for Customer Service.
All child cases must inherit the product, customer name, case title, and case type from the parent case. Parent cases must not be closed until all child cases are closed.
You need to configure cases.
What should you do?

A. Validate that customer and case title fields have not been removed as fields that child cases inherit from parent cases.
Add product and case-type fields to the list.
Set the closure preference setting to Don’t allow parent case closure until all child cases are closed.
B. On the case entity, update the Parent case-Child case 1:N relationship field mapping to include the fields.
Create a business rule on the case entity to prevent the parent from closing if it has one or more open child cases.
C. Create a business rule.
D. Validate that customer and case title fields have not been removed as fields that child cases inherit from the parent cases.
Add product and case-type fields to the list.
The closure preference setting does not need to be changed.
This is default behavior.

Answer: A
Explanation:
https://docs.microsoft.com/en-us/dynamics365/customer-service/define-settings-parent-child-cases

QUESTION 131
A company uses Dynamics 365 Customer Service.
You are configuring the advanced similarity rules. You create a similarity rule on cases and put an exact match for the Modified On field in the Match Fields tab.
You test the rule and discover that exact matches do not appear.
You need to determine why the rule is not working.
What are two possible reasons why the rule is not working? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. A Power Automate flow was not created.
B. The similarity rule is deactivated.
C. The security role is not set to run the similarity rule.
D. The similarity rule was not published.
E. The Modified On field is not set to searchable in the customization of the case entity in the solution.

» Read more

[2025-November-New]Braindump2go DP-700 Dumps with PDF and VCE Free[Q1-Q60]

2025/November Latest Braindump2go DP-700 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go DP-700 Real Exam Questions!

QUESTION 1
Case Study 1 – Contoso, Ltd
Overview. Company Overview
Contoso, Ltd. is an online retail company that wants to modernize its analytics platform by moving to Fabric. The company plans to begin using Fabric for marketing analytics.
Overview. IT Structure
The company’s IT department has a team of data analysts and a team of data engineers that use analytics systems.
The data engineers perform the ingestion, transformation, and loading of data. They prefer to use Python or SQL to transform the data.
The data analysts query data and create semantic models and reports. They are qualified to write queries in Power Query and T-SQL.
Existing Environment. Fabric
Contoso has an F64 capacity named Cap1. All Fabric users are allowed to create items.
Contoso has two workspaces named WorkspaceA and WorkspaceB that currently use Pro license mode.
Existing Environment. Source Systems
Contoso has a point of sale (POS) system named POS1 that uses an instance of SQL Server on Azure Virtual Machines in the same Microsoft Entra tenant as Fabric. The host virtual machine is on a private virtual network that has public access blocked. POS1 contains all the sales transactions that were processed on the company’s website.
The company has a software as a service (SaaS) online marketing app named MAR1. MAR1 has seven entities. The entities contain data that relates to email open rates and interaction rates, as well as website interactions. The data can be exported from MAR1 by calling REST APIs. Each entity has a different endpoint.
Contoso has been using MAR1 for one year. Data from prior years is stored in Parquet files in an Amazon Simple Storage Service (Amazon S3) bucket. There are 12 files that range in size from 300 MB to 900 MB and relate to email interactions.
Existing Environment. Product Data
POS1 contains a product list and related data. The data comes from the following three tables:
– Products
– ProductCategories
– ProductSubcategories
In the data, products are related to product subcategories, and subcategories are related to product categories.
Existing Environment. Azure
Contoso has a Microsoft Entra tenant that has the following mail-enabled security groups:
– DataAnalysts: Contains the data analysts
– DataEngineers: Contains the data engineers
Contoso has an Azure subscription.
The company has an existing Azure DevOps organization and creates a new project for repositories that relate to Fabric.
Existing Environment. User Problems
The VP of marketing at Contoso requires analysis on the effectiveness of different types of email content. It typically takes a week to manually compile and analyze the data. Contoso wants to reduce the time to less than one day by using Fabric.
The data engineering team has successfully exported data from MAR1. The team experiences transient connectivity errors, which causes the data exports to fail.
Requirements. Planned Changes
Contoso plans to create the following two lakehouses:
– Lakehouse1: Will store both raw and cleansed data from the sources
– Lakehouse2: Will serve data in a dimensional model to users for analytical queries
Additional items will be added to facilitate data ingestion and transformation.
Contoso plans to use Azure Repos for source control in Fabric.
Requirements. Technical Requirements
The new lakehouses must follow a medallion architecture by using the following three layers: bronze, silver, and gold. There will be extensive data cleansing required to populate the MAR1 data in the silver layer, including deduplication, the handling of missing values, and the standardizing of capitalization.
Each layer must be fully populated before moving on to the next layer. If any step in populating the lakehouses fails, an email must be sent to the data engineers.
Data imports must run simultaneously, when possible.
The use of email data from the Amazon S3 bucket must meet the following requirements:
– Minimize egress costs associated with cross-cloud data access.
– Prevent saving a copy of the raw data in the lakehouses.
Items that relate to data ingestion must meet the following requirements:
– The items must be source controlled alongside other workspace items.
– Ingested data must land in the bronze layer of Lakehouse1 in the Delta format.
– No changes other than changes to the file formats must be implemented before the data lands in the bronze layer.
– Development effort must be minimized and a built-in connection must be used to import the source data.
– In the event of a connectivity error, the ingestion processes must attempt the connection again.
Lakehouses, data pipelines, and notebooks must be stored in WorkspaceA. Semantic models, reports, and dataflows must be stored in WorkspaceB.
Once a week, old files that are no longer referenced by a Delta table log must be removed.
Requirements. Data Transformation
In the POS1 product data, ProductID values are unique. The product dimension in the gold layer must include only active products from product list. Active products are identified by an IsActive value of 1.
Some product categories and subcategories are NOT assigned to any product. They are NOT analytically relevant and must be omitted from the product dimension in the gold layer.
Requirements. Data Security
Security in Fabric must meet the following requirements:
– The data engineers must have read and write access to all the lakehouses, including the underlying files.
– The data analysts must only have read access to the Delta tables in the gold layer.
– The data analysts must NOT have access to the data in the bronze and silver layers.
– The data engineers must be able to commit changes to source control in WorkspaceA.
You need to ensure that the data analysts can access the gold layer lakehouse.
What should you do?

A. Add the DataAnalyst group to the Viewer role for WorkspaceA.
B. Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.
C. Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.
D. Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

» Read more

[2025-November-New]Braindump2go GH-300 Dumps VCE Free Share[Q1-Q30]

2025/November Latest Braindump2go GH-300 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go GH-300 Real Exam Questions!

QUESTION 1
What method can a developer use to generate sample data with GitHub Copilot? (Each correct answer presents part of the solution. Choose two.)

A. Utilizing GitHub Copilot’s ability to create fictitious information from patterns in training data.
B. Leveraging GitHub Copilot’s ability to independently initiate and manage data storage services.
C. Utilize GitHub Copilot’s capability to directly access and use databases to create sample data.
D. Leveraging GitHub Copilot’s suggestions to create data based on API documentation in the repository.

Answer: AD
Explanation:
GitHub Copilot can generate sample data by creating fictitious information based on patterns in its training data and by using suggestions based on API documentation within the repository.

QUESTION 2
What are the potential risks associated with relying heavily on code generated from GitHub Copilot? (Each correct answer presents part of the solution. Choose two.)

A. GitHub Copilot may introduce security vulnerabilities by suggesting code with known exploits.
B. GitHub Copilot may decrease developer velocity by requiring too much time in prompt engineering.
C. GitHub Copilot’s suggestions may not always reflect best practices or the latest coding standards.
D. GitHub Copilot may increase development lead time by providing irrelevant suggestions.

» Read more

[2025-November-New]Braindump2go DP-600 Practice Exam Free[Q1-Q51]

2025/November Latest Braindump2go DP-600 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go DP-600 Real Exam Questions!

QUESTION 1
Case Study 1 – Contoso
Overview
Contoso, Ltd. is a US-based health supplements company. Contoso has two divisions named Sales and Research. The Sales division contains two departments named Online Sales and Retail Sales. The Research division assigns internally developed product lines to individual teams of researchers and analysts.
Existing Environment
Identity Environment
Contoso has a Microsoft Entra tenant named contoso.com. The tenant contains two groups named ResearchReviewersGroup1 and ResearchReviewersGroup2.
Data Environment
Contoso has the following data environment:
– The Sales division uses a Microsoft Power BI Premium capacity.
– The semantic model of the Online Sales department includes a fact table named Orders that uses Import made. In the system of origin, the OrderID value represents the sequence in which orders are created.
– The Research department uses an on-premises, third-party data warehousing product.
– Fabric is enabled for contoso.com.
– An Azure Data Lake Storage Gen2 storage account named storage1 contains Research division data for a product line named Productline1. – The data is in the delta format.
– A Data Lake Storage Gen2 storage account named storage2 contains Research division data for a product line named Productline2. The data is in the CSV format.
Requirements
Planned Changes
Contoso plans to make the following changes:
– Enable support for Fabric in the Power BI Premium capacity used by the Sales division.
– Make all the data for the Sales division and the Research division available in Fabric.
– For the Research division, create two Fabric workspaces named Productline1ws and Productine2ws.
– In Productline1ws, create a lakehouse named Lakehouse1.
– In Lakehouse1, create a shortcut to storage1 named ResearchProduct.
Data Analytics Requirements
Contoso identifies the following data analytics requirements:
– All the workspaces for the Sales division and the Research division must support all Fabric experiences.
– The Research division workspaces must use a dedicated, on-demand capacity that has per-minute billing.
– The Research division workspaces must be grouped together logically to support OneLake data hub filtering based on the department name.
– For the Research division workspaces, the members of ResearchReviewersGroup1 must be able to read lakehouse and warehouse data and shortcuts by using SQL endpoints.
– For the Research division workspaces, the members of ResearchReviewersGroup2 must be able to read lakehouse data by using Lakehouse explorer.
– All the semantic models and reports for the Research division must use version control that supports branching.
Data Preparation Requirements
Contoso identifies the following data preparation requirements:
– The Research division data for Productline1 must be retrieved from Lakehouse1 by using Fabric notebooks.
– All the Research division data in the lakehouses must be presented as managed tables in Lakehouse explorer.
Semantic Model Requirements
Contoso identifies the following requirements for implementing and managing semantic models:
– The number of rows added to the Orders table during refreshes must be minimized.
– The semantic models in the Research division workspaces must use Direct Lake mode.
General Requirements
Contoso identifies the following high-level requirements that must be considered for all solutions:
– Follow the principle of least privilege when applicable.
– Minimize implementation and maintenance effort when possible.
You need to ensure that Contoso can use version control to meet the data analytics requirements and the general requirements.
What should you do?

A. Store at the semantic models and reports in Data Lake Gen2 storage.
B. Modify the settings of the Research workspaces to use a GitHub repository.
C. Modify the settings of the Research division workspaces to use an Azure Repos repository.
D. Store all the semantic models and reports in Microsoft OneDrive.

Answer: C
Explanation:
Currently, only Git in Azure Repos is supported.
https://learn.microsoft.com/en-us/fabric/cicd/git-integration/intro-to-git-integration#considerations-and-limitations

QUESTION 2
Case Study 1 – Contoso
Overview
Contoso, Ltd. is a US-based health supplements company. Contoso has two divisions named Sales and Research. The Sales division contains two departments named Online Sales and Retail Sales. The Research division assigns internally developed product lines to individual teams of researchers and analysts.
Existing Environment
Identity Environment
Contoso has a Microsoft Entra tenant named contoso.com. The tenant contains two groups named ResearchReviewersGroup1 and ResearchReviewersGroup2.
Data Environment
Contoso has the following data environment:
– The Sales division uses a Microsoft Power BI Premium capacity.
– The semantic model of the Online Sales department includes a fact table named Orders that uses Import made. In the system of origin, the OrderID value represents the sequence in which orders are created.
– The Research department uses an on-premises, third-party data warehousing product.
– Fabric is enabled for contoso.com.
– An Azure Data Lake Storage Gen2 storage account named storage1 contains Research division data for a product line named Productline1. – The data is in the delta format.
– A Data Lake Storage Gen2 storage account named storage2 contains Research division data for a product line named Productline2. The data is in the CSV format.
Requirements
Planned Changes
Contoso plans to make the following changes:
– Enable support for Fabric in the Power BI Premium capacity used by the Sales division.
– Make all the data for the Sales division and the Research division available in Fabric.
– For the Research division, create two Fabric workspaces named Productline1ws and Productine2ws.
– In Productline1ws, create a lakehouse named Lakehouse1.
– In Lakehouse1, create a shortcut to storage1 named ResearchProduct.
Data Analytics Requirements
Contoso identifies the following data analytics requirements:
– All the workspaces for the Sales division and the Research division must support all Fabric experiences.
– The Research division workspaces must use a dedicated, on-demand capacity that has per-minute billing.
– The Research division workspaces must be grouped together logically to support OneLake data hub filtering based on the department name.
– For the Research division workspaces, the members of ResearchReviewersGroup1 must be able to read lakehouse and warehouse data and shortcuts by using SQL endpoints.
– For the Research division workspaces, the members of ResearchReviewersGroup2 must be able to read lakehouse data by using Lakehouse explorer.
– All the semantic models and reports for the Research division must use version control that supports branching.
Data Preparation Requirements
Contoso identifies the following data preparation requirements:
– The Research division data for Productline1 must be retrieved from Lakehouse1 by using Fabric notebooks.
– All the Research division data in the lakehouses must be presented as managed tables in Lakehouse explorer.
Semantic Model Requirements
Contoso identifies the following requirements for implementing and managing semantic models:
– The number of rows added to the Orders table during refreshes must be minimized.
– The semantic models in the Research division workspaces must use Direct Lake mode.
General Requirements
Contoso identifies the following high-level requirements that must be considered for all solutions:
– Follow the principle of least privilege when applicable.
– Minimize implementation and maintenance effort when possible.
You need to refresh the Orders table of the Online Sales department. The solution must meet the semantic model requirements.
What should you include in the solution?

A. an Azure Data Factory pipeline that executes a Stored procedure activity to retrieve the maximum value of the OrderID column in the destination lakehouse
B. an Azure Data Factory pipeline that executes a Stored procedure activity to retrieve the minimum value of the OrderID column in the destination lakehouse
C. an Azure Data Factory pipeline that executes a dataflow to retrieve the minimum value of the OrderID column in the destination lakehouse
D. an Azure Data Factory pipeline that executes a dataflow to retrieve the maximum value of the OrderID column in the destination lakehouse

Answer: D
Explanation:
We need to retrieve the maximum OrderID in the destination table to minimize the number of rows added during refresh. this would be an incremental load. can be done with data flows.

QUESTION 3
Case Study 1 – Contoso
Overview
Contoso, Ltd. is a US-based health supplements company. Contoso has two divisions named Sales and Research. The Sales division contains two departments named Online Sales and Retail Sales. The Research division assigns internally developed product lines to individual teams of researchers and analysts.
Existing Environment
Identity Environment
Contoso has a Microsoft Entra tenant named contoso.com. The tenant contains two groups named ResearchReviewersGroup1 and ResearchReviewersGroup2.
Data Environment
Contoso has the following data environment:
– The Sales division uses a Microsoft Power BI Premium capacity.
– The semantic model of the Online Sales department includes a fact table named Orders that uses Import made. In the system of origin, the OrderID value represents the sequence in which orders are created.
– The Research department uses an on-premises, third-party data warehousing product.
– Fabric is enabled for contoso.com.
– An Azure Data Lake Storage Gen2 storage account named storage1 contains Research division data for a product line named Productline1. – The data is in the delta format.
– A Data Lake Storage Gen2 storage account named storage2 contains Research division data for a product line named Productline2. The data is in the CSV format.
Requirements
Planned Changes
Contoso plans to make the following changes:
– Enable support for Fabric in the Power BI Premium capacity used by the Sales division.
– Make all the data for the Sales division and the Research division available in Fabric.
– For the Research division, create two Fabric workspaces named Productline1ws and Productine2ws.
– In Productline1ws, create a lakehouse named Lakehouse1.
– In Lakehouse1, create a shortcut to storage1 named ResearchProduct.
Data Analytics Requirements
Contoso identifies the following data analytics requirements:
– All the workspaces for the Sales division and the Research division must support all Fabric experiences.
– The Research division workspaces must use a dedicated, on-demand capacity that has per-minute billing.
– The Research division workspaces must be grouped together logically to support OneLake data hub filtering based on the department name.
– For the Research division workspaces, the members of ResearchReviewersGroup1 must be able to read lakehouse and warehouse data and shortcuts by using SQL endpoints.
– For the Research division workspaces, the members of ResearchReviewersGroup2 must be able to read lakehouse data by using Lakehouse explorer.
– All the semantic models and reports for the Research division must use version control that supports branching.
Data Preparation Requirements
Contoso identifies the following data preparation requirements:
– The Research division data for Productline1 must be retrieved from Lakehouse1 by using Fabric notebooks.
– All the Research division data in the lakehouses must be presented as managed tables in Lakehouse explorer.
Semantic Model Requirements
Contoso identifies the following requirements for implementing and managing semantic models:
– The number of rows added to the Orders table during refreshes must be minimized.
– The semantic models in the Research division workspaces must use Direct Lake mode.
General Requirements
Contoso identifies the following high-level requirements that must be considered for all solutions:
– Follow the principle of least privilege when applicable.
– Minimize implementation and maintenance effort when possible.
Which syntax should you use in a notebook to access the Research division data for Productline1?

A. spark.read.format(“delta”).load(“Tables/productline1/ResearchProduct”)
B. spark.sql(“SELECT * FROM Lakehouse1.ResearchProduct “)
C. external_table(‘Tables/ResearchProduct)
D. external_table(ResearchProduct)

Answer: B
Explanation:
The syntax of C and D is correct for KQL databases (incorrect in this use-case). When the shortcut is created, no additional folders have been added to the Tables section, therefore answer A is incorrect. Once created, the line of answer B can be used to access data correctly.
https://learn.microsoft.com/en-us/fabric/onelake/onelake-shortcuts

QUESTION 4
Case Study 1 – Contoso
Overview
Contoso, Ltd. is a US-based health supplements company. Contoso has two divisions named Sales and Research. The Sales division contains two departments named Online Sales and Retail Sales. The Research division assigns internally developed product lines to individual teams of researchers and analysts.
Existing Environment
Identity Environment
Contoso has a Microsoft Entra tenant named contoso.com. The tenant contains two groups named ResearchReviewersGroup1 and ResearchReviewersGroup2.
Data Environment
Contoso has the following data environment:
– The Sales division uses a Microsoft Power BI Premium capacity.
– The semantic model of the Online Sales department includes a fact table named Orders that uses Import made. In the system of origin, the OrderID value represents the sequence in which orders are created.
– The Research department uses an on-premises, third-party data warehousing product.
– Fabric is enabled for contoso.com.
– An Azure Data Lake Storage Gen2 storage account named storage1 contains Research division data for a product line named Productline1. – The data is in the delta format.
– A Data Lake Storage Gen2 storage account named storage2 contains Research division data for a product line named Productline2. The data is in the CSV format.
Requirements
Planned Changes
Contoso plans to make the following changes:
– Enable support for Fabric in the Power BI Premium capacity used by the Sales division.
– Make all the data for the Sales division and the Research division available in Fabric.
– For the Research division, create two Fabric workspaces named Productline1ws and Productine2ws.
– In Productline1ws, create a lakehouse named Lakehouse1.
– In Lakehouse1, create a shortcut to storage1 named ResearchProduct.
Data Analytics Requirements
Contoso identifies the following data analytics requirements:
– All the workspaces for the Sales division and the Research division must support all Fabric experiences.
– The Research division workspaces must use a dedicated, on-demand capacity that has per-minute billing.
– The Research division workspaces must be grouped together logically to support OneLake data hub filtering based on the department name.
– For the Research division workspaces, the members of ResearchReviewersGroup1 must be able to read lakehouse and warehouse data and shortcuts by using SQL endpoints.
– For the Research division workspaces, the members of ResearchReviewersGroup2 must be able to read lakehouse data by using Lakehouse explorer.
– All the semantic models and reports for the Research division must use version control that supports branching.
Data Preparation Requirements
Contoso identifies the following data preparation requirements:
– The Research division data for Productline1 must be retrieved from Lakehouse1 by using Fabric notebooks.
– All the Research division data in the lakehouses must be presented as managed tables in Lakehouse explorer.
Semantic Model Requirements
Contoso identifies the following requirements for implementing and managing semantic models:
– The number of rows added to the Orders table during refreshes must be minimized.
– The semantic models in the Research division workspaces must use Direct Lake mode.
General Requirements
Contoso identifies the following high-level requirements that must be considered for all solutions:
– Follow the principle of least privilege when applicable.
– Minimize implementation and maintenance effort when possible.
Hotspot Question
You need to recommend a solution to group the Research division workspaces.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

Explanation:
https://learn.microsoft.com/en-us/fabric/governance/domains#configure-domain-settings

QUESTION 5
Case Study 2 – Litware, Inc
Overview
Litware, Inc. is a manufacturing company that has offices throughout North America. The analytics team at Litware contains data engineers, analytics engineers, data analysts, and data scientists.
Existing Environment
Fabric Environment
Litware has been using a Microsoft Power BI tenant for three years. Litware has NOT enabled any Fabric capacities and features.
Available Data
Litware has data that must be analyzed as shown in the following table.

The Product data contains a single table and the following columns.

The customer satisfaction data contains the following tables:
– Survey
– Question
– Response
For each survey submitted, the following occurs:
– One row is added to the Survey table.
– One row is added to the Response table for each question in the survey.
– The Question table contains the text of each survey question. The third question in each survey response is an overall satisfaction score. Customers can submit a survey after each purchase.
User Problems
The analytics team has large volumes of data, some of which is semi-structured. The team wants to use Fabric to create a new data store.
Product data is often classified into three pricing groups: high, medium, and low. This logic is implemented in several databases and semantic models, but the logic does NOT always match across implementations.
Requirements
Planned Changes
Litware plans to enable Fabric features in the existing tenant. The analytics team will create a new data store as a proof of concept (PoC). The remaining Liware users will only get access to the Fabric features once the PoC is complete. The PoC will be completed by using a Fabric trial capacity
The following three workspaces will be created:
– AnalyticsPOC: Will contain the data store, semantic models, reports pipelines, dataflow, and notebooks used to populate the data store
– DataEngPOC: Will contain all the pipelines, dataflows, and notebooks used to populate OneLake
– DataSciPOC: Will contain all the notebooks and reports created by the data scientists
The following will be created in the AnalyticsPOC workspace:
– A data store (type to be decided)
– A custom semantic model
– A default semantic model
Interactive reports
The data engineers will create data pipelines to load data to OneLake either hourly or daily depending on the data source. The analytics engineers will create processes to ingest, transform, and load the data to the data store in the AnalyticsPOC workspace daily. Whenever possible, the data engineers will use low-code tools for data ingestion. The choice of which data cleansing and transformation tools to use will be at the data engineers’ discretion.
All the semantic models and reports in the Analytics POC workspace will use the data store as the sole data source.
Technical Requirements
The data store must support the following:
– Read access by using T-SQL or Python
– Semi-structured and unstructured data
– Row-level security (RLS) for users executing T-SQL queries
Files loaded by the data engineers to OneLake will be stored in the Parquet format and will meet Delta Lake specifications.
Data will be loaded without transformation in one area of the AnalyticsPOC data store. The data will then be cleansed, merged, and transformed into a dimensional model
The data load process must ensure that the raw and cleansed data is updated completely before populating the dimensional model
The dimensional model must contain a date dimension. There is no existing data source for the date dimension. The Litware fiscal year matches the calendar year. The date dimension must always contain dates from 2010 through the end of the current year.
The product pricing group logic must be maintained by the analytics engineers in a single location. The pricing group data must be made available in the data store for T-SOL. queries and in the default semantic model. The following logic must be used:
– List prices that are less than or equal to 50 are in the low pricing group.
– List prices that are greater than 50 and less than or equal to 1,000 are in the medium pricing group.
– List prices that are greater than 1,000 are in the high pricing group.
Security Requirements
Only Fabric administrators and the analytics team must be able to see the Fabric items created as part of the PoC.
Litware identifies the following security requirements for the Fabric items in the AnalyticsPOC workspace:
– Fabric administrators will be the workspace administrators.
– The data engineers must be able to read from and write to the data store. No access must be granted to datasets or reports.
– The analytics engineers must be able to read from, write to, and create schemas in the data store. They also must be able to create and share semantic models with the data analysts and view and modify all reports in the workspace.
– The data scientists must be able to read from the data store, but not write to it. They will access the data by using a Spark notebook
– The data analysts must have read access to only the dimensional model objects in the data store. They also must have access to create Power BI reports by using the semantic models created by the analytics engineers.
– The date dimension must be available to all users of the data store.
– The principle of least privilege must be followed.
Both the default and custom semantic models must include only tables or views from the dimensional model in the data store. Litware already has the following Microsoft Entra security groups:
FabricAdmins: Fabric administrators
– AnalyticsTeam: All the members of the analytics team
– DataAnalysts: The data analysts on the analytics team
– DataScientists: The data scientists on the analytics team
– DataEngineers: The data engineers on the analytics team
– AnalyticsEngineers: The analytics engineers on the analytics team
Report Requirements
The data analysts must create a customer satisfaction report that meets the following requirements:
– Enables a user to select a product to filter customer survey responses to only those who have purchased that product.
– Displays the average overall satisfaction score of all the surveys submitted during the last 12 months up to a selected dat.
– Shows data as soon as the data is updated in the data store.
– Ensures that the report and the semantic model only contain data from the current and previous year.
– Ensures that the report respects any table-level security specified in the source data store.
– Minimizes the execution time of report queries.
What should you recommend using to ingest the customer data into the data store in the AnalyticsPOC workspace?

A. a stored procedure
B. a pipeline that contains a KQL activity
C. a Spark notebook
D. a dataflow

Answer: D
Explanation:
Even though the text reads “Data will be loaded without transformation in one area of the AnalyticsPOC data store”: in general, dataflows are used when data transformations are involved after ingestion. As suggested by user BHARAT, the Copy Activity should be the optimal solution.

QUESTION 6
Case Study 2 – Litware, Inc
Overview
Litware, Inc. is a manufacturing company that has offices throughout North America. The analytics team at Litware contains data engineers, analytics engineers, data analysts, and data scientists.
Existing Environment
Fabric Environment
Litware has been using a Microsoft Power BI tenant for three years. Litware has NOT enabled any Fabric capacities and features.
Available Data
Litware has data that must be analyzed as shown in the following table.

The Product data contains a single table and the following columns.

The customer satisfaction data contains the following tables:
– Survey
– Question
– Response
For each survey submitted, the following occurs:
– One row is added to the Survey table.
– One row is added to the Response table for each question in the survey.
– The Question table contains the text of each survey question. The third question in each survey response is an overall satisfaction score. Customers can submit a survey after each purchase.
User Problems
The analytics team has large volumes of data, some of which is semi-structured. The team wants to use Fabric to create a new data store.
Product data is often classified into three pricing groups: high, medium, and low. This logic is implemented in several databases and semantic models, but the logic does NOT always match across implementations.
Requirements
Planned Changes
Litware plans to enable Fabric features in the existing tenant. The analytics team will create a new data store as a proof of concept (PoC). The remaining Liware users will only get access to the Fabric features once the PoC is complete. The PoC will be completed by using a Fabric trial capacity
The following three workspaces will be created:
– AnalyticsPOC: Will contain the data store, semantic models, reports pipelines, dataflow, and notebooks used to populate the data store
– DataEngPOC: Will contain all the pipelines, dataflows, and notebooks used to populate OneLake
– DataSciPOC: Will contain all the notebooks and reports created by the data scientists
The following will be created in the AnalyticsPOC workspace:
– A data store (type to be decided)
– A custom semantic model
– A default semantic model
Interactive reports
The data engineers will create data pipelines to load data to OneLake either hourly or daily depending on the data source. The analytics engineers will create processes to ingest, transform, and load the data to the data store in the AnalyticsPOC workspace daily. Whenever possible, the data engineers will use low-code tools for data ingestion. The choice of which data cleansing and transformation tools to use will be at the data engineers’ discretion.
All the semantic models and reports in the Analytics POC workspace will use the data store as the sole data source.
Technical Requirements
The data store must support the following:
– Read access by using T-SQL or Python
– Semi-structured and unstructured data
– Row-level security (RLS) for users executing T-SQL queries
Files loaded by the data engineers to OneLake will be stored in the Parquet format and will meet Delta Lake specifications.
Data will be loaded without transformation in one area of the AnalyticsPOC data store. The data will then be cleansed, merged, and transformed into a dimensional model
The data load process must ensure that the raw and cleansed data is updated completely before populating the dimensional model
The dimensional model must contain a date dimension. There is no existing data source for the date dimension. The Litware fiscal year matches the calendar year. The date dimension must always contain dates from 2010 through the end of the current year.
The product pricing group logic must be maintained by the analytics engineers in a single location. The pricing group data must be made available in the data store for T-SOL. queries and in the default semantic model. The following logic must be used:
– List prices that are less than or equal to 50 are in the low pricing group.
– List prices that are greater than 50 and less than or equal to 1,000 are in the medium pricing group.
– List prices that are greater than 1,000 are in the high pricing group.
Security Requirements
Only Fabric administrators and the analytics team must be able to see the Fabric items created as part of the PoC.
Litware identifies the following security requirements for the Fabric items in the AnalyticsPOC workspace:
– Fabric administrators will be the workspace administrators.
– The data engineers must be able to read from and write to the data store. No access must be granted to datasets or reports.
– The analytics engineers must be able to read from, write to, and create schemas in the data store. They also must be able to create and share semantic models with the data analysts and view and modify all reports in the workspace.
– The data scientists must be able to read from the data store, but not write to it. They will access the data by using a Spark notebook
– The data analysts must have read access to only the dimensional model objects in the data store. They also must have access to create Power BI reports by using the semantic models created by the analytics engineers.
– The date dimension must be available to all users of the data store.
– The principle of least privilege must be followed.
Both the default and custom semantic models must include only tables or views from the dimensional model in the data store. Litware already has the following Microsoft Entra security groups:
FabricAdmins: Fabric administrators
– AnalyticsTeam: All the members of the analytics team
– DataAnalysts: The data analysts on the analytics team
– DataScientists: The data scientists on the analytics team
– DataEngineers: The data engineers on the analytics team
– AnalyticsEngineers: The analytics engineers on the analytics team
Report Requirements
The data analysts must create a customer satisfaction report that meets the following requirements:
– Enables a user to select a product to filter customer survey responses to only those who have purchased that product.
– Displays the average overall satisfaction score of all the surveys submitted during the last 12 months up to a selected dat.
– Shows data as soon as the data is updated in the data store.
– Ensures that the report and the semantic model only contain data from the current and previous year.
– Ensures that the report respects any table-level security specified in the source data store.
– Minimizes the execution time of report queries.
Which type of data store should you recommend in the AnalyticsPOC workspace?

A. a data lake
B. a warehouse
C. a lakehouse
D. an external Hive metastore

Answer: C
Explanation:
The data store must handle semi-structured and unstructured data, therefore a Lakehouse should be the optimal solution supporting read access with T-SQL and Python.

QUESTION 7
Case Study 2 – Litware, Inc
Overview
Litware, Inc. is a manufacturing company that has offices throughout North America. The analytics team at Litware contains data engineers, analytics engineers, data analysts, and data scientists.
Existing Environment
Fabric Environment
Litware has been using a Microsoft Power BI tenant for three years. Litware has NOT enabled any Fabric capacities and features.
Available Data
Litware has data that must be analyzed as shown in the following table.

The Product data contains a single table and the following columns.

The customer satisfaction data contains the following tables:
– Survey
– Question
– Response
For each survey submitted, the following occurs:
– One row is added to the Survey table.
– One row is added to the Response table for each question in the survey.
– The Question table contains the text of each survey question. The third question in each survey response is an overall satisfaction score. Customers can submit a survey after each purchase.
User Problems
The analytics team has large volumes of data, some of which is semi-structured. The team wants to use Fabric to create a new data store.
Product data is often classified into three pricing groups: high, medium, and low. This logic is implemented in several databases and semantic models, but the logic does NOT always match across implementations.
Requirements
Planned Changes
Litware plans to enable Fabric features in the existing tenant. The analytics team will create a new data store as a proof of concept (PoC). The remaining Liware users will only get access to the Fabric features once the PoC is complete. The PoC will be completed by using a Fabric trial capacity
The following three workspaces will be created:
– AnalyticsPOC: Will contain the data store, semantic models, reports pipelines, dataflow, and notebooks used to populate the data store
– DataEngPOC: Will contain all the pipelines, dataflows, and notebooks used to populate OneLake
– DataSciPOC: Will contain all the notebooks and reports created by the data scientists
The following will be created in the AnalyticsPOC workspace:
– A data store (type to be decided)
– A custom semantic model
– A default semantic model
Interactive reports
The data engineers will create data pipelines to load data to OneLake either hourly or daily depending on the data source. The analytics engineers will create processes to ingest, transform, and load the data to the data store in the AnalyticsPOC workspace daily. Whenever possible, the data engineers will use low-code tools for data ingestion. The choice of which data cleansing and transformation tools to use will be at the data engineers’ discretion.
All the semantic models and reports in the Analytics POC workspace will use the data store as the sole data source.
Technical Requirements
The data store must support the following:
– Read access by using T-SQL or Python
– Semi-structured and unstructured data
– Row-level security (RLS) for users executing T-SQL queries
Files loaded by the data engineers to OneLake will be stored in the Parquet format and will meet Delta Lake specifications.
Data will be loaded without transformation in one area of the AnalyticsPOC data store. The data will then be cleansed, merged, and transformed into a dimensional model
The data load process must ensure that the raw and cleansed data is updated completely before populating the dimensional model
The dimensional model must contain a date dimension. There is no existing data source for the date dimension. The Litware fiscal year matches the calendar year. The date dimension must always contain dates from 2010 through the end of the current year.
The product pricing group logic must be maintained by the analytics engineers in a single location. The pricing group data must be made available in the data store for T-SOL. queries and in the default semantic model. The following logic must be used:
– List prices that are less than or equal to 50 are in the low pricing group.
– List prices that are greater than 50 and less than or equal to 1,000 are in the medium pricing group.
– List prices that are greater than 1,000 are in the high pricing group.
Security Requirements
Only Fabric administrators and the analytics team must be able to see the Fabric items created as part of the PoC.
Litware identifies the following security requirements for the Fabric items in the AnalyticsPOC workspace:
– Fabric administrators will be the workspace administrators.
– The data engineers must be able to read from and write to the data store. No access must be granted to datasets or reports.
– The analytics engineers must be able to read from, write to, and create schemas in the data store. They also must be able to create and share semantic models with the data analysts and view and modify all reports in the workspace.
– The data scientists must be able to read from the data store, but not write to it. They will access the data by using a Spark notebook
– The data analysts must have read access to only the dimensional model objects in the data store. They also must have access to create Power BI reports by using the semantic models created by the analytics engineers.
– The date dimension must be available to all users of the data store.
– The principle of least privilege must be followed.
Both the default and custom semantic models must include only tables or views from the dimensional model in the data store. Litware already has the following Microsoft Entra security groups:
FabricAdmins: Fabric administrators
– AnalyticsTeam: All the members of the analytics team
– DataAnalysts: The data analysts on the analytics team
– DataScientists: The data scientists on the analytics team
– DataEngineers: The data engineers on the analytics team
– AnalyticsEngineers: The analytics engineers on the analytics team
Report Requirements
The data analysts must create a customer satisfaction report that meets the following requirements:
– Enables a user to select a product to filter customer survey responses to only those who have purchased that product.
– Displays the average overall satisfaction score of all the surveys submitted during the last 12 months up to a selected dat.
– Shows data as soon as the data is updated in the data store.
– Ensures that the report and the semantic model only contain data from the current and previous year.
– Ensures that the report respects any table-level security specified in the source data store.
– Minimizes the execution time of report queries.
Hotspot Question
You need to assign permissions for the data store in the AnalyticsPOC workspace. The solution must meet the security requirements.
Which additional permissions should you assign when you share the data store? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 8
Case Study 2 – Litware, Inc
Overview
Litware, Inc. is a manufacturing company that has offices throughout North America. The analytics team at Litware contains data engineers, analytics engineers, data analysts, and data scientists.
Existing Environment
Fabric Environment
Litware has been using a Microsoft Power BI tenant for three years. Litware has NOT enabled any Fabric capacities and features.
Available Data
Litware has data that must be analyzed as shown in the following table.

The Product data contains a single table and the following columns.

The customer satisfaction data contains the following tables:
– Survey
– Question
– Response
For each survey submitted, the following occurs:
– One row is added to the Survey table.
– One row is added to the Response table for each question in the survey.
– The Question table contains the text of each survey question. The third question in each survey response is an overall satisfaction score. Customers can submit a survey after each purchase.
User Problems
The analytics team has large volumes of data, some of which is semi-structured. The team wants to use Fabric to create a new data store.
Product data is often classified into three pricing groups: high, medium, and low. This logic is implemented in several databases and semantic models, but the logic does NOT always match across implementations.
Requirements
Planned Changes
Litware plans to enable Fabric features in the existing tenant. The analytics team will create a new data store as a proof of concept (PoC). The remaining Liware users will only get access to the Fabric features once the PoC is complete. The PoC will be completed by using a Fabric trial capacity
The following three workspaces will be created:
– AnalyticsPOC: Will contain the data store, semantic models, reports pipelines, dataflow, and notebooks used to populate the data store
– DataEngPOC: Will contain all the pipelines, dataflows, and notebooks used to populate OneLake
– DataSciPOC: Will contain all the notebooks and reports created by the data scientists
The following will be created in the AnalyticsPOC workspace:
– A data store (type to be decided)
– A custom semantic model
– A default semantic model
Interactive reports
The data engineers will create data pipelines to load data to OneLake either hourly or daily depending on the data source. The analytics engineers will create processes to ingest, transform, and load the data to the data store in the AnalyticsPOC workspace daily. Whenever possible, the data engineers will use low-code tools for data ingestion. The choice of which data cleansing and transformation tools to use will be at the data engineers’ discretion.
All the semantic models and reports in the Analytics POC workspace will use the data store as the sole data source.
Technical Requirements
The data store must support the following:
– Read access by using T-SQL or Python
– Semi-structured and unstructured data
– Row-level security (RLS) for users executing T-SQL queries
Files loaded by the data engineers to OneLake will be stored in the Parquet format and will meet Delta Lake specifications.
Data will be loaded without transformation in one area of the AnalyticsPOC data store. The data will then be cleansed, merged, and transformed into a dimensional model
The data load process must ensure that the raw and cleansed data is updated completely before populating the dimensional model
The dimensional model must contain a date dimension. There is no existing data source for the date dimension. The Litware fiscal year matches the calendar year. The date dimension must always contain dates from 2010 through the end of the current year.
The product pricing group logic must be maintained by the analytics engineers in a single location. The pricing group data must be made available in the data store for T-SOL. queries and in the default semantic model. The following logic must be used:
– List prices that are less than or equal to 50 are in the low pricing group.
– List prices that are greater than 50 and less than or equal to 1,000 are in the medium pricing group.
– List prices that are greater than 1,000 are in the high pricing group.
Security Requirements
Only Fabric administrators and the analytics team must be able to see the Fabric items created as part of the PoC.
Litware identifies the following security requirements for the Fabric items in the AnalyticsPOC workspace:
– Fabric administrators will be the workspace administrators.
– The data engineers must be able to read from and write to the data store. No access must be granted to datasets or reports.
– The analytics engineers must be able to read from, write to, and create schemas in the data store. They also must be able to create and share semantic models with the data analysts and view and modify all reports in the workspace.
– The data scientists must be able to read from the data store, but not write to it. They will access the data by using a Spark notebook
– The data analysts must have read access to only the dimensional model objects in the data store. They also must have access to create Power BI reports by using the semantic models created by the analytics engineers.
– The date dimension must be available to all users of the data store.
– The principle of least privilege must be followed.
Both the default and custom semantic models must include only tables or views from the dimensional model in the data store. Litware already has the following Microsoft Entra security groups:
FabricAdmins: Fabric administrators
– AnalyticsTeam: All the members of the analytics team
– DataAnalysts: The data analysts on the analytics team
– DataScientists: The data scientists on the analytics team
– DataEngineers: The data engineers on the analytics team
– AnalyticsEngineers: The analytics engineers on the analytics team
Report Requirements
The data analysts must create a customer satisfaction report that meets the following requirements:
– Enables a user to select a product to filter customer survey responses to only those who have purchased that product.
– Displays the average overall satisfaction score of all the surveys submitted during the last 12 months up to a selected dat.
– Shows data as soon as the data is updated in the data store.
– Ensures that the report and the semantic model only contain data from the current and previous year.
– Ensures that the report respects any table-level security specified in the source data store.
– Minimizes the execution time of report queries.
Hotspot Question
You need to create a DAX measure to calculate the average overall satisfaction score.
How should you complete the DAX code? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 9
Case Study 2 – Litware, Inc
Overview
Litware, Inc. is a manufacturing company that has offices throughout North America. The analytics team at Litware contains data engineers, analytics engineers, data analysts, and data scientists.
Existing Environment
Fabric Environment
Litware has been using a Microsoft Power BI tenant for three years. Litware has NOT enabled any Fabric capacities and features.
Available Data
Litware has data that must be analyzed as shown in the following table.

The Product data contains a single table and the following columns.

The customer satisfaction data contains the following tables:
– Survey
– Question
– Response
For each survey submitted, the following occurs:
– One row is added to the Survey table.
– One row is added to the Response table for each question in the survey.
– The Question table contains the text of each survey question. The third question in each survey response is an overall satisfaction score. Customers can submit a survey after each purchase.
User Problems
The analytics team has large volumes of data, some of which is semi-structured. The team wants to use Fabric to create a new data store.
Product data is often classified into three pricing groups: high, medium, and low. This logic is implemented in several databases and semantic models, but the logic does NOT always match across implementations.
Requirements
Planned Changes
Litware plans to enable Fabric features in the existing tenant. The analytics team will create a new data store as a proof of concept (PoC). The remaining Liware users will only get access to the Fabric features once the PoC is complete. The PoC will be completed by using a Fabric trial capacity
The following three workspaces will be created:
– AnalyticsPOC: Will contain the data store, semantic models, reports pipelines, dataflow, and notebooks used to populate the data store
– DataEngPOC: Will contain all the pipelines, dataflows, and notebooks used to populate OneLake
– DataSciPOC: Will contain all the notebooks and reports created by the data scientists
The following will be created in the AnalyticsPOC workspace:
– A data store (type to be decided)
– A custom semantic model
– A default semantic model
Interactive reports
The data engineers will create data pipelines to load data to OneLake either hourly or daily depending on the data source. The analytics engineers will create processes to ingest, transform, and load the data to the data store in the AnalyticsPOC workspace daily. Whenever possible, the data engineers will use low-code tools for data ingestion. The choice of which data cleansing and transformation tools to use will be at the data engineers’ discretion.
All the semantic models and reports in the Analytics POC workspace will use the data store as the sole data source.
Technical Requirements
The data store must support the following:
– Read access by using T-SQL or Python
– Semi-structured and unstructured data
– Row-level security (RLS) for users executing T-SQL queries
Files loaded by the data engineers to OneLake will be stored in the Parquet format and will meet Delta Lake specifications.
Data will be loaded without transformation in one area of the AnalyticsPOC data store. The data will then be cleansed, merged, and transformed into a dimensional model
The data load process must ensure that the raw and cleansed data is updated completely before populating the dimensional model
The dimensional model must contain a date dimension. There is no existing data source for the date dimension. The Litware fiscal year matches the calendar year. The date dimension must always contain dates from 2010 through the end of the current year.
The product pricing group logic must be maintained by the analytics engineers in a single location. The pricing group data must be made available in the data store for T-SOL. queries and in the default semantic model. The following logic must be used:
– List prices that are less than or equal to 50 are in the low pricing group.
– List prices that are greater than 50 and less than or equal to 1,000 are in the medium pricing group.
– List prices that are greater than 1,000 are in the high pricing group.
Security Requirements
Only Fabric administrators and the analytics team must be able to see the Fabric items created as part of the PoC.
Litware identifies the following security requirements for the Fabric items in the AnalyticsPOC workspace:
– Fabric administrators will be the workspace administrators.
– The data engineers must be able to read from and write to the data store. No access must be granted to datasets or reports.
– The analytics engineers must be able to read from, write to, and create schemas in the data store. They also must be able to create and share semantic models with the data analysts and view and modify all reports in the workspace.
– The data scientists must be able to read from the data store, but not write to it. They will access the data by using a Spark notebook
– The data analysts must have read access to only the dimensional model objects in the data store. They also must have access to create Power BI reports by using the semantic models created by the analytics engineers.
– The date dimension must be available to all users of the data store.
– The principle of least privilege must be followed.
Both the default and custom semantic models must include only tables or views from the dimensional model in the data store. Litware already has the following Microsoft Entra security groups:
FabricAdmins: Fabric administrators
– AnalyticsTeam: All the members of the analytics team
– DataAnalysts: The data analysts on the analytics team
– DataScientists: The data scientists on the analytics team
– DataEngineers: The data engineers on the analytics team
– AnalyticsEngineers: The analytics engineers on the analytics team
Report Requirements
The data analysts must create a customer satisfaction report that meets the following requirements:
– Enables a user to select a product to filter customer survey responses to only those who have purchased that product.
– Displays the average overall satisfaction score of all the surveys submitted during the last 12 months up to a selected dat.
– Shows data as soon as the data is updated in the data store.
– Ensures that the report and the semantic model only contain data from the current and previous year.
– Ensures that the report respects any table-level security specified in the source data store.
– Minimizes the execution time of report queries.
Hotspot Question
You need to resolve the issue with the pricing group classification.
How should you complete the T-SQL statement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 11
You are the administrator of a Fabric workspace that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:
Table1: A Delta table created by using a shortcut
Table2: An external table created by using Spark
Table3: A managed table
You plan to connect to Lakehouse1 by using its SQL endpoint.
What will you be able to do after connecting to Lakehouse1?

A. Read Table3.
B. Update the data Table3.
C. Read Table2.
D. Update the data in Table1.

Answer: A

QUESTION 12
You have a Fabric tenant that contains a warehouse.
You use a dataflow to load a new dataset from OneLake to the warehouse.
You need to add a PowerQuery step to identify the maximum values for the numeric columns.
Which function should you include in the step?

A. Table.MaxN
B. Table.Max
C. Table.Range
D. Table.Profile

Answer: D
Explanation:
https://learn.microsoft.com/en-us/powerquery-m/table-profile

QUESTION 13
You have a Fabric tenant that contains a machine learning model registered in a Fabric workspace.
You need to use the model to generate predictions by using the PREDICT function in a Fabric notebook.
Which two languages can you use to perform model scoring? Each correct answer presents a complete solution.
NOTE: Each correct answer is worth one point.

A. T-SQL
B. DAX
C. Spark SQL
D. PySpark

Answer: CD
Explanation:
https://learn.microsoft.com/en-us/azure/synapse-analytics/machine-learning/tutorial-score-model-predict-spark-pool

QUESTION 14
You are analyzing the data in a Fabric notebook.
You have a Spark DataFrame assigned to a variable named df.
You need to use the Chart view in the notebook to explore the data manually.
Which function should you run to make the data available in the Chart view?

A. displayHTML
B. show
C. write
D. display

Answer: D

QUESTION 15
You have a Fabric tenant that contains a Microsoft Power BI report named Report1. Report1 includes a Python visual.
Data displayed by the visual is grouped automatically and duplicate rows are NOT displayed.
You need all rows to appear in the visual.
What should you do?

A. Reference the columns in the Python code by index.
B. Modify the Sort Column By property for all columns.
C. Add a unique field to each row.
D. Modify the Summarize By property for all columns.

Answer: D
Explanation:
By setting the “Summarize By” property to “None” for all columns, you disable automatic aggregation and ensure all rows, including duplicates, are displayed in the Python visual.

QUESTION 16
You have a Fabric workspace named Workspace1 that contains a dataflow named Dataflow1. Dataflow1 has a query that returns 2,000 rows.
You view the query in Power Query as shown in the following exhibit.

What can you identify about the pickupLongitude column?

A. The column has duplicate values.
B. All the table rows are profiled.
C. The column has missing values.
D. There are 935 values that occur only once.

Answer: A

QUESTION 17
You have a Fabric tenant named Tenant1 that contains a workspace named WS1. WS1 uses a capacity named C1 and contains a dataset named DS1.
You need to ensure read-write access to DS1 is available by using XMLA endpoint.
What should be modified first?

A. the DS1 settings
B. the WS1 settings
C. the C1 settings
D. the Tenant1 settings

Answer: C
Explanation:
https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools

QUESTION 18
You have a Fabric tenant that contains a workspace named Workspace1. Workspace1 is assigned to a Fabric capacity.
You need to recommend a solution to provide users with the ability to create and publish custom Direct Lake semantic models by using external tools. The solution must follow the principle of least privilege.
Which three actions in the Fabric Admin portal should you include in the recommendation? Each correct answer presents part of the solution.
NOTE: Each correct answer is worth one point.

A. From the Tenant settings, set Allow XMLA Endpoints and Analyze in Excel with on-premises datasets to Enabled.
B. From the Tenant settings, set Allow Azure Active Directory guest users to access Microsoft Fabric to Enabled.
C. From the Tenant settings, select Users can edit data model in the Power BI service.
D. From the Capacity settings, set XMLA Endpoint to Read Write.
E. From the Tenant settings, set Users can create Fabric items to Enabled.
F. From the Tenant settings, enable Publish to Web.

Answer: ACD

QUESTION 19
You are creating a semantic model in Microsoft Power BI Desktop.
You plan to make bulk changes to the model by using the Tabular Model Definition Language (TMDL) extension for Microsoft Visual Studio Code.
You need to save the semantic model to a file.
Which file format should you use?

A. PBIP
B. PBIX
C. PBIT
D. PBIDS

Answer: A
Explanation:
The PBIP will create one file and two folders, PBIP.Dataset contains definition folder that is use to host the .tmdl files.

QUESTION 20
You plan to deploy Microsoft Power BI items by using Fabric deployment pipelines. You have a deployment pipeline that contains three stages named Development, Test, and Production. A workspace is assigned to each stage.
You need to provide Power BI developers with access to the pipeline. The solution must meet the following requirements:
– Ensure that the developers can deploy items to the workspaces for Development and Test.
– Prevent the developers from deploying items to the workspace for Production.
– Follow the principle of least privilege.
Which three levels of access should you assign to the developers? Each correct answer presents part of the solution.
NOTE: Each correct answer is worth one point.

A. Build permission to the production semantic models
B. Admin access to the deployment pipeline
C. Viewer access to the Development and Test workspaces
D. Viewer access to the Production workspace
E. Contributor access to the Development and Test workspaces
F. Contributor access to the Production workspace

Answer: ADE

QUESTION 21
You have a Fabric workspace that contains a DirectQuery semantic model. The model queries a data source that has 500 million rows.
You have a Microsoft Power Bi report named Report1 that uses the model. Report1 contains visuals on multiple pages.
You need to reduce the query execution time for the visuals on all the pages.
What are two features that you can use? Each correct answer presents a complete solution,
NOTE: Each correct answer is worth one point.

A. user-defined aggregations
B. automatic aggregation
C. query caching
D. OneLake integration

Answer: AB

QUESTION 22
You have a Fabric tenant that contains 30 CSV files in OneLake. The files are updated daily.
You create a Microsoft Power BI semantic model named Model1 that uses the CSV files as a data source. You configure incremental refresh for Model1 and publish the model to a Premium capacity in the Fabric tenant.
When you initiate a refresh of Model1, the refresh fails after running out of resources.
What is a possible cause of the failure?

A. Query folding is occurring.
B. Only refresh complete days is selected.
C. XMLA Endpoint is set to Read Only.
D. Query folding is NOT occurring.
E. The delta type of the column used to partition the data has changed.

Answer: D
Explanation:
https://learn.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-troubleshoot#problem-loading-data-takes-too-long

QUESTION 23
You have a Fabric tenant that uses a Microsoft Power BI Premium capacity.
You need to enable scale-out for a semantic model.
What should you do first?

A. At the semantic model level, set Large dataset storage format to Off.
B. At the tenant level, set Create and use Metrics to Enabled.
C. At the semantic model level, set Large dataset storage format to On.
D. At the tenant level, set Data Activator to Enabled.

Answer: C
Explanation:
https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-scale-out-configure

QUESTION 24
You have a Fabric tenant that contains a warehouse. The warehouse uses row-level security (RLS).
You create a Direct Lake semantic model that uses the Delta tables and RLS of the warehouse.
When users interact with a report built from the model, which mode will be used by the DAX queries?

A. DirectQuery
B. Dual
C. Direct Lake
D. Import

Answer: A
Explanation:
Row-level security only applies to queries on a Warehouse or SQL analytics endpoint in Fabric. Power BI queries on a warehouse in Direct Lake mode will fall back to Direct Query mode to abide by row-level security.
https://learn.microsoft.com/en-us/fabric/data-warehouse/row-level-security

QUESTION 25
You have a Fabric tenant that contains a complex semantic model. The model is based on a star schema and contains many tables, including a fact table named Sales.
You need to create a diagram of the model. The diagram must contain only the Sales table and related tables.
What should you use from Microsoft Power BI Desktop?

A. data categories
B. Data view
C. Model view
D. DAX query view

Answer: C
Explanation:
In the Model view, it is possible to analyze the semantic model and create new layouts.

QUESTION 26
You have a Fabric tenant that contains a semantic model. The model uses Direct Lake mode.
You suspect that some DAX queries load unnecessary columns into memory.
You need to identify the frequently used columns that are loaded into memory.
What are two ways to achieve the goal? Each correct answer presents a complete solution.
NOTE: Each correct answer is worth one point.

A. Use the Analyze in Excel feature.
B. Use the Vertipaq Analyzer tool.
C. Query the $System.DISCOVER_STORAGE_TABLE_COLUMN_SEGMENTS dynamic management view (DMV).
D. Query the DISCOVER_MEMORYGRANT dynamic management view (DMV).

Answer: BC

QUESTION 27
You have a Fabric tenant that contains a semantic model named Model1. Model1 uses Import mode. Model1 contains a table named Orders. Orders has 100 million rows and the following fields.

You need to reduce the memory used by Model1 and the time it takes to refresh the model.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct answer is worth one point.

A. Split OrderDateTime into separate date and time columns.
B. Replace TotalQuantity with a calculated column.
C. Convert Quantity into the Text data type.
D. Replace TotalSalesAmount with a measure.

Answer: AD

QUESTION 28
You have a Fabric tenant that contains a semantic model.
You need to prevent report creators from populating visuals by using implicit measures.
What are two tools that you can use to achieve the goal? Each correct answer presents a complete solution.
NOTE: Each correct answer is worth one point.

A. Microsoft Power BI Desktop
B. Tabular Editor
C. Microsoft SQL Server Management Studio (SSMS)
D. DAX Studio

Answer: AB
Explanation:
To prevent report creators from populating visuals using implicit measures in a Power BI semantic model within a Fabric tenant, you can utilize the following tools:
1. Tabular Editor:
2. Power BI Desktop (Data Model View):

QUESTION 29
You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a table named Tablet.
You are creating a new data pipeline.
You plan to copy external data to Table1. The schema of the external data changes regularly.
You need the copy operation to meet the following requirements:
– Replace Table1 with the schema of the external data.
– Replace all the data in Table1 with the rows in the external data.
You add a Copy data activity to the pipeline.
What should you do for the Copy data activity?

A. From the Source tab, add additional columns.
B. From the Destination tab, set Table action to Overwrite.
C. From the Settings tab, select Enable staging.
D. From the Source tab, select Enable partition discovery.
E. From the Source tab, select Recursively.

Answer: B
Explanation:
Enable “Truncate table” option: This option truncates the target table before copying data, ensuring that all existing data is replaced with the new data from the external source.

QUESTION 30
You have a Fabric tenant that contains a lakehouse.
You plan to query sales data files by using the SQL endpoint. The files will be in an Amazon Simple Storage Service (Amazon S3) storage bucket.
You need to recommend which file format to use and where to create a shortcut.
Which two actions should you include in the recommendation? Each correct answer presents part of the solution.
NOTE: Each correct answer is worth one point.

A. Create a shortcut in the Files section.
B. Use the Parquet format
C. Use the CSV format.
D. Create a shortcut in the Tables section.
E. Use the delta format.

Answer: BD
Explanation:
You should use a columnar file format such as Parquet or ORC (Optimized Row Columnar). These formats are highly optimized for analytical queries and provide efficient storage and query performance.
In the Tables section of your lakehouse, you define virtual tables that represent external data sources. These virtual tables can be backed by data stored externally in formats such as Parquet or ORC in Amazon S3.

QUESTION 31
You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a subfolder named Subfolder1 that contains CSV files.
You need to convert the CSV files into the delta format that has V-Order optimization enabled.
What should you do from Lakehouse explorer?

A. Use the Load to Tables feature.
B. Create a new shortcut in the Files section.
C. Create a new shortcut in the Tables section.
D. Use the Optimize feature.

Answer: A
Explanation:
With ”Load to tables” : tables are always loaded using the Delta Lake table format with V-Order optimization enabled.
https://learn.microsoft.com/en-us/fabric/data-engineering/load-to-tables#load-to-table-capabilities-overview

QUESTION 32
You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains an unpartitioned table named Table1.
You plan to copy data to Table1 and partition the table based on a date column in the source data.
You create a Copy activity to copy the data to Table1.
You need to specify the partition column in the Destination settings of the Copy activity.
What should you do first?

A. From the Destination tab, set Mode to Append.
B. From the Destination tab, select the partition column.
C. From the Source tab, select Enable partition discovery.
D. From the Destination tabs, set Mode to Overwrite.

Answer: D
Explanation:
When setting up the Copy Activity, you need to choose the Overwrite mode to make the partition option appear (not visibile in Append mode).

QUESTION 33
You have source data in a folder on a local computer.
You need to create a solution that will use Fabric to populate a data store. The solution must meet the following requirements:
Support the use of dataflows to load and append data to the data store.
Ensure that Delta tables are V-Order optimized and compacted automatically.
Which type of data store should you use?

A. a lakehouse
B. an Azure SQL database
C. a warehouse
D. a KQL database

Answer: A
Explanation:
To meet the requirements of supporting dataflows to load and append data to the data store while ensuring that Delta tables are V-Order optimized and compacted automatically, you should use a lakehouse in Fabric as your solution.

QUESTION 34
You have a Fabric workspace named Workspace1 that contains a data flow named Dataflow1 contains a query that returns the data shown in the following exhibit.

You need to transform the data columns into attribute-value pairs, where columns become rows.
You select the VendorID column.
Which transformation should you select from the context menu of the VendorID column?

A. Group by
B. Unpivot columns
C. Unpivot other columns
D. Split column
E. Remove other columns

Answer: C

QUESTION 35
You have a Fabric tenant that contains a data pipeline.
You need to ensure that the pipeline runs every four hours on Mondays and Fridays.
To what should you set Repeat for the schedule?

A. Daily
B. By the minute
C. Weekly
D. Hourly

Answer: C
Explanation:
The only way to do this is to set the schedule to ”Weekly”, set the days on Monday and Friday and add manually 6 Time of 4 hour intervals.

QUESTION 36
You have a Fabric tenant that contains a warehouse.
Several times a day, the performance of all warehouse queries degrades. You suspect that Fabric is throttling the compute used by the warehouse.
What should you use to identify whether throttling is occurring?

A. the Capacity settings
B. the Monitoring hub
C. dynamic management views (DMVs)
D. the Microsoft Fabric Capacity Metrics app

Answer: D

QUESTION 37
You have a Fabric tenant that contains a warehouse.
A user discovers that a report that usually takes two minutes to render has been running for 45 minutes and has still not rendered.
You need to identify what is preventing the report query from completing.
Which dynamic management view (DMV) should you use?

A. sys.dm_exec_requests
B. sys.dm_exec_sessions
C. sys.dm_exec_connections
D. sys.dm_pdw_exec_requests

Answer: A
Explanation:
https://learn.microsoft.com/en-us/fabric/data-warehouse/monitor-using-dmv

QUESTION 38
You need to create a data loading pattern for a Type 1 slowly changing dimension (SCD).
Which two actions should you include in the process? Each correct answer presents part of the solution.
NOTE: Each correct answer is worth one point.

A. Update rows when the non-key attributes have changed.
B. Insert new rows when the natural key exists in the dimension table, and the non-key attribute values have changed.
C. Update the effective end date of rows when the non-key attribute values have changed.
D. Insert new records when the natural key is a new value in the table.

Answer: AD
Explanation:
Type 1 SCD does not preserve history, therefore no end dates for table entries exists.

QUESTION 39
You are analyzing customer purchases in a Fabric notebook by using PySpark.
You have the following DataFrames:
– transactions: Contains five columns named transaction_id, customer_id, product_id, amount, and date and has 10 million rows, with each row representing a transaction.
– customers: Contains customer details in 1,000 rows and three columns named customer_id, name, and country.
You need to join the DataFrames on the customer_id column. The solution must minimize data shuffling.
You write the following code.
from pyspark.sql import functions as F
results =
Which code should you run to populate the results DataFrame?

A. transactions.join(F.broadcast(customers), transactions.customer_id == customers.customer_id)
B. transactions.join(customers, transactions.customer_id == customers.customer_id).distinct()
C. transactions.join(customers, transactions.customer_id == customers.customer_id)
D. transactions.crossJoin(customers).where(transactions.customer_id == customers.customer_id)

Answer: A
Explanation:
https://sparkbyexamples.com/spark/broadcast-join-in-spark/”

QUESTION 40
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a Fabric tenant that contains a new semantic model in OneLake.
You use a Fabric notebook to read the data into a Spark DataFrame.
You need to evaluate the data to calculate the min, max, mean, and standard deviation values for all the string and numeric columns.
Solution: You use the following PySpark expression: df.explain()
Does this meet the goal?

A. Yes
B. No

Answer: B

QUESTION 41
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a Fabric tenant that contains a new semantic model in OneLake.
You use a Fabric notebook to read the data into a Spark DataFrame.
You need to evaluate the data to calculate the min, max, mean, and standard deviation values for all the string and numeric columns.
Solution: You use the following PySpark expression: df.show()
Does this meet the goal?

A. Yes
B. No

Answer: B

QUESTION 42
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a Fabric tenant that contains a new semantic model in OneLake.
You use a Fabric notebook to read the data into a Spark DataFrame.
You need to evaluate the data to calculate the min, max, mean, and standard deviation values for all the string and numeric columns.
Solution: You use the following PySpark expression: df.summary()
Does this meet the goal?

A. Yes
B. No

Answer: A

QUESTION 43
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a Delta table named Customer.
When you query Customer, you discover that the query is slow to execute. You suspect that maintenance was NOT performed on the table.
You need to identify whether maintenance tasks were performed on Customer.
Solution: You run the following Spark SQL statement: DESCRIBE HISTORY customer
Does this meet the goal?

A. Yes
B. No

Answer: A

QUESTION 44
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a Delta table named Customer.
When you query Customer, you discover that the query is slow to execute. You suspect that maintenance was NOT performed on the table.
You need to identify whether maintenance tasks were performed on Customer.
Solution: You run the following Spark SQL statement: REFRESH TABLE customer
Does this meet the goal?

A. Yes
B. No

Answer: B

QUESTION 45
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a Delta table named Customer.
When you query Customer, you discover that the query is slow to execute. You suspect that maintenance was NOT performed on the table.
You need to identify whether maintenance tasks were performed on Customer.
Solution: You run the following Spark SQL statement: EXPLAIN TABLE customer
Does this meet the goal?

A. Yes
B. No

Answer: B

QUESTION 46
Hotspot Question
You have a data warehouse that contains a table named Stage.Customers. Stage.Customers contains all the customer record updates from a customer relationship management (CRM) system. There can be multiple updates per customer.
You need to write a T-SQL query that will return the customer ID, name. postal code, and the last updated time of the most recent row for each customer ID.
How should you complete the code? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 47
Hotspot Question
You have a Fabric tenant.
You plan to create a Fabric notebook that will use Spark DataFrames to generate Microsoft Power BI visuals.
You run the following code.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 48
Drag and Drop Question
You have a Fabric tenant that contains a semantic model. The model contains data about retail stores.
You need to write a DAX query that will be executed by using the XMLA endpoint. The query must return a table of stores that have opened since December 1, 2023.
How should you complete the DAX expression? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 49
Hotspot Question
You have a Fabric tenant that contains a warehouse named Warehouse1. Warehouse1 contains three schemas named schemaA, schemaB, and schemaC.
You need to ensure that a user named User1 can truncate tables in schemaA only.
How should you complete the T-SQL statement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 50
Hotspot Question
You have the source data model shown in the following exhibit.

The primary keys of the tables are indicated by a key symbol beside the columns involved in each key.
You need to create a dimensional data model that will enable the analysis of order items by date, product, and customer.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 51
Hotspot Question
You have a Fabric tenant that contains two lakehouses.
You are building a dataflow that will combine data from the lakehouses. The applied steps from one of the queries in the dataflow is shown in the following exhibit.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.
NOTE: Each correct selection is worth one point.

Answer:


Resources From:

1.2025 Latest Braindump2go DP-600 Exam Dumps (PDF & VCE) Free Share:
https://www.braindump2go.com/dp-600.html

2.2025 Latest Braindump2go DP-600 PDF and DP-600 VCE Dumps Free Share:
https://drive.google.com/drive/folders/1hFMvbs2eQP6DLaCpG93gYnq3xN4l19rB?usp=sharing

3.2025 Free Braindump2go DP-600 Exam Questions Download:
https://www.braindump2go.com/free-online-pdf/DP-600-VCE-Dumps(1-51).pdf

Free Resources from Braindump2go,We Devoted to Helping You 100% Pass All Exams!

[2025-November-New]Braindump2go DP-300 Dumps PDF Free[Q205-Q242]

2025/November Latest Braindump2go DP-300 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go DP-300 Real Exam Questions!

QUESTION 205
You are training a new administrator for your company’s Azure data services, which includes database deployed on Azure SQL Database, Azure SQL Managed Instance, and SQL Server on Azure virtual machines (VMs).
You need to identify the fixed roles that are supported by Azure SQL Database only.
Which two fixed roles are available with Azure SQL Database only?
Each correct answer presents a complete solution.

A. db_securityadmin
B. sysadmin
C. dbmanager
D. dbcreator
E. loginmanager

Answer: CE

QUESTION 206
You provision an Azure SQL Managed Instance database named MyDevData. The database will be used by in-house development for application development projects.
DevSupport custom database role members will use dynamic management views (DMVs) to retrieve performance and health information about MyDevData.
You need to ensure that DevSupport members can view information through DMVs.
Which statement should you use?
Each correct answer presents part of the solution.

A. GRANT VIEW DATABASE STATE TO DevSupport
B. GRANT VIEW SERVER STATE TO DevSupport
C. GRANT VIEW DEFINITION TO DevSupport
D. GRANT VIEW REFERENCES TO DevSupport

» Read more

[2025-November-New]Braindump2go AZ-801 Dumps PDF Free[Q136-Q156]

2025/November Latest Braindump2go AZ-801 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go AZ-801 Real Exam Questions!

QUESTION 136
Hotspot Question
You have a generation 1 Azure virtual machine named VM1 that runs Windows Server and is joined to an Active Directory domain.
You plan to enable BitLocker Drive Encryption (Bit-Locker) on volume C of VM1.
You need to ensure that the BitLocker recovery key for VM1 is stored in Active Directory.
Which two Group Policy settings should you configure first? To answer, select the settings in the answer area.
NOTE: Each correct selection is worth one point.

Answer:

QUESTION 137
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a server named Server1 that runs Windows Server._(:з」∠)_488
You need to ensure that only specific applications can modify the data in protected folders on Server1.
Solution: From App & browser control, you configure Reputation-based protection.
Does this meet the goal?

A. Yes
B. No

» Read more

1 2 3 12