Category Archives: Blog
Issuing a Customer Refund in Dynamics 365 Business Central
Introduction In Dynamics 365 Business Central, processing a customer refund is a simple yet important task that ensures accurate financial management and customer satisfaction. Whether a customer has overpaid or requires a refund for another reason, Business Central provides a straightforward process to handle these transactions efficiently. In this guide, we will walk you through the steps to issue a refund, including verifying customer details, applying the refund, and processing the payment. Steps to Process the Refund Issuing a refund in Dynamics 365 Business Central is straightforward, but it’s important to ensure you select the correct document type. 1. Verify the Customer: Start by confirming the customer who will receive the refund. Access the customer list and click on the balance amount to view the customer ledger, which displays only the open or outstanding items. Note the customer number and the amount to be refunded. 2. Navigate to the Payment Journal: Open the Payment Journal and select the appropriate batch to use. Add a line for the refund to the customer, ensuring that the Document Type is set to “Refund.” 3. Apply the Refund: To apply the refund to the outstanding payment, click on “Apply Entries” in the Action Bar. Select the line you want to apply the refund to, then click on “Set Applies-to ID” in the Action Bar. This will fill in the Applies-to ID field for the chosen line. Click OK to close the page. 4. Process the Payment: Finally, process the payment. Whether you’re using a computer check or an electronic payment, follow the same steps as you would for paying a vendor. Once everything is completed, post the payment. 5. Customer ledger entry for the refund showcased above Conclusion Processing customer refunds in Dynamics 365 Business Central is a straightforward process that enhances financial accuracy and customer satisfaction. By following the outlined steps—verifying customer details, applying the refund, and processing the payment—you can ensure that refunds are handled efficiently and correctly. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Identify out about issues with the background processing of the job queue and get notified – Business central D365
Introduction Tasks, reports, and batch processes can occasionally take a very long time to finish. Businesses frequently make use of Business Central‘s background processing capabilities to keep employees working in the meantime. If something goes wrong and an important background process stops, or isn’t scheduled for some reason, being notified of the issue can help you react quickly to resolve it. Pre-requisites Business Central on Cloud References Get notified about issues with job queue background processing | Microsoft Learn Configuration and Feature overview To stay informed about issues with job queue processing, you can subscribe to external business events triggered under the following conditions: – Job queue processing fails after multiple retries. – Job queue processing fails to schedule. These external business events can be captured using the “When a business event occurs” trigger in a Power Automate flow. For added flexibility, we’ve included a couple of pre-built Power Automate templates with this feature. You also have the option to create a custom Power Automate flow to tailor a notification system that suits your business needs. For example, you can: – Notify the user whose credentials are used by the job queue. – Notify recipients specified in the Business Central admin center. Additionally, new job queue APIs enable automation to address specific job queue issues by restarting or rescheduling the job queue. Conclusion In conclusion, this feature offers a robust solution for monitoring and managing job queue processing issues. By subscribing to external business events and utilizing Power Automate flows, you can ensure that critical notifications are promptly delivered to the right individuals, enhancing your business’s responsiveness. With the added flexibility of built-in templates and customizable flows, along with the new job queue APIs, you have the tools needed to efficiently address and mitigate job queue disruptions, ultimately improving operational reliability.
Share Story :
Integration with Finance and Operations – From Basics (Part 2)
Introduction Finance and Operations provides two major ways to interact with tables (or data entities) for external system using APIs; namely Custom Services and Data Entities. Data entities in D365 Finance and Operations simplify data management by grouping data from multiple tables. They make it easier to import, export, and integrate data with other systems. Custom services in D365 Finance and Operations allow developers to create web services for specific business needs. They enable external systems to interact with D365 F&O by exposing custom logic and operations. This helps in integrating and automating processes with other applications. In the previous blog, we saw how we can use Data Entities to create APIs.In this blog, we’ll see how we can use Custom Services to create APIs. References Custom Service DevelopmentExposing an X++ Class as a Data Contract Using Data Contracts Pre-requisites Configuration Right click on the project and click on “Add” and then “New Item” Click on Services and select the “Service Group.” Add the appropriate name for your Service Group.Do note that this will be a part of your endpoint url. Once that is done, we’ll need to create a new Service as well.Repeat the same steps but this time select the “Service” object and add the appropriate name. Once both the Service Group and Service objects are created, we’ll need to create request, response and request processing objects. For that, click on Right Click on Project > Add > New Item > Code > Class. Add the appropriate name and click on “Add”. In the Request object, set the attribute [DataContract] at the class level and add Global variables which will be used to send data to the processing object. In the Response object, set the attribute [DataContract] at the class level and add Global variables which will be used to return data from the processing object. In the processing object, write the necessary logic. Here, I’m writing the logic to pull the data from the request object into local variables and then create a Customer record along with an address entry for that customer and if everything is completed successfully, I’ll return a “Success” status along with the customer Id else a “Failed” status along with the Customer ID. If there is any logic for logging, that can be added to our processing class after the main operation has completed.You can do that in the following way – Once this is done, we can now add our processing class to our Service object. Open the “Service” object and set the “Class” field to the processing class you have created. Right click on the Service object in the designer and click on “New Service Operation” In the new Service Operation that is created, set the method from the processing class that you want to call in the “method” field.Set the appropriate name for that method. (This will be part of the endpoint)Set the operational domain, whether it will only work for a particular company or accross the companies.Set the Access Level (Access level increases as you go down the list) Now after this, we’ll assign our Service object to the Service Group. Open the Service Group in the designer, right click it and then click on “New Service” In the newly created “ServiceGroupService” entry set your “Service” Then after rebuild, Sync database and deploy; open postman and add the following URL template. <base_url>/api/services/<ServiceGroup>/<Service>/<Method> Now, if I trigger the “Post” request, I’ll get a “Success” status along with the CustomerId.If I try to recreate the same customer, I’ll get a “Failed” status along with the CustomerId. If you are not sure whether your API exists or not, then you can simply call a “Get” request on the URL – <base_url>/api/services This returns a list of all the “Service Groups” present in the system. We can then call a “Get” request including this “Service Group” into our URL. This returns a list of all the “Services” present in the system for that “Service Group”. We can then call a “Get” request including this “Services” into our URL. This returns a list of all the “Operations” present in the system for that “Service Group”. We can then call a “Get” request including this “Operation” into our URL. This returns the Request and Response objects for this Service Operation. Conclusion Thus, we saw how to create APIs using Custom Services in Finance and Operations. In the next blog, we’ll see some advanced API functionalities that are present in Finance and Operations. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Integration with Finance and Operations – From Basics (Part 1)
Introduction Finance and Operations provides two major ways to interact with tables (or data entities) for external system using APIs; namely Custom Services and Data Entities. Data entities in D365 Finance and Operations simplify data management by grouping data from multiple tables. They make it easier to import, export, and integrate data with other systems. Custom services in D365 Finance and Operations allow developers to create web services for specific business needs.They enable external systems to interact with D365 F&O by exposing custom logic and operations.This helps in integrating and automating processes with other applications. Feature Data Entities Custom Services Purpose Simplify data management tasks like import, export, and integration. Expose custom business logic and operations as web services. Functionality Provide structured access to data from multiple tables in a unified format. Allow external systems to perform actions or retrieve data via API calls. Usage Used for bulk data operations, data migration, and integration with external systems. Used for real-time integration, extending functionality, and custom business process automation. Typical Use Cases Data import/export, data synchronization, and data migration. Integrating with external applications, custom business processes, and real-time data access. Data Handling Focuses on data in bulk. Focuses on specific operations or business logic. Pre-requisites References Data Entities Overview – Finance and Operations Build and consume data entities – Finance and Operations Exposing an X++ class as a Data Contract Configuration Here, to understand creation of APIs in either case, we’ll expose the same table using both Data Entities and Custom Services. Data Entity: Right click on the project and click on “Add” and then “New Item” Click on Finance and Operations > Dynamic 365 Items > Data Model and then select “Data Entity” Select the table that you want to expose in the “Primary Data Source” field, appropirate “Entity Category”, “Public Entity Name” and “Public Entity Set Name” (which is what the endpoint will be), and the Staging Table name. Select the necessary fields from the primary data source. You can add related tables by clicking on the small arrow next to the table name, which displays the list of all associated tables. Then you can select the relevant fields from the associated tables. Once done, you’ll get one data entity, two security privileges and one staging table created. If you want to add new data sources, then you can right click on the Primary Data Source’s “Data Sources” tab and add new data source. You can drag fields from any of the data sources into the “Fields” section of the data entity to make them available on the API. Calling the Data EntityYou can call <base url>/data url to get a list of all the data entities available in the system. From here, if I call a “GET” request on my Data Entity (the “Public Collection Name” property of the data entity, which we set in the Data Entity wizard), I’ll get the following response.Please note that this “Public Collection Name” is case sensitive. Now, if I need to create a “Customer” record then I can simply pass the same keys into a “POST” request. And we can see the same in FnO. If we want to update a record, then we make the PUT request with the syntax – {{base_url}}/data/TestCustomers(dataAreaId='<Company Name>’,CustomerId='<Customer Id>’) It will include all the Entity Keys defined on the Data Entity as we only have one field then we are simply passing that. Passing it without the DataAreaId will throw errors. You can delete the record using the same syntax but with the “Delete” request. Conclusion: In this blog, we explored how to create APIs using Data Entities in Dynamics 365 Finance and Operations, simplifying data management and external system integrations. Data Entities offer an efficient way to handle bulk data operations, while Custom Services provide flexibility for exposing specific business logic We’ll see how to create APIs using Custom Services in the next blog. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Foreign Currency Gains and Losses in Microsoft Business Central – Part 1
Introduction In Microsoft Business Central, businesses that work with different currencies must manage foreign currency gains and losses. Changes in exchange rates between when a transaction happens and when it is settled or reported cause these gains or losses.In this blog, I will show you how to create a sales invoice and payment receipt using foreign currency. This will explain how Microsoft Dynamics 365 Business Central converts foreign currency to local currency and tracks realized or unrealized gains and losses. I will also show how to run the Adjust Exchange Rate batch job in Microsoft Dynamics 365 Business Central. Understanding Foreign Currency and Setup Foreign currency transactions involve buying, selling, or holding assets and liabilities in currencies other than the company’s main currency. When these transactions are settled or revalued at the end of a period, changes in exchange rates can lead to a gain or loss. This difference must be correctly shown in the financial statements for accurate reporting and compliance. Business Central calculates foreign currency gains and losses based on changes in the exchange rate between the posting date of a sales or purchase invoice and the date of a payment or related entry. Setting up Foreign Currency with Realized and Unrealized Gains and Losses Account If you do any of the following, you need to set up a code for each currency you use: – Buy or sell in currencies other than your local currency (LCY) – Record general ledger transactions in both LCY and an additional reporting currency (if require). You can also search for ‘Currencies’ in the search toolbar. You can set the corresponding posting G/L Account No. for Realized and Unrealized Gains and Losses entries by editing the Currency Card. Define Exchange Rate Exchange rates are used to calculate the local currency value (LCY) of each currency transaction. In this example, I will use USD as the foreign currency (FCY) with the sample setup shown in the Currency Exchange Rates table. On the first line of the above table, it indicates that on or after 01.06.2024, USD amount transactions will be converted to LCY using the relationship of 1 USD = 80 INR. This is effective until there is another entry with the latest starting date (i.e., 15.06.2024 where 1 USD = 81 INR and so on). Transactions Creation of Sales Invoice using USD Currency Amount: USD $5,000 Posting Date: 01.06.2024 Exchange Rate: 1 USD = 80 INR The invoice below does not include GST or discounts to make it easier to review. I have also checked that the correct currency code is selected in the Invoice Details section of the sales invoice. By clicking the assist button next to the currency code, you can see the exchange rate used to convert the transaction to local currency (LCY). Note: If the Fix Exchange Rate Amount field in the Currency Exchange Rates Table is set to “Currency” or “Relational Currency,” you can change the Exchange Rate or the Relational Exchange Rate Amount on this page. If you don’t want these to be changed during a transaction, set the value to “Both.” On the Sales Invoice Statistics, you will see that this has already converted to the Local Currency correctly using the exchange rate defined on the setup. Reviewing General Ledger Entries of Posted Sales Invoice Also take note of the FCY and LCY amounts posted in Amount and Amount (LCY) fields of Customer Ledger Entry. Creation of Bank Receipt Entry using USD Currency Amount: USD $5,000 Posting Date: 15.06.2024 Exchange Rate: 1 USD = 81 INR Reviewing General Ledger Entries of Posted Bank Receipt Apply Posted Bank Receipt Entry against Posted Sales Invoice Rever below mentioned screenshot of Customer Leder Entries which was of before applying posted bank receipt payment entry, in both the lines amount LCY is different because of the difference of exchange rate. Now, we will apply Payment entry to Posted Sales Invoice, which was posted on 01.06.2024. Once applied the Amount (LCY) is updated accordingly and balance amount is transferred to realized gain / loss account. Reviewing General Ledger Entries of Posted Bank Receipt – After Application In the above-mentioned screenshot, system has posted Realized Gain Account Entry of the difference amount and adjusted the same in Detailed Customer Ledger Entries to show the exact amount in Amount (LCY). Conclusion Managing foreign currency transactions in Microsoft Dynamics 365 Business Central is essential for businesses dealing with multiple currencies. By setting up the correct exchange rates and tracking realized and unrealized gains or losses, companies can ensure accurate financial reporting. I will be demonstrating how to execute Adjust Exchange Rates Batch Job in Part 2. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 2
In continuation to our Part 1, welcome to part 2 of the blog on Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault. We have already completed the necessary setup in part 1, so if you haven’t read part 1 yet, please do so before proceeding with this part. Assumptions- Before going further, let’s first discuss the assumptions we made: Now, let’s discuss the step to create a pipeline to refresh the access token: – – Create a web activity to pull the client ID, client secret, and refresh token you created in part 1. – As for settings, you use this setup, and URI is your Azure key vault’s Secret Identifier. – Similarly, set up web activities for the client ID, client secret, and refresh token. – For the refresh token, I have done setup as shown but you may want to change it according to your API requirements. Body- grant_type=refresh_token&refresh_token=@{activity(‘Get Refresh Token’).output.value} Authorization- Basic @{base64(concat(activity(‘Get Client Id’).output.value, ‘:’, activity(‘Get Client Secret’).output.value))} – After this, use another web activity to refresh the access token using the refresh token and save it to the Azure Key Vault. Body- { “value”: “@{activity(‘Refresh Access Token’).output.access_token}” } Conlusion: This blog provides a comprehensive guide to automating the access token and refresh token generation process using Azure Data Factory and Azure Key Vault. By following the steps outlined, you can ensure seamless token management, reduce manual interventions, and maintain secure access to your resources. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Automating HTML Email Notifications in Microsoft Dynamics 365 Business Central
Introduction In this blog, we will explore how to create HTML-formatted email notifications in Microsoft Dynamics 365 Business Central using AL code. We will guide you through a practical example that sends an HTML email notification when a Posted Purchase Invoice is inserted. Pre-requisite – Microsoft Dynamics 365 Business Central (On-premises or Cloud) Objective Our goal is to automatically send an HTML email containing purchase order details whenever a new Purchase Invoice Header is created. Step-by-Step Implementation Before diving into the code, you need to set up the email functionality in Business Central to ensure the system can send emails. Step 1: Set Up Email in Business Central Open Business Central: – Sign in to your Business Central account. – Search for “Set Up Email” in the top-right search bar. Configure Email: – Choose SMTP as the email type and click “Next.” – Fill in the necessary details, such as the email account and authentication details, then click “Next” to finish the setup. – Set the email account as the default if you have multiple email addresses. Step 2: Create Necessary Fields in Table and Page Extensions Add a Field in Table Extension: – Create a boolean field named “GRN Notification” in the User Setup table extension. This field will ensure that the email is sent only to the users who require it. tableextension 51328 UserSetupExt extends “User Setup” { fields { field(55005; “GRN Notification”; Boolean) { DataClassification = CustomerContent; } } } Add a Field in Page Extension: – Add the “GRN Notification” field to the User Setup page extension to allow users to enable or disable notifications. pageextension50102 extends “User Setup” { layout { addafter(“Register Time”) { field(“GRN Notification”; Rec.”GRN Notification”) { ApplicationArea = All; } } } } Step 3: Create a Table Extension for the Purchase Invoice Header This is where we extend the Purch. Inv. Header table to trigger a procedure that sends the email when a new record is inserted. tableextension 50101 PurchaseInvoiceHeader extends “Purch. Inv. Header”{ trigger OnInsert() begin GRNPostingtoPO(Rec); end; Step 4: Define the GRNPostingtoPO Procedure This procedure handles the core logic of the email notification: procedure GRNPostingtoPO(PurchaseInvoiceHeader: Record “Purch. Inv. Header”) var UserSetup: Record “User Setup”; EmailMessage: Codeunit “Email Message”; Email: Codeunit “Email”; PurchaseLine: Record “Purchase Line”; PurchaseHeader: Record “Purchase Header”; HtmlBody: Text; begin // Find the corresponding Purchase Header using the “Order No.” PurchaseHeader.SetRange(“No.”, PurchaseInvoiceHeader.”Order No.”); // If the Purchase Header exists, retrieve related Purchase Lines. if PurchaseHeader.FindFirst() then begin PurchaseLine.SetRange(“Document No.”, PurchaseHeader.”No.”); if PurchaseLine.FindSet() then begin // Build the HTML email body with purchase order details. HtmlBody := ‘Hello Team,’ + ‘<p>Please find the attached purchase order details.</p>’ + ‘<BR>’ + ‘<p>Purchase Order has been created successfully.</p>’ + ‘<h3>Purchase Order No. ‘ + PurchaseInvoiceHeader.”No.” + ‘</h3>’ + ‘<table border=”1″ style=”border-collapse: collapse; width: 100%;”>’ + ‘<tr>’ + ‘<th style=”padding: 8px; text-align: left; background-color: #f2f2f2;”>IDS No.</th>’ + ‘<th style=”padding: 8px; text-align: left; background-color: #f2f2f2;”>ITEM No.</th>’ + ‘<th style=”padding: 8px; text-align: left; background-color: #f2f2f2;”>Item Description</th>’ + ‘<th style=”padding: 8px; text-align: left; background-color: #f2f2f2;”>Quantity</th>’ + ‘</tr>’; // Loop through the Purchase Lines to add them to the HTML body. repeat HtmlBody += ‘<tr>’ + ‘<td style=”padding: 8px;”>’ + PurchaseLine.”Document No.” + ‘</td>’ + ‘<td style=”padding: 8px;”>’ + PurchaseLine.”No.” + ‘</td>’ + ‘<td style=”padding: 8px;”>’ + PurchaseLine.Description + ‘</td>’ + ‘<td style=”padding: 8px;”>’ + Format(PurchaseLine.Quantity) + ‘</td>’ + ‘</tr>’; until PurchaseLine.Next() = 0; // Close the HTML table and body. HtmlBody += ‘</table>’ + ‘<p>This is an Auto-generated mail, if any concerns related to purchase please contact the ERP Team.</p>’ + ‘</body></html>’; // Send the email to users who have GRN Notification enabled. UserSetup.SetRange(“GRN Notification”, true); if UserSetup.FindSet() then begin repeat EmailMessage.Create( UserSetup.”E-Mail”, ‘Purchase Order Posted’, HtmlBody, true ); Email.Send(EmailMessage, Enum::”Email Scenario”::Default); until UserSetup.Next() = 0; end; end; end; end; Output: Conclusion By following these steps, you can create HTML-formatted email notifications in Microsoft Dynamics 365 Business Central. This method ensures that users receive detailed and well-structured notifications, which enhances communication and workflow efficiency within your organization. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Introduction to Azure Service Bus and Its Use Case
Introduction Azure Service Bus is a fully managed, multi-tenant cloud messaging service functioning as a brokered messaging system. In a software-oriented architecture (SOA), application components interact through communication protocols over a network, facilitated by the Service Bus. This article provides an overview of Azure Service Bus, highlighting its role in integrating systems like Microsoft Dynamics 365 CRM with third-party e-commerce platforms. Real-World Scenario: Integrating Dynamics 365 CRM with an E-commerce Platform Azure Service Bus is instrumental in enabling seamless interaction between Dynamics 365 CRM and external e-commerce applications, enhancing data management and operational efficiency. – Customer Data Synchronization: Customer data from the e-commerce platform is transferred to Dynamics 365 CRM using Service Bus queues, ensuring the CRM system reflects the latest information. – Order Processing: When an order is placed, it triggers a message to Dynamics 365 CRM, streamlining order fulfilment and tracking through Service Bus topics and subscriptions. – Inventory Management: Inventory levels are updated in real-time across both systems. Messages sent through Service Bus ensure accurate stock levels, preventing overselling. – Customer Support Integration: Customer support tickets from the e-commerce platform are channelled to Dynamics 365 CRM, providing a comprehensive view of customer interactions and improving support quality. Use Case Real-Time Data Synchronization Between Dynamics 365 CRM and Finance & Operations Scenario: Imagine a company that uses Microsoft Dynamics 365 CRM for customer relationship management and Dynamics 365 Finance & Operations (F&O) for financial and inventory management. To ensure consistent and accurate data across these systems, especially regarding inventory levels, real-time data synchronization is essential. Solution: In this integration scenario, the goal is to synchronize inventory levels between Microsoft Dynamics 365 CRM and Finance and Operations (F&O) systems to ensure real-time accuracy. The process starts with Dynamics 365 CRM, where changes in inventory, such as sales or restocking, trigger an event. This event generates a message containing the updated inventory details, which is then sent via Azure Service Bus. Azure Service Bus serves as a reliable messaging service that decouples the CRM and F&O systems, facilitating smooth communication between them. Once the message reaches Azure Service Bus, it is picked up by an Azure Logic App. The Logic App orchestrates the integration process, potentially using Azure Functions for tasks such as data transformation, validation, or enrichment. For instance, it may convert the message into a format compatible with the F&O system, such as OData, a standard protocol for data exchange. After processing, the transformed data is sent to the F&O system, where the inventory levels are updated accordingly. This setup ensures that inventory records are synchronized in real time across both systems, preventing issues like overselling by maintaining up-to-date stock levels. The use of Azure Service Bus and Logic Apps not only supports real-time communication but also offers a scalable and flexible integration solution that can adapt to evolving business needs. Key benefits of this approach include real-time updates, fault tolerance through message persistence and retry logic, and the flexibility to scale and integrate systems without tight coupling. Azure Service Bus Queues and Topics and Subscriptions Azure Service Bus offers Queues and Topics and Subscriptions as core features, enabling different messaging patterns to suit various use cases. Queues facilitate point-to-point communication, while Topics and Subscriptions support a publish-subscribe model. This flexibility allows for efficient data transfer and processing across applications. Stay tuned for my next post, where we’ll explore the specific scenarios in which to use queues versus topics and subscriptions. Conclusion Azure Service Bus provides a versatile and reliable messaging solution for building scalable, decoupled distributed applications. By integrating seamlessly with the broader Azure ecosystem, Service Bus empowers developers to create efficient communication channels, enhancing the performance and reliability of their applications. Whether you’re modernizing existing systems or developing new cloud-native applications, Azure Service Bus is an essential tool for delivering an excellent user experience. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 1
Introduction In this blog, I will explain how we can automate generating access tokens or refresh tokens. When working with APIs, a common problem is the expiration of the access token or refresh token after some time. We solved this issue by using Azure Data Factory and Azure Key Vault. Azure Key Vault is used for storing API credentials as it is one of the most secure ways to store keys/secrets in Azure. Azure Data Factory is used to automate the process of generating access tokens for APIs. We are dividing this blog into two parts: Before we proceed with the blog, please test your API in Postman to know the API requirements for generating access tokens. For me, it is client ID, client secret, and Refresh Token. Steps to Set Up Azure Key Vault and Azure Data Factory: – Go to the Azure portal and create a Key Vault resource. Please make sure that your Key Vault and Azure Data Factory are in the same region. – Create a secret by generate/import and entering the required details. I have already created the secrets I need. – For the access token, you can keep the initial value as anything you want; we will update it later using an ADF Pipeline. – Set up an access policy for the Azure Data Factory to access our Key Vault. To do this, go to “Access Policy” and select the appropriate options. – Click “Next” and select your Azure Data Factory, where you will be creating a pipeline for refreshing the access token. – Now, go to Azure Data Factory Studio and set up the linked Service for your API in the Azure data factory. – The dataset is also pretty straightforward, and I prefer to use a parameter for the relative URL so that I can reuse the same dataset and just set the URL of the API I want to call during runtime: Conclusion That’s all for the setup in part 1. We’ve covered the essential steps to set up Azure Key Vault and Azure Data Factory for securely managing API credentials and setting the groundwork for automating access token generation. These tools provide a reliable and secure way to handle token expiration, ensuring smooth API operations without manual intervention. In part 2, we will discuss in detail how we can automate access token generation using Azure Data Factory. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
Share Story :
Automating Opportunity Timeline Updates for Owners and Sales Teams in Dynamics 365 using power automate
What is Opportunity in D365 CRM? The opportunity table represents a potential sale to new or established customers. It helps you to forecast future business demands and sales revenues. You can create a new opportunity in Dynamics 365 for Customer Engagement to monitor an inquiry from your existing customer or convert a qualified lead into an opportunity. Opportunities are frequently used by salespeople to monitor the sales engagements they are presently working on. For More details, please follow the linkhttps://learn.microsoft.com/en-us/dynamics365/sales/developer/opportunity-entities What are Notes in Timeline Section of D365 CRM? The timeline makes the entire history of activities visible to app users. The timeline control is used to capture activities like notes, appointments, emails, phone calls, and tasks to ensure that all interactions with the related table are tracked and visible over time. Use the timeline to quickly catch up on all the latest activity details. For More details, please follow the link https://learn.microsoft.com/en-us/power-apps/maker/model-driven-apps/set-up-timeline-control Use Case: Using Power Automate, whenever the notes in the Opportunity’s Timeline are updated, an automated email will be sent to the Opportunity Owner and associated Sales Team in the timeline of that Opportunity. Steps: – Login to make.powerautomate.com with your CRM credentials and you will land onto this page. – Once you have landed into Power Automate Page, click on Create and selected Automated Cloud Flow – Set your Trigger as the below since the flow should start working only when the Notes are added in the timeline of the Opportunity. – I have also set a certain condition to this flow. In other words, it checks whether there is a non-empty value in _objectid_value and that this value’s type is ‘opportunities’. The expression returns true if both requirements are satisfied and returns false otherwise. – The Expression is @and(not(empty(triggerOutputs()?[‘body/_objectid_value’])), equals(triggerOutputs()?[‘body/_objectid_type’], ‘opportunities’)) – Then Initializing variable for Email Addresses – Initializing variable for Notes Table – Retrieving the Owner’s Email Address. This step is necessary to obtain the Opportunity Owner’s email address so that we can send the initial notification email to them. – List All Notes in the Opportunity. We use the List rows action to retrieve multiple Note records associated with the Opportunity. This allows us to access all the notes within the Opportunity’s timeline. – Get opportunity by ID. Here we are fetching the complete details of to access all the fields and data associated with the specific record. We are filtering out based on name and opportunity id. The row ID is typically obtained from another step in your flow, such as a “List rows” action or a trigger that provides record details. My record details where I need fetch details is from Opportunity. – After retrieving the record, we need to fetch details of the associated Sales Team. This ensures that whenever a record is linked to the Sales Team, all members of the Sales Team receive an email notification. Thus, we are connecting the Sales Team to the Opportunity to include them in the notification process. The Fetch XML needs to be taken from Advanced Find in CRM. – In this step, we need to store the email address of the sender (i.e., the “From” user). We initialized this variable in step 3, so here we will save the GUID of the sender into that variable. – In this step, we need to save the email address of the recipient (i.e., the “To” user). We initialized this variable in step 3, so here we will store the GUID of the recipient into that variable. a participationtype mask of 2 indicates a specific participant role or type, i.e., sales team members associated with this Opportunity. – Next, we need to ensure that the content is structured within a table. As specified in step 6, I created a variable called `NotesTable` to hold this data. We will use this table to format the content into an HTML table for the email. – In this step, we are configuring the URL link for the Opportunity. Include the base URL of the environment and append the unique identifiers for both the Opportunity and the Topic field (which is a field within the Opportunity). – Sending an Outlook Email to the Opportunity Owner and Sales Team associated with the Opportunity. This Outlook mail works only if ‘Email Addresses Sales Team is Skipped’. – In Power Automate, adding a new row typically involves using actions provided by connectors such as Dataverse, SQL Server, SharePoint, or others, depending on where your data is stored. Here we have created an email body in this action. – In this step, we are using a bound action to send emails within the CRM system. Output: – Once, I click on Add Note, waiting 5-10 seconds and then you find the email within the timeline. Please note that the Email is tracked within CRM itself. – Below is the Opportunity which contains the Opportunity Owner, and the Sales team associated with that Opportunity. The Owner is CF Admin, and the Sales Team Members are Amit Prajapati, Ethan Rebello and Mithun Varghese. – The Opportunity Owner and Sales Team will receive notifications about the Notes in the timeline. All email interactions will be tracked in the Opportunity’s timeline, where you can also view all previous notes associated with this timeline. Conclusion: Automating bulk case resolution using Power Automate in Dynamics 365 CRM offers an efficient way to streamline your workflows and reduce manual errors. By setting up automated email notifications for updates in the Opportunity’s timeline, your sales team and opportunity owners stay informed, ensuring smoother communication and faster response times. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com
 
								 
															