Blog Archives - Page 17 of 151 - - Page 17

Category Archives: Blog

Gain Business Insights faster by generating Power BI Reports quickly with just 1 click in Dataverse

Hi All, I’m going to show a useful feature that you can leverage to view and create instant Power BI visuals that is generated automatically based on the current view. Documentation Link Just an example: How it looks Steps to achieve this: Step 1: You need to enable this feature in Model-Driven App itself Edit Model-Driven App -> Settings -> Features -> ‘Enable Power BI quick report visualization on a table‘. Save and publish the settings Note: You also need ‘TDS endpoint‘ enabled in the environment feature settings Step 2: Refresh your browser and navigate to any table records view ( I took cases in the example) Step 3: Click the ‘Visualize this view‘ button on the command bar Note: You need to add the necessary columns in the current view if you want to show those columns in Power BI Reports Step 4: You can now see the Report generated automatically within a few mins. You can save these Reports if all necessary information is displayed here. Hope this helped you get faster Business Insights with auto-generated Power BI visuals.

Share Story :

How to use Dataverse Global In-App Notification for Real-Time Notification in Model-Driven PowerApps

Hi All, Have you ever wondered how to use Dataverse In-App notification that can be helpful in many scenarios, I’ll give an overview of the usage with a use-case. Documentation Link Business Insights in Real-Time – Documentation This is the In-App Notification Let’s see how to create such an In-App notification Step 1: In order to use this In-App notification, we need to enable it. -> Go to Editor of Model-Drivel App -> Click on Settings -> Click on ‘Features’ -> Enable ‘In-App Notification’ Note: This In-App notification creates records in Dataverse, therefore the storage will be consumed. There’s also a time-based auto-deletion that can be configured by Admin. Step 2: Save & Publish your Model-Driven App. You will see a Bell Icon on the top-right of the App. Step 3: Now, we will create the notification. There are 3 ways to create the notification, Using JavaScript -> will be using in this example Using C# Plugins (Dataverse SDK) Using HTTP Request (Web API) Step 4: As example, I will be sending a Notification to me but you can configure it to send to anyone in the organization. Note: You must have the necessary permissions to do so, Admin can configure these privileges by customizing Security Roles for the ‘app notification‘ table Refer what type of format you want to display from the documentation link provided above Step 5: I’ve created a Web-Resource that triggers when I manually close the case, So when I close a case, here’s the output. Note: You can automate these and send to anyone.Some examples are:1. Automatic send Notification of new case created to assigned person.2. Automatic notification to the Team whenever a critical ticket is raised.3. Apart from these, you can have notification whenever a Business critical flow has been failed. Some of the best ways to use is with having rich user experience with Icons and formatting options that are available to use.Icons, font styling and mentioning anyone That’s how we can achieve In-App notifications. Hope this helped you!

Share Story :

Trigger Power Automate Flow using JavaScript – Bi-Directional

Hi All,  This blog will be a continuation of my previous blog –  Trigger Power Automate Flow using JavaScript – Uni Directional Now, feedback is essential when sending a request to determine whether it was successfully performed or failed somewhere.  You can accomplish this by forwarding a response back from where the flow was invoked.  I’ll use the same situation as in my previous blog, where I send a notification by greeting a person’s name if exists, or else I will greet a friend.  Check out my previous blog to learn how to build your Flow and JavaScript.  Steps to pass the response back within Flow​  Step 1: Add a response that will be sent back from where the Flow was invoked.  Quick Tip: I am checking if ‘Name’ is present in my dynamic content. If yes, then greet the person else greet a Friend  Formula: if(contains(triggerBody()?[‘DynamicData’], ‘Name’), triggerBody()?[‘DynamicData’][‘Name’], ‘Friend’)  Steps to add into the JavaScript Step 1: Initially we created JS to trigger the flow, now we will add the code snippet to accept the response from Flow. Add the following Code: Step 2: Trigger the JS and watch the output I get as Alert (I have used the console page to trigger my JS for example purposes) Hope this helps in achieving a response from the Power Automate Flow!

Share Story :

Trigger Power Automate Flow using JavaScript – Uni-Directional

Hi All, Did you know you can use JavaScript to trigger Power Automate flows and pass input data? So, I’ll show you how to do that, as well as how to pass strict structured data and dynamic schema in Power Automate. In the next blog, I’ll talk about Trigger Power Automate Flow using JavaScript – Bi-Directional Steps to follow for Initial Setup Step 1: Let’s create a Power Automate Flow and define the input JSON schema.Go to: Power Automate Create an Instant Flow with the trigger ‘When a HTTP request is received‘ Step 2: Let’s outline the input schema and then focus on the output in a ‘Compose’ block. I’ll describe two types of inputs. (Strict and Dynamic).Our strict schema will be identified by a specific pattern indicating how the input should be given.Our dynamic schema will be recognised by an unknown pattern, and input will not always be fixed. Click on ‘Use sample payload to generate schema’. Apply the following code, click on Done and you will see the schema in ‘Request Body JSON Schema’. Add a Compose Block to check the output of the request and save the Flow. URL will be generated and is ready to be used. Let’s now proceed to create JavaScript to trigger this flow Steps to follow for calling Flow using JavaScript Since I’ll show the code snippet, adjust it as per your use case. Note: Copy your HTTP Post URL from the trigger as it will be used in the JavaScript Step 1: Type the following code Step 2: Execute the JS with ‘TriggerFlow.Main()‘. Note: Make sure you pass Execution Context to the JS Step 3: Check your Power Automate Flow History and open the Run. That’s how Power Automate is triggered using JavaScript. Hope this has helped you 🙂 Next blog – Trigger Power Automate Flow using JavaScript – Bi-Directional

Share Story :

Get Owners of a Teams Channel Using Power Automate Flow

Posted On May 2, 2023 by Vidit Gholam Posted in Tagged in

With Power Automate it has become easier to post automated messages, and alert approvals in Microsoft Teams, in the following blog we will explore some Power Automate actions which will help us to send these alerts messages and approvals to Microsoft Teams Channel Owners only.  So let’s begin…!  Let’s say we have a Teams Channel with members, as shown in the snapshot below. We need to send approvals to the Owner of the channel only.  Here is how it is done,   Step 1: In power automate flow, search for Office 365 Group Action, and now select List all members action. Select the Teams name from the dropdown.  (Comments: When a Team is created it forms an Office 365 group)  Step 2: Here we will be using the Microsoft Graph API to get the owners of the group more about it in the doc – https://learn.microsoft.com/en-us/graph/api/group-list-owners?view=graph-rest-1.0&tabs=http  API : GET /groups/{id}/owners  To get the ID go to https://admin.microsoft.com/ and follow the snapshot below. (Comments: to get the id of the group you need to have admin privileges)  Step 3: After we run the flow, we get the output for step 2 as shown in the snapshot below.  So now we need to get the “mail” from the “value” from step 2, hence here we use a Select action in Power Automate to get the emails from the values from step 2.  Step 4: At last, we use a join expression to club the mails separated by (;)so that we can use them in the outlook action.  Power Automate Flow Screenshots:  Output: Hope this helps 😉

Share Story :

How to Use Solution Checker to identify usage of the OrganisationData.svc endpoint (Odata Deprecation for Web resources)

The Organization Data Service is an OData v2.0 endpoint introduced with Dynamics CRM 2011. The Organization Data Service was deprecated with Dynamics 365 Customer Engagement v8.0 in favor of the Web API, an OData v4.0 service. For more details please follow the link https://powerapps.microsoft.com/en-gb/blog/odata-v2-0-service-removal-date-announcement/ OData v2.0 Service removal date announcement | Microsoft Power Apps To determine the deprecation in your old javascripts below is the blog you can refer to. Step 1: Log in to the required Power Apps environment using the URL make.powerapps.com by providing a username and password and select your environment accordingly. Step 2: Go onto Solutions and click on [+ New solution] from the menu bar Step 3: Name your Solution and fill in all the details which include the Publisher as well as the Version details. Step 4: Go inside your solution and select Add existing option. Click on More and select Web resource. Step 5: Search for your web resources using your custom publisher. For example, your publisher might be new_ or abc_ and so on.It depends on how you name your publisher. Step 6: Select all the web resources you required and once done, go back to the solution and click on the ellipses(3 dots) of your solution. Click on the option Solution checker and select Run. Step 7: We can also view the Run Status of the solution. Step 8: Click on Ellipses(3 dots) again of the solution you have worked on and click on Solution checker and then you can view the option Download results. Click on that option and once you download it, it will be downloaded in the form of xlsv(excel). Try searching the issue for web-avoid-crm2011-service-data on that excel sheet. Hope this helps!!!

Share Story :

How to change a Product Number in Dynamics 365 F&O

In this blog, we will learn how to change a Product’s Product Number. For this blog, I have created a New Product, Hand Gloves with Product Number P-000015 and released it into the system. The Path for Changing the Product Number is: Product Information Management>Products>Products. Step 1: Click on Change Number under the Maintain tab. Here, my Product number is P-000015 which I will change to P-000020. Step 2: Enter the New Product Number and Click on OK. The above screenshot shows that the Product Number has now changed from P-000015 to P-000020. Note: It is recommended to rename only if no transaction exists for the product. If the Product already has an existing transaction, it is better to create a new Product.

Share Story :

Initialize Retail Commerce Scale Unit (Cloud)

In this blog, I am going to showcase how to initialize a Retail commerce scale unit. If you’re using a Tier-2 sandbox or production environment that has application version 8.1.2.x or later, you must initialize a Commerce Scale Unit (cloud) before you can use retail channel functionality either for point of sale (POS) operations or for e-Commerce operations that use Retail Server in the cloud. Initialization will deploy a Commerce Scale Unit (cloud). Prerequisites Deploy a Tier-2 sandbox or production environment that has application version 8.1.2.x or later. Initialize Commerce Scale Unit as part of a new environment deployment Make sure the headquarters environment is available and not in Maintenance mode. In LCS, on the environment details page, Click on Manage. 2. Click on initialize 3. Select the your region And initialize Ok 4. Inside HQ, go to channel Database new channel DB created. Add channels inside that Channel DB. 5. Go to channel, click on channel profile and select that new channel DB. Run 9999 jobs from the channel Database. I hope this helps!

Share Story :

Business Central 2023 wave 1 (BC22) new features: AL Explorer and AL Home in Visual Studio Code AL extension

Introduction: Business Central 2023 wave 1 (BC22) new features: AL Explorer and AL Home in Visual Studio Code AL extension Steps : 1. Download and install the next major version of the AL Language extension. (v11). 2. Below is the download link of the AL Language below to experience the new features of the development. Click on this link Link:  ALLanguage v11.0.759316. Extract that folder and we can see there is a VSIX extension file. 3. Uninstall your previous AL language extension from the visual code marketplace. 4: Go to visual studio code extension marketplace, then click on 3 dots -> click on-> Install from VSIX. 5 : Select vsix extension file, and click on Install. Here, we have successfully installed the AL language extension. 6. First of all, when we open the VS Code, we will see the AL Home below. 7. Another shortcut key for this AL Explorer:     AL: Explorer: Ctrl + Shift + F12 Below is the AL: Explorer page. 8. We can View, search, and filter objects: a. Group By: b. Module c. Go to Source Code For a selected object, we can quickly jump to source code, whether to develop or read. d. Bookmark: We can bookmark objects. Below is the Bookmark object. e. API: AL Explorer also allows an overview of all APIs. f. Events: AL Explorer also allows an overview of all Events. g. Extensible Enums: AL Explorer also allows an overview of all Extensible enums. Thank you, I hope this helps!

Share Story :

Remove duplicate values from an array using Power Automate Flow

Hello everyone! In this blog, we will demonstrate how to schedule a flow that sends emails to all the owners of opportunities using Power Automate Flow. Scheduled flows are the best option for actions that need to be automated on a schedule and will run at a specific time or date. For instance, using this approach, you might schedule a daily data upload to Dynamics 365 or SharePoint. Use Case: Power Automate flow sends a reminder notification to the owners of Opportunities. To avoid multiple emails being sent to the same owner, we need to remove duplicate email addresses. Step 1: Log in to the required Power Apps environment using URL make.powerapps.com by providing username and password, click on Create on the left-hand side as shown below, and click on Scheduled cloud Flow. Step 2: Mention the date and time. It will appear as this. Step 3:  Initialize a variable ‘Email_addresses’ of type Array which will store all the email addresses. Step 4: Use List Rows to find the desired records from the required table. Select the table name. Also, I added the FetchXML code to fetch the required details of an entity. Step 5: Add an ‘Apply to Each’ loop and insert the previous step’s value. This is done so that the value in every iteration is stored in the Array Variable. Add a step of ‘Set variable’ to store the email addresses in the Array. Step 6:  Add a new step ‘Compose’ and declare an empty array. This helps to save time when comparing with larger sets. Step 7: Apply the Union function which removes duplicate values. union(variables(‘Email_addresses’),outputs(‘Empty_Array’)) Step 8: Apply the Join function to extract the email addresses. join(array(outputs(‘Union_function_to_remove_duplicate_mail_Values_in_Array’)),’,’) Hope this helps!

Share Story :

SEARCH BLOGS:

[gravityform id="36" ajax="true"]

FOLLOW CLOUDFRONTS BLOG :