Category Archives: Blog
Create a marketing segment in Dynamics 365 for Marketing
Introduction: In this blog we will be discussing about marketing segments in Dynamics 365 for Marketing Preview.Marketing is basically promoting some products or services which an individual or organization has to offer. While marketing the main aim to decide a target audience. Once the audience is know, the marketing content can be tailored to target that specific group of people. A market segment is the collection of contacts that you target in a marketing campaign. There are three types of segments a follows: Static Dynamic Compound Implementation: Static Segments: Members in the static segments are added explicitly. These members can be added by the marketer based on the personal interaction that he/she had with the customer. Step 1: Here we created a new static segment in the definitions tab we select the members using the check box as shown below. Initially the segment is in the Draft state. A point to be noted is that segment names can’t have blank spaces. Step 2: Then we save and click the Check for Error on the top ribbon which validates the changes done as shown Step 3: After the validation is passed successfully we can click the Go Live button on the top of the ribbon. Dynamic Segments: The members in dynamic segment keep on changing based on the criteria that is setup. Every time a new contact is added to the system and it fulfills the condition specified in the query, it gets added to the dynamic segment. Step 1: In the designer at the bottom of the page we can see the total segment size which is the total number of members who will be added in the segment. In dynamic segments we design rules using the Designer or Flow or Query. Once we design the view using the designer we can see in the Query tab the query is populated. There are three operators that can be used is, is not and contains. Compound Segments: The third and final type of segment is the compound segment.Compound segments make use of static and dynamic segments which are live. As seen in the image below first we have to select the segment from the drop down and only the segments that are live will appear in the drop down list. Once the segment is selected we click on the arrow and then select one of Union/Exclude/Intersect. Finally, we see at the top of each segment how many members are included or excluded. Conclusion: These segments are used in customer journeys when we want to send marketing emails to a particular group of people.
Share Story :
How to Pause SLA in Dynamics 365
Introduction: This blog explains how to Pause SLA in Dynamics 365. Steps to be followed: Only Enhanced SLA can be Paused. Go to Settings –> Administration –> System Settings –> Service Select the Entity for which SLA is Created. (I have Created SLA for Word Order so I will select Word Order from that list). Now from Available Values select that Status that will Pause SLA. For Work Order entity we will create new Status as “On Hold”. (because it does not have any status which we can use to Pause SLA) Save and Publish. Now Select “On Hold”. Go to Work Order entity and select the Status = ‘On hold’ and Save. SLA is Paused now.
Share Story :
Power BI Service Live Connection
Introduction: In this blog you will see how you can use power BI as existing dataset to create the report in Power BI desktop. Service Live Connection: You can establish connection to a shared dataset in the Power BI service, and create many different reports from the same dataset. This means you can create your perfect data model in Power BI Desktop, publish it to the Power BI service, then you and others can create multiple different reports (in separate. pbix files) from that same, common data model. This feature is called Power BI service Live connection. To create Shared data set you need to, create a dataset and report and publish it in a workspace which is common to all. Select the workspace that is shared and where the report needs to be deployed. Report will start to Puclish to workspace. You will get below confirmation, when the report is sucecssfully published. Establish a Power BI service live connection to the published dataset If you’re not signed in to Power BI, you’ll be prompted to do so. Once logged in, you’re presented with a window that shows which workspaces you’re a member of, and you can select which workspace contains the dataset to which you want to establish a Power BI service live connection. Click on load and the dataset will be loaded and you can create the reports and publish it. Below are some known limitations to this as well: Read-only members of a workspace cannot connect to datasets from Power BI Desktop. Only users who are part of the same Power BI service workspace can connect to a published dataset using the Power BI service live connection. Users can (and often do) belong to more than one workspace. Try it out and put you question below if there is anything.
Share Story :
Error Resolution to “Form.RunModal is not allowed in write transaction” in Microsoft Dynamics NAV
Introduction: Scenario: I had created an action button Print to run a report using REPORT.RUNMODAL, while running this report from the Windows Client an error is thrown as below. Pre-requisites: Microsoft Dynamics NAV 2017 Cause of this error: RUNMODAL stops the transaction and waits for the User interaction. Hence, all users are blocked (who need the table). During a transaction, we are not allowed to open a object with RUNMODAL. Resolution to the error: Use COMMIT statement before you call the REPORT.RUNMODAL. What does COMMIT statement do? When the system enters a C/AL codeunit, it automatically enables write transactions to be performed. When the system exits a C/AL code module, it automatically ends the write transaction by committing the updates made by the C/AL code. This means that if you want the C/AL codeunit to perform a single write transaction, the system automatically handles it for you. However, if you want the C/AL codeunit to perform multiple write transactions, you must use the COMMIT function to end one write transaction before you can start the next. The COMMIT function separates write transactions in a C/AL code module. Example The metasyntax below contains two write transactions. As execution begins, a write transaction is automatically started. Using the COMMIT function, you tell the system that the first write transaction has ended and prepare the system for the second. Once execution has been completed, the system automatically ends the second write transaction. BeginWriteTransactions (C/AL Statements) // Transaction 1 COMMIT (C/AL Statements) // Transaction 2 EndWriteTransactions
Share Story :
Azure Machine Learning Cheat Sheet
Introduction: Microsoft released a PDF cheat sheet of which machine learning algorithms can be used on Azure Machine Learning Studio. This Microsoft Azure Machine Learning Algorithm Cheat Sheet helps you choose the right machine learning algorithm for your predictive analytics solutions from the Microsoft Azure Machine Learning library of algorithms. The algorithms have been grouped in 5 different groups. These groups are: Regression: For predicting values. For Example when predicting a stocks price. Anomaly detection: For finding unusual data points. For example, any highly unusual credit card spending patterns which deviates from the normal credit card spending patterns. Clustering: The data points have no labels associated with them. Instead, the goal of an unsupervised learning algorithm is to organize the data in some way or to describe its structure. For example, discovering companies with similar marketing strategies. Two-class classification: When there are only two choices, it’s called two-class or binomial classification. For example distinguishing between a Cat or Dog. Multi-class classification: For predicting three or more categories. For Example predicting the winner of a Race. To read the cheat sheet, read the path and algorithm labels on the chart as “For <path label>, use <algorithm>.” For example, “For speed, use two class logistic regression.” Sometimes more than one branch applies. In this case it is better to create scored models with both the algorithm and compare both of their accuracy to decide which algorithm is the better fit. Even a beginner can easily use the cheat sheet provided to select which algorithm is apt for creating their predictive solution. There are some generalizations and oversimplifications, but it points you in a safe direction. It also means that there are lots of algorithms not listed here but these many algorithms are more than enough to give you a good head start in the ML world.
Share Story :
Blanket Sales Order Dynamics NAV
Introduction: A blanket sales order represents a sales agreement between the company and a customer. It typically involves one item with multiple shipments at predetermined quantities, price and delivery dates. Scenario: Customer orders 500 units of item that will be delivered 100 units for each week. Steps: 1) In the Search box, enter “blanket Sales orders”, and select the related link. 2) Click on new to create new blanket Sales order. 3) On the General FastTab, in the Sell to Customer No. field, select Customer. 4) Keep the Order Date field blank. When the separate Sales orders are created from the blanket order, the program will set the order date of the Sales order equal to the current date. 5) On the Lines FastTab, in the Type field, select Item. 6) In the No. field, select item. 7) In the quantity field, specify quantity 100. 8) Specify date in Shipment Date field. 9) Create four more lines and specify 100 quantity and shipment date in each line. 10) Now in Qty. to Ship field, keep the quantity of 100 for the first line and delete the quantity to ship in the other four lines. 11) On the Home tab, click Make Order. 12) Click Yes to create an order. 13) You will get message that a Sales order is created from the blanket order. 14) To open the Sales order, select the first line on Blanket order. 15) On the Lines FastTab, point to Line, then to Unposted Lines, and then click Orders. 16) On Home tab of the Sales Lines page, click Show Document. Then the Sales order will appear. Conclusion: By using Blanket Sales Order organization can sell a specified quantity or amount by using multiple Sales orders over time.
Share Story :
Data Loss Prevention in Office 365
Introduction: Data loss prevention (DLP) is a strategy for making sure that end users do not send sensitive information outside the corporate network. You can set up policies to help make sure information in email and docs isn’t shared with the wrong people. With a DLP policy, you can identify, track, and protect sensitive information across Office 365. Create a DLP Policy in Office 365 Security & Compliance center: Go to Office 365 Admin Center > Security & Compliance > Data Loss Prevention. You can choose to create a policy from a template or create a custom policy. In the next step, you need to name your policy. The next step is to choose location, whether it should be for all locations or for specific. If you select, Let me choose specific location you will getting option in below image. Under policy settings, you can choose base setting (Find content that contains) or you can Use advanced settings. If you choose advanced settings then you can customize a New Rule. By clicking New Rule, you will get options to create a rule. Provide the conditions and actions. In conditions you can add sensitive information types which is available or you can select Label which has been applied to the document for data classifcation.Labels need to be created and published first in order to use it in a DLP Policy. You can create Labels from Office 365 Security & Compliance. Labels can be applied to the documents in OneDrive and SharePoint Online. You can also configure other settings like User Notification, User overrides and incident reports. After creating a Rule, Save the changes. In the Conditions option you can see the Label (see below image) which has been applied to the DLP rule “Cloud Sensitive Information”, which has been published first and then applied to the document. In the below image, showing Label which has been applied to the Cloud DLP Policy. After creating the policy, it may take upto 24hrs for the changes to take effect. Testing DLP Policy: After creating policy, if user will try to share the document with external users he will be getting policy tips (as shown in below image). Also, if you try to send the sensitive information of your organization on an email outside your organization, policy tips will be shown (see below image). If the user will override the policy tip, then he has to enter a business justification or report it as a false positive. Conclusion: This is how you can create DLP Policies and prevent your organisations classified data from leaking.
Share Story :
How to issue and redeem Gift Card in Dynamics 365 For Finance and Operations
Introduction: In this blog we shall see how to Issue and redeem Gift Card in Dynamics 365 for Finance and Operations. Issue Gift Card: On POS go to Sales. Go to Action and select Gift Card. Click on Issue Gift Card. Enter the Gift Card Number. Enter the Amount for gift Card. Select any Tender to pay. You Can check the details of the Gift card under Gift card in Dynamics 365 FOE. Redeem Gift Card: On the Transaction Screen select the Products. To redeem the gift card select on Pay gift card. Enter the Gift card number. You can check the balance by clicking on check balance. Proceed to pay. You can manually enter the amount and remaining due can be paid by other mode of payment. You can check the details of the gift card under gift card in Dynamics 365 FOE.
Share Story :
Connect your Azure Machine Learning Predictive Solution to Power BI
Introduction: Azure Machine Learning Studio is an amazing tool that lets us create efficient ML experiments with simple drag and drop features. We can predict anything from Flight Predictions to Churn Analysis. But what if we want to represent this predicted data a more visually appealing format? Well it is possible to do this by representing your predictions on Power BI! Pre-Requisites: Basic Understanding of Azure Machine Learning Studio. Basic Understanding of Power BI. A Blob Container created on Azure Storage. Steps: Create your Azure Machine Learning Experiment on Azure Machine Learning Studio. Convert your Training Experiment to a Predictive Experiment and Deploy it as a Web Service. We will create a Console application in Visual Studio and copy paste the code inside Batch Execution. For automation we can create automated data pipelines but for now we will just use a simple Console application. Remove the existing code from the Console Application and copy paste the Batch Execution code. Install the necessary Nuget Packages and also update the following parameters. – BaseURL will be the same. – Storage Account Name, Storage Account Key and Storage Container Name will be parameters that can be found in your Azure Blob Storage which was created. – Api Key can be found in the Web Experiment Page in Azure Machine Learning Studio. – The input path is the path where you have saved your input csvfile for Batch Execution. Your Input csv file should have all the features which you have used to train your experiment After you run your Console application a new output1results.csv file should get generated in your Blob Container. The output results should include the labels which your experiment generates in it’s output. It should include the Scored Labels and Scored Probabilities labels as well. Now you can get your data using Azure Blob Storage as your source in Power BI and use the columns in the output1result.csv file to generate your ML Predicted Reports. The Report can look something like this. I hope this blog helps you to combine Azure Machine Learning Studio and Power BI to create a powerful predictive solution.
Share Story :
Connecting to On-Prem SQL from Azure Web App
Background: When an enterprise transitions to Cloud, it may still need to leave some assets on-premises for technical or security reasons. Typically SQL DBs will be On-premises for most enterprises. But this should not stop the enterprise from having their Web apps, APIs, services and mobile apps on cloud. The major hindrance in this scenario will be the feasibility for connecting the Cloud based services to On-Prem SQL for seamless transition. Azure allows you to create layer on top of this On-prem assets while safely connecting to them back on your premises using Hybrid Connections. Supported assets include MS SQL Server, MySQL or any resource that runs on static TCP Port. Prerequisites: Visual Studio 2013 or later SQL Server 2008/ 2012 with SQL server authentication Azure SDK Microsoft Azure Subscription Steps: Create SQL Server DB and table. Cerate an SQL User to connect which will be used in the .NET application. Also create some sample data in the table. Create a .NET web application which will read data from table create in Step 1. The connection string will look something like below. Host the application on Local IIS and ensure it works and can connect to SQL. Now host the application on Azure as web-app. You can refer the below link for steps to create Azure Web app. https://github.com/Microsoft/HealthClinic.biz/wiki/Create-and-deploy-an-ASP.NET-web-app-in-Azure-App-Service You will notice that the application will throw error because it will not be able to connect to the On-prem SQL. We will now create a Hybrid connection to the SQL DB. Navigate to App Service which we created in Step 4 in Azure, and navigate to Networking. Click on Hybrid Connections > Configure your Hybrid Endpoints Create New Hybrid Connection. Enter the details for Hybrid connection like below: Note: usually the TCP Port no for SQL is 1433. Please check for the SQL you are configuring. Download the Hybrid connection manager and install on the SQL server or any server on the same network. Open the installed Hybrid Connection Manager UI, and enter the connection string of the Hybrid connection we created in Azure. You can get the connection string of the Hybrid connection by clicking on it like below. Enter the Connection String in Hybrid Connection Manager UI. If everything is proper, you should see the status as Connected Like below in the tool as well as in Azure. In Azure: Other Notes: If you are facing issues with connection, you can restart the Hybrid Connection service from Local services. Please comment below in case of queries.
