Dynamics 365 Finance and Operations Archives - Page 3 of 4 - - Page 3

Tag Archives: Dynamics 365 Finance and Operations

Duplicate address record entry through Data entity in Dynamics 365 Finance and Operations

Introduction:In this blog, we will see how to allow system for accepting duplicate addresses of customer, vendor or any other, through doing bit change in Data Entity  Solution:For allowing duplicate entries we will create one field in LogisticsPostalAddress. Here field name is as IsCreateFromEntity. Field-IsCreateFromEntity will be used as flag, type of boolean / NoYesId (EDT). [ExtensionOf(tableStr(LogisticsPostalAddressBaseEntity))] final class LogisticsPostalAddressBaseEntityCFSJSTable_Extension { public static LogisticsLocationId resolveRemittanceAddressLocationId(LogisticsPostalAddressBaseEntity _postalAddressEntity, DirPartyNumber _partyNumber) { LogisticsLocationId locationId, locationIdCreateNew; locationId = next resolveRemittanceAddressLocationId(_postalAddressEntity, _partyNumber); if(_postalAddressEntity.IsCreateFromEntity) { locationId = locationIdCreateNew; } return locationId; } } Aforementioned code would replace the standard locationId variable with new locationIdCreateNew which will be null always. Once new variable locationIdCreateNew is replaced with standard code,  system will generate the new locationId in the LogisticsLocation(Table) and create the same duplicate address along with state, city, country, zipcode and all other respective fields. Moreover, we will just need to pass YES value to the newly created field ISCreateFromEntity while creating the new address and run the execution. How easy it is!!! Thanks for reading !!!

Deploy SSRS reports through Windows Powershell in Dynamics 365 Finance and Operations

Introduction:In this blog, we will see how to deploy SSRS reports in Microsoft Dynamics 365 Finance and Operations   Solution:  For on-prem environment, we will open Windows PowerShell in administrator mode and run the below scrip step by step. cd C:\AOSService\PackagesLocalDirectory\Plugins\AxReportVmRoleStartupTask\ .\DeployAllReportsToSsrs.ps1 For online Cloudhosted/Dev environment, we will run below mentioned script cd k:\AosService\PackagesLocalDirectory\Plugins\AxReportVmRoleStartupTask\ .\DeployAllReportsToSSRS.ps1 -PackageInstallLocation “k:\AosService\PackagesLocalDirectory” It appears as When deploying reports are completed, It looks like Thanks for reading !!!

Item Master Data Mass uploading via Data Management in Dynamics 365 Finance and Operations

Item Master Data Mass uploading via Data Management In the world of data management, it has become essential that the business users are provided solution for mass data uploading. Where the data is managed by very less expert users who’re required to upload and download data in bulk at very less given time. Today I am going to show you the way to upload Item Master Data quickly and with no errors via Data Management in MS Dynamics Commerce and Retail. Note: Masters as Item group, tax codes, units, category, item model group etc should be pre-configured or created for this to work accurately otherwise this operation will fail with errors as the masters are not created/configured. For this operation to work first we need to identify the fields that are required to do two steps. First, Item Master Creation Secondly, Item Master release in released products. So now to identify the necessary “importable” fields first we will export the required fields in excel as below. Goto Workspaces> Data Management and in Data Management click Export Button. Upon Clicking “export” create project as data filled in below. Desired Project name in my case I have named it “product master fields” Generate Data package to be set as No. Now when project is created click on “add entity” Select entity as “Released product creation V2” or anything as “Released product creation” whichever is latest according to your version of D365. Select output as “EXCEL”. Use sample file and skip staging to be set as NO. Very important step to select fields to be set as “Importable Fields” And “add” the entity to the project. Now again click on “add Entity”, select the rest of the fields as mentioned in pt. II., and select “released products V2) in entity name and click on add. Now click on export button as below Now on export screen Click on “refresh” button until you see  mark on thee screen, and then “download File” to obtain excel file. Note: click each project one by one to download table fields. After downloading file replace the data with required information that you want in your product master so that we can Import the Master data. (Note: You can hide fields that doesn’t require user to input any data but are required by AX this you will identify in released products format). Assuming that you have created your files for upload we will now continue with “Import” data, Goto Workspaces > Data Management and in Data Management click “Import” Button. Upon clicking “Import” create a project and add entity as below. Add group name, in my case added as “import Master Data” Now when project is created click on “add File” Select entity as “Released product creation V2” or anything as “Released product creation” whichever is latest according to your version of D365. Click the upload data file and select the appropriate file, in my case it is “Released product creation V2 – Format.xlsx” Select “Source data format” Click close. Now again click on “add File”, select the rest of the fields as mentioned in pt. VIII., and select “released products V2) in entity name and click on Close. After adding both the files, now its time to click on “import”, but first ensure that the sequence of the below files are as such 1st for Released Product Creation and 2nd for Released products. If this is not in sequence the above operation will fail. After Clicking on Import, you will have to click on refresh button until the  come. Upon success you can see the validation message that the master data has been uploaded successfully and released the product for use. In my case I had uploaded a test product which got successfully uploaded to the system. Hope this helps!

RSAT (Regression Suite Automation Tool ) implementation and configuration for Finance and Operations

Purpose The Regression suite automation tool (RSAT) significantly reduces the time and cost of user acceptance testing. This tool enables functional power users to record business tasks using the Finance and Operations Task recorder and convert these recordings into a suite of automated tests without the need to write source code. Test libraries are stored and distributed in Lifecycle Services (LCS) using the Business Process Modeler (BPM) libraries. These libraries are also fully integrated with Azure DevOps Services (Azure DevOps) for test execution, reporting and investigation. Test parameters are decoupled from test steps and stored in Microsoft Excel files. Prerequisites Dynamics 365 for Finance and Operations test environment (Demo or tier 2 UAT environment Excel Azure DevOps: You will need an Azure DevOps Test Manager or Test Plans license. For example, if you have a Visual Studio Enterprise subscription, you already have a license to Test Plans. Pricing-https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/ For a demo environment, you don’t need to buy any license. Authentication Certificate: To enable secure authentication, RSAT requires a certificate to be installed on the RSAT client computer. The RSAT settings dialog box allows you to automatically create and install the authentication certificate. Installation Download Regression Suite Automation Tool.msi to your machine RSAT requires Selenium and web browser driver libraries. RSAT will prompt you if needed libraries are missing and will automatically install them for you. Configuration For RSAT Open RSAT application. Select the Settings button in the upper right to configure RSAT. And next steps will help you to find those required fields input. Go to project settings of Lcs for your projects. Go to Visual Studio Team Services. Here you need to mention the Azure DevOps project in the Azure DevOps site URL field. In order to do that, click on https://www.visualstudio.com Open Azure DevOps and create a new organization if there is not an existing one. Now create a new project as shown below Now you need to set up a security token by clicking on  account info>security Once you create the token, save it as you will not be able to access it again when you want to use it. Once that is done, go back to the main page and create a new test plan. Name it as RSAT-TT (or you can use any name) Now right click on RSAT-TT and create a new suite you can name it ‘Demo’. Azure DevOps setup is done. In Azure DevOps site URL mention Organization name that you set up in Azure DevOps. And in Personal access token field paste the token that you had earlier saved. Click on continue to select the project and continue, Save. Now you need to deploy it to the environment Next, open the Regression Suite Automation Tool, Go to settings in Azure Dev Ops Url field copy it from the LCS Access token should be the security token you had copied. Click on Test connection so the Project name and Test plan will populate. Now run VM. You will find Hostname and SOAP Hostname by going to IIS and then right-clicking on AOSService>Edit bindings. Copy both the Hostname and in Hostname and SOAP Hostname fields paste these values Admin username should be the username you use to login to your environment. To generate Thumbprint click on New and save at any location and then copy the generated certificate to the VM Open the copied certificate and install it to the local machine at personal and Trusted Root Certification Authorities locations.Now Open the wif file in admin mode in notepad from the given location of VM In wif file find CN name=127.0.0.1 exists or not. If not, copy the selected portion and paste it below the same authority block. Now add modify those lines as follows: <authority name=”CN=127.0.0.1″>             <keys>               <add thumbprint=”F46D2F16C0FA0EEB5FD414AEC43962AF939BD89A”/>             </keys>             <validIssuers>             <add name=”127.0.0.1″ />             </validIssuers>             </authority>  ( Note: Add thumbprint of installed Certificate in wif as shown)   Final steps include Copy thumbprint from RSAT settings (which was generated when you click on New) and paste it in wif file in your VM Then Mention the company name And Working directory Set default browser as internet explorer Save as and ok Next, Go to LCS open business process modeler and create a new Library Name it as RSAT, go to edit and rename the process as required and you may add a child node to it by clicking on the Add process.  Now go to Finance and operations, go to test recorder  Create recording by clicking on create a recording and perform the operation and then click on the stop button. Name it as per your need then Save it to Lifecycle services or Save this to PC option. Click ok Now go back to LCS in the project library and click on the requirement, tab check it’s syncing  Now Sync test cases and VSTS sync Next, go to Visual studio DevOps, test cases, click on Add existing Then click on the run query and click on Add test case  Now go to regression suite automation and load the test and download test cases. select test and click on new and generate test execution parameter files Then click on edit option for the older version to edit values in excel For older version For newer version Now edit metadata for the test in excel file and save and close Now Run the test after this step, automatic session for the test is handled by selenium where the browser will perform steps as test cases Then run the test and after it’s completed successfully click on upload (Note the result as passed)

How to resolve workflow editor error “Application cannot be started.Contact the application vendor”

Sometimes when you try to open workflow editor you receive error  as “Application cannot be started.Contact the application vendor” as shown in screenshot. this problem can be caused due to various versions of application are there on your system. Let see how to solve this problem :- First thing first make sure you are using internet explorer browser for the workflow. If you are using internet explorer the go to settings and go to internet options. Now try to connect the application. If above step is also not working for you then there must be multiple versions of applications are on your system. To resolve this you visit C:\Users\*YourUserFolder*\AppData\Local\Apps\2.0 and search as workfow and select second application file and open file location of it.(AppData may be hidden in some cases) Now in the opened location delete all the files other than folder in that location. after this try to download the latest workflow editor and it should work now.

How to create and apply workflow for purchase order in D365 finance and operations

In this blog we will learn how to create workflows in D365 finance and operation. For this blog we are taking example of Purchase order workflow. Which will allow us to create purchase order which is allocated to different persons for approval and review process. Navigate to Procurement and Sourcing >>Setup>>Procurement and Sourcing workflows, and click on new and select purchase order workflow as follows:- This will open workflow editor in you first need to provide log in details same as that of environment. Here we need to arrange various components and need to set their properties to resolve those following errors. Components for this of workflow: – Start: – To indicate start of workflow design. End:- To indicate end of workflow design. Review Purchase order:-This assign review(Complete/Return PO). Approve Purchase order:- This assign users who needs to approve purchase order.   To design workflow follow the steps :- Now In designer create design as shown in screenshot Set the Review element and right click and open properties and set as basic settings as follows:- In assignment make sure you have assigned type(in our case user) and user name You can also escalate roles after certain time as follows(we are not considering this setup for this blog) Now get back to approve purchase order and open its properties for and set automatic action as follows which will approve Purchase order below 10000 USD. Set the notification for person who will receive notification when particular operation is performed(for eg :- Approved/rejected etc) Now click on step 1 to enter in step 1 section and open its properties. First we are assigning user who will approve the purchase order as screenshot suggest as well as you can set time limit for approval and completion policy as well. You can also add the condition to step 1 which will decide whether to run this step or notNow close step and get back to main design of designer Now click on save and close and mention version notes and activate this workflow Now you can see new workflow in procurement and sourcing workflows Now create new purchase order and after that click on workflow button and click on submit you can also check history of it Now another user will complete the purchase order approval and mention comment. Now user with authority of approval will approve  from common>>Work Items assigned to me  

Model import and export in D365 Finance and Operations using Powershell

When we want to move customization done on specific model from one environment to other development environment we need to export and import the model file. Steps for model import and export using PowerShell :- Open PowerShell in administrator mode. Change directory to the path of package bin folder. Export command:-.\ModelUtil.exe -export -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=”name of model” -outputpath=path to store model after exportFor example: If model name is TOUpgradeModel and I want to store the model file to path is C:\Temp\ModelFile The command will be as follows: .\ModelUtil.exe -export -metadatastorepath=K:\AosService\PackagesLocalDirectory -modelname=”TOUpgradeModel” -outputpath=C:\Temp\ModelFile Output file you can see on the specified path as Import Command :-.\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=the path from. axmodel to importFor example: .\ModelUtil.exe -import -metadatastorepath=C:\AOSService\PackagesLocalDirectory -file=C:\Temp\ModelFile\TOUpgradeModel-Cloudfront.axmodel ( Note : If model already exist in your environment, trying to import the same model you will     receive the error message of “Model already exist”. So, delete the existing model by command .\ModelUtil.exe -delete -metadatastorepath=C:\AOSService\PackagesLocalDirectory -modelname=” TOUpgradeModel ” try to import the model)

How to resolve Error “Exception from HRESULT: 0xC0202009” While data export

While  Exporting data using data entity in D365 FO sometimes the Data project fails to export data with error “Exception from HRESULT: 0xC0202009”.   While event log displays   – EventData  methodName DMFGenerateSSISPackage.generateFileDataV2() diagnosticsMessage System.Exception: Exception from HRESULT: 0xC0202009 at Microsoft.Dynamics.AX.Framework.Tools.DMF.ServiceProxy.DmfEntitySharedTypesProxy.DoWork[T](Func`1 work) at Dynamics.AX.Application.DMFGenerateSSISPackage.`generateFileDataV2(DMFDefinitionGroupExecution _dmfDefinitionGroupExecution, String _defGroupName, DMFFileFormat _fileFormat, DMFDelimiter _rowDelimiter, DMFDelimiter _columnDelimiter, String _codePage, String _locale, NoYes _isFirstRowHeader, NoYes _unicode, String _source, String _textQualifier, DMFXMLStyle _style, String _rootElement, String _filePath, Map _entitySyncVersion, Int32 _previewCount, Boolean @_entitySyncVersion_IsDefaultSet, Boolean @_previewCount_IsDefaultSet) in xppSource://Source/ApplicationFoundation\AxClass_DMFGenerateSSISPackage.xpp:line 1273 In Such Cases, the reason behind this is some target fields is may be disabled or cause this problem. To resolve this problem, you need to perform the following steps :-   Refresh Entity List  Data Management>>Framework parameters>>Entity Setting And wait until all entities got refreshed Regenerate Mapping – Select data entity with the above issue (In our case Sales Order header v2 Entity)from data entities in data management. Click on Generate Mapping – Select “Yes” from generation warning Note:-You can try to disable/remove fields from the mapping until it starts working. This way you at least find out the problematic field. To be more effective, disable the first half of the field list. If the export works, the problem was in some of the disabled fields. I hope this will help you.  

Create Azure Connector With ARM(Azure Resource Manager) Configuration

While Creating Any Cloud-Hosted Environment in LCS it Is Necessary to create Azure Connector for which ARM(Azure Resource Manager) configuration is necessary. So this article will help you to create Azure Connector. Steps to follow :- Role assignment at the azure portal For Proper Working of Azure Connector make sure you have mentioned role assignment in your azure portal.Visit the Azure portal with the same credential as that of LCS and visit subscription section. Now select Access Control(IIM) In which click on Add Button and select Add Role Assignment. Now Configure the Add Role Assignment field as follows and save those configurations. Authorize link Now Navigate back to LCS in which Project Settings>>Azure Connectors and make sure to autorize link by clicking authorize button. Create Add option for connector Click on Add Button in Azure Connectors section and add Name, Azure subscription Id ,  and Toggle Configure to Azure Resource Manager(ARM) option to Yes. Click on next and Check for the following page Again Click on next move to the next step Upload Management Certificate Download the management certificate Now Upload Downloaded Certificate in Azure portal as follows And upload the certificate Select Region for Connector Navigate back to previous LCS session and Complete setup By selecting required Azure Region Click on confirm and your Azure connector is created and the screen looks as follows  

Configure Network Printers for Print Management in D3FOE

Introduction: I faced an issue recently where I was trying to print a report directly to printer but Print Management Settings didn’t show any Printers option. In AX 2012 we had to perform setup in AX 2012 Server Configuration to print documents or report to connected printers. In Dynamics 365 for Finance and Operations, Enterprise Edition you need to install and setup Document Routing Agent in your system and activate Network Printers in D3FOE for printing. Steps: Install Document Routing Agent. Setup Printers in Document Routing Agent. Manage Network Printers in D3FOE. 1. Install Document Routing Agent. Go to Organization Administrator -> Setup -> Network Printers In Action Pane, Go to Application -> Download document routing agent installer Install downloaded setup file ‘DocumentRoutingAgentSetup’. 2. Setup Printers in Document Routing Agent. Go to Settings and enter details. Click OK. Application Id – Unique Appication Id. It  is filled automatically. Dynamics 365 URL – URL of D3FOE Azure AD Tenant – Domain name of Azure Active Directory Run as Windows Service – This will configure agent as a windows service. If you want to print custom size reports then agent should be desktop app as it sends report to Printers with help of Adobe Reader instead of sending it to Target folder. Agent as service will send it to Target folder. Sign In with your logic credentials. Go to Printers. You can see the installed Printers on your device. Select the Printers you want to enable for printing in Operations. 3. Manage Network Printers in D3FOE. Go to Organization Administrator -> Setup -> Network Printers. You will see the list of Printers enabled on Document Routing Agent. Enable the printers by setting Yes for Active field. Conclusion: Once the setup is completed you can use the Network Printers from Print Destination – Printers option. Agent once installed on one machine and enabled as Network Printers can be used by anyone using the D365 Operations.

SEARCH :

FOLLOW CLOUDFRONTS BLOG :

[gravityform id="36" ajax="true"]

FOLLOW CLOUDFRONTS BLOG :