Integration Archives -

Tag Archives: Integration

Shopify Meets Dynamics 365 Finance and Operations: A Guide to Integration [Part 1]

Introduction The integration of Shopify with Dynamics 365 Finance and Operations (FnO) starts by creating a secure link.The initial step in this process involves generating an API token within Shopify, serving as the credential for verified communication between both systems.In this blog I will walk you through the steps to create the API token, facilitating a seamless beginning for your integration. Pre-requisites Access to the Shopify Admin account with appropriate permissions to create private apps or access custom apps.API access enabled in your Shopify store.A basic understanding of API concepts and authentication methods. [Available in Reference]The URL or endpoint details where the API calls will be directed. [Available in Reference] References Shopify – How to generate API token ResfulAPI.net – Basics of REST APIsShopify.dev – REST API Documentation Configuration Step 1: Access the Shopify Admin Portal Log in to your Shopify store’s Admin account. Navigate to Apps from the main menu. Step 2: Create a Custom App Click on Develop Apps (available under Apps). Select Create an App and provide a name (e.g., “Dynamics365_Integration”). Assign a developer or admin as the app owner. Step 3: Configure API Scopes After creating the app, click on it to open the configuration page. Under the Configuration section, define the API scopes required for integration based on your requirements. You can change these later if required. For example: Click on Save to save the changes. Step 4: Generate the API Token Once scopes are set, click on the API credentials tab. Click Install App to generate the credentials. A unique Access Token will be displayed.  Copy and securely store this token, as it will not be shown again. If you scroll down, you’ll also see the API Key and API Secret; store these values as well. Step 5: Test the Token Use a tool like Postman to test the API token. Set up a GET request to an API endpoint (e.g., https://<API KEY>:<API Secret>@<Store Name>.myshopify.com/admin/api/2023-07/products.json). Include the token in the header as X-Shopify-Access-Token. Verify the response to confirm the token is working correctly. Or simply (https://<Store Name>.myshopify.com/admin/api/2023-07/products.json) Conclusion The API token is your gateway to integrating Shopify with Dynamics 365 Finance and Operations.  By following this guide, you’ve taken the first critical step toward seamless data flow between your e-commerce platform and back-office operations.  In the next blog, we’ll explore how to configure Dynamics 365 Finance and Operations to connect with Shopify and start synchronizing data. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Using Automated Testing in POSTMAN for Business Central Web Services

Introduction: While using Business Central Web Services or APIs, we often use POSTMAN for testing the request and the responses. Today we’ll see how we can automate this testing to a certain extent using the inbuilt features of POSTMAN. We can have testing logic that runs before every request, after every request or logic that tests on particular request. In the below demonstration, we’ll write automated test to check for GET, PUT, POST and DELETE operations for a single record on a custom API. Postman itself provides a bunch of standard procedures or boiler plate code which we can modify as per our requirements. As this uses Javascript we can also use additional JS features here. Pre-requisites: Postman Business Central OnCloud or OnPremise References: Writing tests | Postman Learning Center Announcing Postman for the Web, Now in Open Beta | Postman Blog Configuration: Post Request – So first we are going to be creating a record in the Customer Table with the following fields. One of the common things to be testing with Custom APIs is to verify whether the request is being created successfully (1) and what we are sending and what is being stored in the record are the same(2). As we are using Javascript, the response is stored in the jsonData variable and we can access any of the fields of the response as a property on the jsonData variable.  As the rest of our automated tests are doing to be performed on this same record, we need to store the Identifier for this record inside some variable which exists outside the scope of this request, here we are using a variable with the Collection scope. If you want to use the same variable outside of this collection, you can also define Global variables.  GET Request – In a simple GET request, the only thing we are concerned with here, is whether the request is executed successfully or not. For this we are simply going to be checking the Status Code. PUT Request – In a PUT request, we are going to be modifying the record that we previously created, here I’m going to update the name of the record. A common test-case for PUT requests would be to ensure that (a) the request is completed successfully and (b) what is being sent in the request is what is updated on the record and is available in the response. DELETE Request – In a simple DELETE request, the only thing we are concerned with is whether the requested is executed successfully and here we will be simply checking the Status Code returned. Once all the Automated Tests are written, you can either execute them from the Collection Level or from a folder level. Here we will be executing our tests from the folder level.  We can also define the Run Order for the requests.  Once the Tests have run, we can get a summary of the results as well as detailed version of the results.  Conclusion: Thus we saw how to use Automated Testing in POSTMAN to reduce re-work and increase efficiency while testing. A bonus tip – you can now use POSTMAN Web Version to create requests instead of download the POSTMAN app and the entire blog above was written using the Web App of POSTMAN. Do note that not everything that can be done on the Windows App can be done on the Web App. Happy Coding!

Rest API GET call in JSON format in Dynamics 365 Finance and Operations

Introduction: In this blog, we will see how to get response from Rest Api through GET call Solution:  Consisting of basic authentication, we will pass username and password in byteStr and for the endpoint we will put it in url in below code. class CFSJSTestRestAPI { public static void main(Args _args) { int find; str url,aosUri,activeDirectoryTenant; str activeDirectoryClientAppId; str activeDirectoryClientAppSecret; str postData,activeDirectoryResource; str aadClientAppSecret,oAuthHeader; str returnValue,jsonString,jsondszstr; System.Net.HttpWebRequest request; System.Net.HttpWebResponse response; System.Byte[] byteArray; System.IO.Stream dataStream; System.IO.StreamReader streamRead; System.IO.StreamWriter streamWrite; System.Net.ServicePoint servicePoint; System.Net.ServicePointManager servicePointmgr; System.Net.HttpVersion version; CLRObject clrObj; Newtonsoft.Json.JsonReader reader; System.Text.Encoding utf8; Counter countCounter; Object obj; Map data; System.Byte[] byteArraynew; System.Net.WebHeaderCollection headers = new System.Net.WebHeaderCollection(); new InteropPermission(InteropKind::ClrInterop).assert(); str byteStr = strfmt(‘%1:%2’, “USERNAME”, “PASSWORD”); headers = new System.Net.WebHeaderCollection(); url = “http://dummy.restapiexample.com/api/v1/employees”; clrObj = System.Net.WebRequest::Create(url); request = clrObj; request.set_Method(“GET”); request.set_KeepAlive(true); request.set_ContentType(“application/json”); utf8 = System.Text.Encoding::get_UTF8(); byteArraynew = utf8.GetBytes(byteStr); byteStr = System.Convert::ToBase64String(byteArraynew); headers.Add(“Authorization”, ‘Basic ‘ + byteStr); request.set_Headers(headers); servicePoint = request.get_ServicePoint(); System.Net.ServicePointManager::set_Expect100Continue(false); System.Net.ServicePointManager::set_SecurityProtocol(System.Net.SecurityProtocolType::Tls12); response = request.GetResponse(); dataStream = response.GetResponseStream(); streamRead = new System.IO.StreamReader(dataStream); jsonString = streamRead.ReadToEnd(); info(strFmt(“RESPONSE: %1”,jsonString)); dataStream.Close(); response.Close(); } } Thanks for reading !!!

Migrating Activities Of Type ‘Case Resolution’ Between Two Microsoft Dynamics CRM Environments

Introduction: While migrating Cases, the migration of activities of type ‘Case Resolution’ is necessary. However, the complexity in migrating this increases due to the fact that when the status of a case is updated, a blank case resolution activity is created automatically by the system. This system-generated case resolution needs to be deleted as this would result in each case having two case resolution activities after migration – one system-generated and one with the correct migrated data from the source. Solution: To tackle this issue, one must follow the following steps during migration: 1. Send all Cases (no matter what the status in the source environment) to the target with their status as ‘Open’. 2. Send all related activities to the target environment. 3. Update the case status in the target environment to its status as in the source environment. 4. For cases with status ‘Resolved’, a system-generated case resolution activity will be created. 5. In your case resolution migration map, first add a step to delete the existing case resolution in the target and then insert the case resolution from the source environment. 6. Now your case with status ‘Resolved’ will have only one case resolution and that will be the one migrated from the source environment with the correct data. Conclusion: Above steps shed some light on how to preserve the integrity of case resolution activity data in your target environment during data migration.

Salesforce field not visible in Scribe Insight Integration

Introduction: There may arise a scenario where after creating a connection to Salesforce, a field that exists in Salesforce isn’t visible in your Scribe Insight DTS. This can happen because of multiple reasons. In this article I have listed down 3 of the most likely reasons why your SFDC field is not visible in Scribe Insight. Three Probable Reasons: Reason 1 The first reason can be that a custom field was created in your SFDC after you had created your SFDC connection in Scribe Insight. To fix this follow the steps mentioned below: Go to Connections in your DTS Select your SFDC connection Select Edit -> DTS Connection Settings -> Adapter Settings and click on Refresh Metadata. Your custom field should now be visible. Reason 2 The second reason can be that the field-level security has the visibility disabled for that field. To fix this follow the steps mentioned below: Log into SFDC classic and select Setup Go to Build -> Customize -> Object (Eg. Accounts, Asset) -> Fields Click on the field that isn’t visible in Scribe Insight and select Set Field-Level Security Make sure the field is visible for your Profile Reason 3 The third reason can be that you are just using an old API of Salesforce and that field did not exist for that version of SFDC. For example the Asset RecordTypeId field was added in the later versions of Salesforce. To fix this follow the steps mentioned below: Go to Connections Select your SFDC connection and click on Edit Select Change Connection Update the version of the Salesforce URL to the latest version that your SFDC supports

SEARCH :

FOLLOW CLOUDFRONTS BLOG :

[gravityform id="36" ajax="true"]

FOLLOW CLOUDFRONTS BLOG :