Azure Archives -

Category Archives: Azure

Streamlining Build Pipelines with YAML Template Extension: A Practical Guide

In modern development workflows, maintaining consistency across build pipelines is crucial. A well-organized build process ensures reliability and minimizes repetitive configuration. For developers using YAML-based pipelines (e.g., Azure DevOps or GitHub Actions), template extension is a powerful approach to achieve this. This blog explores how to use YAML templates effectively to manage build stages for multiple functions in your project. What is Template Extension in YAML? Template extension allows you to define reusable configurations in one place and extend them for specific use cases. Instead of repeating the same build steps for every function or service, you can create a single template with customizable parameters. Why Use Templates in Build Pipelines? – Scalability: Add new services or functions without duplicating code. – Maintainability: Update logic in one place instead of modifying multiple files. – Consistency: Ensure uniform processes across different builds. Step-by-Step Implementation Here’s how you can set up a build pipeline using template extension. 1. Create a Reusable Template A template defines the common steps in your build process. For example, consider the following file named buildsteps-template.yml: parameters: – name: buildSteps # the name of the parameter is buildSteps type: stepList # data type is StepList default: [] # default value of buildSteps stages: – stage: secure_buildstage pool: name: Azure Pipelines demands: – Agent.Name -equals Azure Pipelines x jobs: – job: steps: – task: UseDotNet@2 inputs: packageType: ‘sdk’ version: ‘8.x’ performMultiLevelLookup: true – ${{ each step in parameters.buildSteps }}: – ${{ each pair in step }}: ${{ pair.key }}: ${{ pair.value }} 2. Reference the Template in the Main Pipeline This is your main pipeline file: trigger: branches: include: – TEST {Branch name} paths: include: – {Repository Name}/{Function Name} variables: buildConfiguration: ‘Release’ extends: template: ..\buildsteps-template.yml {Template file name} parameters: buildSteps: – script: dotnet build {Repository Name}/{Function Name}/{Function Name}.csproj –output build_output –configuration $(buildConfiguration) displayName: ‘Build {Function Name} Project’ – script: dotnet publish {Repository Name}/{Function Name}/{Function Name}.csproj –output $(build.artifactstagingdirectory)/publish_output –configuration $(buildConfiguration) displayName: ‘Publish {Function Name} Project’ – script: (cd $(build.artifactstagingdirectory)/publish_output && zip -r {Function Name}.zip .) displayName: ‘Zip Files’ – script: echo “##vso[artifact.upload artifactname={Function Name}]$(build.artifactstagingdirectory)/publish_output/{Function Name}.zip” displayName: ‘Publish Artifact: {Function Name}’ condition: succeeded() Benefits in Action 1. Simplified Updates When you need to modify the build process (e.g., change the .NET SDK version), you only update the template.yml. The changes automatically apply to all functions. 2. Customization Each function can have its own build configuration without duplicating the pipeline logic. 3. Improved Collaboration By centralizing common configurations, teams can work independently on their functions while adhering to the same build standards. Best Practices Final Thoughts YAML template extension is a game-changer for developers managing multiple services or functions in a project. It simplifies pipeline creation, reduces duplication, and enhances scalability. By adopting this approach, you can focus on building great software while your pipelines handle the heavy lifting. If you haven’t already, try applying template extension in your next project—it’s a small investment with a big payoff. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Real-Life Use Case of CRUD Operations with Postman and Azure Logic Apps 

Posted On November 18, 2024 by Bhavika Shetty Posted in Tagged in

Having a robust Customer Relationship Management (CRM) system is crucial for managing customer data and interactions effectively. One way to enhance your CRM capabilities is through seamless integration with Azure Logic Apps, allowing for efficient CRUD (Create, Read, Update, Delete) operations via OData endpoints. In this blog post, we’ll dive into a real-life business use case that demonstrates how to perform CRUD operations on a CRM system using Postman and Azure Logic Apps.  What Are CRUD Operations?  CRUD operations form the backbone of any data-driven application. They enable you to:  The Setup: Using Postman for API Requests  Postman is an incredibly useful tool for testing APIs, and in our case, it will help us interact with our CRM’s OData endpoints. Before we begin, ensure that you have the necessary API access and permissions set up.  Creating a New Record in CRM  Step 1: Prepare Your Request  To create a new record, you’ll need to set up a POST request in Postman. Here’s how to do it:  Step 2: Set the Request Body  In the body of your POST request, include the necessary details for the new record. For example, if you’re creating a customer record, it might look something like this:  Step 3: Send the Request  Hit the Send button. You should receive a response containing the payload of the newly created entry (e.g., CustomersV3).  Step 4: Verify Creation in CRM  Next, navigate to your CRM dashboard to verify that the new customer entry has been successfully created.    Updating an Existing Record  Step 1: Prepare Your Update Request  To update an existing record, you’ll be sending a PATCH or PUT request. Here’s how to set it up in Postman:  Step 2: Set the Request Body  Include the changes you wish to make in the request body. For example, if you want to update John Doe’s phone number:  Step 3: Send the Request  Once you send the request, you should see a response indicating the payload of the updated account.     Step 4: Verify Update in CRM  Check your CRM to confirm that the changes were applied correctly.     Future Topics: Logic App Creation  In our next blog, we’ll dive deeper into the creation of Azure Logic Apps and how they can automate these CRUD operations further, enhancing your CRM’s functionality. We’ll cover:  – Setting up triggers and actions within Azure Logic Apps.  – Automating data flow between systems.  – Best practices for managing CRM data efficiently.  Conclusion  By leveraging Postman for CRUD operations and integrating with Azure Logic Apps, businesses can significantly enhance their CRM capabilities, streamline operations, and ensure that their customer data remains accurate and accessible. Stay tuned for our upcoming blog, where we’ll explore how to create Azure Logic Apps to automate these processes, making your CRM experience even more efficient.  We hope you found this article useful, and if you would like to discuss anything, you can schedule a call with us by clicking the button below.

Share Story :

Sending and Receiving Messages from Azure Service Bus Using Logic Apps

Azure Service Bus, paired with Logic Apps, offers a powerful combination for sending, receiving, and managing messages between different applications and services. In this blog, we’ll walk through the process of sending and receiving messages using Azure Service Bus and Logic Apps. Steps to send and receive messages from service bus using logic app Step 1: Create an Azure Service Bus Namespace Navigate to the Azure Portal: – Go to portal.azure.com and log in with your credentials. Create a Service Bus Namespace: – In the search bar at the top, type “Service Bus” and select Service Bus from the results. – Click + Create to start the creation process. – Fill in the required details: Click Review + Create, and then Create to deploy the namespace. Step 2: Create a Queue or Topic in the Service Bus Namespace Access the Service Bus Namespace: – After the namespace is deployed, navigate to it by clicking on the resource in the portal. Create a Queue or Topic depending on your use case I am going to use: – Creating a Queue: Step 3: Create a Logic App to Send Messages to the Service Bus Navigate to Logic Apps: – In the Azure portal, use the search bar to find and select Logic Apps. – Click + Create to start a new Logic App. Configure Your Logic App: – In the Basics tab, provide the following details: – Click Review + Create, and then Create. Design the Logic App: – Once the Logic App is created, open the Logic Apps Designer and a trigger “When a HTTP request is received” along with POST request. – Add a compose action and pass the input parameters. – Go to Service bus –> Shared access policies –> Copy the Connection String Endpoint url – Add action Service Send Message and paste the copied end point in Connection String. – Pass the Output of compose in content. – Add a response action and the logic app workflow. – Now Copy the Url from trigger and paste it in postman hit the url. – As soon as you hit the url you will get customer Id as response in postman body. – Now Go to azure portal and check the run history I will see the Date and Status has been added for that particular customer id. – Now, Let’s verify this particular message whether it has been sent at the logic level or not. – Go to queue in my case Queue name Is “receivingqueue” –> Go to Service bus Explorer –> Click on Peek form Start. – Now in order see the content/ Message select the sequence number Step 4: Create a Logic App to Receive Messages from the Service Bus – Create a New Logic App: Repeat the steps to create a new Logic App. – Go to Logic app designer. – Add the Trigger “When a message is received in a queue”. – Add a compose action – Add a Terminate action on Succeeded. – Now to verify you check the run history of logic app you can we are getting the content in base64 Format – You can decode it and check it’s the same data that we were sending. Conclusion We’ve successfully set up a messaging system with Logic Apps and Azure Service Bus by following these steps. This configuration makes it possible to automate workflows, integrate apps seamlessly, and create reliable cloud solutions. Whether you’re working with batch processing or real-time data, Azure’s tools give you the strength and flexibility you need to scale your business effectively.

Share Story :

Posting – Document processing – The remote certificate is invalid according to the validation procedure Error in D365 FNO

Introduction Encountering errors while working with Sales Orders in Dynamics 365 Finance and Operations (D365FO) can disrupt your workflow, especially in development environments. One common issue involves posting the packing slip due to an expired SSL certificate in cloud-hosted environments. SSL certificates in D365FO cloud-hosted setups are valid for one year, after which they need to be renewed for continued security and functionality I faced this issue while trying to post the packing slip for a Sales Order.  I faced this issue on Dev Environment. To resolve this issue, follow the below process: To maintain security, these certificates must be renewed through rotation. Credential rotation is a critical aspect of enterprise-level cybersecurity, and this process can be managed via LCS. To resolve this log into the LCS environment. – Select the Implementation Project and then click on Full details option. – Click on the Maintain drop down button and then select the Rotate Secrets. – After that click on Rotate SSL Secrets Certificates option. It will look like this. This process make take a few minutes to complete. This will resolve the issue. After completion you can see that the status will be changed to Deployed. Then the next and final step is to click on Apply updates option this will apply all the changes and updates. Conclusion Rotating SSL certificates in Dynamics 365 Finance and Operations is essential to maintain security and functionality in cloud-hosted environments. By following these steps in LCS, you can ensure that your environment remains secure and that tasks like posting packing slips proceed smoothly. Regularly checking and updating your SSL certificates will help prevent future disruptions and keep your operations running efficiently. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Integrating CRM and FNO Using Azure Logic Apps

Posted On October 22, 2024 by Bhavika Shetty Posted in Tagged in

Introduction Seamless integration between systems is essential for efficient operations and data accuracy. One of the common integration challenges is syncing data between Customer Relationship Management (CRM) systems and Finance and Operations (FNO) systems. Traditionally, dual write has been a solution for this integration, but it comes with limitations. In this blog, we’ll explore a real-life business use case where we replace dual write with Azure Logic Apps to enable real-time data synchronization between CRM and FNO systems. Understanding Dual-Write Dual write is a framework provided by Microsoft that ensures data consistency between Dynamics 365 Finance and Operations (FNO) and Dynamics 365 Customer Engagement (CRM) applications. It facilitates real-time and bi-directional data synchronization, maintaining records of table and field mappings between FNO and CRM. This ensures that any change made in one system is reflected in the other, providing a unified experience across the enterprise. However, dual-write has its limitations, such as complex setup, limited customization options, and potential performance issues in high-transaction environments. These limitations prompt businesses to seek more flexible and scalable integration solutions. The Business Use Case: Replacing Dual-Write with Azure Logic Apps Scenario: A manufacturing company uses Dynamics 365 CRM to manage customer interactions and Dynamics 365 FNO to handle finance and operations. The company relies on dual write to keep customer data synchronized between the two systems. However, they face issues with the dual-write setup, including occasional synchronization lags and difficulties in customizing data mappings. To overcome these challenges, they decide to implement Azure Logic Apps for real-time data synchronization between CRM and FNO. Objective: Create a Logic App that enables real-time data synchronization between CRM and FNO, replacing the existing dual-write setup. This Logic App will ensure that any changes in customer data in CRM are immediately reflected in FNO and vice versa, without the complexities and limitations of dual write. Steps to Implement the Solution Benefits of Using Azure Logic Apps Conclusion By replacing dual write with Azure Logic Apps, the manufacturing company can achieve a more reliable and customizable integration between their CRM and FNO systems. This solution not only enhances data consistency and real-time synchronization but also provides the flexibility to adapt to future business requirements. Azure Logic Apps empower businesses to streamline their operations, improve data accuracy, and ultimately deliver better customer experiences. In our next blog, we will explore in detail how this business use case can be fully implemented using Azure Logic Apps. Stay tuned for a step-by-step guide on setting up the Logic App, configuring connectors, and ensuring seamless real-time data synchronization between CRM and FNO. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

JSON to JSON Transformation using Azure Logic Apps and Liquid

Posted On October 18, 2024 by Deepak Chauhan Posted in Tagged in

Introduction  In this blog post, I’ll walk you through the process of transforming JSON to JSON using Azure Logic Apps and the Liquid Template Language. This step-by-step guide will demonstrate how you can use Azure Integration Services to achieve your transformation goals.                                      What is Liquid Template Language?  The Liquid Template Language (commonly referred to as “Liquid”) is a flexible, open-source template language developed by Shopify. It is widely used to render dynamic content in platforms such as Shopify themes, Jekyll websites, and web applications. Liquid uses placeholders, loops, and conditional statements to pull dynamic data into a web template, making it an effective tool for JSON transformation.  Prerequisites  To complete this tutorial, you’ll need:  Sample Input JSON  We will use the following sample JSON file for this tutorial:  {    “FirstName”: “Deepak”,    “LastName”: “Ch”,    “Add1”: “T square, Saki Vihar Road, Andheri East”,    “Add2”: “Mumbai”,    “Landmark”: “Near Car Showroom”,    “PhoneNo1”: 9812727261,    “PhoneNo2”: 2121233322  }  Desired Output JSON  The client’s requirement is to transform the input JSON into the following format:  {    “Full Name”: “Deepak Ch”,    “Address”: “T square, Saki Vihar Road, Andheri East, Mumbai, Near Car Showroom”,    “Phone”: “9812727261, 2121233322”  }  Step-by-Step Guide –   Step 1: Create a Free Azure Integration Account  Step 2: Add the Liquid Template Map  Step 3: Create a Logic App  Step 4: Transform JSON to JSON using Liquid  Here’s the Liquid template used for this transformation:  {    “Full Name”: “{{content.FirstName}} {{content.LastName}}”,    “Address”: “{{content.Add1}}, {{content.Add2}}, {{content.Landmark}}”,    “Phone”: “{{content.PhoneNo1}}, {{content.PhoneNo2}}”  }      Step 5: Test with Postman  Final Output  The output JSON will be:  {    “Full Name”: “Deepak Ch”,    “Address”: “T square, Saki Vihar Road, Andheri East, Mumbai, Near Car Showroom”,    “Phone”: “9812727261, 2121233322”  }  We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Understanding When to Use Azure Service Bus Queues or Topics

Posted On October 14, 2024 by Tanu Prajapati Posted in Tagged in

If you’re finding it challenging to decide when to use Azure Service Bus Queues or Topics, this blog is for you! In our previous blog, we explored Azure Service Bus Queues, Topics, and Subscriptions. To recap, Azure Service Bus is a fully managed messaging service provided by Microsoft Azure. It helps decouple and scale applications by allowing different components to communicate with each other through messages. In this blog, we will delve deeper into Azure Service Bus Queues vs. Topics, examining their differences, use cases, and how to choose between them based on your application needs. By understanding these core concepts, we’ll be better equipped to design scalable and efficient messaging solutions using Azure Service Bus. Azure Service Bus Queues vs. Topics Service Bus Queues Queues work on a First In, First Out (FIFO) basis. This means that clients that receive messages from the queue and then process that message in the order in which they were added to the queue, and they will be the only consumer that processes this message. The queue will store this message until our client is able to process them. To process the message, the client will pull the message off the queue. Purpose: Queues are designed for point-to-point communication. They are ideal when a single consumer needs to process messages from a single sender. Message Handling: Messages are stored in a queue and processed by a single consumer in a first-in, first-out (FIFO) manner. Use Case: Best suited for scenarios where a specific task needs to be handled one at a time. For example, in an order processing system where each order needs to be managed sequentially. Fig – Message Queue with Messages One of the benefits of using queues is that producers and consumers do not need to exchange messages simultaneously. Messages are stored in the queue and are processed only when the consumer retrieves them. This setup enables producers to continue sending messages to the queue independently. Consequently, components within our architecture can be decoupled, as producers and consumers are not required to synchronize their actions. If there is a high volume of messages entering the queue, we can scale up the consumers without needing to scale the producers. Service Bus Topics Topics are different to Queues since instead of working with a single consumer, we can have multiple subscribers to our topic, who will receive their own copy of the message from the topic. This works in a pub/sub pattern, where we will have messages being published to the topic and have multiple clients subscribe to that topic. Purpose: Topics are designed for publish-subscribe communication. They allow messages to be sent to a topic and processed by multiple consumers. Message Handling: Messages sent to a topic are delivered to multiple subscriptions. Each subscription can have its own filter and process messages independently. Use Case: Ideal for broadcasting messages to multiple systems. For instance, a CRM system might need to notify various departments (e.g., sales, marketing) about a new customer record. Fig – Topic with three Subscription with Messages In Topics, our consumers don’t directly consume the message from our Topic. Instead, we create subscriptions that subscribe to the topic and our consumers receive a copy of a message from the topic. In Azure Service Bus, we can define filters on these subscriptions that determine conditions for messages to be published to a subscription and actions that modifies the message metadata. Conclusion In this post, we discussed the differences between Queues and Topics in Azure Service Bus. To summarize, Azure Service Bus Queues are ideal for point-to-point communication in which messages must be handled sequentially by a single consumer. Topics, on the other hand, are suitable for scenarios that need publish-subscribe patterns, as they enable several consumers to process the same message independently. Choosing the proper solution is determined by your application’s, individual requirements, ensuring that your message system is both scalable and efficient. If your system requires sequential processing and single customers, queues are the best option. However, if your system wants to broadcast messages to several users, Topics will give the necessary flexibility and scalability. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 2

Posted On September 11, 2024 by Deepak Chauhan Posted in Tagged in

In continuation to our Part 1, welcome to part 2 of the blog on Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault. We have already completed the necessary setup in part 1, so if you haven’t read part 1 yet, please do so before proceeding with this part.  Assumptions-  Before going further, let’s first discuss the assumptions we made:  Now, let’s discuss the step to create a pipeline to refresh the access token: –   – Create a web activity to pull the client ID, client secret, and refresh token you created in part 1. – As for settings, you use this setup, and URI is your Azure key vault’s Secret Identifier.  – Similarly, set up web activities for the client ID, client secret, and refresh token.  – For the refresh token, I have done setup as shown but you may want to change it according to your API requirements.  Body-   grant_type=refresh_token&refresh_token=@{activity(‘Get Refresh Token’).output.value}  Authorization-   Basic @{base64(concat(activity(‘Get Client Id’).output.value, ‘:’, activity(‘Get Client Secret’).output.value))}  – After this, use another web activity to refresh the access token using the refresh token and save it to the Azure Key Vault.  Body-  {    “value”: “@{activity(‘Refresh Access Token’).output.access_token}”  }  Conlusion: This blog provides a comprehensive guide to automating the access token and refresh token generation process using Azure Data Factory and Azure Key Vault. By following the steps outlined, you can ensure seamless token management, reduce manual interventions, and maintain secure access to your resources. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Introduction to Azure Service Bus and Its Use Case

Posted On September 6, 2024 by Tanu Prajapati Posted in Tagged in

Introduction Azure Service Bus is a fully managed, multi-tenant cloud messaging service functioning as a brokered messaging system. In a software-oriented architecture (SOA), application components interact through communication protocols over a network, facilitated by the Service Bus. This article provides an overview of Azure Service Bus, highlighting its role in integrating systems like Microsoft Dynamics 365 CRM with third-party e-commerce platforms. Real-World Scenario: Integrating Dynamics 365 CRM with an E-commerce Platform Azure Service Bus is instrumental in enabling seamless interaction between Dynamics 365 CRM and external e-commerce applications, enhancing data management and operational efficiency. – Customer Data Synchronization: Customer data from the e-commerce platform is transferred to Dynamics 365 CRM using Service Bus queues, ensuring the CRM system reflects the latest information. – Order Processing: When an order is placed, it triggers a message to Dynamics 365 CRM, streamlining order fulfilment and tracking through Service Bus topics and subscriptions. – Inventory Management: Inventory levels are updated in real-time across both systems. Messages sent through Service Bus ensure accurate stock levels, preventing overselling. – Customer Support Integration: Customer support tickets from the e-commerce platform are channelled to Dynamics 365 CRM, providing a comprehensive view of customer interactions and improving support quality. Use Case Real-Time Data Synchronization Between Dynamics 365 CRM and Finance & Operations Scenario: Imagine a company that uses Microsoft Dynamics 365 CRM for customer relationship management and Dynamics 365 Finance & Operations (F&O) for financial and inventory management. To ensure consistent and accurate data across these systems, especially regarding inventory levels, real-time data synchronization is essential. Solution: In this integration scenario, the goal is to synchronize inventory levels between Microsoft Dynamics 365 CRM and Finance and Operations (F&O) systems to ensure real-time accuracy. The process starts with Dynamics 365 CRM, where changes in inventory, such as sales or restocking, trigger an event. This event generates a message containing the updated inventory details, which is then sent via Azure Service Bus. Azure Service Bus serves as a reliable messaging service that decouples the CRM and F&O systems, facilitating smooth communication between them. Once the message reaches Azure Service Bus, it is picked up by an Azure Logic App. The Logic App orchestrates the integration process, potentially using Azure Functions for tasks such as data transformation, validation, or enrichment. For instance, it may convert the message into a format compatible with the F&O system, such as OData, a standard protocol for data exchange. After processing, the transformed data is sent to the F&O system, where the inventory levels are updated accordingly. This setup ensures that inventory records are synchronized in real time across both systems, preventing issues like overselling by maintaining up-to-date stock levels. The use of Azure Service Bus and Logic Apps not only supports real-time communication but also offers a scalable and flexible integration solution that can adapt to evolving business needs. Key benefits of this approach include real-time updates, fault tolerance through message persistence and retry logic, and the flexibility to scale and integrate systems without tight coupling. Azure Service Bus Queues and Topics and Subscriptions Azure Service Bus offers Queues and Topics and Subscriptions as core features, enabling different messaging patterns to suit various use cases. Queues facilitate point-to-point communication, while Topics and Subscriptions support a publish-subscribe model. This flexibility allows for efficient data transfer and processing across applications. Stay tuned for my next post, where we’ll explore the specific scenarios in which to use queues versus topics and subscriptions. Conclusion Azure Service Bus provides a versatile and reliable messaging solution for building scalable, decoupled distributed applications. By integrating seamlessly with the broader Azure ecosystem, Service Bus empowers developers to create efficient communication channels, enhancing the performance and reliability of their applications. Whether you’re modernizing existing systems or developing new cloud-native applications, Azure Service Bus is an essential tool for delivering an excellent user experience. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Automating Access Token and Refresh Token Generation Using ADF and Azure Key Vault – Part 1 

Posted On September 5, 2024 by Deepak Chauhan Posted in Tagged in

Introduction In this blog, I will explain how we can automate generating access tokens or refresh tokens. When working with APIs, a common problem is the expiration of the access token or refresh token after some time. We solved this issue by using Azure Data Factory and Azure Key Vault.  Azure Key Vault is used for storing API credentials as it is one of the most secure ways to store keys/secrets in Azure. Azure Data Factory is used to automate the process of generating access tokens for APIs. We are dividing this blog into two parts:  Before we proceed with the blog, please test your API in Postman to know the API requirements for generating access tokens. For me, it is client ID, client secret, and Refresh Token.  Steps to Set Up Azure Key Vault and Azure Data Factory:  – Go to the Azure portal and create a Key Vault resource. Please make sure that your Key Vault and Azure Data Factory are in the same region.    – Create a secret by generate/import and entering the required details. I have already created the secrets I need. – For the access token, you can keep the initial value as anything you want; we will update it later using an ADF Pipeline.   – Set up an access policy for the Azure Data Factory to access our Key Vault. To do this, go to “Access Policy” and select the appropriate options.   – Click “Next” and select your Azure Data Factory, where you will be creating a pipeline for refreshing the access token.   – Now, go to Azure Data Factory Studio and set up the linked Service for your API in the Azure data factory.  – The dataset is also pretty straightforward, and I prefer to use a parameter for the relative URL so that I can reuse the same dataset and just set the URL of the API I want to call during runtime:  Conclusion That’s all for the setup in part 1. We’ve covered the essential steps to set up Azure Key Vault and Azure Data Factory for securely managing API credentials and setting the groundwork for automating access token generation. These tools provide a reliable and secure way to handle token expiration, ensuring smooth API operations without manual intervention. In part 2, we will discuss in detail how we can automate access token generation using Azure Data Factory. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

SEARCH BLOGS:

[gravityform id="36" ajax="true"]

FOLLOW CLOUDFRONTS BLOG :