Azure and Office 365 Archives - Page 10 of 11 - - Page 10

Category Archives: Azure and Office 365

Using D365 App for Outlook for quick Lead capture

The purpose of this article is to help fellow entrepreneurs and sales managers leverage the power of Office 365 and Dynamics 365 to manage and build their sales pipeline with a few easy steps. I use the Outlook Web Application (OWA) for my O365 email access. I also have CRM open in another tab and usually toggle between CRM and email. There are several email conversations that happen with existing customers or partners that I would like to quickly track as a Lead in CRM. I might not know the timeline or even the budget, but I know since it is an existing customer/partner, this would be a good lead. The disadvantage of not tracking these potential opportunities is that after a while you tend to forget to follow up! In this article, we focus on leveraging the D365 App for Outlook to convert emails into Leads which then feeds into our Sales Pipeline. Step 1 – Hit that “D” hard D365 now has a cool new logo (not a logo, may be an icon?). Anyways, once you install D365 App for Outlook, you will see this logo next to any email you have received. Below is an email I received this morning from our partner, on a potential opportunity – Step 2 – Track the Email and Create a Lead! Once you ‘hit the D’. you get to this window, where you would ‘Track’ that email. I already have Andy Neal as a contact in my system, so the app gets me all that info right in my email window! Once you track the email, you will get an option to set the regarding option. On this screen, select New and select Lead – Finally, enter the details for your Lead and close the window or open that Lead right from your email! Step 3 – Just do it. Yes, this step is same as in my previous article. Get in the habit of doing this and you will see a good lead pipeline that you can work through daily and increase your conversion rates. Remember – ‘Sales cures all.’ Let’s take care of that sales pipeline! You can always email me at AShah@CloudFronts.com to discuss your sales processes and technology adoption. In the coming articles, I will continue to focus on efficient ways to build and manage your sales pipeline and how this ties into one of the most important KPIs for running your professional services business.

Share Story :

Configuring Azure AD B2C: Sign up and sign in for consumers in your applications on Azure

Azure Active Directory B2C is a cloud identity management solution for your consumer-facing web and mobile applications. It is a highly available global service that scales to hundreds of millions of consumer’s identities. Built on an enterprise-grade secure platform, Azure Active Directory B2C keeps your applications, your business, and your consumers protected. Azure Active Directory B2C offers developers a better way to integrate consumer identity management into their applications with the help of a secure, standards-based platform and a rich set of extensible policies. When you use Azure Active Directory B2C, your consumers can sign up for your applications by using their existing social accounts (Facebook, Google, Amazon, LinkedIn) or by creating new credentials with username and password and called as “local accounts.” Get started To build an application that accepts consumer sign up and sign in, you’ll first need to register the application with an Azure Active Directory B2C tenant. Step 1: Sign into Azure subscription and get access to Azure AD B2C. Step 2: Create an Azure AD B2C tenant Use the following steps to create a new Azure AD B2C tenant. Currently B2C features can’t be turned on in your existing tenants. Sign in to theAzure portal as the Administrator. Click New > App Services > Active Directory > Directory > Custom Create. Choose the Name, Domain Name and Country or Region for your tenant. B2C directories are not yet available in the selected country/region so select region or country B2C is available. Check the option that says This is a B2C directory. Complete. Your tenant is now created and will appear in the Active Directory extension. You are also made a Global Administrator of the tenant. You can add other Global Administrators as required. Step 3: Navigate to the B2C features blade on the Azure portal Navigate to the Active Directory extension on the navigation bar on the left side. Find your tenant under theDirectory tab and click it. Click theConfigure tab. Click theManage B2C settings link in the B2C administration section. The Azure portal with the B2C features blade showing will open in a new browser tab or window. Note: It can take up to 2-3 minutes for your tenant to be accessible on the Azure portal. Retrying these steps after some time will fix this. Easy access to the B2C features blade on the Azure portal Pin this blade to your Starboard for easy access. Sign into the Azure portal as the Global Administrator of your B2C tenant. If you are already signed into a different tenant, switch tenants (on the top-right corner). Click Browse on the left-hand navigation. Click Azure AD B2C to access the B2C features blade. Azure AD B2C to access the B2C features blade – How to add application in Azure AD B2C After adding application, you need to share application ID with developing team for further coding to redirect to sign up and sign in page. This is ‘renaissancesvcb2c.onmicrosoft.com’ your tenant ID and ‘https://www.contoso.com’URL will be required for configuring with Identity providers to sign up and sign in. After configuring your tenant ID and URL with Identity providers it will provide Client and secret ID. Add Identity Provider Use that ID and Key in Azure AD and try to Sign up and Sign in. After Adding Identity providers. Next step – Add sign up polices as per your requirement. Adding sign in policies is easier then sign up policies.  

Share Story :

Introduction to Azure Event Hubs

Posted On August 8, 2016 by Posted in

Overview Microsoft Azure Event Hubs is a managed platform service that can intake a large amounts of data for various scenarios. It is a highly scalable, low latency and highly-scalable data ingest system. Data is ingested here in the form of events. Event Publishers submit data to the Event Hub and Event Consumers consume the data at their own time. Some scenarios where Event Hubs are applicable are – Application Instrumentation, user experience, Internet of Things (IoT) Event Hubs reside in the Service Bus namespace. Event Hubs uses AMQP and HTTP as its primary API interfaces.   Below diagram will give a high level overview of where Event Hubs lie:   Partitions in Event Hubs Partitions are ordered sequence of events that reside in the event hubs. Newer partitions are added to the end of the queue as they arrive. Partitions retain data for a configured period of time. This setting is common across all partitions in the Event Hub. Every partitioned is populated at their own pace and not necessarily sequentially. Hence, data in partitions grow independently. of partitions are specified while creation of Event Hubs. This number should be between 2 and 32. Default partitions allotted are 4. The number of partitions you choose are more related to the number of concurrent consuming applications you expect to have. This partition count cannot be changed once the event hub is created.   Event Publishers So who are event publishers? The entities that populates data to the event hubs are the event publishers. Event publishers can publish data to the event hubs either using HTTPS or AMQP 1.0. Event publishers use a SAS (Shared Access Key) token to authenticate themselves to Event Hubs.   Common Tasks for a Publisher Acquire an SAS Token SAS is the authentication mechanism for Event Hubs. Service Bus provides SAS policies at the namespace level and at the Event Hub level. Service Bus can regenerate the key and authenticate the sender.   Publishing an Event Service Bus provides an EventHubClient class for publishing events to an Event Hub from .NET clients. Events can either be published individually or batched. A single publication has a limitation of 256 KB, whether batch or individually. Publishing events larger than this will result in error.   Partition Key It is a value to map incoming messages to the specific partitions for data organization purpose. This is a sender supplied value passed to the event hub. It is processed through a hashing function which creates the partition assignment. Partition Keys are important for organizing data for downstream processing. Below Diagram explains how Partition Key work:   Event Consumers An entity that reads event data from Event Hub is an event consumer. All event consumers read from the partitions in the consumer group. Each partition should only have one active reader at a time.   Consumer Groups A consumer groups is a view (state, position, or offset) of the entire Event Hub. Consumers groups lets consuming applications have a separate view of the entire Event Hub. There is always a default consumer group. You can create up to 20 consumer groups in an Event Hub.   Stream Offsets & Checkpointing An offset is a position of an event within a partition. It is a client-marker to specify at which point should the processing should happen from. Consumers should store their own offsets. Checkpointing is a process where readers mark their position in a partition in the event hubs. Common Consumer Tasks All consumers connect to the Event Hub via AMQP 1.0. It is a session and state-aware bidirectional communication channel. As a partitioned consumer model, only 1 consumer can be active on a partition at a time within a consumer group. The following data is read from the Event Hub Offset Sequence Number Body User Properties System Properties. As mentioned above, it is user’s responsibility to maintain this offset. So now you know about Event Hubs! Summary Azure Event Hubs provide high-scalable, telemetry processing service that can be used for common applications. Event Hubs provide low latency. In the next part of the blog, I’ll be covering a technical look at the Event Hubs wherein Dynamics CRM will publish data to the Event Hubs and how data is available for applications to consume! Watch out here for the upcoming blog very soon! Hope this overview was helpful.  

Share Story :

Data Movement using Azure Data Factory

Prerequisite: Azure Subscription, SQL Server Management Studio (SSMS), Azure Explorer What is Azure Data Factory? Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data. Data Factory works across on-premises and cloud data sources and SaaS to ingest, prepare, transform, analyze, and publish your data. You can use Data Factory anytime you need to collect data of different shapes and sizes, transform it, and publish it to extract deep insights all on a reliable schedule. Key Concepts in Azure Data Factory Dataset – Identify data structures within different data stores including tables, files, folders, and documents Linked Service – Define the information needed for Data Factory to connect to external resources Pipeline – Used to group activities into a unit that together perform a task Activity – Define the actions to perform on your data Read more about Azure Data Factory here In below example, we will demonstrate copy data activity from csv file stored in Azure Blob Storage to Azure SQL Database using Azure Data Factory Editor. Steps for Data Movement using Azure Data Factory: Step 1: Create Storage account and a container in Azure. Place file containing data into the container using Azure Explorer or similar tool   Step 2: Below image shows csv file content and same placed in Azure container using Azure Explorer   Step 3: Create an Azure SQL Database to store output data   Step 4: By connecting SSMS to Azure SQL Database, we can create output table in Azure SQL Database   Step 5: Now go to new Azure Portal i.e. portal.azure.com and create a new Data Factory as shown   Step 6: We need to create 3 things to start data movement. Linked Services, Datasets and Pipeline. You can start creating by opening Azure Data Factory and click on “Author and deploy”   Step 7: First create linked service for Azure SQL Database and then for Azure Blob Storage   Find the JSON code for linked service given below: { “name”: “AzureSqlLinkedService”, “properties”: { “description”: “”, “hubName”: “adfcf_hub”, “type”: “AzureSqlDatabase”, “typeProperties”: { “connectionString”:”Data Source=tcp:qbozi5org6.database.windows.net,1433;Initial Catalog=adfcfs;Integrated Security=False;User ID=cfadmin@qbozi5org6;Password=**********;Connect Timeout=30;Encrypt=True” } } } For Azure Blob Storage: { “name”: “StorageLinkedService”, “properties”: { “description”: “”, “hubName”: “adfcf_hub”, “type”: “AzureStorage”, “typeProperties”: { “connectionString”: “DefaultEndpointsProtocol=https;AccountName=adfcfsstorage;AccountKey=**********” } } } Step 8: Now create datasets for source as well sink For Azure SQL Database { “name”: “OpportunitySQLTable”, “properties”: { “structure”: [ { “name”: “OpportunityName”, “type”: “String” }, { “name”: “Status”, “type”: “String” }, { “name”: “EstimatedRevenue”, “type”: “String” }, { “name”: “ContactPerson”, “type”: “String” } ], “published”: false, “type”: “AzureSqlTable”, “linkedServiceName”: “AzureSqlLinkedService”, “typeProperties”: { “tableName”: “Opportunity” }, “availability”: { “frequency”: “Hour”, “interval”: 1 } } } For Azure Blob Storage { “name”: “OpportunityTableFromBlob”, “properties”: { “structure”: [ { “name”: “OpportunityName”, “type”: “String” }, { “name”: “Status”, “type”: “String” }, { “name”: “EstimatedRevenue”, “type”: “String” }, { “name”: “ContactPerson”, “type”: “String” } ], “published”: false, “type”: “AzureBlob”, “linkedServiceName”: “StorageLinkedService”, “typeProperties”: { “fileName”: “Opportunity.csv”, “folderPath”: “adfcontainer/”, “format”: { “type”: “TextFormat”, “columnDelimiter”: “,” } }, “availability”: { “frequency”: “Hour”, “interval”: 1 }, “external”: true, “policy”: {} } } Step 9: Create a pipeline. Find the JSON code below { “name”: “ADFDataCopyPipeline”, “properties”: { “description”: “Copy data from a blob to Azure SQL table”, “activities”: [ { “type”: “Copy”, “typeProperties”: { “source”: { “type”: “BlobSource” }, “sink”: { “type”: “SqlSink”, “writeBatchSize”: 10000, “writeBatchTimeout”: “60.00:00:00” } }, “inputs”: [ { “name”: “OpportunityTableFromBlob” } ], “outputs”: [ { “name”: “OpportunitySQLTable” } ], “policy”: { “timeout”: “01:00:00”, “concurrency”: 1, “executionPriorityOrder”: “NewestFirst” }, “scheduler”: { “frequency”: “Hour”, “interval”: 1 }, “name”: “CopyFromBlobToSQL”, “description”: “Push Regional Effectiveness Campaign data to Azure SQL database” } ], “start”: “2015-11-17T08:00:00Z”, “end”: “2015-11-17T09:00:00Z”, “isPaused”: false, “pipelineMode”: “Scheduled” } } Step 10: Now go back to your Data Factory editor and you can see status of different linked services, datasets and pipeline created   Step 11: Click on “Diagram” and check the status of slices scheduled for data movement   Step 12: Once in ready status, you can go back to Azure SQL Database and check if data has been copied/moved.  

Share Story :

Azure setup using Office 365

In this blog we walk-through how to setup Azure using Office 365. Pre-Requisite Office 365 administrator account. Steps 1. Login to Office 365 portal. Navigate to https://portal.office.com 2. Click on Admin Button 3. Click on Azure AD to setup Azure. This will link your Azure AD to Organization account. Note: Don’t use admin account to setup Azure AD, instate of that you can use client account. Once Azure AD is setup, Account administrator cannot be changed. Fill required details to setup Free Azure Trial Account. Note: Credit card is required for Azure Sign-Up. After sign-up process is completed , navigate to https://manage.windowsazure.com to access Windows Azure.  

Share Story :

Creation of ACS and SAS in Azure

ACS is an Azure service that provides an easy way to authenticate users to access web applications and services without having to add complex authentication logic to code. While SAS is used to access resources in storage account which includes both primary and secondary keys. Assumptions Azure Account should be added in PowerShell with respective User’s Credentials. Note: For Adding account In Microsoft Azure PowerShell refer to following link: https://www.cloudfronts.in/azure-console-login-logout-using-azure-powershell/ Steps in Microsoft Azure PowerShell for ACS   Step 1: Write ACS Command in PowerShell ACS Key can be created using Azure PowerShell following command, New-AzureSBNamespace GravityDocument -Location “Southeast Asia” -CreateACSNamespace $true -NamespaceType Messaging Command requires Service Bus Namespace Name, Location and Messaging type.     Step 2: ACS Information on Azure Portal This ACS Key information can be seen on Microsoft Azure Account with that corresponding service bus namespace provided in the command above.   Once the Namespace is created the corresponding Connection Information is available at the bottom under Connection Information. Steps in Microsoft Azure for SAS Here for SAS Key we have created Queue inside the namespace.   Step 1: Creation of Queue Now Queue can be created inside this specified Namespace, for that follow the below screenshots       Specify the required details i.e. the Queue name under the specified namespace.   Step 2: Key with Permissions Now since the queue is created SAS key can also be generated with different permissions like Manage, Listen & Send. So under Configure option, under Shared Access Policies specify the name and permission to be given for that particular queue.       Now SAS key for that particular queue can be obtained from Connection Information of Queue with SAS key.   Conclusion Thus, we can Create ACS and SAS requests as per requirements using Microsoft Azure PowerShell and Azure Portal.

Share Story :

Azure Console Login & Logout using Azure PowerShell

User can Add Account on Azure to check or get its Subscription details as well as Remove Account. Let’s see the steps to add account and get subscription details also allows user to select a particular subscription as per requirement. Step 1: Run Microsoft Azure PowerShell as Administration.   Step 2: Add your Microsoft Azure Account To Login to your account write a command Add-AzureAccount. And then popup comes for asking the username. Note: Login through your Live ID   Step 3: Enter Credentials Once the Step 2 is done, it will ask for user’s Credentials. Enter the Username and Password and press Enter.   Once the details are entered, all the subscriptions with their ID and tenants related to that particular user will be listed as shown in the figure below.   Step 4: Get Subscription Details If the user wants to see the entire details for related Subscriptions present in that particular account, type command –GetAzureSubscription. This command lists downs every Subscription with its corresponding details like Subscription Id, Subscription Name, Account, Storage Account etc. as shown in figure below.   Step 5: Select Particular Subscription If user wants to select particular Subscription, type command Select-AzureSubscription (Subscription Name) and then type command Get-AzureSubscription –Current, this command will give the current Subscription details which was selected.   Step 6: Remove Microsoft Azure Account Now if User wants to Logout, then type command –RemoveAzureAccount , once that is done PowerShell asks for ID and confirmation for the same.   Conclusion: Thus User can login and logout successfully with help of Microsoft Azure PowerShell also can set and get its Subscription Details for that particular Account.  

Share Story :

Power BI with Azure SQL Data Warehouse

Prerequisite: Power BI Desktop Tool, Power BI Online Service, SSMS and SSDT Connecting Power BI Desktop Tool with Azure SQL Data Warehouse: With the new Azure SQL Data Warehouse Preview released in June 2015, we can connect to Power BI Online or Desktop tool and create stunning reports/dashboards to visualize and analyze your data. Before connecting with Power BI, we will first move few records in Azure SQL Data Warehouse from SQL Server in Azure VM using SSIS as shown below: Now we can connect to Azure SQL Data Warehouse from SQL Server and query table to view records inserted as shown below: Once data is loaded in Azure SQL Data Warehouse, we can now start creating reports and later publish it to Power BI Online. Find the steps given below to connect to Power BI Desktop tool:   Step 1: Open Desktop tool and click ‘Get Data’ from ribbon. Select ‘Microsoft Azure SQL Data Warehouse’ and click ‘Connect’   Step 2: Enter Server Name, Database and later credentials to connect to your Azure SQL Data Warehouse   Step 3: Once connected, you can select required tables. In our case it is ‘Projects’ and click load   Step 4: Design report and later save & publish it to Power BI Online   Step 5: You can pin visualizations to dashboard and also schedule refresh without the need for Power BI Personal Gateway   Direct Connectivity to Power BI Online from Azure SQL Data Warehouse: We can also directly load tables in Power BI Dataset using option ‘Open in Power BI’ available in Microsoft Azure as shown below:   Once you hit ‘Open in Power BI’, you will be taken to Power BI Online and details like server name and database name will be already provided by default. Later you need to just enter password for database and then you are good to go.   You can create reports from the imported dataset and pin visuals to dashboard similarly as in case of reports published from Power BI Desktop tool. Find the screen capture as shown below:   Since dataset is directly imported in Power BI connecting to Azure SQL Data Warehouse, the dataset is automatically refreshed at regular interval without the need to schedule refresh. Find image shown below:  

Share Story :

Developing Integration Solutions using Microsoft Azure BizTalk Services

Part 2 – Integrating Microsoft Dynamics CRM Online to Microsoft Azure Service Bus Queue. You can check part 1 here. Scope: To demonstrate the integration through Message-flow from Microsoft Dynamics CRM Online and Azure Service Bus Queue. Pre-requisite: Source: Microsoft Dynamics CRM Online Target: Microsoft Azure Service Bus Queue SDK for Dynamics CRM Online (plugin Registration Tool) Service Bus Explorer to view the message contents received from CRM in Azure SB Queue. Visual Studio to create Custom WCF Service to push the messages from CRM to SB Queue (alternate method) Background: In earlier Blog we had seen steps to Create Microsoft Azure BizTalk Services as well as Developing and Deploying BizTalk bridges on Azure. In this Blog we will see the steps to create Service End Point (Azure Aware Plugin) that will push messages in JSON (default) format whenever a new Account Name is created in CRM to Azure Service Bus. We can View the contents of this message in Azure SB Queue, using Service Bus Explorer Tool (URL for the tool https://code.msdn.microsoft.com/windowsazure/service-bus-explorer-f2abca5a). Alternately we can also create a custom WCF Web Service that will push the messages whenever a new record is created CRM (Entity- Accounts) STEP 01: Creating Azure Aware Plugin in CRM Download the Microsoft Dynamics CRM Software Development Kit (SDK) for CRM Online and on-premises CRM 2015 from URL http://www.microsoft.com/en-us/download/details.aspx?id=44567 After Download, Extract the same and go to Path \MicrosoftDynamicsCRM2015SDK\SDK\Tools\PluginRegistration, and launch the PluginRegistration.exe Select Create New Connection and enter the details for your CRM like Deployment Type as per your CRM, Online Region, User Name and Password and click on Login. You can create CRM Trail account if needed. In the Next Window of Service End Point Registration, provide details for the endpoint Please note in above screen the path is taken from Service Endpoint URL in Azure. For example URL is Then my path is TwoWayService/Demo. The Contract can be selected from dropdown Oneway, TwoWay, Queue, REST, TOPIC and PersistentQueue. After entering the details you need to click on Save & Configure ACS (Access Control Service) We will need below Information in this screen. Management Key Certificate File Issuer Name   Management Key: This key is obtained from Azure Portal. Login to Azure Portal and Create a ServiceBus and a Queue. Here the Service Bus is btscfsnamespace and btscfsqueue is a Queue in it. The Management Key is the Default Key found in the Connection Information for the Service Bus in the Azure Portal. After this you need to Register the Steps in Plugin Registration tool for the Service End point you recently created.   Certificate File: This certificate file is obtained from CRM under Customizations, under Developer Resources. Issuer Name : This is found as in above screen in CRM under Windows Azure Service Bus Issuer Certificate (crm.dynamics.com)   Select Save & Verify Authentication in the Service Endpoint Registration window and close the window after verification test is completed successfully. Step 02: Registering the Step in Plugin Registration Tool In the Message, mention the type of action like Create or Update or Delete etc. Then specify the Entity in CRM. In this case Entity is Account. Execution Mode will be Asynchronous.   Now that the Plugin registration and Step registration is completed. We can login to CRM and create new Account Name in account entity.   When a new Account Name is created, the Message is pushed by the plugin that we created earlier to Azure Service Bus Queue. You can view the Message using the service bus explorer tool. The message remains in the Queue as per the Time-To-Live settings in the Azure portal. After that if there is no further processing, the message gets moved to dead-letter queue. So we have pushed the messages from Account entity in CRM to Azure Service Bus Queue. Azure Service Bus Queue has many features as below FIFO Ordering guaranteed Transaction Support Automatic Dead Lettering Increasing Queue TTL (Max Unlimited) Poison Message Support Message Auto Forwarding WCF Integration Support Message Sessions supported Duplicate Detection functionality. Maximum Queue Size 1 GB to 80 GB Maximum Message Size 256 KB Maximum message by default is TTL 7 days Maximum No of Queues 10,000 (per service namespace, can be increased) Unlimited number of concurrent clients for REST based Maximum throughput upto 2000 Messages/seconds Average Latency 20-25 ms. Azure Service Bus Architecture combined with BizTalk Bridge solution deployed in Azure can provide integration solutions with scalable and monitoring capabilities at affordable cost. In the above steps, there was not coding involved due to SDK plugin registration tool. We can write the custom WCF service that will push the messages from CRM whenever a new Account is created, to Azure Service Bus Queue and then from there these messages can be picked up by BizTalk Service Bridge that listens for the incoming messages in the Queue and then process these messages to external end point or web service that can write the same in to other application or another CRM. We can track the status of these message processing in BizTalk service Tracking option. Creating the WCF service to Push the messages from CRM to Azure Service Bus Queue. Alternately we can write a custom code that will integrate the pre-defined entities and fields from CRM to other applications. Here we need Schema from Source and Target. To achieve this we need to host the WCF code and register the Assembly using the same plugin registration we used earlier. Describing the code for the same is beyond the scope of this blog. Please note the messages that we send to SB Queue through WCF service are by default in XML UTF -8 encoding, and this aspects needs to be handled while creating the processing steps in the Solutions that listens to this messages. In the next article we will have more insight in to the Messages flow in the Queues and the Processing inside the Azure BizTalk Bridges.  

Share Story :

Developing Integration Solutions using Microsoft Azure BizTalk Services

Part 1 – Creating Microsoft Azure BizTalk Services and Deploying Bridge. Scope: Creating Microsoft Azure BizTalk Services on Azure Portal Developing and Deploying BizTalk Bridge on Azure. Pre-requisite: Azure Subscription to create BizTalk Service. Visual Studio for Developing BizTalk Bridge solution. Windows Azure BizTalk Services SDK (including Microsoft BizTalk Adapter Pack and Microsoft WCF LOB Adapter SDK) .NET Framework 3.5.1 Features should be enabled .NET Framework 4.5 must be installed Background: Microsoft Azure is a cloud computing platform and infrastructure, created by Microsoft, for building, deploying and managing applications and services through a global network of Microsoft-managed and Microsoft partner hosted datacenters. The cloud services as offered as PaaS (Platform as a Service) and IaaS (Infrastructure as a Service). Step 1: Creating BizTalk Service Launch windows Azure portal through URL https://manage.windowsazure.com (You can create a Trial Subscription) Go to NEW option at the bottom of the page. Select APP SERVICES → BIZTALK SERVICE → CUSTOM CREATE This BizTalk Service Creation web form allows the creation of storage Account Tracking Databases. After successfully creating the BizTalk Service you get the BizTalk URL. ) On Clicking the Connection Information Button at the bottom of the page, you get Access Connection Information. This information you need to copy to a notepad to be used during deployment of the BizTalk Bridge solution on Azure. NAMESPACE DEFAULT OWNER DEFAULT KEY   Step 2: Developing the BizTalk Bridge Solution Launch Visual Studio 2012 and open a new project. Select BizTalk Service Template under folder located BizTalk Services in Visual C#. Please note BizTalk Services Templates are visible in Visual Studio 2012 only after you install the Windows Azure BizTalk Services SDK. (BizTalk Service Templates are not available in Visual Studio 2013 and 2015 even after installing the SDK) In the Visual Studio solution you need to specify 4 components. Sources Bridges (XML One way or XML Reply Request or Pass-Through bridge Destinations BizTalk Service URL Sources can be FTP / SFTP or Service Bus Queue or Subscription. Destinations have more options like FTP, FTPS, Service Bus Queue, Service end point, Blob Storage etc. Bridges can be XML one-way or Two way i.e. Reply Request and simple Pass Through Bridge. Right click anywhere in the empty space in the solution and select Properties. Enter the BizTalk Service URL (Example: https://cfsbtdemo.biztalk.windows.net). We get this URL when we create BizTalk Service in Azure portal. After placing the source, Bridge and Target blocks, connect them using connector under Bridge in toll box items. For complex solutions business logic and SCHEMA mapping between source and Target entities can be defined inside the bridges. Custom code can be written here.   Step 3: Deployment Save and Build the project and Right click and select deploy. Details like ACS Namespace and Issuer Name (Default – owner) and Shared Secret needs to be entered. ACS NameSpace is the Namespace we got earlier from Azure portal on clicking connection information for the BizTalk service. Shared Secret is the Key “Default Key “ Check the status in the Visual Studio output window for the deployed components. Please Note: Deployment Name needs to be Registered / created on https://biztalksvc12.portal.biztalk.windows.net/default.aspx before deploying your bridge. (You may need Silverlight on to be able to launch this end point URL). Provide the BizTalk Service Name as in Azure portal ACS Issuer name (Default is owner) ACS Issuer secret – This is Shared Secret key from BizTalk Service Connection Information in Azure portal. Once you deploy your bridge, it appears under BRIDGES in Microsoft Azure BizTalk services (MABS) Portal. Rest of the deployed codes appear under “RESOURCES”. If you are using bridge solution to route messages from Azure Service Bus Queue to Destinations like Web service end point, you can track them under Tracking Details. In the next article we will explore the integration between the Microsoft Dynamics CRM Online and Azure Service Bus queue. Azure servicebus Queue is one of the sources that Azure BizTalk Services can listen to for Integration requirements. Please note Azure Queues and Service Bus Queues are 2 different type of Queue offered by Microsoft Azure.  

Share Story :

SEARCH BLOGS:

[gravityform id="36" ajax="true"]

FOLLOW CLOUDFRONTS BLOG :