Category Archives: Azure and Office 365
How to Send Email Notifications for Failed Pipeline Runs : Part 2
Introduction: ADF has a feature to Monitor and Audit the ADF activity. These Alerts can be Fired on both success and failure of a pipeline based on how we configure it. We have already created a Target Criteria in the previous in Blog. Check it out here: In this Part we will configure Email Notification for Failure of Pipeline Runs Step 1: Under configure Email/SMS/Push/Voice notification click on Configure Notification to set an action group. An action group defines a set of notification preferences and actions included by Azure alerts. Step 2: Select Create new. Give an Action Group name and Short name and click on add notification. Step 3: Give the Action Name and check mark the Email Option. Step 4: Add the Email address and click on Add notification. Step 5: You can see that your Notification is now added and you can click on Add action group. Step 6: You can click on Create Rule once your Target criteria and Notifications are added and Enable rule upon creation is enabled. Step 7: Thus a new Email Alert has been created for Failure of Pipeline.
Share Story :
See how to restore deleted channel, check Audit Logs and applying Retention Policy in MS Teams
As an Administrator, you might face this situation and suddenly you are not able to see one of the Channel in Microsoft Teams. There might be a chance that it got deleted somehow. Will you be able to recover the deleted Channel? The answer is Yes as a Team owner you will be able to recover the deleted channel. You can restore any deleted channel up to 21 days. In this article, we are going to see how to restore deleted channels, auditing, and other preventive measures. Restoring a deleted Channel in MS Teams: Go to Microsoft Teams > Click on Team whose Channel got deleted. Click on the 3 dots beside and then go to the Manage Team. Go to Channel option, expand Deleted items. You will see all the deleted Channels listed over there. Click Restore. By default, all the Members of the team have the right to delete the Channel in Microsoft Teams. After this incident, you might want to remove delete permission for the Team members. Please find the steps below. Go to the Team for which you want to manage permissions. Click on the 3 dots beside and then go to the Manage Team (as we did while Restoring the channel). Go to the Settings option, expand “Member Permission”. Uncheck “Allow members to delete and restore channels”. You can manage other permissions as well as per your organization requirements. If any channel got deleted, you might want to know who has deleted this, to ensure it doesn’t happen again. We can search for the Audit logs from Security and Compliance center. Login to Security & Compliance center (protection.office.com). On the left navigation, Go to Search > Audit log search. Select the activities, you want to search for. In this case, we need to find who deleted the Channel. Scroll down to MS Teams activities. Provide the start date and end date and then search. You can leave the Users field blank. Click on Search. You can find the deleted channel activities on the right-hand side. You can also set up Teams retention policy for chat and channel messages and decide whether to retain data, delete it or retain for a specific period. Find the steps below to setup Retention Policy. Go to Office 365 Security & Compliance Center > Information Governance > Retention > Create. Name your Policy and add a description. You can set the Retention Policy for Retain Data or Delete Data. Retain Data – This will ensure that your data is retained for a specified period. Data is retained and is available for eDiscovery even if the user deletes it. You can define whether to delete data or do nothing after the Retention period is over. Delete Data – If you set up the Retention policy to delete data after the defined time. Choose Microsoft Teams as a location. You need to select if you want to apply Retention policy for Teams Chat and Channel messages. Review your settings and create the policy. In this article, we saw how we can restore a deleted channel, managing members permissions, checking the Audit logs and applying Retention Policy which will help your organization to manage MS Teams more efficiently. Thanks!
Share Story :
How to get Email notification when Azure Data Factory Pipeline fail
However, it seems there’s no “e-mail activity” in Azure Data Factory. I would like to send an e-mail notification if one of the activities fail. In this blog I am going to explain you how to send an e-mail notification using ADF Web Activity and Azure Logic App. Sending an Email with Logic Apps Logic Apps allows you to easily create a workflow in the cloud without having to write much code. First login to Azure.portal.com Choose to create a new resource, search Logic App Click on Create button, it will be asked to specify some details for the new Logic App: Click on Review + create again to finalize the creation of your new Logic App. After the app is deployed, you can find it in the resource menu. Click on the app to go the app itself. Click on Logic app designer link. In this tip, we need the HTTP request (“When a HTTP request is received”) as the trigger, since we’re going to use the Web Activity in ADF to start the Logic App. From ADF, we’re going to pass along some parameters in the HTTP request, which we’ll use in the e-mail later on. This can be done by sending JSON along in the body of the request. The following JSON schema is used: { “properties”: { “DataFactoryName”: { “type”: “string” }, “EmailTo”: { “type”: “string” }, “ErrorMessage”: { “type”: “string” }, “PipelineName”: { “type”: “string” }, “Subject”: { “type”: “string” } }, “type”: “object” } We’re sending the following information: The name of the data factory. Suppose we have a large environment with multiple instances of ADF. We would like to send which ADF has a pipeline with an error. The e-mail address of the receiver. An error messages. The name of the pipeline where there was an issue. The subject of the e-mail. In the editor, click on New step to add a new action to the Logic App: Click on New step it will send the e-mail. When you search for “mail” you will see there are may different actions: Click on Office 365 Outlook and select Send an email(V2) Once you’re logged in, you can configure the action. We’re are going to use dynamic content to populate some of the fields The result look like this: Once you click on Save, it will generate HTTP POST URL. Copy this URL for ADF. Triggering the Logic App from ADF Suppose you already created pipeline with some activities in ADF: Add a web activity to the canvas and connect another activity to this new activity using the arrow. When the connection has been made, right-click on the connection to change it to a Failure precedence constraint. This will change the color of the connector to red. Now we are using HTTP POST URL that will copied from the Azure Logic App We also need to add a header, where we will set the Content-Type to application/json. In the body, we enter the following JSON (following the structure mentioned before): { “DataFactoryName”: “@{pipeline().DataFactory}”, “PipelineName”: “@{pipeline().Pipeline}”, “Subject”: “An error has occurred!”, “ErrorMessage”: “The ADF pipeline has crashed! Please check the logs.”, “EmailTo”: “myemail@outlook.com” } We’re using system parameters to retrieve the name of the data factory and the name of the pipeline. All the other fields in the settings pane can be left as-is. Now we can run the pipeline and wait to see if any emails come in: I hope this will help you.
Share Story :
Auto scale the Power BI Embedded capacity using Job Scheduler in Azure
Power BI Embedded is a Microsoft Azure service that is useful for the ISVs and developers to embed visuals, reports and even dashboard into the application. As Power BI Embedded is a PaaS analytics solution which provide Azure based capacity, Power BI embedded charge customers on an hourly basis there are n annual commitment for the Power BI Embedded service. As Power BI Embedded charges on hourly basis and there is no direct Auto Scaling feature available on Azure but, we do have API provided by using which we can scale the capacity. In this blog we are going to see how scale the Power BI Embedded capacity using PowerShell script. Before going to start we’ll first quick list the set up the prerequisites: You will need an Azure account, if you are implementing the PowerShell script for your organisation then you must have co-administrator role assign kindly keep in mind that if you have contributor role assign then you’ll not be able to make Automation account.(we’ll see about the Automation account in the later part of this blog.) Power BI Embedded subscription. Automation Account. I’m assuming you already have Azure account along with the subscription for the Power BI Embedded. Steps:- Create Automation Account:- Automation account is use to manage the Azure resource across all the subscription for the given tenant. To create Automation click on the create resource in your Azure portal as shown below and search for Automation account. Or you can type in search box Automation Account. 2. Click on create Automation Account and make sure to fill the following details. If you have multiple subscription then make sure to select proper subscription from drop-down. Make sure create Azure Run As account is selected to Yes (if you are co-administrator or administrator then it will by default selected to Yes). Once we create Azure automation account it will show under automation account. 3. Open the Automation account and go to the Connections and add below connection and types as shown below (Click on Add a connection and type the name and type as shown below) 4. For the AzureClassicRunAsConnection set the CertificateAssetName to AzureRunAsCertificate. 5. Add the Power BI Embedded subscription to your resource group. 6. Once we have Automation account ready go to the Runbooks under Process Automation in Automation Account. Runbook is useful for the routine procedures and operations. We can also use Azure Function app instead of Runbook. 7. Click on the Create a runbook and use fill following details. 8. Once we open runbook make sure to import the Module AzureRM.PowerBIEmbedded which can be installed by going to Module under Shared Resources then click on Browse gallery and search for the AzureRM.PowerBIEmbedded module. 9. Use the below PowerShell script which can also be found on the Power BI discussion site. $resourceGroupName = “<your resource group>” $instanceName = “<Power BI embedded instance name>” $azureProfilePath = “” $azureRunAsConnectionName = “AzureRunAsConnection” #”PowerBIAutoscale” $configStr = “ [ { Name: “”Weekday Heavy Load Hours”” ,WeekDays:[1,2,3,4,5] ,StartTime: “”06:45:00″” ,StopTime: “”23:45:00″” ,Sku: “”A4″” } , { Name: “”Early AM Hours”” ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: “”00:00:00″” ,StopTime: “”04:44:00″” ,Sku: “”A1″” } , { Name: “”Model Refresh”” ,WeekDays:[0,1,2,3,4,5,6] ,StartTime: “”04:45:00″” ,StopTime: “”06:45:00″” ,Sku: “”A3″” } , { Name: “”Weekend Operational Hours”” ,WeekDays:[6,0] ,StartTime: “”06:45:00″” ,StopTime: “”18:00:00″” ,Sku: “”A3″” } ] “ $VerbosePreference = “Continue” $ErrorActionPreference = “Stop” Import-Module “AzureRM.PowerBIEmbedded” Write-Verbose “Logging in to Azure…” # Load the profile from local file if (-not [string]::IsNullOrEmpty($azureProfilePath)) { Import-AzureRmContext -Path $azureProfilePath | Out-Null } # Load the profile from Azure Automation RunAS connection elseif (-not [string]::IsNullOrEmpty($azureRunAsConnectionName)) { $runAsConnectionProfile = Get-AutomationConnection -Name $azureRunAsConnectionName Add-AzureRmAccount -ServicePrincipal -TenantId $runAsConnectionProfile.TenantId ` -ApplicationId $runAsConnectionProfile.ApplicationId -CertificateThumbprint $runAsConnectionProfile.CertificateThumbprint | Out-Null } # Interactive Login else { Add-AzureRmAccount | Out-Null } $fmt = “MM/dd/yyyy HH:mm:ss” # format string $culture = [Globalization.CultureInfo]::InvariantCulture $startTime = Get-Date Write-Verbose “Current Local Time: $($startTime)” $startTime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($startTime, [System.TimeZoneInfo]::Local.Id, ‘Eastern Standard Time’) Write-Verbose “Current Time EST: $($startTime)” $scheduleTimeMidnight = ($startTime).Date Write-Verbose “Schedule Time Base (Midnight): $($scheduleTimeMidnight)” $currentDayOfWeek = [Int]($scheduleTimeMidnight).DayOfWeek Write-Verbose “DOW: $($currentDayOfWeek)” $stateConfig = $configStr | ConvertFrom-Json #| Select-Object Sku, WeekDays, Name, StartTime, EndTime #, @{Name=”StartTime”; Expression={[DateTime]:Smiley TonguearseExact($_.StartTime, $fmt, $culture)}}, @{Name=”StopTime”; Expression={[DateTime]:Smiley TonguearseExact($_.StopTime, $fmt, $culture)}} Write-Verbose “Writing Config Objects…” foreach($x in $stateConfig) { Write-Verbose “Name: $($x.Name)” Write-Verbose “Weekdays: $($x.WeekDays -join ‘,’)” $x.StartTime = ($scheduleTimeMidnight).AddHours([int]$x.StartTime.Split(“{:}”)[0]).AddMinutes([int]$x.StartTime.Split(“{:}”)[1]).AddSeconds([int]$x.StartTime.Split(“{:}”)[2]) Write-Verbose “Start Time: $($x.StartTime)” $x.StopTime = ($scheduleTimeMidnight).AddHours([int]$x.StopTime.Split(“{:}”)[0]).AddMinutes([int]$x.StopTime.Split(“{:}”)[1]).AddSeconds([int]$x.StopTime.Split(“{:}”)[2]) Write-Verbose “End Time: $($x.StopTime)” } Write-Verbose “Getting current status…” # Get the server status $pbiService = Get-AzureRmPowerBIEmbeddedCapacity -ResourceGroupName $resourceGroupName switch ($pbiService.State) { “Scaling” { Write-Verbose “Service scaling operation in progress… Aborting.” end } “Succeeded” {Write-Verbose “Current Status: Running”} Default {Write-Verbose “Current Status: $($pbiService.State)”} } Write-Verbose “Current Capacity: $($pbiService.Sku)” # Find a match in the config $dayObjects = $stateConfig | Where-Object {$_.WeekDays -contains $currentDayOfWeek } # If no matching day then exit if($dayObjects -ne $null){ # Can’t treat several objects for same time-frame, if there’s more than one, pick first $matchingObject = $dayObjects | Where-Object { ($startTime -ge $_.StartTime) -and ($startTime -lt $_.StopTime) } | Select-Object -First 1 if($matchingObject -ne $null) { Write-Verbose “Current Config Object” Write-Verbose $matchingObject.Name Write-Verbose “Weekdays: $($matchingObject.WeekDays -join ‘,’)” Write-Verbose “SKU: $($matchingObject.Sku)” Write-Verbose “Start Time: $($matchingObject.StartTime)” Write-Verbose “End Time: $($matchingObject.StopTime)” # if Paused resume if($pbiService.State -eq “Paused”) { Write-Verbose “The service is Paused. Resuming the Instance” $pbiService = Resume-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -ResourceGroupName $resourceGroupName -PassThru -Verbose } # Change the SKU if needed if($pbiService.Sku -ne $matchingObject.Sku) { Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to $($matchingObject.Sku)” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } else { Write-Verbose “No Interval Found. Checking current capacity tier.” if($pbiService.Sku -ne “A2”) { Write-Verbose “No Interval Found. Scaling to A2” Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to A2” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } } else { Write-Verbose “No Interval Found. Checking current capacity tier.” if($pbiService.Sku -ne “A2”) { Write-Verbose “No Interval Found. Scaling to A2” Write-Verbose “Updating Capacity Tier from $($pbiService.Sku) to A2” Update-AzureRmPowerBIEmbeddedCapacity -Name $instanceName -sku $matchingObject.Sku } } Write-Verbose “Done!” 10. Above script not includes Capacity pause, we can add that in the script. 11. Once we done with the script click on Save and the Publish the script. 12. Create the Schedule under the Shared Resources … Continue reading Auto scale the Power BI Embedded capacity using Job Scheduler in Azure
Share Story :
How to map Azure File Share as a Network Drive?
In this article, we will see how we can map a Network Drive on our system to a file stored on the cloud using Azure File Share. Using Azure File Share is useful when you want to share files on multiple systems and all the data will be on the cloud. IT Considerations: You must have an Azure subscription for using Azure File Share. You must be a Global Administrator for Azure to perform these steps. You should have a Resource Group created on Azure. You will need to have an Azure subscription, if you trying it for the first time, you can also take a trial for a month. Azure Files offers fully managed file shares in the cloud using Server Message Block (SMB) protocol. The advantage of using File Share is that you can replace on-premise file shares with Azure file shares, begin able to share file systems across multiple systems. You do not have to deal with any kind of hardware. Step 1: Creating Storage Account on Azure: For using Azure File, you can use a storage account that is already created or else you can create a new one. Let’s see how to create a new Storage account below. Go to portal.azure.com. In the Search bar, search for Storage accounts. Click on “Add” and then provide all the required details. Choose Subscription. Select Resource Group, if you haven’t created it yet, create a new one. Under Instance details, provide Storage account name. This should be unique. Select your location. You can leave the other fields set to their default value. Choose Review + Create, to review the Storage Account settings and create the account. Once the Storage account is created, it will look like this. Step 2: After the Storage account is created, you will need to create a File Share. Go to the Storage account which you created, select File Shares. Click on “+File Share” to get started. Name the File Share and set a quota in GiB to limit the total size of the files on File Share. Click Create. Step 3: We had created a File Share in Step 2, now we will map the Azure File Share as a Network Drive on the system. To map Azure File Share as a Network Drive, you need the following information. The URL of your FIle Share – Open File Share > Properties. The Storage Account name and Storage Account key to allow access – Go to the Storage Account which you created > Access Keys. Open Windows Explorer > Click on This PC > Computer > Map Netwrok Drive. Select the Drive (Choosing Z:) then provide the File Share URL, which we got from Point 1. Please find the screenshot below. Once you click on Finish, a pop-up will open, which will ask for the credentials. You will find a Network Drive created in the system mapped with Azure File Share which we created. You can now save any file in your Network Drive, which will appear in the Azure File Share and on all the connected devices. You can also use Microsoft Azure Storage Explorer to upload, download and manage Azure blobs, File Shares and other storage entities. You can download the same from https://azure.microsoft.com/en-in/features/storage-explorer/ This is how you can utilize Azure File Share by mapping it to a Network Drive on your system to have shared access and being able to share a file system across multiple machines, no hardware to manage. You can go through the above 3 steps to easily create the same without having any difficulties.
Share Story :
How to configure SAML authentication in Azure AD for Zoho People?
After signing up for Zoho People subscription, you can go ahead to configure SAML authentication for Zoho People by adding it as a non-gallery application in Azure AD. While adding the gallery apps or non-gallery apps to Azure AD, one of the SSO options available is SAML based SSO. With SAML, Azure AD authenticates application by using the user’s Azure AD account. In this article, we will see how to register Zoho People as a non-gallery application in Azure AD and how we can configure SAML authentication for SSO in 3 steps. To configure SAML SSO for a non-gallery application, you need to have an Azure AD Premium subscription. Steps 1: Registering the Zoho People application. Registering Zoho People as a non-gallery application in Azure AD. Go to Azure AD > Enterprise Application > Click ‘+ New Application’. Under add an application, select a Non-gallery application. Provide a name to the application. Here, I am providing the name Zoho People – CFT (Org name) and then click ADD. After the application is added, it will show up under the Enterprise Application list. Click on Single Sign-on, it will show up methods to configure SSO, here choose SAML. Step 2: Configuring SAML in Zoho Accounts. Before configuring SAML in Azure AD, you will need to configure SAML into Zoho accounts. Sign in to Zoho People account as an administrator and then go to My Account. You will be redirected to the Zoho Accounts page. Click SAML Authentication under Settings and then click Setup Now. Provide the required details. You can get all the above details from Azure AD. Go to the application you registered in Step 1 and then click on Single Sign-on.Sign-In URL & Logout URL: Both the URL you can keep it as Sign-In URL (shown in below screenshot) or else leave the Logout URL blank.Note – I tried Logout URL from SAML configuration (Azure AD), but it is giving an error while logging out from Zoho People and hence keeping the Logout URL blank. Public Key: Download the certificate (Base 64) and upload it. Zoho Service: Select “People”. Algorithm: RSA (by default RSA is selected). Change password URL: This will be the same as Sign-in URL. Click Configure, it will ask to verify yourself (enter password) and then click configure. Once SAML is configured in Zoho Account, you will see the Download Metadata tab. Download Metadata which will need to be uploaded in Azure AD. Step 3: Configuring SAML in Azure AD. SAML authentication is configured on Zoho Account. Now, you will need to upload the metadata which we downloaded in Step 2. Go to Azure AD > Enterprise Application > Zoho People -CFT > Sign Sign-on. Click on the Upload metadata file. Upload the metadata file downloaded from Step 2. Once uploaded, it will open a Basic SAML configuration. You will see that the Identifier (Entity ID) and Reply URL will be populated automatically. Change the Identifier (Entity ID) from Zoho.com to Zoho.in, because we can see the URL for login in to Zoho People is zoho.in. Instead of changing Zoho.com, you can also add Zoho.in and make it as default. Click Save. Once the SAML configuration is done, go to Users and Groups and add users. After adding users in the App, the application will appear on the Access Panel for assigned users. Test the SSO with the application. Accessing Zoho Application: Directly through Zoho People URL – https://people.zoho.in/ From the Access Panel (myapps.microsoft.com), users can find the Zoho People – CFT application. From office.com – Users can fins Zoho People-CFT under all apps. This article will help you configuring SAML authentication for Single Sign-on. If you are facing any issues with SSO, you can just configure SAML authentication for application and the end-users will be able to leverage SSO for the third-party applications.
Share Story :
RSAT (Regression Suite Automation Tool ) implementation and configuration for Finance and Operations
Purpose The Regression suite automation tool (RSAT) significantly reduces the time and cost of user acceptance testing. This tool enables functional power users to record business tasks using the Finance and Operations Task recorder and convert these recordings into a suite of automated tests without the need to write source code. Test libraries are stored and distributed in Lifecycle Services (LCS) using the Business Process Modeler (BPM) libraries. These libraries are also fully integrated with Azure DevOps Services (Azure DevOps) for test execution, reporting and investigation. Test parameters are decoupled from test steps and stored in Microsoft Excel files. Prerequisites Dynamics 365 for Finance and Operations test environment (Demo or tier 2 UAT environment Excel Azure DevOps: You will need an Azure DevOps Test Manager or Test Plans license. For example, if you have a Visual Studio Enterprise subscription, you already have a license to Test Plans. Pricing-https://azure.microsoft.com/en-us/pricing/details/devops/azure-devops-services/ For a demo environment, you don’t need to buy any license. Authentication Certificate: To enable secure authentication, RSAT requires a certificate to be installed on the RSAT client computer. The RSAT settings dialog box allows you to automatically create and install the authentication certificate. Installation Download Regression Suite Automation Tool.msi to your machine RSAT requires Selenium and web browser driver libraries. RSAT will prompt you if needed libraries are missing and will automatically install them for you. Configuration For RSAT Open RSAT application. Select the Settings button in the upper right to configure RSAT. And next steps will help you to find those required fields input. Go to project settings of Lcs for your projects. Go to Visual Studio Team Services. Here you need to mention the Azure DevOps project in the Azure DevOps site URL field. In order to do that, click on https://www.visualstudio.com Open Azure DevOps and create a new organization if there is not an existing one. Now create a new project as shown below Now you need to set up a security token by clicking on account info>security Once you create the token, save it as you will not be able to access it again when you want to use it. Once that is done, go back to the main page and create a new test plan. Name it as RSAT-TT (or you can use any name) Now right click on RSAT-TT and create a new suite you can name it ‘Demo’. Azure DevOps setup is done. In Azure DevOps site URL mention Organization name that you set up in Azure DevOps. And in Personal access token field paste the token that you had earlier saved. Click on continue to select the project and continue, Save. Now you need to deploy it to the environment Next, open the Regression Suite Automation Tool, Go to settings in Azure Dev Ops Url field copy it from the LCS Access token should be the security token you had copied. Click on Test connection so the Project name and Test plan will populate. Now run VM. You will find Hostname and SOAP Hostname by going to IIS and then right-clicking on AOSService>Edit bindings. Copy both the Hostname and in Hostname and SOAP Hostname fields paste these values Admin username should be the username you use to login to your environment. To generate Thumbprint click on New and save at any location and then copy the generated certificate to the VM Open the copied certificate and install it to the local machine at personal and Trusted Root Certification Authorities locations.Now Open the wif file in admin mode in notepad from the given location of VM In wif file find CN name=127.0.0.1 exists or not. If not, copy the selected portion and paste it below the same authority block. Now add modify those lines as follows: <authority name=”CN=127.0.0.1″> <keys> <add thumbprint=”F46D2F16C0FA0EEB5FD414AEC43962AF939BD89A”/> </keys> <validIssuers> <add name=”127.0.0.1″ /> </validIssuers> </authority> ( Note: Add thumbprint of installed Certificate in wif as shown) Final steps include Copy thumbprint from RSAT settings (which was generated when you click on New) and paste it in wif file in your VM Then Mention the company name And Working directory Set default browser as internet explorer Save as and ok Next, Go to LCS open business process modeler and create a new Library Name it as RSAT, go to edit and rename the process as required and you may add a child node to it by clicking on the Add process. Now go to Finance and operations, go to test recorder Create recording by clicking on create a recording and perform the operation and then click on the stop button. Name it as per your need then Save it to Lifecycle services or Save this to PC option. Click ok Now go back to LCS in the project library and click on the requirement, tab check it’s syncing Now Sync test cases and VSTS sync Next, go to Visual studio DevOps, test cases, click on Add existing Then click on the run query and click on Add test case Now go to regression suite automation and load the test and download test cases. select test and click on new and generate test execution parameter files Then click on edit option for the older version to edit values in excel For older version For newer version Now edit metadata for the test in excel file and save and close Now Run the test after this step, automatic session for the test is handled by selenium where the browser will perform steps as test cases Then run the test and after it’s completed successfully click on upload (Note the result as passed)
Share Story :
How to create Azure SQL database from Azure Portal
Hello friends, In this blog we will learn how to create an Azure SQL database in Azure portal. Steps: Go to www.portal.azure.com and create new resource of SQL Database. Create New database, click on new server and fill the necessary details, the login id and password will be the same that you will use to authenticate the database. Now the database is ready! You can click on Basic pricing tire to change the pricing of the database.Pricing Overview of Azure SQL: BASIC STANDARD PREMIUM DATA SIZE MAX 2 GB 250 GB 500GB DTU MAX 5 50 4000 PRICE (INR) /Month 329.89 4958.54 1057537.88 Hope this helps!
Share Story :
How to create Dynamics 365 Finance & Operations (formerly Dynamics AX) connection in SSIS
Introduction: Creating a Connection is one of the first Steps during the integration of data. While performing Integration with Dynamics AX or more specifically Dynamics 365 for Finance and Operations create a connection using the following steps. Step 1: Right Click on Connection Managers and click on New Connection Manager. Step 2: Configure your AX Application to get the following details: Step 2.1 Go to www.portal.azure.com and login with your credentials. Go to “app registrations ”. Step 2.2 Enter a Name for the Application, appropriate Support Account Type and Redirect URI having a Similar Address as that of the Service URL which you Enter and finally click on Register. Step 2.3 Note Down the Following details for creating connection. Step 2.4 Click on New client Secret and add description and select expires as required. Step 2.5 Add description and select Never in expires section and click on Add. Step 2.6 This is the Client secret which is Generated only once, so must copy and store it in a file for future reference. Step 2.7 In the Data Scope section select required value as per your AX deployment and click on register and click on Open in Browser. You will be redirected to your Finance and Operations page. Once authorized you can test the Connection by clicking on the Test Connection Button. Conclusion Now you can Move on with development of your Control and Data flow for Integration with Finance and Operations as a Source/Destination.
Share Story :
Move database from sandbox to development in D365 Finance and Operations
Hello, In this blog I am going to demonstrate how to move database from sandbox to development environment. In some cases, there might be a situation where you need to debug the code with production data. For this, first we need to move database from production to sandbox with refresh database in LCS as shown in below screenshot. Then we need to move database from sandbox to development as follows. Steps to move database from Sandbox to Dev Login to LCS and click on Sandbox Environment full details. On Maintain Tab click Move database. To export the Sandbox Database, click on Export Database. 4. You can find the .bacpac file in Database backup of asset library after successfully executing export command . Download the .bacpac file to development VM. 5. Open SSMS in development server. Before importing the database AxDB you must rename the existing AxDB by the following Script. USE master; GO ALTER DATABASE MyTestDatabase SET SINGLE_USER WITH ROLLBACK IMMEDIATE GO ALTER DATABASE MyTestDatabase MODIFY NAME = MyTestDatabaseCopy ; GO ALTER DATABASE MyTestDatabaseCopy SET MULTI_USER GO 6. Right click on Database, select Import Data-tier Application. 7. Click Next. 8. Change the New database name to AxDB and click Next. 9. Click Next and Browse to the folder where .bacpac is downloaded. 10. Click Finish to import database. 11. You can see the Steps as follows. 12. Once Import is done, Open Visual Studio and do Full Synchronization. I hope this blog will help you.