Blog Archives - Page 145 of 151 - - Page 145

Category Archives: Blog

Agile project implementation methodology at CloudFronts

We have been hearing a lot about Agile Methodology for project implementation. But, we also need to see if it is the right choice for Dynamics world. Agile in short says “Do not wait till end, rather let’s see the deliverable in pieces”. Well, that’s justified too. Clients here get to see their baby at regular intervals. Each delivery sprint can go in a controlled manner from End-to-End delivery cycle like the complete projects does, excluding the deployment phase, which happens at the end of the last Sprint of the project. We at CloudFronts practice Agile for all the project implementation. In fact, our PSM solution is getting redesigned to suit the needs of such Agile project implementation. This is done in order to help project managers keep the data intact in CRM and do not scatter it in multiple excel files. Right from Project creation to Resource allocation to Gantt Charts to Time sheet’s we keep all of it inside CRM so that the PM is in pace of the project and all is in single place. Agile generally demands clear requirements for the project to be implemented. Keeping this in mind the project moves through Sprints. The following image illustrates “Where does Sprints come into picture”. We too follow the standard practise dictated by Agile: Daily stand up meeting Iteration planning Unit testing Release planning Burndown/team-based estimation Coding standards Continuous integration Automated builds The project planning is done in MS Project in Sprinted approach. Though MS Project is not the ideal tool to do Agile Planning but taking the advantage of its flexibility that allows us to do it.   We have the following documents in place for each Sprint and the Sprints coming ahead: Daily Agile task allocation Sheet: This is an Excel sheet where we allocate the tasks to the team during the daily stand up meeting and mark the Pending tasks that are incomplete. Sprint Document: This contains the following 4 things: Achievable Backlog Completed Other Remarks All the incomplete tasks from previous sprint are move to next sprints Backlog section. This also then becomes the part of the current sprint. There are multiple benefits of following Agile: Development process gets streamlined and simplified. Higher rate of customer satisfaction. Reduces risk. Improves project Visibility. Success rate for project goes higher by 70% Reduces the cost of development. So, this is what CloudFronts follows as Agile practice. Hope you found this article useful. We will continue to publish more articles as we implement “Agile” that works for Dynamics projects !  

Share Story :

Custom Auto Number for Cases

Posted On December 1, 2015 by Admin Posted in

Currently in Microsoft Dynamics CRM, customer service representative creates a case to track a customer request, question, or a problem. All actions and communications can be tracked in the case entity. ID field in Case entity is automatically generated and default format is CAS-00034-Z7M9F7. This kind of Case ID will not be meaningful for many users and there might be a need to create an ID which consists of customer name so that service representative can easily identify the case just by looking at the Case ID. In today’s blog, we will show you how to generate customized Case ID with your own format as per the business requirement. Whenever a case is created Case ID field is automatically generated and lock is acquired on that field. User cannot change the ID generated. In order to achieve a custom generated Case ID we create a plug-in and do customizations in CRM. Here we are considering an example where Account entity is been used. The Case ID generated is like “3-003” where ‘3’ is the unique account number and ‘003’ represents the 3rd case for respective account. When 4th case is created on same account, the counter increases and the Case ID generated will be “3-004”. Thus track can be maintained on cases for different accounts just by seeing the Case ID. In order to get Custom Auto Number on Case Entity follow the below steps: 1) Plug-in: Create a synchronous PRE CREATE plugin on Case entity. The code for case number generation public void Execute(IServiceProvider serviceProvider) { if (serviceProvider != null) { this.context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext)); //// Obtain the organization service reference which you will need for web service calls. IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory)); this.service = serviceFactory.CreateOrganizationService(this.context.UserId); } if (this.context.Depth > 2) { return; } if (this.context.InputParameters.Contains(“Target”)) { if (this.context.InputParameters[“Target”] is Entity) { //// Obtain the target entity from input parameters Entity casenumber = (Entity)this.context.InputParameters[“Target”]; this.CaseNumber_Generation(casenumber); } } } // Case Number generation private void CaseNumber_Generation(Entity casenumber) { Entity result = null; string caseNumber = string.Empty; string accountNumber = string.Empty; string counterValue; int caseCounter; if (casenumber.LogicalName == “incident”) { if (casenumber.Attributes.Contains(“customerid”)) { EntityReference customerId = (EntityReference)casenumber.Attributes[“customerid”]; //// Retrieve related account. QueryExpression customerquery = new QueryExpression(“account”); customerquery.ColumnSet = new ColumnSet(“new_casecounter”, “statecode”, “accountnumber”); customerquery.Criteria.AddCondition(new ConditionExpression(“statecode”, ConditionOperator.Equal, 0)); customerquery.Criteria.AddCondition(new ConditionExpression(“new_casecounter”, ConditionOperator.NotNull)); customerquery.Criteria.AddCondition(new ConditionExpression(“accountid”, ConditionOperator.Equal, customerId.Id)); EntityCollection customerresult = this.service.RetrieveMultiple(customerquery); if (customerresult.Entities.Count > 0) { result = customerresult[0]; accountNumber = result.GetAttributeValue<string>(“accountnumber”); counterValue = result.GetAttributeValue<string>(“new_casecounter”); bool counterResult = int.TryParse(counterValue, out caseCounter); if (counterResult) { caseCounter = caseCounter + 1; } if ((caseCounter >= 1) && (caseCounter < 10)) { caseNumber = string.Format("{0} - 00{1}", accountNumber, caseCounter.ToString(CultureInfo.InvariantCulture)); } else if ((caseCounter >= 10) && (caseCounter < 99)) { caseNumber = string.Format("{0} - 0{1}", accountNumber, caseCounter.ToString(CultureInfo.InvariantCulture)); } //// Case Number update Entity caserecord = new Entity("incident"); caserecord.Id = casenumber.Id; casenumber["ticketnumber"] = caseNumber; //// Case counter in account entity updated Entity accountrecord = new Entity("account"); accountrecord.Id = result.Id; result["new_casecounter"] = caseCounter.ToString(CultureInfo.InvariantCulture); this.Update(result); this.recordguid = caserecord.Id; } } } } // Method for updating any entity. private void Update(Entity caseRecord) { this.service.Update(caseRecord); }   2) Customizations in CRM Add Counter field to the entity whose values you will use in Case ID. Initialize the counter field to 0 Hide the counter field.   Example We can consider an example where case number is generated through plugin. Requirement is to create Case Number with respect to the account selected. Whenever a new case is created on that particular account, counter increases and case Id is generated. Create an account with name “Anay Industries”. Account number- Unique identifier for account created. Case Counter- By default it is set as 0. This is hidden field and whenever a new case is created the counter increases by 1. Case ‘Auto number generation’ is created with Customer “Anay Industries” Anay Industries has Account Number- 3 and Case counter-2. When this case is created Account Number is selected and counter is increased by 1. When the record is created Case ID “3-003” is generated where ‘3’ is the Account Number and ‘003’ is the case counter. Conclusion Although our example was simple, you can generate complex case numbers as well by applying a different logic. We hope this have given you a useful information on generating automated Custom Case Number.

Share Story :

Data Movement using Azure Data Factory

Prerequisite: Azure Subscription, SQL Server Management Studio (SSMS), Azure Explorer What is Azure Data Factory? Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data. Data Factory works across on-premises and cloud data sources and SaaS to ingest, prepare, transform, analyze, and publish your data. You can use Data Factory anytime you need to collect data of different shapes and sizes, transform it, and publish it to extract deep insights all on a reliable schedule. Key Concepts in Azure Data Factory Dataset – Identify data structures within different data stores including tables, files, folders, and documents Linked Service – Define the information needed for Data Factory to connect to external resources Pipeline – Used to group activities into a unit that together perform a task Activity – Define the actions to perform on your data Read more about Azure Data Factory here In below example, we will demonstrate copy data activity from csv file stored in Azure Blob Storage to Azure SQL Database using Azure Data Factory Editor. Steps for Data Movement using Azure Data Factory: Step 1: Create Storage account and a container in Azure. Place file containing data into the container using Azure Explorer or similar tool   Step 2: Below image shows csv file content and same placed in Azure container using Azure Explorer   Step 3: Create an Azure SQL Database to store output data   Step 4: By connecting SSMS to Azure SQL Database, we can create output table in Azure SQL Database   Step 5: Now go to new Azure Portal i.e. portal.azure.com and create a new Data Factory as shown   Step 6: We need to create 3 things to start data movement. Linked Services, Datasets and Pipeline. You can start creating by opening Azure Data Factory and click on “Author and deploy”   Step 7: First create linked service for Azure SQL Database and then for Azure Blob Storage   Find the JSON code for linked service given below: { “name”: “AzureSqlLinkedService”, “properties”: { “description”: “”, “hubName”: “adfcf_hub”, “type”: “AzureSqlDatabase”, “typeProperties”: { “connectionString”:”Data Source=tcp:qbozi5org6.database.windows.net,1433;Initial Catalog=adfcfs;Integrated Security=False;User ID=cfadmin@qbozi5org6;Password=**********;Connect Timeout=30;Encrypt=True” } } } For Azure Blob Storage: { “name”: “StorageLinkedService”, “properties”: { “description”: “”, “hubName”: “adfcf_hub”, “type”: “AzureStorage”, “typeProperties”: { “connectionString”: “DefaultEndpointsProtocol=https;AccountName=adfcfsstorage;AccountKey=**********” } } } Step 8: Now create datasets for source as well sink For Azure SQL Database { “name”: “OpportunitySQLTable”, “properties”: { “structure”: [ { “name”: “OpportunityName”, “type”: “String” }, { “name”: “Status”, “type”: “String” }, { “name”: “EstimatedRevenue”, “type”: “String” }, { “name”: “ContactPerson”, “type”: “String” } ], “published”: false, “type”: “AzureSqlTable”, “linkedServiceName”: “AzureSqlLinkedService”, “typeProperties”: { “tableName”: “Opportunity” }, “availability”: { “frequency”: “Hour”, “interval”: 1 } } } For Azure Blob Storage { “name”: “OpportunityTableFromBlob”, “properties”: { “structure”: [ { “name”: “OpportunityName”, “type”: “String” }, { “name”: “Status”, “type”: “String” }, { “name”: “EstimatedRevenue”, “type”: “String” }, { “name”: “ContactPerson”, “type”: “String” } ], “published”: false, “type”: “AzureBlob”, “linkedServiceName”: “StorageLinkedService”, “typeProperties”: { “fileName”: “Opportunity.csv”, “folderPath”: “adfcontainer/”, “format”: { “type”: “TextFormat”, “columnDelimiter”: “,” } }, “availability”: { “frequency”: “Hour”, “interval”: 1 }, “external”: true, “policy”: {} } } Step 9: Create a pipeline. Find the JSON code below { “name”: “ADFDataCopyPipeline”, “properties”: { “description”: “Copy data from a blob to Azure SQL table”, “activities”: [ { “type”: “Copy”, “typeProperties”: { “source”: { “type”: “BlobSource” }, “sink”: { “type”: “SqlSink”, “writeBatchSize”: 10000, “writeBatchTimeout”: “60.00:00:00” } }, “inputs”: [ { “name”: “OpportunityTableFromBlob” } ], “outputs”: [ { “name”: “OpportunitySQLTable” } ], “policy”: { “timeout”: “01:00:00”, “concurrency”: 1, “executionPriorityOrder”: “NewestFirst” }, “scheduler”: { “frequency”: “Hour”, “interval”: 1 }, “name”: “CopyFromBlobToSQL”, “description”: “Push Regional Effectiveness Campaign data to Azure SQL database” } ], “start”: “2015-11-17T08:00:00Z”, “end”: “2015-11-17T09:00:00Z”, “isPaused”: false, “pipelineMode”: “Scheduled” } } Step 10: Now go back to your Data Factory editor and you can see status of different linked services, datasets and pipeline created   Step 11: Click on “Diagram” and check the status of slices scheduled for data movement   Step 12: Once in ready status, you can go back to Azure SQL Database and check if data has been copied/moved.  

Share Story :

Date/Time fields in Microsoft Dynamics CRM 2015 Update 1

Posted On November 16, 2015 by Admin Posted in

The Date and Time data type is used in many times such as Project Start Date, Project End Date, Date of Birth, anniversaries etc. Before the Update, CRM stored all date and time values with the user’s local time zone information. This included User Local format.  The system converted the users’ local time zone to Coordinated Universal Time (UTC) for backend storage and then converted the date and time back to the local time zone for display on forms. When Date and Time data type is selected you can select different Behaviour in CRM 2015 Update 1. The need for such update is sometimes date only format caused confusion basically for Birthdays, anniversaries.  Users would sometimes see a different day displayed depending on their local time zone. The different Behaviours available in CRM 2015 Update 1 are as follows User Local Date Only format always sets time to 12:00 am Time-Zone Independent format saves in UTC on the backend without converting the time-zone This date/time behaviour are not present in On-premises installation.   1. User Local Behaviour The field values are displayed in the current user’s local time. The Format available for User Local Behaviour is Date Only and Date and Time.   Here Project Start Date field is set as User Local   You can change the custom entity field’s behaviour from the User Local to Date Only or to Time-Zone Independent. Before changing the behaviour of date and time field, review all the dependencies, to ensure thet there are no issues as a result of changing  behaviour.   2. Date Only Behaviour The field values are displayed without the time zone conversion. The time portion of the value is always 12:00AM. The date portion of the value is stored and retrieved as specified in the UI and Web services. The Date Only behaviour can’t be changed to other behaviour types, once it’s set. The Format available for Date Only Behaviour is Date Only.   “Date of Birth” field is set to Date only field.   3. Time-Zone Independent behaviour The Time-Zone Independent behaviour can’t be changed to other behaviour types, once it’s set. The concept of a time zone isn’t applicable to this behaviour. The field values are displayed without the time zone conversion. The date and time values are stored and retrieved as specified in the UI and Web services The Format available for Time-Zone Independent Behaviour is Date Only and Date and Time.   “Project End Date” field is set to Time Zone Independent Behaviour   On change of behaviour effect on existing records 1. When Date of Birth field behaviour is changed to Date only:   The Date of Birth field shown here is 9/15/2015 as entered by the user. When this field is retrieved the date/time value returned will be 9/14/2015 06:30:00 PM. Thus the date of birth visible and retrieved is different. In the database UTC date/time is set but in UI it appears different. This value is when the behaviour is user Local behaviour.   Retrieve the value of the Date of Birth field and you will get 9/14/2015. Here the date/time value set will be 9/14/2015 12:00:00 AM. When the record was originally created the date entered was 9/15/2015 and I would have expected it to show as 9/15/2015. However, since the change has not been made to the existing records for this field in database, they are still stored in the original UTC format along with the time. It appears that the UTC date/time is picked and the time part is being set as 12:00:00 AM. Thus before changing, make sure about the dependent fields.   2. When Date of Birth field behaviour is changed to Time Zone Independent   Here the date and time filed visible is 11/16/2015 8:00 AM. When you retrieve this field the value is 11/16/2015 2:30 AM. Thus the date of birth visible and retrieved is different. In the database UTC date/time is set but in UI it appears different. This value is when the behaviour is user Local behaviour.   Retrieve the value of the Date of Birth field and you will get 11/16/2015 2:30 AM. When the record was originally created the date entered was 11/16/2015 8:00 AM and I would have expected it to show as 11/16/2015 8:00 AM. When the field behaviour is changed to time zone independent the time is been changed and data stored in database is picked.  

Share Story :

Enhanced Business Process Flow

Posted On November 5, 2015 by Admin Posted in

Enhanced Business Process Flow 2015 Business Process Flow was introduced in Microsoft dynamics CRM in 2013. Business Process Flows in CRM guides users through each step in a defined process to clearly see what steps have been completed, and what needs to happen next. Each stage in a Business Process Flow can be configured to include fields that the business would like to have completed for that stage. This list of fields can include any field available on that entity. These fields can also be represented elsewhere on the form. However, in CRM 2013 Business Process Flows were subject to several limitations Strictly Linear Process: Business processes are designed to work only in linear manner, no branching was allowed. Cannot Revisit The Entity More Than Once: Cannot visit the same entity again in single business process flow. No programmability support Enhanced Business Process Flow in CRM 2015 The improvement made to Business Process Flow functionality is the ability to deploy conditional logic within a defined process, (called Branches). Steps and stages can be configured easily where branching rules are defined. Selection of entity relationships – It can be optional. Branching supports – Single entity, cross entity and also supports multiple entity loops. Entity used in Business Process Flow can be revisited multiple times. Programmability through client API: Programmatically updates process state and hooks on to process events. Let’s implement these improvements in an example, consider a scenario were EazyApp Is a facetious Company which sells Software as a product. The request EazyApp receives on daily basis gets classified as leads, some of them get Qualified if they are further interested in evaluating the trial of the software and also the budget amount is greater than or equal to $5000, if not, then the lead gets disqualified. Once a lead Is interested in Trial and the budget amount is greater than or equal to $5000, the lead gets Qualified and an Opportunity gets created. Once the trial is completed, a new question is asked whether a Quote is required. If no, we close the Opportunity but if it is a yes, we create a quote and deliver the quote. After that, we offer maintenance for the product, if the Opportunity is interested in maintenance, we update the quote, and otherwise we close the opportunity. The following diagram shows a business process flow with branches.   Before Designing Business process flows with Branches for EazyApp take a note of the following Information: A process can span across a maximum of 5 unique entities. You can use a maximum of 30 stages per process and a maximum of 30 steps per stage. Each branch can be no more that 5 levels deep. Branching rule must be based on the steps in the stage that immediately precedes it. You can combine multiple conditions in a rule by using the AND operator or the OR operator, but not both operators. An entity used in the process can be revisited multiple times (multiple closed entity loops). You can go back to the previous stage regardless of an entity type. For example, if the active stage is Deliver Quote on a quote record, you can move the active stage back to the Propose stage on an opportunity record. Only one active process per a record is possible. The stages can be reordered using the MOVE UPor MOVE DOWN arrows within the branch. The stages can’t be moved from one branch to other branches. Let’s look at the example of the business process flow with branches, for EazyApp selling Software as a product. First, we’ll create a new process named EazyApp Business Process Flow. Go to Settings > Processes. Specify the Category as Business Process Flow and for the primary Entity choose Lead. Add the first stage to the process called Qualify and add steps Purchase Time frame and Is Interested in Trial.     After the common Qualify stage, we split the process into to two separate branches, by using the If-Else clause. Notes To add the first branch for a stage, choose Add branch below the stage and specify the If condition. To add the second branch for the same stage, choose Add branch again, below the same stage. The Else clause will be displayed. You can choose Else, to convert it to Else-If, if you have more than two branches from the same stage, or if you want to enter a branch only when certain conditions are satisfied. Choose the green square + (plus) button under the branching rule, to add another condition to the rule. Choose the + Insert stage button to insert a stage at the beginning of the branch. If the Is Interested Trial = Yes & Budget Amount is greater than or equal to $5000 the process branches out to the Trial stage, otherwise, it jumps to the Close stage, in the second branch, as shown below. As you can see above, it will apply ‘if’ criteria to check condition, where we can specify field criteria and save the condition. Once it is saved, it will allow you to insert stage on that basis as shown below. It also allows to add else condition if the criteria to flow the process in another direction is not fulfilled. You can combine multiple conditions using and/or branching techniques as shown above.     Here, it will also allow you to set relationship with another entity. If no relationships exist, then you can set it to none. Likewise, you can create complete Business Process Flow and it can be used as shown below:     Likewise, you can create a complete Business Process Flow and it can be used as shown below: As you can see below, it currently shows only two stages ‘Qualify’ and ‘Close’ and the step ‘Is Interested in Trial’ shows value as No. If you select ‘Is Interested in Trial’ as yes & Budget amount more than or equal to $5000, it will change the flow with additional stages as shown below:   Similarly, after qualifying the lead, it will go to another stage where if you select ‘Quote required’ as yes, it will show some additional stages as ‘Deliver Quote’, ‘Offer Maintenance’ etc. as shown below: This is how you can include business process flow … Continue reading Enhanced Business Process Flow

Share Story :

Filters available in AX 2012

Posted On November 2, 2015 by Admin Posted in

Filters plays a very important role in getting the data in a fast and easy manner. AX is designed in such a way that the filters can be used on all the forms. The user can filter the data by providing syntax in the filter field available on the form or by clicking Ctrl+ G. When the user clicks Ctrl+G, a new row is created below the column header of the grid. The following filtering and query options are available when you use embedded filters or queries. Syntax – Value Character Description – Equal to the value entered Description – Type the value to find. Example – Alex finds “Alex”.   Syntax – !Value Character Description –  Not Equal to the value entered Description – Type an exclamation mark in front of the value to exclude. Example –!Alex finds all values except “Alex”   Syntax – From-value..To-value Character Description –  Between the two values entered separated by double periods. Description – Type the From value, then two periods, and then the To value. Example – 10..30 finds all values from 10 to 30.   Syntax – ..value Character Description –  Less than or equal to the value entered Description – Type the two periods and then the value. Example – ..50 finds any number less than or equal to 50.   Syntax – .. Character Description –  Greater than or equal to the value entered. Description – Type the value and then the two periods Example – 50.. finds any number greater than or equal to 50.   Syntax – >value Character Description –  Greater than the value entered. Description – Type a greater than sign (>) and then the value. Example – >20finds any number greater than 20   Syntax – <value Character Description –  Less than the value entered. Description – Type a less than sign (<) and then the value. Example – <50 finds any number less than 50   Syntax – value* Character Description –  Starting with the value entered. Description – Type the starting value and then an asterisk. Example – S* finds any string that starts with S   Syntax – *value Character Description –  Ending with the value entered. Description – Type an asterisk and then the ending value. Example – *ltd finds any string that ends with ltd.   Syntax – *value* Character Description –  Contains the value entered. Description – Type an asterisk, then a value, and then another asterisk. Example – *pvt* finds any string that contains pvt.   Syntax – ? Character Description –  Having one or more unknown characters. Description – Type a question mark at the position of the unknown character in the value. Example – Sa??abh finds “Saurabh”   Syntax – value,value Character Description –  Matching the values entered separated by commas. Description – Type all your criteria separated by commas. Example – 30,80 finds exactly “30” and “80”.   Syntax – T Character Description –  Today’s date Description – Type “T” Example – Enter T and click Tab/Enter, it will bring todays date. The above provided syntax are very useful in day to day activities in AX. User can use the syntax in combination also, by using “&” syntax.  

Share Story :

Connect to an external database using X++ code in AX 2012

Posted On November 2, 2015 by Admin Posted in

Below are the steps to be performed, To connect to an external database using ODBC connection, first create a DSN. To create a DSN refer the link https://support.microsoft.com/en-us/kb/300596 Write the below code: Note : The database login credentials are given in the DSN. You can use windows credentials or provide SQL credentials, if required. In the above code, the below lines of code will initialise the connection to the database. sq = new SqlSystem(); loginProperty = new LoginProperty(); loginProperty.setDSN(/*your dsn name here*/); loginProperty.setDatabase(/*your database name here*/); oDBCConnection = new ODBCConnection(loginProperty); The below lines of code will execute the query and store the data in a resulSet. sqlStmnt = strFmt(“select * from ABC”); //write your query here statement1 = odbcConnection.createStatement(); perm1 = new SqlStatementExecutePermission(sqlStmnt); perm1.assert(); myResult1 = statement1.executeQuery(sqlStmt1); We can also execute the stored procedures in the database. Write the below code to do so. odbcConnection = new OdbcConnection(loginProperty); sqlStmt1 = strFmt(“exec [Stored Procedure name]”); perm = new SqlStatementExecutePermission(sqlStmt1); perm.assert(); statement1 = odbcConnection.createStatement(); statement1.executeUpdate(sqlStmt1); CodeAccessPermission::revertAssert(); Suppose you execute more than one stored procedure in the same code, you may get below error. “SQL error description: [Microsoft][ODBC SQL Server Driver]Connection is busy with results for another hstmt” To eliminate this error, write below statement after every stored procedure is executed, statement1.close(); If there are multiple external databases on different servers, then you can create separate DSN to connect to each server. This connection information can be stored in a master table containing the DSN name, the database name. The X++ code can then use the connection details from the master table as shown below. loginProperty = new LoginProperty(); loginProperty.setDSN(masterTable.databseName); loginProperty.setDatabase(masterTable.databseName);  

Share Story :

Azure setup using Office 365

In this blog we walk-through how to setup Azure using Office 365. Pre-Requisite Office 365 administrator account. Steps 1. Login to Office 365 portal. Navigate to https://portal.office.com 2. Click on Admin Button 3. Click on Azure AD to setup Azure. This will link your Azure AD to Organization account. Note: Don’t use admin account to setup Azure AD, instate of that you can use client account. Once Azure AD is setup, Account administrator cannot be changed. Fill required details to setup Free Azure Trial Account. Note: Credit card is required for Azure Sign-Up. After sign-up process is completed , navigate to https://manage.windowsazure.com to access Windows Azure.  

Share Story :

Power BI updates

Posted On October 26, 2015 by Admin Posted in

Purpose of this blog is to showcase latest Power BI updates. Prerequisite: Power BI Desktop Tool, Power BI Personal Gateway and Power BI online service account. Purpose of the setup: To learn about latest Power BI updates from October month. Here we will be looking the below mentioned Power BI updates; Read-only Members in Power BI Groups Semi-select support for DAX formulas in Data view   Read-only Members in Power BI Groups: Power BI have introduced a very good feature in Power BI online Service using which the user can create group in Power BI Online account with multiple users and these users can be assigned role as ‘Admin’ or ‘Member’.Let’s have a look. With such Power BI groups the user and his colleagues can come together to collaborate, communicate and connect with their data. The user can create a group either in Power BI or Office 365. Then, the user can invite co-workers into this group workspace where he can collaborate on his organization’s shared dashboards, reports and datasets. Up until now, all members had all the rights. Many of us required read-only membership to the groups so that the users you want to keep up to date but not have edit permissions on our dashboards and reports. So, this update exactly helps us to do the same; let’s have a look on how to do this. Now to begin with, first we’ll create our group by clicking the “+” icon near Group Workspaces. Now let’s add a few users to our group. FYI, Power BI Groups have two roles: Admins and Members. Now here, we will set one user to Admin and the remaining users to Members. We will click on Save to create the group. Now, let’s take a look what difference take place when an Admin and a member login to Power BI Online account and use the created group. As you can see, when person with Admin rights, login, than that user has right to ‘Explore’, ‘Schedule Refresh’, ‘Rename’, ’Delete’ etc. options available. Whereas, when a person with member rights login can only view the reports and Dashboard. Member of the group do not have the Dataset access and thus, cannot perform any edit operation. Thus, now we have option introduced for us to quickly switch a user from Member to Admin and vice-versa which will be very helpful.   Semiselect support for DAX formula in Dataview : Power BI have introduced another very good feature in Power BI Desktop Tool using which the user can create new column using DAX function by directly clicking on the column. Let’s have a look First we have to open the Dataview Screen. Now, before this update user had to manually type the table and the column name to write the DAX function. With this update the user can directly click on the column name and create the DAX function as per the requirement.  

Share Story :

Creation of ACS and SAS in Azure

ACS is an Azure service that provides an easy way to authenticate users to access web applications and services without having to add complex authentication logic to code. While SAS is used to access resources in storage account which includes both primary and secondary keys. Assumptions Azure Account should be added in PowerShell with respective User’s Credentials. Note: For Adding account In Microsoft Azure PowerShell refer to following link: https://www.cloudfronts.in/azure-console-login-logout-using-azure-powershell/ Steps in Microsoft Azure PowerShell for ACS   Step 1: Write ACS Command in PowerShell ACS Key can be created using Azure PowerShell following command, New-AzureSBNamespace GravityDocument -Location “Southeast Asia” -CreateACSNamespace $true -NamespaceType Messaging Command requires Service Bus Namespace Name, Location and Messaging type.     Step 2: ACS Information on Azure Portal This ACS Key information can be seen on Microsoft Azure Account with that corresponding service bus namespace provided in the command above.   Once the Namespace is created the corresponding Connection Information is available at the bottom under Connection Information. Steps in Microsoft Azure for SAS Here for SAS Key we have created Queue inside the namespace.   Step 1: Creation of Queue Now Queue can be created inside this specified Namespace, for that follow the below screenshots       Specify the required details i.e. the Queue name under the specified namespace.   Step 2: Key with Permissions Now since the queue is created SAS key can also be generated with different permissions like Manage, Listen & Send. So under Configure option, under Shared Access Policies specify the name and permission to be given for that particular queue.       Now SAS key for that particular queue can be obtained from Connection Information of Queue with SAS key.   Conclusion Thus, we can Create ACS and SAS requests as per requirements using Microsoft Azure PowerShell and Azure Portal.

Share Story :

SEARCH BLOGS:

[gravityform id="36" ajax="true"]

FOLLOW CLOUDFRONTS BLOG :