Blog Archives - Page 11 of 151 - - Page 11

Category Archives: Blog

Reduce Storage Usage for Business Central using Data Administration

Introduction By default, Business Central comes with 80GB of storage capacity across three sandbox environments and 1 Production Environment with an additional 3GB/Premium License, 2GB/Essential License, 1GB/Device license. These storage limits depending on your Business volume may run out if the data is not managed properly. Business Central now comes with a one stop view where you can manage (compress or delete) the entries to reduce storage usage – “Data Administration.” Pre-requisites Business Central Cloud/On Prem References Manage Storage by Deleting Documents or Compressing Data – Business Central | Microsoft Learn Configuration In Business Central, we’ve had the option to view the capacity usage from the Admin Center for a while now. Recently, they’ve also added a one stop view to check and manage the capacity usage – Data Administration from within Business Central itself. It can be found directly from the global search. The first time we open this we are greeted with an empty view, the data is loaded after we click on refresh to load the latest data. You can also configure it so that the data is loaded automatically in the background every so often. Here, we get the options for Data Clean up where we can delete data that isn’t required anymore. All of the below options, open a similar processing report where you can set filters which are used to delete the records as needed. The “Delete Detached Media” opens another page which I’ve discussed in depth in another blog. The second action groups hold actions which are meant to compress the ledger entries which can drastically reduce the storage space used. It is important to note that you can only compress entries which are older than 5 years by yourself which belong to Fiscal years that are closed and the entries themselves are closed (Open is set to false). You can configure the compression such that there is one entry per day, one entry per week, one entry per month, one entry per quarter, one entry per year or one entry for the period that is defined for compression. You also have the functionality to delete empty registers from here. If these individual actions seem to be overwhelming, Microsoft also provides for a Data Administration wizard which simplifies this process and allows you to manage the capacity via a wizard. Conclusion Thus, we saw how we can use the standard data administration tools to manage capacity of Business Central environment which can help the system run much more efficiently in terms of both performance and costs. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to create: Azure Blob Storage, Container and Blob

Posted On May 14, 2024 by Bhavika Shetty Posted in Tagged in

Microsoft Azure provides a cloud-based storage service called Azure Blob Storage. It is made up of Blobs, which are files kept as individual units and arranged like folders inside of Containers. Uses of Azure Blob Storage include: Steps to Create Azure Blob Storage STEP 1: Access the Azure Portal. Before proceeding, please confirm that you have a subscription. You will already have a FREE TRIAL membership for one month if you made a free account for the first time. Note: To learn more about how to obtain a free Azure account, click on Azure free account to create Free Trial Account. STEP 2: Setting up the “Storage Account” is the first and most important step in generating Blob Storage. Go into the Azure interface and select “Storage Accounts” to start the creation process. STEP 3: After clicking on Storge Account, the following screen will appear and then click on ‘+ New‘ to proceed further. STEP 4: After selecting New, you’ll be prompted to provide the following information on the following page: After you have entered all the information, click “Create.” Step 5: As seen in the sample below, an Azure Storage Account offers four different kinds of redundancy storage. For the demo, Geo-redundant Storage (GRS) will be used. Step 6: The next screen displays the deployment status when you click the “Create” button. Once deployment is finished, select “Go to resource.” Steps to Create Container STEP 1: Now, we have to create a new Container for that click on ‘+ Container ‘. Step 2: After selecting Add Container, a form requesting the container’s name (which must be unique) and access level will appear. We have chosen Blob Public Level access for the demo. Select “Create” to continue. Step 3: As a result, the blob storage has been effectively constructed, the container named as demo. Steps to Create Blob Step 1: Click on the container demo. Step 2: Under overview blob can be uploaded. The connection string can then be found by selecting the Storage Account and clicking on “Access keys.” These Connection Strings are used to communicate with the Storage Account. Conclusion Azure Blob Storage features integration with other Azure services, built-in security safeguards, and accessibility through a variety of tools and APIs. Azure Blob Storage is a solid and affordable solution for companies looking to store and manage unstructured data in the cloud thanks to these benefits, Azure Blob Storage is a powerful and flexible cloud storage option that offers several benefits. With many storage layers, it provides an enduring and scalable storage solution to satisfy the demands of diverse applications in terms of both cost and performance. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to Setup Alternative Unit of Measure in Business Central

Introduction:  Many Businesses supports buying and selling items with different unit of measures. How to configure this in Business Central without showcasing your inventory in decimals. Let’s take an example: We will be buying the goods always in dozen whereas we can sell the items in PCS or CARTONS. Steps to achieve the goal: 2. Once the base unit of measure is set to PCS. Go to the Item Unit of Measure page by click on Related-> Unit of Measure. 3. By default PCS line would be set to 1. Add new line enter DOZEN and next right 6 which means 1 dozen has 6 PCS. 4. Next line add CARTONS and enter the qty per unit of measure as 72. which means 1 Cartons has 72 PCS in it. 5. Once the above lines have been set. Let’s pass Item journal where will be increase the inventory by dozen and sell the items in PCS and CARTONS. Ideally the inventory would be shown without any decimal value. 6. Below is the explanation how system would calculate into PCS as when you buy and sell in different items. 7. Purchase 100 dozen which is 600 PCS. This would be converted by Business Central itself. As we have defined 1 dozen as 6 PCS. 8. Sales 7 Cartons which is 702 PCS. This would be calculated by Business Central itself as per the Item Unit of Measure configuration where 1 cartons is 72 PCS. 9.Once the above transaction is posted the inventory would be whole value without any decimal. Conclusion:    Thus, we saw how we can use alternative Unit of measure in Business Central. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Advance warehouse management – Load and Location Directives in Microsoft D365 F&O – Part 3 

Hello everyone, in this series of Blog, we are going to learn about the Advance warehouse management in D365.  In this blog we will learn about the basic setups required for the Advanced Warehouse Management process. These setups may vary depending on the business scenarios. As a continuing blog from https://www.cloudfronts.in/blog/advance-warehouse-management-item-creation-part-1/ and https://www.cloudfronts.in/blog/advance-warehouse-management-warehouses-and-locations-in-microsoft-d365-fo-part-2/, this is will be a part 3 of the series. The following are the setups that we need to configure:  Load:  The loads are useful when we group multiple shipments. So, you can consider load as an object that will be used to transport the material.  Path:   Warehouse Managementà SetupàLoadà Load Templates  I have created two containers here as a load. One is a 20 ft container, and the other is a 40 ft container.  Location Directives:  The Location Directive plays a significant role in inventory movement in advanced warehouses. Location Directives are the set of rules which define the pick and put, Counting, License Plate building, Status change and Quality check etc. for individual warehouse or group of warehouses.  For my current scenario, I will create location Directive for a Purchase Order transaction. In further blogs I will write about other transactions as well.  Select the Work Order type as Purchase Order.  Select the work type as “Put.” For the receipt location, I have mentioned the default receipt location in the warehouse master.  For default receipt location, Go to Warehouse managementà Setupà Warehouse à Warehouses.  Select the default receipt location from the Drop down.  Select the Warehouse for which this “Put” rule going to work. You can group the warehouse and select the warehouses to work similar to the rule.  In the lines I have mentioned the from Quantities and to quantities.  For location directive action, I am using “Only fixed locations for the product.” By this, the system will ask for the location while Putaway operation in the Purchase Order.  Now, the Loads and location directives are ready to use in Advance Warehouse process.  That’s it for this blog!!  How to use these Loads and location directives in actual transaction will be discussed going forward in the blog series.  Keep learning!!!!!  Next in the Blog series:  How to create Work Classes and Work Templates in Advance warehouse management in D365.  How to set up Worker in Advance warehouse management in D365. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

Item Availability Overview – A quick glance at the Item’s Inventory levels

While going through some sales documents, I noticed that the page that appears when I click on “Show Details” in the notification for low inventory has been updated! When we click on “Show Details” now, we’re taken to the page named “Item Availability Check”. Furthermore, it includes options to directly create a Purchase Order or a Purchase Invoice from this page.  If a Vendor is specified in the “Vendor No.” field of the Item Card, the Purchase Order/Invoice is automatically generated with that Vendor. In the scenario where multiple vendors are selected in the Item Vendor Catalog instead of the Vendor No., all the vendors are displayed, and the one selected by the user is utilized to create the Purchase Order/Invoice. In both cases, the Purchase Line will reflect the shortfall as the Quantity. If the Item has any substitutes available then the “Substitute Exists” indicates the same and clicking on it opens the Item Substitutions page. Further, if you click on the “All Locations” then the “Item Availability by Location” page is opened. That’s all! Just wanted to share something new I learned recently. We hope you found this article useful, and if you would like to discuss anything, you can reach out to us at transform@cloudfronts.com

Share Story :

How to cancel Transfer Order Shipment in D365 Finance and Operations?

In D365 Finance and Operations, managing transfer orders is a crucial aspect of maintaining efficient inventory operations. However, there may be instances where you need to cancel a transfer order shipment due to various reasons such as changes in demand, inventory discrepancies, or operational adjustments. In such cases, it is essential to understand the process of cancelling the transfer order shipment to ensure accurate inventory management and smooth operations. Here are the steps to cancel the transfer order shipment in D365 Finance and Operations: Go to Inventory Management>Out Bound Orders>Transfer Order. Here, I have already created the Transfer Order which is in Created State. Here, I am transferring the Items from (Warehouse W1 to Warehouse W2). The next step is to ship the Transfer Order. In the below screenshot you can see that the Transfer Order has been Shipped. As the Transfer Order is shipped the following Transactions are posted. Now to Cancel the Transfer Order Shipment. In the Transfer Order tab click on Transfer Order History which is under the View Action tab. Here you can see the Transfer Order Shipment. Select that and then click on the Cancel Button from the top of the screen. By clicking this the system will automatically reverse the Transaction. You will see that a Reverse entry is posted with negative quantity and there will be a right tick under the Cancelled Shipment column. Now if I go to Transactions, I can see that the entry has been reversed and the Transfer Order status has been reverted back to Created. Note: The Transfer Order which has been received cannot be Reversed by this process. Transfer Orders which are only in Shipped state or transfer orders where in Shipment has been sent can only be cancelled or Reversed. Depending on your system configuration you may need to manually adjust the Inventory. That’s it for this blog! Hope that helps, thank you for reading!!

Share Story :

Azure Blob Storage: Features, Benefits, And Usage

Posted On April 16, 2024 by Bhavika Shetty Posted in Tagged in

Azure Blob Storage Overview   These days Data is viewed as the most curial aspect of doing business because it is used to draw insights, take business decisions & plan future business strategy by understanding targeted audience behavior. In this entire process mostly in organizations data is coming from different sources it could be expensive to store and a challenge to manage as most of the data is unstructured, to tackle this situation organizations can consider opting for a blob storage account as it provides comprehensive support for unstructured data workload on a single modern platform.   By using blob storage, a company can store massive amounts of data inexpensively and make the most of what you have as it is scalable, durable, secure and capable of handling workloads which means it can meet any capacity requirement.   It allows us to protect and manage data with ease so the company can store binary.   data and application data, videos, audio files and anything knowing that business and data is well secured blob storage is built from the ground up to support scale security and availability requirements needed by mobile web and cloud need of application developers and we do so by using most popular developer frameworks.   Types of Blobs – There are three types of blobs supported by Azure Blob Storage which can be chosen based on the requirements:   Types of Blob Storage Access Tiers – Benefits of Blob Storage – Components of Azure Blob Storage – The following are the minimal technical needs to use Azure Blob Storage:  Azure Blob Storage Integration options with other systems – A few instances of how Azure Blob Storage can be connected to other Azure ecosystem services and systems. You can utilize Azure Blob Storage as a scalable and adaptable storage solution and investigate other integration options based on your unique requirements and use cases.  Conclusion – Azure Blob Storage is a powerful and flexible cloud storage option that offers several benefits. With many storage layers, it provides an enduring and scalable storage solution to satisfy the demands of diverse applications in terms of both cost and performance. In addition, Azure Blob Storage features integration with other Azure services, built-in security safeguards, and accessibility through various tools and APIs. Azure Blob Storage is a solid and affordable solution for companies looking to store and manage unstructured data in the cloud.  In the next blog, we will explore Azure Blob Storage in detail, which covers:  How to create Azure Blob Storage Components such as storage account, container, and blob. 

Share Story :

Actionable Error Messages in Business Central

Introduction Error handling is an important concept in every technical field.  It helps programs deal with unexpected problems and mistakes smoothly.  It makes sure software works reliably and doesn’t crash unexpectedly.  Error handling also helps developers find and fix issues quickly, making the software better for users. Plus, it gives users clear messages when something goes wrong, making their experience smoother. It shows that the team has considered the scenario and has measures in place for it indicating a well designed solution. Microsoft has an amazing document which lists the things to keep in mind for writing resilient code. In Business Central, we have try functions to handle errors and error function to show those errors to the Users.  In this blog, we’ll learn how we can enhance the error messages so that the Users can resolve the errors themselves or at the very least we can point them towards where the error is. Pre-requisites Business Central OnPrem/Cloud References Actionable errors Try Methods for Error Handling Robust Coding Practises Error Info – Business Central Docs Explanation Before we get to the code, let’s set a little context. For Error Handling, Microsoft has two categories in Business Central, ErrorInfo is a data type used for error handling and reporting.  It can be used to hold information about errors that occur during the execution of code. It has additional properties and actions that can be used to define it’s behavior to the end-user. The ones that are most useful as – The “Add Action” procedure takes a codeunit and a method name as input. To pass input into this procedure, we add an “ErrorInfo” object as a parameter to the function and if we want to specify some details of the record where the error is happening or where the fix is to happen, we can use the following procedures. The “Add Navigation Action” only takes a method name as an input. So, to tell the action which page and which record to open we have the following procedures. If you are passing the Page No. and System Id to the procedure which handles the error then the same can be accessed there as well. Code Here, I’ve taken a sample scenario where the value of one field depends on the value of another field on the Sales Order. Basically ;-   I’ve set it up so that these validations are triggered when the Sales Order is posted. And the same thing goes for the “Not Blank” scenario so I’m not writing it for now. So, if I try the second scenario where Type is “Blank” and Field has some value then we get the following error message. If I click on the “Copy Details” I can see the detailed message that I added for this Error Info. If I click on the “Make Mandatory Field Blank” action then I can make the “Some Important Field” as blank. The code behind the action – “Make Mandatory Field Blank” is as follows-  I’ve used messages to confirm that the values that I passed during the origin of the error are flowing into the procedure. Here are the messages –  Now, some of you might be wondering, if this was a error message where one field was dependent on another then it should’ve been a validation. And yes!  That is correct and here is how it would look. Here, I’ve used both “Add Action” and “Add Navigation Action” on the ErrorInfo. For the Parameters, all of the parameters are pointing to the Customer. This opens the Customer Card for the specified Customer. Conclusion You can refer the “Actionable Errors” documentation for the best practises and patterns for which type of actionable error to use and where to use it. Thus, we learned how to utilize actions within error messages in Business Central to assist users in resolving errors more effectively. We hope you found this article useful and if you would like to discuss anything you can reach out to us at transform@cloudfronts.com. 

Share Story :

What is “Database Wait Statistics” in Business Central?

Introduction: “Wait” typically refers to the amount of time during which a database session waits for an event to complete before it can proceed with execution. Waits can arise for many reasons in a database system, and understanding them is important for  tuning and optimizing performance. References: Explanation: Waits, in SQL, are broadly categorized into three categories: Resource Waits: These happen when a worker needs access to a resource like data or system resources, but it’s not available because another worker is using it.Examples include waiting for locks, system latches, or for data to be read from the network or disk. Queue Waits:  Occur when a worker is waiting for a task to be assigned to it.Think of it like waiting in line for a job to do.This commonly occurs with system tasks like deadlock detection or cleaning up deleted records.Even if there’s no immediate task, workers might still check periodically. External Waits:  This occurs when a worker is waiting for something outside the SQL Server environment to finish, like a call to an external procedure or a query to a linked server.It’s important to note that just because a worker is in external wait doesn’t mean it’s idle; it might be actively running external code. In context of Business Central, we see the following “wait” types: Buffer IO: This type of wait occurs when a database session is waiting for data to be read from or written to the buffer cache, which is an area of memory used to cache data pages from disk. Buffer Latch: Buffer latch waits happen when a session is waiting to acquire a latch on a buffer in memory. Latches are used to protect access to in-memory data structures, and buffer latch waits can occur when multiple sessions are contending for access to the same buffer. Compilation: Compilation waits occur when a session is waiting for a SQL query or stored procedure to be compiled and optimized by the database engine. CPU: CPU waits occur when a session is waiting for CPU resources to become available for query processing. Idle: Idle waits occur when a session is not actively performing any work and is waiting for something to do. Latch: Latch waits, as mentioned earlier, happen when a session is waiting to acquire a latch on a data structure in memory. Lock: Lock waits occur when a session is waiting for a lock on a resource that is held by another session. Memory: Memory waits occur when a session is waiting for memory resources to become available. This can include waits for memory allocations, deallocations, or other memory-related operations. Network IO: Network IO waits occur when a session is waiting for data to be sent or received over a network connection. Other: This category typically includes waits that don’t fit into the other specific categories listed. Other Disk IO: This is similar to Buffer IO waits but encompasses other disk-related operations beyond just buffer reads and writes. Parallelism: Parallelism waits occur when a session is waiting for other parallel threads to complete their tasks. Preemptive: Preemptive waits occur when a session is waiting for an external operation to complete, such as an operating system call. Service Broker: Service Broker waits occur when a session is waiting for a message to be sent or received via the Service Broker feature in SQL Server. SQL CLR: SQL CLR waits occur when a session is waiting for a Common Language Runtime (CLR) operation to complete. Tran Log IO: Transaction Log IO waits occur when a session is waiting for data to be read from or written to the transaction log. Transaction: Transaction waits occur when a session is waiting for a transaction to complete. User Wait: User waits are general-purpose waits that occur when a session is waiting for some user-defined event to occur. Worker Thread: Worker thread waits occur when a session is waiting for a worker thread to become available for query processing. Conclusion: Thus, we saw how we can use the “Database Wait Statistics” in Business Central to identify performance bottlenecks in the system. We hope you found this article useful and if you would like to discuss anything you can reach out to us at transform@cloudfronts.com. 

Share Story :

Use Database Access Intent List to Boost Performance in Business Central

Introduction For any Business Application, database replication is a necessity for the application to be highly available, fault tolerant and performant without any data throughput issues. Business Central too follows the database replication utilizing a technique known as “Read Scale Out” or “Leader/Follower or Master/Slave Replication Architecture”. Basically, the business operations(Codeunits, Pages, POST/PUT/DELETE API calls) which create the data in the system are relatively quick as compared to Analytical operations (Reports, Queries, GET APIs calls)which read a whole bunch of data from a lot of tables at once. So, in this case, performing both business and analytical operations on the same database can cause performance issues as tables can be locked by an analytical operation while a business operation tries or access or modify that data. A solution for this is using multiple copies of the database in a leader follower architecture.All the write transactions are directed towards the leader database which are then forwarded to the follower databases.All the read transactions can be forwarded to either the leader or the follower database. Please note that this all only happens for Production Environments. Sandbox environments only have the primary database. Side Note  If you’re wondering what happens when a User tries to read from a follower database before the leader database was able to send the updated information there (This is called a stale replica).  This is an accepted risk when using this architecture. According to CAP Theorem only two of the three properties, Consistency, Availability and Partition Tolerance can be guaranteed. Out of these, partition tolerance has to be tolerated as network failures are inevitable so most systems have to choose between Consistency and Availability. In most cases, RDBMS systems choose Consistency over Availability (as does Business Central) and most NoSQL databases choose Availability over Consistency. Pre-requisites Business Central Cloud/OnPrem References Explanation Setting the property DataAccessIntent to ReadOnly doesn’t guarantee that all the operations that a particular object does are going to be routed via the “replica database”. For example, consider a case where we are using a processing report to update a field on the Item table based on the calculations done using a Query object. Here, when the operation started, given that the processing report intents to update the Item table, the operation was forwarded into the Primary database, now when the Query is executed to fetch the generate the necessary value, the database is still going to be the Primary database. To summarize, the database is not switched in the middle of a transaction. For API Pages where we are only going to be fetching the data from Business Central, we have to set the API page’s Editable property as false and only then we can set the DataAccessIntent to ReadOnly.We don’t have this property for any other page types. For Reports, we can set the DataAccessIntent property directly and if it is a processing report that tries to make any modifications to the data then we end up with a run-time error. For Queries, we can set the DataAccessIntent property directly as well with the same conditions as the Report object but in effect, the only time queries benefit from the “replica database” is if they are used directly as APIs. Almost all the ODATA GET requests are directed to the “replica” database by default in Business Central on Cloud. In the On-Premise Environment, we have a setting “ODataReadonlyGetEnabled” that controls this behaviour. Further, there is a list page in Business Central “Data Access Intent List” which can be used to modify the Data Access List of any Page, Query or Report object. The Default Value indicates that the object should use the pre-defined value defined in AL. The same rules as above are followed when we update the “Data Access Intent” values in the Data Access Intent List page. Conclusion: Thus we saw how Business Central architecture uses the “read scale out” method to ensure consistency and availability and how we can leverage those to boost our application’s performance. Happy Coding!

Share Story :

SEARCH BLOGS:

[gravityform id="36" ajax="true"]

FOLLOW CLOUDFRONTS BLOG :