BIDA0057 – Business Central Beginning of Time GL Balances in PowerBI

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


Hello and welcome to our latest blog post.

I am Sofie your BIDA AI Assistant.

I will be reading this blog post for you today on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

If you are a business person using Business Central On Premise?

If you would like to get your Business Central data into world class Power BI dashboards so that you can improve the profit of your business?

Then you will want to watch this video.

We are using Business Central twenty twenty three in our demo.

This video is a follow up on the blog post BIDA Fifty Three and BIDA Fifty Four.

You can review those two blog posts on the buttons below if you would like.

BIDA0053 Blog Post

BIDA0054 Blog Post

Download Dashboard

BC Direct Query DB

BC Demo Database

BC023_DM03_01

So, as ever, on with the demo!

Demonstration

Here we are on one of our development machines.

In this demo we want to show you the end result of creating a Power BI dashboard from Business Central On Premise data.

You can now create any dashboard you want from the source data in your Business Central System.

If you can query the production database you can do all this for free.

The only thing extra you need to pay for are your Power BI licenses.

If your production Business Central System can not tolerate the query load being generated by these Power BI dashboards then you will have to create a replica database on premise or in the cloud.

Of course. That will require a server to run the database on and the SQL Server Standard Edition License.

You will not need the Enterprise Edition license for the replica as it is just a query database.

Here we are inside the GL Beginning of Time Balances Dashboard.

You can see I am inside the web page of the Power BI Service.

I am not browsing a local server in our office.

Now. I will just set the report to full screen.

You can see that the default is for 5 years of GL Account Balances.

Where there are no transactions as is the case in the demo database you will have dashes to represent a zero balance.

You can see from the dashboard that the first year there are transactions is twenty twenty three.

Now I will drill down on twenty twenty three.

When I drill down on twenty twenty three you can see that the first month there are GL transactions is December twenty twenty three.

So the beginning of time balance in December twenty twenty three is zero for all accounts.

Some accounts have transactions for twenty twenty three.

Now I will drill up from looking at twenty twenty three.

Now I will drill down from twenty twenty four.

Now I will go to the line for account one three nine zero vehicles total.

You can see that the beginning balance is forty nine thousand three hundred and forty dollars.

There are transactions to the amount of twenty nine thousand dollars.

And you can see the beginning of time value for February is seventy eight thousand three hundred and forty dollars.

Then in February there are transactions to the amount of negative one thousand dollars and you can see that the beginning of time amount for March is seventy seven thousand three hundred and forty dollars.

If you would like to pause the video you can inspect the page more closely.

You can see that for each GL Account there is a beginning of time balance for each month.

There are the transactions for each month.

And the beginning of time balance for the next month is the addition of the beginning of time balance plus the transactions for the current month.

Everyone who works in accounting will be familiar with this type of dashboard.

It is one of the most common dashboards asked for in data warehousing projects for finance.

People who are familiar with Business Central will know that GL Account Balances are flow fields and they are calculated at run time on the reports inside Business Central.

So inside Business Central there is no report that goes back across all accounts and calculates the beginning of time balances for every account for every month across the last five years.

Or at least we don’t know of one.

We have made the font quite small because accounting people like to pack numbers on to the page and we wanted to give a realistic demonstration for our accounting colleagues.

Now I am going to show you that you can drill on all columns at once.

So first I will drill back up to the year level.

Now I will click on the button with the help text expand all down one level in the hierarchy.

You can see that the first month is December twenty twenty one.

Now I will scroll to the right of the dashboard and you will see the twenty twenty three numbers come in to view.

Now I will scroll down and you can see all the other GL Account codes are available.

Now I will click on the plus sign to drill down next to GL Account code one three nine zero Vehicles total.

You can see the transaction types are not defined for the zero transaction type.

You can see the transaction type one is set to Sale order or sale invoice.

This text can be set by your programming staff when this dashboard is implemented.

Now I will click on the minus sign and close the detailed rows.

Now I will leave the full page mode.

This dashboard was developed on one of our development machines and then published.

The technical details of how this dashboard is created are available to your programming staff.

Your programming staff can download everything they need to install this test dashboard at your company to learn from it.

Everything you see here is free.

If you would like us to install this demonstration at your company so you can play around with it yourself we would be only too happy to do so.

We want as many people as possible to know that this is now possible.

If you would like us to create some dashboards for your company?

We will create a number of free dashboards for qualified companies on the agreement that we can resell the dashboards we create for free.

So that’s a very good deal for those who take it up.

In Summary

Today’s summary is very simple.

What we have shown you is that it’s possible to develop Power BI Dashboards that retrieve data from your on premise Business Central database directly.

This uses the Microsoft Gateway product to link the Power BI Service in the cloud to your Business Central on premise database.

This means you have access to all the data in Business Central directly from Power BI running in the cloud.

Your programming staff can write any dashboard you want.

And we at BIDA can also write any dashboard you want.

This is a huge breakthrough for Power BI because until now it has not been feasible to report on all the Business Central data in Power BI.

It has only really been feasible to create Power BI Dashboards using the O Data interface.

And that interface is quite limited.

This new way brings the full power of Power BI right to your Business Central On Premise Data.

We are sure you are interested in that.

And with that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofie.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0056 – Business Central Dimensional Queries Sending Results To Excel

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


Hello and welcome to our latest blog post.

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

If you are a business person using Business Central On Premise?

If you are frustrated that your I T department is telling you it takes days, or weeks, to get answers to questions that are urgent for you?

Then you will want to watch this video.

We are using Business Central twenty twenty three in our demo.

This video is a follow up on the blog post BIDA Fifty One.

You can review that blog post on the button below.

BIDA0051 Blog Post

So, as ever, on with the demo!

Demonstration

In this demo we want to show you the end result of creating an Excel report from Business Central on premise.

Of course, this is a demo.

You can create any report you want from Business Central on premise now.

Please note that this is not a freebie. The Meta five license does have to be paid for.

So, here we are on our demo machine.

We are inside an Excel report that is pretty simple.

Please note that you can download this Excel workbook using the button below.

Download Report

Anyone who uses Business Central will recognize the data in this report.

We are presenting sales lines for Sell to, Bill To, and Ship To.

This is a very simple report to make it easy to understand the demonstration.

So, I will click on the global slicers for Bill to countries for Great Britain and Germany.

Please note that the N A currency is actually the default currency for the Business Central implementation.

This is a known problem that we have not solved yet just because we wanted to get the demo done more quickly.

When we have clicked on Great Britain and Germany we can see the customers for sell to and bill to.

We can drill through to the detailed reports using the arrows to the top right, or just by clicking on the tabs at the bottom of the Excel workbook.

So, now I will drill through to the, sell to, customers.

You can see sales lines for such customers as the Guildford Water Department and the Auto House in Mielberg.

As I said, this is a very simple report, and it is intended to be this way.

You can see that we have put such fields as the item number, the item descriptions, and the numeric details like unit price, V A T, total amount, and the number of items sold.

We can drill back up to the global slices using the arrow at the top left of the report.

Now, I will click on that arrow.

I will also click on the Bill To, and Ship To, work sheets just so that you can see them.

Now, I will open up the power pivot model.

Now, I will just click through the tables so that you can see that there is data in the tables.

This data should be very familiar to you.

Now, I will open the diagram view.

You can see that we have the sales invoice lines in the center of the diagram.

You can see that there are detailed keys in the mini fact table.

You can see these detailed keys are linked to the primary keys of the mini dimension tables.

This data came from Business Central, and was put into this Excel Power Pivot Model.

We have added default rows automatically to the mini dimension tables, so it is not possible to lose rows when data is not correct in business central.

When a join fails you get the Not Applicable value.

You do not lose the row.

As I said, this is a very simple report, so that it is easy for you to understand.

It could also contain orders, shipments and returns.

You can put any data you want into these excel reports from your Business Central system.

Now, I will close the power pivot model.

Now, I will close the spreadsheet.

Now, I will delete the spreadsheet from the folder.

Now, we will go to the Meta5 desktop and open the Applications File Draw, to get to the capsule that creates this report.

Now, I will copy the capsule onto the Capsule Service Prompt.

Meta5 will ask me when I want to run the Capsule.

I can run it now or I can run it repeatedly under a specified alias.

You can see we have created default start times, every half hour from midnight to nine A M in the morning.

If we select one of those this capsule will run every day at that time.

In this demonstration, that would include weekends.

Please note, when this capsule runs, it will actually read data from a demonstration Business Central database, on another virtual machine.

Now, I am going to start the capsule.

Now, I will go to the capsule server and show you that it is running.

Now, I will show you the task manager.

In the task manager you will see mostly Excel processing.

Meta5 fetches the data from the Business Central demonstration database.

It then sends it to Excel regions.

It then loads the data into the Excel Power Pivot Model.

This transfer of data to the Excel Regions, and then load into the Power Pivot Model, takes the greatest amount of elapsed time.

The temporary workbooks used in the Excel workbook are then deleted.

Finally, the finished workbook is sent to the output folder, which is actually a one drive folder.

We will not pause the video, or alter the speed in any way.

We would like you to see how long this takes.

This is so much faster than trying to get data into Excel any other way.

These capsules can be run using this Meta five scheduler.

They can also be run using a windows command.

The windows commands can be run on demand, or it can be put into the windows task scheduler.

Now, the job is complete, and it disappears from the queue.

Now I will go back to the reports folder.

You can see the new report is there.

Now, we will just log off the Meta five desktop.

Just as an extra point.

The report start date and end date for sales lines are stored in a parameter database.

Your I T people can easily set these up, and make sure the dates are moved forward each day automatically.

In Summary

I would like to just summarize what we have shown you.

It is now possible for you to buy, or have your I T people create, a capsule that generates an Excel report.

That Excel report can contain any data from Business Central.

It can also contain any data from just about anywhere, that the server for the capsule has access to.

This can include data from web pages and other sources.

It is not limited to data that is stored in databases.

Meta five can then put that data into Excel automatically.

You, as a business user, get the finished report.

You can get it on your one drive.

You can run it on a schedule.

You can run it whenever you like, using a windows command.

And the best thing about all this is that the SQL that is used to get data can be as complex as required.

The only limitations on this solution are.

One. The processing power needed on your business central database.

Two. Excel and the limitations inherent with Excel.

If you want to be able to get Business Central data into Excel reports?

This is a very good way to do that.

And it will be cheaper than any other way you can solve this problem.

Of course, we are also doing this in Power B I, but Power B I uses a different approach to Excel.

If you are interested in trying out Meta five at your company?

We have a training machine available on Azure, and a weeks free training, for qualified prospective customers.

And with that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting, and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0055 – Business Central Tables And Columns Rename Script

0

Please note the script of the presentation is below so you can read along with the video if you like.


Please note. This is a re-release of BIDA0042 – Business Central Tables And Columns Rename Script so that the post will be sent to the Business Central Community.

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

I am pleased to bring you some news that I hope will interest you.

I am Mihai Neacsu and I am the business development manager here at BIDA.

Today we have some good news for all those people who use Business Central on premise.

If you have Business Central on Premise and find it hard to write SQL against the underlying database?

We have some good news for you.

As you know, Business Central contains GUIDs in table names.

It also has field names that require brackets around the field names in almost all cases.

This makes writing SQL against the underlying database more difficult than it needs to be.

Today we are pleased to announce the release of an SQL Script that renames all tables and columns in the Business Central 2023 Release Wave 2 demonstration database.

You can run this script, start to finish, against the Demonstration 23 database of Business Central.

Just so you know, we used the Cronus UK database to create this script.

But it is easily edited to run against any other version of Business Central 2023.

We have tested it against that database and it works without error.

You can download this script from the button below and try it out for yourself.

Download Script

Of course, if you are on a different level of Business Central, or you have customized your Business Central, you have two options.

1. You can alter the script by hand.
2. You can send us your SQL Server dictionary tables and we can quickly run the process on a server here.

To make it more interesting for you?

We will do this script generation work for free for the first 20 companies who contact us and ask us to do that.

After the first 20?

We might ask to be paid a small fee just for our time.

We have implemented this script as part of developing our data warehousing product for Business Central on Premise.

This table and column renaming will make it a lot easier for us to build our data warehouse.

It will also make it a lot easier for us to develop SQL Direct Query Dashboards against Business Central.

We hope you will download and try out this script and consider using it in your company.

And with that?

I would like to say thank you very much for reading or listening to our blog post today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Mihai Neacsu

BIDA Business Development Manager.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0054 – BC Direct Query Using PowerBI Cloud Demo

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


Hello and welcome to our latest blog post.

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

If you are a business person using Business Central On Premise?

We have created a new way to easily query any data in your Business Central database, using the Power B I Service in the cloud.

This video is designed for technical people.

However, as a business person, you might also want to watch this video, just to see what is now possible.

This video demonstration is a follow up on our previous blog post.

If you are a technical Power B I developer?

You can read the previous blog post on the button below.

BIDA0053 Blog Post

And now?

On with the demonstration.

Demonstration

Here we are on the desktop of one of our Azure Virtual Machines.

This is in Europe West.

The first thing I want to show you is the Direct Query Database for Business Central.

We will just list the views so that you can see there are over three thousand, five hundred views.

These views expose the Business Central tables with improved column names.

They also expose the Business Central tables using a pseudo dimensional data model.

The pseudo dimensional data model reduces the chances of incorrect answers being generated by queries.

This pseudo dimensional data model is work in progress.

Anyone is welcome to create new views and send us a copy.

All these views are free and published for anyone to use.

The obvious disclaimer being that you accept all liability in using these views.

What I want you to understand is that the Direct Query database sits on the same machine as the production database, or a production replica database, if you want to do that.

So, you could turn on replication from your business central server, and you could replicate all that data to another machine on premise, or to a virtual machine in Azure.

We are proposing that the Direct Query database sit on the same machine as the production database for better performance.

Now I want to show you the data mart data model data base.

This database only has thirty one tables in it.

These tables were created just using select statements from the views.

When we open up the general ledger entry table, you can see that the referential integrity constraints have been added to the table.

When we open up any of the dimension tables, you can see that the primary key constraint has been added.

As a technical SQL Server person, you will know that it is very easy to create a small test database like this.

There is also test data in this test database.

The main thing to point out is that this test database can be created very quickly and very simply.

It can contain just the tables and columns that are required for the Power B I dashboard that is being developed.

So.

With that background.

Let us move over to Power B I desktop on this machine.

Here we are in Power B I on the desktop of the azure machine.

We have done the demonstration all on just one azure machine for simplicity.

Obviously, the usual case would be that the Power B I Desktop being used is not on the same machine as the business central database.

In this demonstration we are going to show you how we can develop a Power B I report based on the data mart data model data base.

In this case that database has thirty one tables.

So it creates a quite small and manageable Power B I Semantic Model.

And then we will repoint that report to the Direct Query database which has over three thousand, five hundred views in it.

Also, there are no joins defined between the views.

Three thousand, five hundred views, with no joins defined, is completely unusable as a source to create a Power B I Semantic Model.

Now I will click on the SQL Server button to connect the report to SQL Server.

I will enter the I P address of the machine.

We will use direct query.

I will click on ok.

In the next step we have to provide credentials.

Just for the sake of a simple demonstration I will use Raul Romans credentials.

I will click on connect.

We will ignore the message about encryption.

We will be connected to the SQL Server through the Power B I Navigator.

Now we will select the data mart data model database.

When we open up the data mart data model database, we see the general ledger entry table.

I will click on the general ledger entry table.

I will click on select related tables.

You can see all the related tables are selected.

This is because the referential integrity constraints are defined in the test data mart.

Now I will click on load.

You will see that Power B I now reads the information schema of the database and creates the Power B I Semantic model based on the test data mart.

It will just take a little while, and we think it’s worth you seeing exactly what it does.

Now that Power B I is connected to the database you can see all the tables are presented on the right hand side of the Power B I pallet.

Just to prove what has happened we will go into the data model.

I will just scroll around the data model a little bit, and you can see the joins were all defined from the database.

You can download this database and put it on a local machine and try this for yourself.

The two databases you need are on the previous blog post linked above.

Now I will create a visualization using the table icon in Power B I. .

I will use the posting date for the report.

The first field is going to be the year number.

The second field is going to be month in year.

The third field is going to be the month short description.

Now we will select some fields from the general ledger entry fact table.

We will select credit amount and debit amount.

As you can see the visualization is updated as I add the fields.

Now I will increase the size of the fonts being used just to make it a little easier for you to read.

Ok.

Now we have our very simple report.

Power B I created the semantic model from the test database.

It is getting the data from the test database with just thirty one tables in it.

Now I will save the report.

I will call it report eleven, just to differentiate it from the first test report I created.

Now I will publish the report to the Power B I Service in the cloud.

I will put it into a demonstration workspace.

The publish operation was successful.

Now we will go to the Power B I service in the web browser.

I will refresh the page.

I will show you the gateway for the semantic model that was just created.

I have previously created the gateway and pointed the gateway to the underlying database.

This was done when we created version one of the report in testing.

As you can see.

We have a gateway connection to the semantic model.

It is pointing to the data mart data model database.

This is because it already existed from report one.

When you create this for the very first time you will have to create the gateway.

If you do not know how to do that?

Your technical support staff will know.

Now I will go back to the workspace.

Now I will open the report in Power B I Cloud.

You can see the report is the same as it was on the desktop.

Now we will edit the report.

So, I will click on edit.

Now I will select a field from the general product posting group.

I will get the description for the posting group and put it onto the report.

I will put it between the month name and the credit amount.

Power B I creates the SQL necessary to refresh the report.

It sends it to the gateway.

The gateway sends it to SQL Server.

And then the results are returned.

So, this is a demonstration of a report in the Power B I Service in the cloud, accessing data in SQL Server on a virtual machine, in Europe West.

Now I will save this report.

Now I will go back to Power B I desktop.

Now I will create a second report using, save as, for the report that we just created.

I will call this report number twelve.

So, now we have the exact same report, it is just a copy, and it is called report twelve.

Now, I am going to change the data source for this second report to the direct query database, that has more than three thousand, five hundred views in it.

Also, there are no referential integrity constraints defined in the direct query database.

Now I will click on the transform data icon in the queries tab.

Now I will click on data source settings.

You can see the source database is the data mart data model data base.

Now I will click on change source.

Now I will change the database name to be the direct query database.

Now I will click on advanced options.

Now I will click on, include relationship column, to turn the selection off.

Now I will click on OK.

Now I will click on close.

I will get a message to say that there are changes pending.

Now I will click on apply changes.

Now Power B I will connect to the new direct query database.

It will validate that the tables and columns exist in this database.

So, clearly, the data mart data model database must be a subset of what is in the Direct Query database.

You can not create new fields or tables in the data mart test database, and then repoint that data model back to the Direct Query database.

As you will see, Power B I takes a little while to validate everything.

Now I will save this report.

Now I will publish this report to the Power B I Cloud Service.

I will publish the report to the same workspace as the prior report.

Now I will go back to the web browser.

I will close the previous report.

I will refresh the page.

Now you can see that the report twelve report, and semantic model, has been created.

Now I will go to the settings for the semantic model for the twelve report.

I will go to the gateway.

You can see that this new report, the number twelve report, is pointing to the direct query database.

As I mentioned, this happens automatically for the second and subsequent reports you create.

For the very first report you create like this, you will have to create the gateway entry.

If you do not know how to do this?

You should contact your technical support people.

These gateway entries only have to be set up once per database.

Now I will go back to the workspace.

I will open the twelve report.

Now I will click on edit to edit the report.

Now I will add the general product posting group short description again to the report.

I will put it between the month name short description and the credit amount.

So, here we go.

And Whola!

Power B I Generates the SQL, it sends it to the direct query database with all the joins defined correctly.

This is because the joins have been inherited from the data mart data model database.

And there you have it.

How to develop Power B I Reports using a data mart data model test database.

And then how to deploy them to a direct query database.

In Summary

I would like to summarize what I have shown you today.

You have seen how to create a Power B I Semantic model from scratch.

This was by using a data mart data model test database specifically created to support a specific report.

These test databases are very quick and easy to create.

Much of the work can be made automatic.

And after a while you will have a library of tables in different databases that you can copy.

You saw me create a very simple report, send it to the cloud, set the gateway, and make an update to the report.

This proved that the report in the cloud was able to talk directly to the database via the gateway.

Then, you saw me make a copy of this report.

You saw me change the connection of that Semantic Model, to point to the direct query database.

You saw Power B I validate the semantic model against the new database.

You saw me save this report as report twelve, and publish this new report to the test workspace.

Then you saw me change the report the same way as I changed the eleven report.

And you saw that Power B I Cloud updated the report properly.

On our prior post you can get the SQL trace for our testing.

This shows you the process inside SQL Server, for all this reading and semantic model updating that Power B I performed, during our testing.

It is actually very interesting.

All in all?

What you have seen is a new, and better, way of using Power B I to directly query Business Central on premise databases.

And given that the vast majority of Business Central implementations are on premise?

This will be of interest to all those customers.

You can get the full power of Power B I Service in the Cloud, to directly access your Business Central production database, or replica.

Everything you have seen here is free, the exception being the Power B I licenses if you do not already have them.

BIDA is providing all these views for free as part of our commitment to adding value to the Business Central community.

And with that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting, and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0053 – BC Direct Query Using PowerBI Cloud

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


Hello and welcome to our latest blog post.

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

If you are a business person using Business Central?

The really simple message for you is this.

We have invented a new and better way for you to get access to all your Business Central, On Premise data, through Power B I, in the cloud.

This new and better way also allows you to integrate any other data that Power B I can read, which is pretty much any data.

How we did this is a bit technical.

So, if you want faster, cheaper, easier access to all your data in Power B I to improve your companies profits?

All you really need to do is to pass this blog post along to your Business Central Technical Support people.

You can ask them if they can take a look at this and use it in your company.

By the way, this is all free.

The only bit you have to pay for is your Power B I Cloud Licenses if you don’t have them already.

Many Business Central On Premise customers want to use Power B I Cloud but it’s kind of hard to use.

We have solved the problem of, “it is kind of hard to use”.

Now, moving right along.

Details of Our Solution

This video is intended for more technical people who work with Business Central.

This is for those technical people whose business community want to use Power B I to access Business Central On Premise.

So.

Please allow me to start with a statement of what the problems are with using Power B I in the Cloud to access Business Central On Premise data.

The presumption is that the business community see value in having Power B I Cloud over having Power B I for Report Services.

The presumption is that the business users want to use Power B I as one of the tools for Business Central Reporting, and very likely, for reporting on other data not held in Business Central.

There are a number of issues with querying Business Central On Premise data.

Issue One.

The Business Central data model is very complicated.

To be able to answer any question you have to have access to all the data.

But there are over one thousand, eight hundred tables in Business Central. And that number is growing rapidly.

There are many cases where joins are complex.

By this I mean that the same field can be used to join to many different tables depending on the value of another field.

It is simply not possible to give anyone a copy of Power B I, the Business Central Data Model, and say, off you go, you have access to all the tables, go and write whatever report you want.

There are too many tables to manage in Power B I.

The joins between the tables are too complex.

This is why Business Central Programmers have to write code to produce even simple reports out of Business Central.

To even get the simplest of new reports out of Business Central, you need a programmer, not a Power B I developer.

Programmer time is expensive, not to mention you have to wait in the queue to get their time.

Issue Two.

Even if you could deal with issue number one, and some how find a way to simplify the joins, there are just too many tables to deal with using Power B I.

Even with the O Data interface, it quickly becomes obvious that there will be thousands of O data sources needed to query Business Central in the Cloud, or via the O Data Interface for On Premise.

That there are one thousand eight hundred tables, the vast majority of which are not related to each other, means that the number of tables presented to Power B I is overwhelming.

Not even the best Power B I developers, even with the best intentions and efforts, are going to be able to deal with that number of tables.

For a new report?

Until now, all roads have led back to the Business Central programming staff, their high expense, and their work backlog queue.

We have promised that we have solved these problems.

So, please allow me to explain.

Solving Issue One.

We have previously released a blog post talking about the idea of putting pseudo dimensional models over the top of the Business Central database.

We have been doing further work on this and we are finding that it works fine.

Sure, it’s going to consume CPU to run queries over such views, but it will be much easier to get the correct answer.

Of course, it will take a lot of time to build up the suite of views for direct query against Business Central.

Also, we have added the numbers zero one to the end of these views.

This is so that each company is able to create their own views for tables to meet their own needs for custom developed reports.

So, a company could add, for example, five one, five two, five three, as the suffixes for their own custom developed views.

Anything that SQL Server supports can be done in the view.

All that has to be presented out to Power B I is the view itself.

You can download the latest version of the dimensional model views on the button below.

BC Direct Query DB

BC Demo Database

BC023_DM03_01

You can review the prior blog post on the button below.

BIDA0050 Blog Post

By creating these pseudo dimensional views we make it an order of magnitude easier to perform the queries against the business central data.

But we still have the problem that there are just too many views to sensibly manage.

Also, there is the serious problem that you can not define referential integrity constraints between views.

This means the joins in the Power B I Models would have to be added manually.

Solving Issue Two.

So how do we solve issue two?

What we have tested, and found to work, is this.

We created what we have currently termed a “data mart data model data base”.

The “data mart data model data base” contains a set of tables that are commonly used together.

For the sake of demonstration purposes, we have chosen the General Ledger Entry table, and all the tables associated with it.

Now, it is important to note that there can be many “data mart data model data bases”, and their function is to make the Power B I Models simple to use.

So, you only need to put into one of these databases the tables that you need for a specific Power B I report.

If you are only going to use data from 10 tables, then you can create a separate “data mart data model data base” with just those 10 tables in it if you want to.

Next.

Once you have created the tables in the “data mart data model data base”, you can remove columns that are empty or never needed by your company.

You can remove the system only columns.

Or you can not create them in the tables in the first place.

Then, with the drop and create tables commands, you must alter the primary key of each dimension table to be big integer and not null.

You must alter the columns for all the lookup keys in the fact table to also be big integer and not null.

This is so that the referential integrity constraints will be able to be created.

All the data types for the fields that will be joined have to be the same, and they all have to be not null.

Once you have the columns you want in each table you can add the primary key constraints to all the dimension tables.

The following is an example.

Please see the blog post for the typed example.

alter table dbo.vm_vendor_01 add constraint pk_vm_vendor primary key ( pk_vm_vendor );

Once you have added all the primary key constraints you can add the referential integrity constraints.

The following is an example.

Please see the blog post for the typed example.

ALTER TABLE [dbo].[vf_g_l_entry_01] WITH CHECK ADD CONSTRAINT [vf_g_l_entry_01_c04] FOREIGN KEY([dk_vm_gl_account]) REFERENCES [dbo].[vm_gl_account_01] ([pk_vm_gl_account]);

We have done this work in a demonstration database for you.

You can click on the button below and download a SQL server backup of the sample database.

This backup was created in SQL Server two thousand and nineteen.

(Update 2024-09-25: The database was updated to chance the collation sequence to be the same as Business Central Demos. It was accidentally created with the default collation sequence of the host server it was created on)

BC023_DM03_01

We will create a video demonstration of all this soon.

If you restore the backup database, you will see that it is just a small database with some test data in it.

You will see that the name of the tables is the same as the name of the views in the direct query database.

More importantly, if you check the tables, you will see the primary keys are defined.

If you check the general ledger entries table, you will see that the referential integrity constraints have been defined.

What this means is that when you develop your dashboard, you can use this database as the test data source for development.

You can put indexes on tables as you like.

You can even try writing your own Power B I report, on this sample database, just to practice how this works.

Now comes the magic part.

You create your dashboard on your desktop using Power B I desktop.

When you connect to a data source you connect to this test database where ever it may be.

It has to be visible via a fixed I P address, or server name, from your desktop.

You say you want direct query to the database, when you create the connection.

This database may be on the production business central machine. It may be on a development server.

But you create the Power B I Semantic Model for this dataset using this test database.

You then create the dashboard using this test database, and any other test database that you need.

You test the report and make sure it appears to be working properly, as best as is possible, with the test data you have.

Then, before you publish the report to Power B I Cloud, you repoint the data source to the I P address of your on premise server and the direct query database.

This database must be visible from your desktop using a fixed IP address or server name.

When changing the connection, it is important to go to “advanced options” and turn off select related tables.

You just want to repoint the desktop report, from the test database where there are just those tables you want in the Power B I Semantic Model, to the direct query database where those tables will be presented as views.

Power B I can not tell the difference between a table and a view in this repointing process.

And because you have told it not to bring in related tables, it will not look for the referential integrity constraints, which will not be there anyway.

You then publish the report to Power B I Cloud.

It will give you a message saying that it does not know where to connect the semantic model to.

This is because you have used the I P address of the business central database inside your organization.

On your Business Central On Premise server, or a server with linked tables, or replica, however you have decided to do this.

You will run the Power B I Gateway.

You can run just one Gateway per machine.

So, your published Semantic Model, or models, for the report will now be in the cloud in Power B I.

But they will not be able to see your on premise business central database.

You go into the semantic model, or models, for your Power B I Dashboard, and you repoint the source of the data from the test database to the direct query database, which will be defined in your gate way.

And then?

Whola!

When you refresh your report it will send the generated SQL into the gate way, and query your direct query database, which will have at least one thousand eight hundred views in it. And possibly many more.

So, what we have figured out is that you can create a small “data mart data model data base”, to create the Power B I Semantic model, that contains just the tables you need.

Then you can repoint that semantic model to your on premise Business Central direct query database, which contains all the possible views for the whole company.

In this way any single dashboard will contain just the semantic models it needs, while all data in Business Central is visible to any Power B I report developer, who is authorized to access that data.

You would implement your security however it is you want to implement your security.

This idea of creating these small test databases, to serve as the input to the creation of the Power B I Semantic Models, means that you can limit the number of tables and number of columns in any given semantic model, to be viable and easy to use for developers.

And then, by repointing the semantic model to the direct query database, the report will have access to all the production data. You can answer any question that the data can support.

Another factor to take into consideration is this.

Because the views in the direct query database can contain common table expressions, these views can be as complex as you like.

You can literally create very sophisticated, and complex views, and present them out as simple views.

You can also put time constraints on some views to say, show last 90 days, last 180 days, last 365 days and perhaps last 2 years.

In this way, a dashboard that only needs the last 90 days of data does not need to have “drill down” to do that. The view for the fact table data could have that constraint in it. You just add another suffix number to the fact table view.

Just for completeness.

The professional level of Power B I has a limit of 1 million rows being returned for any single query.

If you need to return more than one million rows, you have to buy the higher level licensing for Power B I.

Also, because of the way that the SQL is generated by Power B I, we would encourage you to create views that only display the columns needed in each report.

And if a report is only needed at summary levels, such as weekly, monthly, quarterly and similar, it is better to do the summarization in the database in another view, and not let Power B I do the summarization from the full fact table itself.

This is because it will consume far more temporary space if this is allowed to happen.

To show you what the SQL looks like for all the work that goes into creating such a report?

We have included a trace of our testing.

If you are familiar with Power B I, and how it creates its semantic models, and how it performs direct query, the trace will make complete sense to you.

When we read the trace, we were very pleased with how easy it was to read and understand.

Please click on the button below, if you want to read the trace created, when we set this up.

Gateway Trace

Simply put.

The Power B I Semantic model is created from the “data mart data model data base”.

Then when the switch is made to the direct query database, and the referenced tables option is turned off, it revalidates the model against the direct query database.

If the revalidation works you can then publish your report.

Just for extra information.

From our testing it appears that the semantic model knows the name of the SQL Server database at publishing time.

We were not able to alter the database name inside Power B I cloud services.

This is why we recommend changing the name of the database prior to publishing.

Once published, you can simply repoint the semantic model to get it’s data from the On Premise gateway using the correct database name.

Your database gateway support people will know how to set that up.

In Summary

Power B I Cloud offers a very wide range of features to help business managers improve the profit of their businesses.

Power B I Cloud is far in advance of Excel and Power B I on premise.

Excel is great and is better than Power B I for many uses.

Power B I On Premise is also great and very useful.

But it lacks many of the features of Power B I Cloud.

So, if you have Business Central On Premise, and you want to use Power B I Cloud?

There was no really good solution to being able to take advantage of Power B I Cloud, at least that we are aware of.

By inventing the following two ideas?

We have made it viable to use Power B I Cloud with Business Central On Premise.

Those two ideas being.

One.

Placing pseudo dimensional models over the top of the direct query database.

Two.

Creating separate dedicated “data mart data model data bases”, which will be used to create the Power B I Semantic Model, for each report that needs to use that set of tables.

Just as further information.

Although we have used Business Central in this example, it should be noted that what we describe works for any SQL Server production system.

This is a fast, simple, cheap and easy way to get started using Power B I against SQL Server production databases, or replicas.

Lastly, we really must add some caveats.

Obviously, direct query against a production database, or replica, has it’s limitations.

This is not like having a proper staging area.

This is not like having a proper data warehouse.

But this idea is very cheap and easy to get started with.

For thousands and thousands of small companies?

This will be enough.

If you use this idea and you come to understand why separate staging areas and data warehouses offer greater value?

Then you will have more arguments available to put into your business case, to ask for the budget for your data warehouse project.

We see this idea of using Power B I, without any other software purchase necessary, as a good way to get started on your journey of learning how to use data to improve the profitability of your company.

And with that?

I would like to say thank you very much for listening to our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0052 – First Post To Business Central Forums

0

Hello and welcome! I am Mihai Neacsu and I am the Business Development Manager here at BIDA.

We are very pleased you are dropping by to see our latest blog post.

This blog post is intended to be our first post that makes it to the Business Central Forums for partners.

So this is a test.

BIDA BC Blog

If the test works? We would like to give you a valuable freebie for finding us!

That freebie is a renaming script for Business Central 2023 Wave 2.

You can listen to the video below and click on the button below to download the whole script.

It will give you easier to use and remember names for all the tables and fields in Business Central. We hope you like it.

And with that?

We are going to post this blog post and see if we have our very first blog post make it to the Business Central Blogs.

Please note the script of the presentation is below so you can read along with the video if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

I am pleased to bring you some news that I hope will interest you.

I am Mihai Neacsu and I am the business development manager here at BIDA.

Today we have some good news for all those people who use Business Central on premise.

If you have Business Central on Premise and find it hard to write SQL against the underlying database?

We have some good news for you.

As you know, Business Central contains GUIDs in table names.

It also has field names that require brackets around the field names in almost all cases.

This makes writing SQL against the underlying database more difficult than it needs to be.

Today we are pleased to announce the release of an SQL Script that renames all tables and columns in the Business Central 2023 Release Wave 2 demonstration database.

You can run this script, start to finish, against the Demonstration 23 database of Business Central.

Just so you know, we used the Cronus UK database to create this script.

But it is easily edited to run against any other version of Business Central 2023.

We have tested it against that database and it works without error.

You can download this script from the button below and try it out for yourself.

Download Script

Of course, if you are on a different level of Business Central, or you have customized your Business Central, you have two options.

1. You can alter the script by hand.
2. You can send us your SQL Server dictionary tables and we can quickly run the process on a server here.

To make it more interesting for you?

We will do this script generation work for free for the first 20 companies who contact us and ask us to do that.

After the first 20?

We might ask to be paid a small fee just for our time.

We have implemented this script as part of developing our data warehousing product for Business Central on Premise.

This table and column renaming will make it a lot easier for us to build our data warehouse.

It will also make it a lot easier for us to develop SQL Direct Query Dashboards against Business Central.

We hope you will download and try out this script and consider using it in your company.

And with that?

I would like to say thank you very much for reading or listening to our blog post today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Mihai Neacsu

BIDA Business Development Manager.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0051 – Business Central Dimensional Queries Sending Results To Excel

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)
Hello and welcome to our latest blog post.

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

This video is intended for all business people, who work at a company, where you are frustrated that it is so hard to get answers from your operational systems.

If you are a business person, and you are frustrated that your I T department is telling you it takes days, or weeks, to get answers to questions that are urgent for you?

Then you will want to watch this video.

We are using Business Central twenty twenty three as an example, but what we will present today applies to the entire world of getting questions answered, from operational systems.

So, as ever, on with the demo!

Download Materials

Please note, because of their size the demonstration databases are stored on our public one drive. Anyone can download the two databases, but microsoft will require you to be logged into an outlook account or an office account for the link to work.

Please note the direct query database has been updated on 2024-07-07. We will update this note each time we update the direct query database.

BC Direct Query DB

BC Demo Database

Demonstration

Here we are on our main BIDA Azure Meta five evaluation machine.

You can get access to this machine by contacting Mihai, to book the machine, and get the login details.

I want you to know that you can go and try this out for yourself today.

On the button below you can download all the materials you will see in this demonstration.

So what are you looking at?

I will just go up to the top so you can see the IP address of this machine.

I will go to a command prompt and show you the IP configuration command.

You can see it’s a machine in the cloud.

I will also go to SQL server and show you the databases.

In this first window you can see that in the bida hosting services master database, we have one thousand seven hundred and ten SQL files loaded into the database.

Now I will click on explore object details.

You can see there are three thousand five hundred and seventy five views, available in the business central direct query database.

This is the number at the time of this video recording.

It will be larger if you review this machine for yourself.

Now I will go to the Meta five desktop.

This is a very clean desktop because it is the evaluation machine.

In the top left corner you can see the applications file drawer.

Now I will open up the applications file drawer.

We currently have applications only for Business Central.

Now I will open up Business Central.

We see there are a number of file drawers available to us.

As a business person you are only interested in what is in the first file drawer, the dashboards.

Now I will open the dashboards file drawer.

You can see there are currently fourteen folders.

These are fourteen reasonable sub sections of business central.

I will open up twelve, sales.

You can see the capsule B, C, twelve, zero, zero, three, sales lines.

This is the capsule we want to show you today.

Now, I will go into the sales lines capsule.

To make it possible to sell capsules over the web, and have them get their parameters from the customers master database, there is some technical stuff that needs to be done.

You can completely ignore this.

The capsule itself is embedded inside some wrappers, so that it can be reused by many customers.

So, I will just drill down to the capsule because that is what you are interested in.

Here we are inside the actual capsule that is of interest to you.

What can you see?

You can see lots of squares joined by arrows.

So, this capsule creates an Excel workbook, containing sales lines from the business central twenty twenty three, United Kingdom demonstration database.

The capsule also sends all of the associated dimensions into the workbook.

You can see there are fourteen capsules that are called get SQL.

We will go into the first one and show you how it works.

Now I will open the first get SQL icon.

You can see the variables at A V, at A W, at A A, and at A C. .

These are set when the capsule is installed at your site.

You can see B, C, twelve , nine, nine, nine, V M, order date SQL.

This is the name of a file of SQL that has been loaded into the master database.

Now I will option this SQL name.

Now I will press show constrained choices.

You can see that there are many files of SQL that the query icon knows about.

Of course, this list will get much longer as we develop more applications.

We can select any piece of SQL to be read by Meta five.

Now I will close the options windows.

Now I will press show data.

You can see that it has returned some SQL that is going to be used to read the order date dimension table.

This SQL can reside anywhere.

It can be inside your company, or on a machine in Azure.

It is stored in a SQL Server database.

Now I will close this query icon.

Now I will open up the SQL zero, zero, one, file.

You can see that there is SQL in the file.

This is the SQL that has been retrieved from the master database.

Because this is our evaluation machine we have copied the master database to this machine.

You can go to SQL Server Management Studio and look at the database if you want to.

Now I will close the SQL file.

Now I will open the SQL one, zero, one, file.

In this file you can see that the SQL is a lot more complicated.

I will scroll down just so you can see the SQL in the file.

This SQL is hand written.

When it is loaded into the database all comments and new lines are removed.

Now I will close the SQL File.

Now I will open the icon, V M order date.

You can see that the SQL from the SQL file is inside the icon.

Now I will click on show controls.

You can see that the SQL entry icon is reading it’s SQL from the SQL zero, zero, one file.

That is how this works.

You get the SQL from the master database and put it in a file.

Then you refer to that file in a SQL Entry icon, which can run the SQL against the database.

Now I will close SQL Controls.

Now I will press run on the V M order date SQL Entry icon.

You can see that it has returned the data from the select statement, including the heading line.

So.

Every SQL Entry icon is followed by a remove header icon, to remove the column names.

Now I will close the V M order date SQL Entry icon.

Now I will scroll down just a little bit.

Now you can see the collect data spreadsheet.

Simply put, all the SQL retrieved from the master database is executed by the SQL Entry icons.

The heading row is removed, and all the data from each query is sent into the collect data spreadsheet.

From there the data is sent into the sales invoice line excel spreadsheet.

It is sent into Excel regions.

You can download this spreadsheet to see exactly what we have done.

For each query there is a worksheet inside the workbook.

The worksheet has a heading region and a data region.

We only send data into the data region, because the headings need to be fixed, from the time of development.

The next step, the I Excel icon, takes the data from the data regions, and loads them into the power pivot model.

This overwrites the data in the power pivot model.

It will also update all dashboards, reports, and pivot tables based, on the data in the power pivot model.

Then there is an icon called delete multiple sheets.

What this does is remove the worksheets, that were used, to send the data into the power pivot model.

This means that the only data inside the spreadsheet, is in the power pivot model.

This data is highly compressed.

The delete multiple sheets capsule sends the final workbook to the P C Directory.

This can be any folder that the server has access to, including one drives.

Lastly, if this capsule has a signal file defined for it, it will delete the signal file so that the signal file has to be replaced before this capsule will run again.

Now, I will run the capsule so that you can see it running.

Since the capsule has so many icons in it, you can not see the icons at the top of the capsule.

I will also turn on the task manager so you can see what happens.

We will not pause the video, so that you can see how long this takes on this evaluation machine.

So, now I will click on the run button.

We can watch the capsule execute together.

We will just wait for the capsule to run.

Of course, any capsule can be scheduled to run in batch, via the meta five or BIDA scheduler.

Remember, you can have your I T colleagues write any SQL you want.

You can send any amount of data into the spreadsheet, so long as Excel can handle it.

When the volume of data is large, it’s best to have many collect data icons, and to send the data into Excel in multiple steps.

Now I will go to the directory the finished workbook has been written to.

I will open it up.

You can see there is only the reporting worksheet in the workbook.

Now I will open the power pivot model.

Now I will click on the diagram view.

Now you can see the power pivot model inside the Excel workbook.

Now I will just click on data view again.

Now I will just go along the tables along the bottom of the workbook.

You can see that there is a table for each query that was fetched from the master database.

Now I will close the power pivot model.

Now I will close the spreadsheet.

Now I will go back to the capsule.

In Summary

Now, please allow me to summarize what we have shown you.

What we have shown you, is that you can ask your I T colleagues to write SQL.

You can store that SQL in your master database.

You can read that SQL from the master database using Meta five.

You can resolve parameters inside Meta five.

Then you can send that SQL to any ODBC data source.

In this case we are using Business Central twenty twenty three, as an example.

But it can be any ODBC data source, including those supported by C Data.

The data that is returned is collected in the collect data spreadsheet.

It is then sent into Excel.

It is then loaded into the power pivot model.

Then the worksheets used for the transfer are deleted.

So, in the end you get a spreadsheet, with the data you want, in your power pivot model.

From there?

You are an Excel expert.

You know what Excel can do once your data is in your power pivot models.

You have seen the future of direct query against operational systems.

You can have this today, for your company, for the very modest cost of one Meta five desktop.

The Meta five desktop can be paid for monthly, annually, or outright purchased.

We are not publishing the current prices, because we have an agreement for some steep discounts from Meta five, that they do not want published.

It is sufficient to say that the price of a meta 5 desktop, on a daily basis, is negligible compared to being able to easily get any data you want, into your Excel workbooks.

As I mentioned.

This is our Meta five evaluation machine on azure.

You can book time on this machine with Mihai, and review everything we have shown you here, to make sure that it works just like in the demo.

There are no “tricks” to this.

The largest piece of work is just the setting up the regions, and tables, in Excel, because Excel needs to be told these things.

And with that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0050 – Dimensional Models Over Business Central

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

This video is intended for all business people, who work at a company, where you are frustrated that it is so hard to get answers from your operational systems.

If you are a business person, and you are frustrated that your I T department is telling you it takes days, or weeks, to get answers to questions that are urgent for you?

Then you will want to watch this video.

We are using Business Central twenty twenty three as an example, but what we will present today applies to the entire world of getting questions answered from operational systems.

So, as ever, on with the demo!

Please note, because of their size the demonstration databases are stored on our public one drive. Anyone can download the two databases, but microsoft will require you to be logged into an outlook account or an office account for the link to work.

BC Direct Query DB

BC Demo Database

Demonstration

Here we are on our main BIDA development machine.

We are inside the Business Central twenty twenty three Direct Query Database.

You can see that there are three thousand and sixty five views in this Business Central Direct Query database.

That is a lot of views.

This is how many views are needed over the top of Business Central to get even the first few fact tables to work.

Just so you know.

Because this is our main development machine we are hiding many database names.

Now.

This demonstration is a little bit technical, so I would like to firstly tell you the two problems that we have solved.

Problem number one.

Today everyone is talking about artificial intelligence.

The number one problem with AI in the enterprise is that the data inside the enterprise is housed in large operational systems.

These are usually E R Ps, but telcos have billing systems, banks have banking systems, and so on.

In 2024 we live in an era where practically all actions inside a company are tracked by putting transactions into computer systems.

Those systems can be generally referred to as Large Operational Systems.

The data models inside these databases are so complicated, that trying to train an AI using these data models has proven extremely dangerous, in that bad data fed to the AI provides bad results.

The first problem we have solved is the problem of making sure the data being sent into the AI, from the large operational system, is correct and good data.

Or, at least, we have vastly reduced the likelihood of errors.

Everyone who wants to do AI inside the enterprise will want to watch this video.

Problem number two.

In the eighties, relational databases were touted as the next big thing.

Relational databases, we were told, would allow business people to ask their own questions of their databases.

“All business people will learn SQL. They will answer their own business questions”, was the pitch from IBM and Oracle in particular.

Just like COBOL before it, this didn’t happen.

As the eighties went on, and then the nineties went on, operational systems exploded in processing complexity and in the number of tables and columns they contained.

In the late eighties, the era of the data warehouse was born to put all this data back together in some consistent form.

This was to make it possible to ask questions from all these operational systems, that were costing tens of millions of dollars to implement.

These systems have data models that even experts in the systems struggle with some days.

It is simply not possible for a business person to ask questions of large operational systems, and hope to get a correct answer.

Dimensional models for data warehouses made life that much easier.

Business people think in dimensions, it is natural.

So dimensional models became the standard way business people interacted with their data.

Now, the first thing I am going to show you are some dimension tables over the top of Business Central.

Please note, when I say tables we include views so that we do not have to make that distinction over and over again.

Please, remember, this could be any large operational system.

Now, on with the demonstration.

I am going to filter the views displayed to those that begin with the letters V M. .

These are dimension tables.

From the views list at the bottom of the video you can see there are one thousand one hundred and sixty four views that begin with V M. .

That is a lot of views.

These are all the tables that can be interpreted as dimensions in Business Central.

We will be the first to admit that because there are well over one thousand one hundred views, there are likely to be some mistakes.

We have put a dimension table view over the top of every table in Business Central, that we think might be queried as a dimension table.

We have a process that makes it quite easy to add more views.

We will go down to the item table because everyone understands companies sell items.

We will view the items.

Here we are with the items in the Business Central database in front of us.

You can see that there is something called a primary key for the item table.

It is a sequential number.

This number goes into your Excel workbook for the power pivot join to fact tables like sales transactions.

You can see the item number.

You can see the demonstration database is selling bicycles and bicycle parts.

I will scroll to the right so that those of you who do not know Business Central, can see how much information is stored about items.

Now we will go to item categories.

Now I will open item categories.

You can see there are some categories for office supplies.

Now, since this is a presentation for business people, I do not want to get too technical.

On the first row of the item categories you can see there is a default row.

It has a key of zero and the data in the columns are defaulted.

This is one of the design features of dimensional models.

They have a default row for when a join is not found.

So, if an item does not have a category, and many of the bicycles do not, you do not lose the sales row when you are trying to look at the categories.

Also, you are not stuck with a null.

You get a specific value of not applicable which is much easier to understand and work with.

This means you are told bicycles have quote, not applicable, end quote, as their item category.

This zero row is very important.

This is how we make sure no data is accidentally lost, and no extra rows are accidentally generated, by queries.

Because you will be able to download the database, and give it to your I T colleagues, there is no need to explain this to your I T people.

They already know about these things.

Now, I want to take you to some other tables that you will understand immediately.

The next table is the day table.

Now, I will open the day table.

It is not in order.

Now I will scroll to the right to show you the columns in the day table.

Now I will go to the month table.

Now I will open the month table.

You can see that the month table only has rows at each month level and the day table has rows for each day.

So, when you want to query data at the month level, you go to the month table and when you want to query data at the day level you go to the day table.

We have added tables to this query database to extend the functionality of the data model.

For example, we have a table for age bands.

If your large operational system records the birth date of your customers, it is possible to link them to the age band table for analysis by age bands.

There are many such tables in this model.

Now I will change to the views of fact tables.

As you can see, we only have eight fact tables defined at the moment.

We wanted to get this first video out to the public as fast as possible, because this is so revolutionary.

I want to show you the sales invoice line table.

Of course, companies sell items and the sales are recorded on the sales invoice lines.

Now I will select the sales invoice lines table and I will talk about that.

On the video now you can see lots of numbers.

These are the generated integer keys, that are going to link the sales invoice line, with all the dimension tables that have descriptive information in them.

For example, you see sell to customer in the middle of the screen.

You can see the keys are numbers like 7 , 8 , 18 and similar. .

These keys go to the customer table from which you can get the details about the customer.

You don’t have to worry about how that is done, it’s done for you by us or your I T colleagues.

You can see that there are a lot of these keys.

This is a template that can be used to select just what you want.

You will understand that if you have this many dimensions to join to, that you can slice and dice your item sales lines in any way you please.

You can answer any questions that your database is able to answer.

Now, I will scroll to the right.

You will see fields like header bill to name, header bill to contact and similar.

What this view does is expose the sales header on the first half of the row, and the sales line on the second half of the row.

So now we will scroll further to the right.

Now we can see fields like line number, line location code, line description.

These are the sales invoice lines.

They are individual items that are being sold.

To the right of the screen you can see line quantity and line unit price.

What you are seeing is what is called a fact table.

One type of fact table records all the details about a transaction.

This is one of those transaction fact tables.

It is recording every scrap of data we can about sales transactions, and it is joining that data to all the dimension tables we can properly join it to.

So, essentially, what you are seeing is a set of views where you, or anyone, can query the sales transactions, and present data to any tool, and you can know you are not losing any data, or accidentally getting any extra rows.

For training an AI, this means you will have far less chance of sending the AI false information.

For querying, this means you will not be able to accidentally lose rows or add rows to your results.

So, let us take a look at the actual code behind this sales line.

This is just to show it to you, we do not expect business people to understand this code.

Your I T colleagues can download this code and see it for themselves.

You can see on the screen that there is a word called coalesce, and this is used for all these keys.

Where you see the zero, this simply means that if nothing was found in the database for this key, then default to zero.

Then, every dimension table has a zero row, that will join to this row, to tell you that nothing was found for the row.

So, when an item is sold like a bicycle, and there is no item category for it?

Rather than lose the sales row, you will get the sales row with a Not Applicable value for the item category.

Now, I will scroll down in this view.

You will see how long it is.

Now we are at the point where your I T colleagues will be more interested in this segment of the video.

But we just want to show this section for completeness.

You can see that we are selecting the data from the direct query database, and we are querying the sales invoice line.

You can see that we are joining the sales invoice line to the sales invoice header.

You can see we are doing a very long list of left joins.

You don’t need to worry about these joins.

Your I T colleagues will understand.

All you will want to know is that someone, like us, has prepared these joins for you to use.

I will scroll down to the bottom so you can see how many joins are here.

This view is 491 lines of code.

So, it is quite substantial.

Of course, for performance reasons you might not want to do all the joins for every query.

So, of course, your I T colleagues could create SQL that just reads the data you need to read, to save processing time.

Now, I will scroll back to the top of the view.

Now I want to show you how such a data model can be queried and the results sent into Excel Power Pivot Models.

We are on our Meta five development desktop.

We have a very simple capsule in front of us.

The simplest way to take advantage of what we have now created is this.

Someone who knows the large operational system can write efficient SQL and store that SQL in a database.

The Meta five capsule can read that SQL, set any variables like begin and end dates for the report, and send it to Business Central.

Business Central can then return the dimension tables and the fact tables that are mentioned in the SQL queries.

These data streams are sent into the collect data spreadsheet.

Then, when all data has been collected, Excel is started inside Meta five and all data is sent into the correct worksheet inside Excel.

From the worksheet Meta five sends commands to Excel to refresh the power pivot model.

Then Meta five can send a command to Excel to delete the worksheets the data was sent into.

What you end up with is an Excel workbook containing a power pivot model with the data you want.

From that position you can do anything you like because you are an expert in Excel.

There are many reasons you want to use Meta five to put your data into your power pivot model.

However, we won’t go into them in this video.

You can just trust me that you can have Meta five run batches of reports, each night, and it can send out the updated Excel spreadsheets to whoever needs them.

It is also possible to use the Meta five query tool to query these dimensional models.

We have not yet created the dictionary needed to support that.

Also, if you are querying your production large operational system?

Your I T colleagues will want to review any SQL that is being sent into it, to make sure that no harm will be done.

In Summary

Now I would like to explain what you have seen because it’s pretty amazing.

As far as we know, no one has ever tried this before.

This is because the idea itself is simply so out of this world.

People have been doing direct query for a long time.

But no one has created a layer of views that allows the direct query to operate as a pseudo dimensional model.

At least we have never heard of such.

The original question was.

Is it possible to put a dimensional model over the top of Business Central, and make it work?

The answer is now proven to be yes.

This also means that it is possible to put a dimensional model over the top of any large operational system, and make it work.

That dimensional model can sit over the top of the production database, or it can sit over the top of a replica database.

So, all that you need to get started using this idea today, is a set of views and a Meta five workstation.

Then you can send the queries into your large operational system, and send the results into Excel power pivot models.

With Meta five, and this idea, you can get data out of your large operational system and into your Excel workbook, where you can analyze it in any way you want.

If you are using Business Central twenty twenty three?

You can get started today from what we have provided on this blog post.

Your I T colleagues can download the demonstration databases and get started today.

With this idea you will be able to get questions answered in minutes, or hours, rather than days or weeks.

If your database server is fast enough?

You can even use the Meta five query tool directly and not even need SQL to be written by your I T colleagues.

Then you can get answers to your questions in minutes, without having to ask anyone else for help.

Of course, we are going to continue creating views for the fact tables in Business Central.

This is time consuming work.

And it means that all Business Central users will be able to use those views.

What you have seen today is the new “entry point” to being able to get questions answered from your large operational systems more easily.

We want to make sure you understand this is not as good as having a real data warehouse.

We want to make sure you understand this is a short cut that allows you to get started very cheaply.

If you run these views on your production business central server, the only outlay for you is a meta five workstation.

If your I T colleagues say that you must run these queries on another server because of performance issues?

Then your outlay is for another server, and another SQL Server Standard Edition license.

Then your IT colleagues can turn on replication from your production Business Central, and send the production updates into your replica database.

Today you can buy a 16 core machine, with 164 Giga Bytes of memory, and 10 terra bytes of solid state disc, and the SQL Server Standard Edition License, for approximately one hundred thousand euros.

With a three year commitment, you can rent such a machine on Azure for around three thousand euros per month.

That is a very cheap entry point, to be able to answer any question you can think of, from your Business Central database.

One last point we want to make is this.

You might want to create a staging area rather than just use a replica of Business Central.

If you want to do that?

We will create staging areas, for free, for the first ten qualified customers.

After the first ten qualified customers we will charge one thousand euros to create your staging area for you.

We will give you all the software you need to run your staging area.

If you then wish to sign up for technical support>

We will charge very reasonable fees.

We have done another post on why you might want a staging area rather than a simple replica production database.

We can share that with you if you are interested.

In finishing.

Ladies and gentlemen, we have come to the end of our very big announcement.

We are more than a little excited about this.

This is a big idea whose time has come.

We think a lot of people are going to adopt this idea.

If you do adopt this idea?

We would please just ask that you give us proper credit when you mention it to other people.

We at BIDA invented the idea, so please respect our efforts.

We thank you in advance for doing so.

When we started BIDA we had the idea that BIDA was going to be a well known brand world wide like Nike, or Google.

Nike has the slogan just do it. .

You google things on the internet.

We set out with the vision that BIDA will come to mean, you ask your data a question.

BIDA your data today, is the idea we had when we founded our company.

With this idea of pseudo dimensional models over large operational systems?

You can BIDA your data today.

You can ask your large operational system any questions you want, and you can get answers back in minutes or hours rather than days or weeks.

To BIDA your data is to ask it a question, and get the answer you need to the “just thought of question”.

We are very pleased, and proud, to bring you this idea.

You can download the demonstration databases from the buttons on the blog post.

If you do not have the Business Central twenty twenty three demonstration database?

We have also included the download for it on a separate button.

Your I T colleagues can restore the Business Central twenty twenty three demonstration database, and the direct query database, on a development SQL Server for you.

Then you can review it to understand how this all works.

Your I T colleagues will understand what has been done.

They can also explain it to you.

And with that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0049 – Business Central Balance Sheet in Excel

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

This video is intended for companies who have Business Central 2023 installed on premise.

This video is a follow on from the previous videos, that were talking about being able to query Business Central using Meta five and sending the results into Excel.

So, as ever, on with the demo!

Demonstration

Here we are in the Excel version of the Balance Sheet generated out of Business Central.

You are welcome to download this workbook, and all the related materials, from the button below.

Download Materials

What we want to show you is that we have the categories and lines in the Balance Sheet, that are defined inside the Business Central demo system.

Now I will show you that this data is in the pivot table inside this workbook.

Since you can download this workbook and see the results for yourself, we will make this demo very brief in terms of showing you the power pivot model.

Now I will go to power pivot in the menu bar.

Now I will click on manage the data model.

Now I will click on the month table.

You can see the months.

The first day of the month is cast into the internal integer in Excel which is not an issue.

Now I will click on the account schedule name table.

This is in the workbook redundantly just for demonstration purposes.

Now I will click on the account schedule lines.

This is to show you that the account schedule lines are available in their own dimension table.

Lastly, I will click on the fact table in the power pivot model for the account schedule lines.

When I scroll to the right you will see the various amount columns.

When we scroll to the bottom we will see these amount columns contain values.

Now I will filter on Balance Sheet and scroll to the bottom.

You can see that there are values in the amount and credit amount columns.

These amounts are coming from the underlying demonstration database for Business Central.

Now I will click on diagram view.

You can see from the diagram view that you have the four tables mentioned when I described the data that was in each table.

The important thing for you to understand is this.

This data has come directly from Business Central using only Meta five and SQL.

The SQL was written by hand.

Meta five then sent the SQL to Business Central and put the results into this workbook.

What we want you to understand is this.

Any question you have that you would like to get answered from Business Central?

You can now get that data into the Excel Power Pivot model to answer any business question you have.

You do not need to have the report written in Business Central.

All you need is someone who knows the Business Central database, who knows Meta five, and who knows Excel.

They could be the same person, and they could be different people.

Now I will close the power Pivot Model.

The demonstration has two different worksheets in it.

We will just show you that we can click on the drill for 2024 and we will get the detailed rows.

Now I will click on the plus sign next to 2024.

You can see the totals for operating equipment and operating equipment depreciation to date.

The numbers are not exactly as you get in the demonstration database as we have not included all the rows in this demonstration.

We did take some short cuts in the SQL as we just wanted to demonstrate what was possible.

Now I would like to show you how this workbook was created.

We will go across to our Meta five development machine.

Here we are on the desktop of our Meta five development machine.

You can see a capsule on the video.

This is the sort of capsule that one of your IT people can create, or we can create, for you.

Across the top of the capsule you can see 4 query icons with the name “Get SQL”.

These icons will retrieve SQL that is stored in your master database.

Your master database can be in the cloud and supported by us if you wish.

I will open the first get SQL Icon and show you what is inside it.

You can see variables like at A V, at A W, at A A, and at A C. .

These variables allow us to write a capsule on our development server and deploy it, without any changes, on your Meta five server.

Basically, these variables mean that we can create capsules on our development server and sell them to any customers who have the system that the capsule is intended to be used on.

For example, we can create capsules for Business Central on premise users, and that capsule can be used by any Business Central installed account.

You can see the name, B C 3 9 9 9, read V M month, zero one.

This is a piece of SQL that lives in the master database.

Anyone who uses this capsule would use that piece of SQL.

In this way we can write SQL once, store it in the cloud, and all of our customers can use that piece of SQL.

Now I will click on show data.

It is a little hard to read the SQL returned.

But you can see that a select statement was returned.

Now I will close the query icon.

Now I will open the SQL one text document.

You can see that this is the select for the V M month table.

This SQL is in the download available from the button on the blog post.

This SQL will return the data that will eventually go into the power pivot model.

Now I will open up the text file, SQL one zero one.

You can see this is simply a select statement from a view to retrieve the G L account schedule lines.

You can then see icons called Run SQL, for each of the four text files.

These icons send the SQL retrieved from the master database to Business Central, and return the data into the spreadsheets next to them.

These four spreadsheets are redundant and are only in the demonstration for demonstration purposes.

I will now open the fact table spreadsheet that contains the account schedule lines.

You will see that the data has been returned by running the SQL that was in the master database.

I will just scroll around the spreadsheet and you will see the data that has been returned.

Now I will open the collect data spreadsheet.

I will scroll to the right and you will see there are various sets of data returned to the collect data spreadsheet.

All the data that is needed in the Excel workbook is sent into this collect data spreadsheet.

Of course, you could do much more manipulation of the data inside Meta five if you wanted to.

Eventually the data is sent into regions inside the Excel workbook.

We wanted to show you this portion of the demonstration for these reasons.

Any SQL that any SQL developer can write, can be stored in the master database.

That SQL can be read, parameters can be resolved at run time, and it can be sent to Business Central.

To create a power pivot model in Excel, the SQL developer needs to write the SQL statements that will return the data needed in the Excel Power Pivot Model.

Because this SQL can be very complex, as we will show you, you can get any answer that Business Central can provide, and Excel can present.

The limitations on your questions are SQL, Excel, and your imagination.

To make it even easier to get the answers you need from Business Central, we will soon be publishing dimensional models over the top of Business Central.

We have done a lot of work in that direction, and we just have to publish the demonstrations we have developed.

Now I will take you to the view that creates the lines for the fact table, that are then put into the power pivot model.

As a business person, you will not be able to read this SQL.

It is very complex.

However, excellent SQL developers only have to write this piece of code once.

Then it can be read from our cloud master database and used by any customer.

The SQL is in the materials that you can download from the blog post.

You can give it to your I T support to show them how complex the SQL can be to answer business questions.

Now I will just scroll down this SQL and then scroll back up.

In this way you can see that this is nine hundred lines of SQL.

In Summary

Now, I would like to summarize what we have shown you today.

We all know that generating the Balance Sheet, from the definitions created inside Business Central, is an important function delivered by Business Central.

We all know that the actual Balance Sheet generation requires code inside Business Central.

We all know that the code to generate the Balance Sheet is complex.

What we have shown you today is that it’s possible for a good SQL developer, who knows Business Central very well, to produce the Balance Sheet using Meta five and Excel.

The purpose is not to produce the Balance Sheet.

The purpose is to show you that a report that can be produced, can be as complex as the Balance Sheet.

This demonstration proves that you can get any question you want answered from Business Central, using just Meta five and Excel.

The role of Meta5 is as follows.

One. To read pre written SQL statements from the master database.

Two. To perform variable resolution at run time to apply constraints to this execution of the SQL.

Three. To send the SQL to Business Central.

Four. To send the answers from Business Central into Excel.

Five. To re-load the power pivot models in Excel from the new data.

Six. To send the finished Excel workbook to the people who wish to have the workbook.

One of the most pressing problems in Business Central installed accounts, is getting the data from Business Central into Excel, so that business questions can be answered.

If you are in the position in your company that you have questions you want answered from Business Central, but your I T staff are telling you that it is very expensive, or impossible?

Then you might like to try out Meta five, and see for yourself how easy it could be.

We have an evaluation machine that has the Business Central 23 database installed.

You can test that out for yourself.

If you want Meta five on your premises to try this out?

We can arrange for a thirty day trial copy that can be installed on a VM on your premises.

We believe that once you play around with Meta five for a few weeks, you are going to want to have your own copy on your PC.

If you would like to get access to our evaluation machine to try Meta five for yourself?

Then please contact Mihai to book your time on our evaluation machine.

And with that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0048 – Business Central Beginning of Time GL Balances in Excel

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.

(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest video!

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

This video is intended for companies who have Business Central 2023 installed on premise.

This video is an update on the BIDA forty six blog post.

We have updated the Excel workbook to make the demonstration even better.

So, as ever, on with the demo!

Demonstration

Here we are in the BIDA General Ledger Beginning of Time Balances Demonstration workbook.

You can download this workbook from the button below.

Download Materials

Firstly, you can see that we have allowed for 5 years of beginning of time balances.

The data being used is the version 23 demonstration database, that only has two years of data in it.

The headings and GL account name panes are frozen.

We will scroll to the right, so we are starting in 2023.

We will scroll down to 1999, fixed assets total.

This is because this line gives us a nice reporting line.

Now I will click on the plus sign next to the GL Account code.

Now I will click on the plus sign next to 2024.

You can see that there is no beginning of time amount for 2023 because it is zero.

An empty cell means zero.

In December 2023, the transaction amounts totaled one million, three hundred and twenty one thousand and two hundred and seventy eight dollars.

You can see this amount became the beginning of time amount for January 2024.

Then the transaction amount for document type zero, in January 2024, was thirty six thousand and forty five dollars.

This is added to the beginning of time amount to get the amount of one million, three hundred and fifty seven thousand and three hundred and twenty three dollars, for the beginning of time value for February.

We are pretty certain we do not need to explain to you how beginning of time balances work.

What we would like to explain to you, is that you can calculate such a complex thing as beginning of time balances going back 5 years, using a SQL query.

You can then send the data for that query in to Excel at a summary level.

Once you have the data in Excel you can present it in any way you please.

Now we will scroll down in the workbook, just to show you all the other GL accounts from the demonstration database are in the report.

Now we will click on the minus sign next to 2024.

Obviously, from this demonstration, you can see that you can select any combination of GL account codes from Business Central, and you can report on them in any way you like in Excel.

For most people who work in the accounting area of Business Central, you will not have seen that it is possible to calculate Beginning of Time Balances like this, without considerable effort from yourself or your IT department.

Such complex calculations as Beginning of Time Balances can now be done quite easily, if you use Meta five to send the query into Business Central and to send the results to Excel.

You can run this query right on top of your production Business Central database.

Or, you could ask your IT department to create a replica database, so that running this query does not interfere with production performance.

In either case, this demonstration shows that you can get started with much more complex queries than you might have thought possible.

Once you are doing more and more complex queries on your replica production database, and you have come to understand the value of creating a data warehouse?

Then you might be able to more easily get a budget approved to implement our data warehouse product, that is still under development at this time.

For small companies who use Business Central, it is very possible quite a wide array of reports can be written without the need to even create a replica copy of your Business Central database.

Please note.

We have included in our blog post, all the buttons needed to download all components of this report.

You are welcome to ask your IT colleagues to review the SQL that was used to create this report.

In Summary

Just to keep you interested?

We have another very interesting demonstration in development.

This demonstration shows you that it is possible to create the balance sheet from Business Central in Excel using this sort of idea.

So, if you want to be notified when that demonstration is available?

Please feel free to join our email list.

Or please grant us permission to put you on our email list.

With that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

Please note. If you would like to ask your IT staff who support you, to read a more detailed blog post? You can give them the link on the button below.

BB0010 Blog Post

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.