BIDA0047 – BC 2023 Excel Dictionary Demo

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest video!

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

I am really pleased to be able to bring you some news that I think will really interest you.

This video is intended for companies who have Business Central 2023 installed on premise.

This video is a follow up to the prior video demonstration of the Navision 2018 Excel Dictionary database.

The obvious question that arises from prior demonstrations is this.

“If we have access to so many tables and fields in Business Central, how do we make it easier to find those fields because we can’t possibly remember them all.”

And that is where today’s video comes in.

If you are wondering how, you might be able to more easily search all the fields available in Business Central, then you should watch today’s video.

So, as ever, on with the demo!

Download Dictionary

Demonstration

Here we are in a simple Excel spreadsheet.

This spreadsheet has two sides to it, the left and the right.

You can download this spreadsheet using the button on the blog post.

On the left-hand side is a table of categories.

These at the same categories you saw in the prior demonstration.

They are drillable so you can immediately understand that I can click on a drill, and it will open up with the tables.

So, let’s drill on Financial Management.

Please note that the very first time you click, Excel will load the power query model, so it will take a second or two to respond.

Now you can see all the tables that are listed in the Financial Management Category.

We will scroll down to the GL Account table.

As you would expect, when we click on the GL Account table, we will get all the fields in the GL Account.

As an Excel user the rest would be obvious to you.

Since we would love you to download the workbook and play around with it yourself, we won’t go into too much detail of this simple drill down.

All the categories, all the tables, all the columns are there.

If we ever get good help text and descriptions?

Then we will include them in this workbook.

Now we will scroll back up to the top of the workbook.

We will close the Financial Management Category.

Please note at the bottom of the categories list we have two new categories.

They are.

One.

Ninety Eight, system tables.

And two.

Ninety nine, not in category.

In the prior Navision dictionary we excluded system tables.

We now think that it’s useful to have them in the dictionary.

For the Not in Category Tables, we have not yet gone through and moved these tables into categories.

If you think you have a good suggestion for what categories what tables should go in?

Please feel free to use the comments below or contact us and let us know your proposals.

Over time we expect to be able to move all the tables into a suitable category.

We know this is a matter of opinion and some people will have differing opinions.

Now we will show you something we think is really pretty cool.

Excel drills and searches are implemented in this workbook.

So, next to Table there is a drill.

I will click on this drill.

Now we can see there is a search option for all the tables in the dictionary.

I will enter G L and it will search for all the tables with G L in the name.

I will click off all “select all search results” to have no tables selected.

Then I will scroll down and select the GL Account table again.

When we click on the GL Account and click OK, we will be returned to the report.

The GL Account table is now selected but it may not be visible in the table.

You might have to click on Financial Management again to have GL Account visible.

Once you do that then you can click on GL Account to make the fields in the table visible.

Please note we have also numbered the columns in the order in which they appear in the tables.

Just like any other Excel report you might create; the dictionary is fully searchable in this table.

Now, we will clear those selections.

Now I would like to draw your attention to the right-hand side of the report.

You can see there is a bubble saying, “search a table”.

This provides a second way of searching for a table that you might like more.

You can see GL Account is selected.

In this search bar you can delete and type in your search term if you wish.

We recommend you download the spreadsheet and give it a try because it’s quite a new idea.

We have not seen this idea before.

What I will do is go to the search text and enter just GL.

I will again select GL Account.

Now it will show me what category GL Account is in, and it will show me all the fields in GL Account.

In this way you have a second way to search for tables using this filter and drop down.

We thought that in this first release of this workbook we would give you both options.

Please feel free to let us know which one you prefer in the comments section at the bottom of this blog post.

Of course, the value of this spreadsheet is this.

As you are creating queries on the Business Central database using SQL, you can refer to this spreadsheet.

This will help you to easily find the tables and fields that you are looking for.

Then, when you search the spreadsheet and know exactly what fields in what tables you want, you can more easily locate them in SQL Server Dictionary.

Also, the actual SQL names are the same as in this dictionary, only with underscores where the spaces are.

In our own development work, we have this spreadsheet on our left screen as we write SQL.

You might want to try that out for yourself.

If you would like to try this out for yourself on your PC?

Then please contact Mihai to book a time on the evaluation machine.

You can also download this spreadsheet from the button on the blog post.

You can try for yourself how easy it is to query data directly from Business Central using SQL while also having this dictionary available to you.

Further, if you would like one of these spreadsheets for your company to help you query your Business Central system?

We will prepare this spreadsheet for qualified Business Central installed companies.

So please feel free to contact me to see if you meet our qualification criteria.

This report takes a little while to create.

So, we will only create a limited number of these reports as “freebies”.

In Summary

So, to summarize what I have shown you today?

In our prior video we showed you that it is now possible to perform direct query to all data in your Business Central system using the Meta5 query tool and SQL.

However, because there are so many tables and so many columns it’s really handy to have a searchable dictionary right there on your left screen.

We created this report to be a searchable dictionary for any customized Business Central system.

It provides two ways to query what tables and columns you have available.

On the left-hand side, it allows you to drill down or to use the excel search and drill functions.

On the right-hand side, it allows you to search for tables containing the search string.

We are finding this very useful in our own development work.

We think you will find it useful too, even if you don’t have Meta5.

With that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

Download Dictionary

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0046 – Business Central Beginning of Time GL Balances in Excel

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest video!

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

In this blog post we would like to show you how it is possible to create Excel reports, from Business Central on premise, using direct query.

This demonstration is intended to be viewed by business analysts, who are responsible for asking questions and coming up with new business proposals, based on the answers to those questions.

For this reason, the demonstration will be less technical.

If you would like to ask your IT staff who support you, to read a more detailed demonstration, you can give them the link on the button below.

BB0010 Blog Post

Demonstration

And now, on with the demonstration.

Here we are on our Meta five evaluation machine on Azure.

You can get access to this machine by contacting us and booking time on the machine.

We would like to show you a number of things.

Firstly, we will show you the Excel workbook, with the finished data in the workbook.

This is so that you know what the target data set is.

Then you can more easily understand the other things we will show you.

Here we are, inside the Meta5 capsule, that can be scheduled to run over night.

At the bottom right hand corner, we can see the Excel workbook.

So, we will open the Excel workbook.

Now I will show you each of the worksheets.

In the month worksheet, you can see we have a primary key, first day of month, month name and month description, as well as many more fields.

In the general ledger account worksheet, we have a copy of the business central general ledger accounts.

In the general ledger document types workbook, we have manually defined document types.

In the general ledger beginning of time account balances workbook, we have the data from the general ledger entries.

Now, we will go into the power pivot model, to show you that the data is also in the power pivot model.

We will just click on each table, to show you that the data is in the power pivot model.

We will also show you the power pivot model, so that you can see that the fact table is joined to the dimension tables.

You will notice that the keys on the fact table are at the end of the table, rather than at the front of the table.

This is because of how Meta five joins data.

This makes no difference to the performance of the power pivot model.

What we would like you to understand, from this portion of the demonstration, is this.

As long as it is possible for your business central system to support the query, you can get any data you ask for, into Excel.

The best ways to deal with such data is in Excel Tables, or Power Pivot Models.

It is important for us to mention that this is not as good as having your own data warehouse.

However, if you do not have a data warehouse, this is the best possible way to get data into Excel, or power bee eye, for complex questions that can not be answered other ways.

Next.

Now we will go back to the VM that this demonstration is running on.

We will show you that there is a Business Central demonstration database, for version 23, installed on this machine.

You can see the tables inside Business Central.

Don’t be too concerned about the strange names.

We will scroll down to the General Ledger Entry table.

We will select the top 1000 rows.

You can see the names of the columns and the data returned.

You can understand this is a demonstration copy, of what you will have installed, at your company.

Next.

We would like to show you that we have a direct query database, where the names of the tables are much easier to understand.

We will scroll down to the General Ledger Entries table.

This is so you can see the difference between the Business Central Table, and the one in this query database.

We will select the top 1000 rows of this query General Ledger Entries table, so that you can see it is the same data.

The renaming is just to make it faster, and easier, for technical people to write the SQL, to answer your questions.

We will just show you the view, that allows this simpler querying, of the database.

You can see on the left hand side, the business central names.

You can see on the right hand side, the easier to read names.

In the view, from clause, you can see the real name of the business central table.

Next.

We would like to show you that it is possible to add extra data, into the hosting master database.

This extra data can be joined to your Business Central data, or any other data, that you have access to.

The most common use of this extra data, is for fields for descriptive information for days, weeks, months, quarters, or years.

In the demonstration we will now show you the view in the hosting master database for days.

We will show you the view in the hosting master database for weeks.

We will show you the view in the hosting master database for months.

We will show you the view in the hosting master database for quarters.

We will show you the view in the hosting master database for years.

Your IT people can put any data you wish, into this database, to help produce better reports.

You can have other common tables put into this database, such as gender and salary ranges.

Because operational systems are often missing descriptive data, we have made it possible to store such descriptive data, in this hosting master database, and then use it in reports.

You can also use spreadsheets to load descriptive data into reports, that you can write or ask your IT staff to write.

Next.

As you know, one of the reasons that companies create data warehouses, is to allow business analysts like you, to answer your own questions.

Operational systems have databases that are too complex, for most business analysts to understand and query.

There is a significant chance that a business analyst will get the wrong answer from the database, if you write your own SQL, against an operational system database.

So, what you are about to see is a compromise, that allows you to get your questions answered, without your company having to invest, to build a full data warehouse.

You have seen the example workbook earlier in the demonstration.

We will briefly show you the query, that produces the resultant data.

We will scroll down this SQL.

You are welcome to stop the video and review the SQL.

You can also ask your technical support, to download it from our website, if you really want to read it.

You don’t need to worry about this SQL.

All you really need to know about the SQL code is that you, and your peers, can not write this code.

This code needs to be written by IT people, who have profound knowledge and experience, of Business Central.

However, what you can do is ask for this code to be written.

This code can generally be written in a day or two, depending on how complicated it is.

You have a choice to ask your own IT department, to write such code, or ask a third party, such as BIDA, to write this code for you.

The important part is that the code can usually be written quickly.

You can get your answers to your questions quickly.

If the report answers your questions, and you want the report to be completed and put into a regular schedule?

Then you can work with your IT department, or a third party, to improve and finalize the report.

Once you are satisfied with the report, you can have it added to the daily schedule.

Next.

We would like to go back and show you the capsule, that is used to create the report.

This is just to show you that it is not that difficult, to create such a capsule.

It is more difficult to write the SQL query, against the Business Central database.

The first line is fetching the SQL from the hosting master database.

SQL file one is for the query against the GL Entries.

SQL file two is to get the GL Account table itself.

The second line runs the SQL one file to fetch the GL Entries, producing the temporary beginning of time balances data stream, and sending it into a spreadsheet.

The third line fetches the month level keys, so that the posting date can be linked to these keys.

The join icons put the integer keys from the lookup tables into the data stream, for the beginning of time balances table.

The fourth line fetches the general ledger document type keys, to be able to join the document types, to the beginning of time balances data stream.

The fifth line runs the SQL two file which fetches the GL Account table, and puts it into the general ledger account, as a dimension table.

You don’t need to worry about the column select, remove header, and compress, on the fifth line.

You can just take it to be a small amount of data manipulation.

On the sixth line, you can see we fetch the month data from the host master database.

And on the seventh and last line, you can see we fetch the general ledger document type table, from the host master database.

You can see all this data is collected, into the collect data spreadsheet.

It is then sent into the Excel regions in the workbook.

Now we will just run the capsule as part of the demonstration so you can see it run.

This sort of report would be run over night.

You will simply have the resulting spreadsheet in your one drive each morning.

With Meta5, your IT department can create Excel reports from the underlying Business Central database, much more quickly than any other approach.

More importantly, third party vendors will be able to create complex analytical dashboards for Business Central, and sell them on the open market.

Given this, it is in the interest of all Business Central on premise customers to ask for, and encourage, third party vendors to compete, to develop dashboards similar to the one we have demonstrated today.

For our part, we are willing to make this limited time offer.

We will write dashboards for installed Business Central accounts against the demonstration database, for no charge.

This is as long as we are able to take that intellectual property, and resell it to other Business Central accounts.

We can create them on this Azure VM.

You or your technical people can review and evaluate the report, as it is being developed.

In this way early adopters can get free applications in a win-win situation.

If we are allowed access to your Business Central database, then that will help.

However, we can write most reports against the demonstration database.

Then you can test the query against your Business Central database.

This concludes our video demonstration today.

In Summary

In this demonstration video we have shown you that it is possible to write a complex Excel report against your on premise Business Central database faster, at a lower cost, than has ever been possible before.

Even a report as complex as a beginning of time, monthly level, general ledger report, can have the underlying queries written in a day or two.

Then the development of the Excel dashboard can be done.

Further, third parties can develop these reports, and sell them in to the marketplace at a very low cost.

All this is to say, Meta5 changes the game for getting an answer to the just thought of question, for Business Central installed accounts.

You can get answers to questions that you have never been able to answer before, in hours, or a day or two, using Meta5.

With that?

I would like to say thank you very much for watching our video today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0045 – Meta5 And Complex CTE Queries

0


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

This blog post will be very short so we have not created an audio.

Please note that this blog post is particularly technical and is addressed to technical people.

We did think about putting it only on BIDA Brains but we felt we would put it here as well.

For business people this post means:

You can ask your operational systems any question you like as long as it can handle the processing of the query.

Now that we have published everything needed to create “Meta5 as a Black Box” we just needed to test one last thing to make sure all was ok.

That test was to make sure Common Table Expressions worked from Meta5 to SQL Server.

Of course, we expected it to work, but we have to make sure by testing anyway.

So for all you technical people out there?

We have tested the CTE required to produce the Business Central Chart of Accounts report.

You can download the code from the button below.

Please note we formatted the SQL using Notepad++.

You might want to use Notepad++ to read it to get the spacing correct.

Download CTE SQL

The way we did this was as follows.

We turned on SQL Tracing and then ran the Chart of Accounts report.

We captured the SQL sent into SQL Server.

We then translated it to use the Direct Query table and column names we have defined.

We created the query using both outer apply and left joins.

We got the same result with both queries.

However, we are only using the demo database so there could still be errors in it.

Please note. In the Business Central report, the report itself creates the summaries for the parent GL Accounts.

So our CTE query includes the work to create the summaries for the parent GL Accounts.

What does this mean for technical people like you?

This means that you can use “Meta5 as a Black Box” to send any CTE query into any database that supports it and get the result back.

Basically, because CTEs can be so complex, you can get virtually any answer from any database that supports CTEs.

The obvious constraint is that running the query does not interfere with other processing the database is required to be doing.

This means “Meta5 as a Black Box” + CTEs + Excel / PowerBI can produce a very large array of reports from operational systems without the need to create the data warehouse first.

Sure, this is not a replacement for the data warehouse proper.

This is a way to create Excel / PowerBI reports by querying operational data where the interference on the operational processing is tolerable.

This is interesting to technical people because it means you can write the report one time and then sell it many times to many users of the operational system you are an expert in.

We believe this “Direct Query” using “Meta5 as a Black Box” sending CTEs into databases and then sending the resultant data into Excel / PowerBI is going to be a very large market.

No other product provides such seamless integration between Excel / PowerBI and complex CTE derived data as Meta5 does.

But don’t take my word for it.

Read the CTE from the button above.

And if you want to see it working for yourself?

We can give you access to our Meta5 Evaluation server and you can run the query for yourself.

Ok?

With that I would like to say thank you very much for reading our blog post today.

I really appreciate your time and attention.

Best Regards.

Mihai Neacsu.

Business Development Manager.

BIDA.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0044 – Meta5 As A Black Box Demo

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

I am pleased to bring you some news that I hope will interest you.

I am Sofi, your BIDA AI Assistant.

I will be reading this blog post for you today, on behalf of Mihai.

Today, we have some really big news, for former Metaphor employees and customers.

We have been very busy on a number of fronts.

One of those fronts is this.

We have been trying to find a way to create quote, Meta5 as a black box, end quote.

I am very pleased to report that we have been successful.

Today, we are releasing our very first customer demonstration of Meta5, as a Black Box.

So, if you are a former Metaphor employee, or customer?

We think you will really enjoy this demonstration.

So please watch our new demonstration.

If you are new to Metaphor, but you are knowledgeable in business intelligence?

We think you will also really enjoy this demonstration.

So we would also ask you, please watch our demonstration.

This demonstration shows how anyone, who has profound knowledge of a large operational system, could leverage that knowledge by selling Excel Dashboards, created from the data in that operational system.

What we are going to take you through today, is how Meta 5, as a black box, works.

So.

On with the demonstration!

Demonstration

Here we are on our development machine.

The first thing we want to show you, is that we have a master database, that contains parameters for Meta 5.

We mentioned this in a prior post.

On the video, you can see the master database.

You can see that inside the master database, there are just three tables.

You can see there is a SQL statements table.

You can also see there are two variable tables for meta 5.

These are the only three tables you need, in the master database, to be able to run meta 5 as a black box.

You will notice that we give them a schema name, so that this works on other databases, such as Oracle and DB2.

The first meta 5 variables table is for the site.

The second meta 5 variables table is for the individual capsules.

I will just take you through what is in each of the variables tables.

You can see that we have a simple select star from the master table variables one table.

You can see that what is in it is the name of the spreadsheet that it came from.

You can see it has a version number.

You can see it has a row number.

You can see it has a current flag.

So, if you wanted to keep a record, of all the values a variable has had, you can just set the current flag to 0 for the old records.

You can see there is a group variable.

This is to allow sets of 208 variables.

You can have many groups of variables, but you must set all 208 variables in each group.

A single capsule can only retrieve variables at the group level.

Then you can see the at variable name used by meta 5.

This is what Meta5 knows the variable name as.

Then you can see the actual value of the at variable.

Then the rest of the fields are documentation.

We allow for giving a variable a name, a short description, and a long description.

This is used to make it easier for developers, to know what variables they should be using.

So, these variables will contain these values, for capsules that go to this group, in the master table.

You can see that there are dots in each variable, that is not yet allocated.

That dot will be sent into the capsule.

You can see things like ODBC connections.

You can see things like query folders.

If you scroll down you can see things like ODBC drivers.

You can see reporting folders.

You can see a series of dates, such as three months prior and four months prior.

You can see the signal file directory.

You can see the signal file name, which we will get to.

You can see the delete signal file variable.

You can see the sleep seconds variable.

You can see the capsule iterations variable.

And 60 seconds by 240 iterations is 4 hours.

Lastly, you can see some system variables down at the bottom.

Anyone who has used meta 5 before, will understand these variables.

For people who are new to Meta5, this makes sense.

This is a set of site variables, that make sense, that are required to run meta 5 as a black box.

Now inside each capsule, you can also have variables.

The most common variable, inside a capsule, is the signal file.

The signal file is tied to the capsule name.

If you wish to delete the signal file, you also have to set the delete signal file variable to yes.

So, in this second variables table, it makes sense, that you have to set variables, for individual capsules.

The capsule we are going to test, is the all capsules template, and you can see it has a signal file defined.

You can see the delete signal file variable is set to yes.

And these two values will go to FB and FW.

Please note, we haven’t updated the descriptions and the timestamps, on this testing machine.

The real dashboard we’re going to show you, is the head office sales dashboard.

But we are going to show you the all capsules template first.

So with that, we will move over to the meta 5 desktop.

Here we are on our meta 5 desktop.

I have zoomed in to the upper left hand corner.

This is because this is a development machine, and there are objects on the desktop we do not want to publish.

So, if you are going to use meta 5 as a black box, this is how you would do it.

What we will do is take you through the example first.

You will have an applications file draw.

Our application that we are doing is business central.

You will have a file drawer for reference capsules.

These are analogous to shareable program objects.

You will have a file draw for template capsules.

You will also have file draws for dashboards, data models, and metadata.

These are virtual filing cabinets, just like you would find real filing cabinets, in an office.

I am going to take you into the template capsules.

I am going to show you the 0001 all capsules template.
Just to show you that this capsule works, what we are going to do, is show you the run file, and then show you the run file being deleted when the capsule runs.

To schedule a capsule, all you need to do, is to copy it from wherever it is, and drop it on the capsule service prompt.

In the example we will say run now.

It will immediately run.

And you can see the run yes signal file is gone.

So the capsule executed right away.

Now I will put the run yes file back.

Now I will show you, that you can copy a capsule onto the capsule service prompt, and then submit it under a specified alias.

An alias can be any of the items you see on the screen.

At installation time, you decide what aliases you would like to define, for your desktop.

These aliases can be anything, however you don’t want to create too many of them.

So we are going to set this capsule to run at midnight.

Now we can go into capsule service prompt, we can click on show queue, and we can see the capsule that is scheduled for midnight.

Now we will click on show aliases.

You can see that all the aliases are set to periodic.

This means the capsules will be submitted at that time every day.

You can see there are variables for year, month, date, day of week, hour, minute, and second.

Capsules can also be set to run once on a schedule.

However, as I said, you do not want to create too many aliases, even though so many options are available.

So what we have shown you, is you can create any capsule, drop it on the capsule service prompt, and run it on a schedule, either periodic or one time.

Capsules can also be started based on a windows command.

So you can just create a windows command to start a capsule, and then run it any time you like.

To run a capsule instantly, without going into the meta 5 desktop, you can just run a windows command.

Now I want to show you how this works.

The first icon that you need to be concerned about, gives the capsule its name.

So we will open this icon, and you can see the GA variable has the capsule name in it.

Meta 5 has an idea called the reference capsule.

This is basically a reusable programmable object.

So they can be invoked from other capsules, and not need to be written multiple times.

In this example the first reference capsule sets the capsule controls.

I will show you the controls.

You can see the values of at A, at B, and at C.

I will remove the first three values.

I will close and open the user input controls, so that you can see the three values are truly gone.

Then we will run the first two icons in the capsule.

I will open the controls again.

You can see the values of the first three at variables are back.

This shows you, that this reference capsule, retrieves all 208 variables stored in the first variables table.

You don’t really need to know how this works.

You can just trust that it does work.

We then need to set the capsule name.

If we go to variable GA, we will see that the capsule name defaults, to just capsule name.

We need to set the actual capsule name, so that we can retrieve the capsule variables.

I will demonstrate how the capsule name is set.

Then we can fetch the individual variables, that the capsule needs, from the master database.

This is done by running the second reference capsule, into the SQL target.

When we open the SQL target, you can see the two variables, for this capsule.

In this case, it is the signal file name, and the variable to say delete the signal file, after execution.

I will show you that FB and FW, are not yet changed.

The last two icons, must be executed to apply the changes, in the SQL file.

And now you can see that they have been updated.

This line of icons sets all the variables for the capsule.

Now, we can iterate up to 240 times.

That is, if the signal file is not yet in the signal file folder, the capsule will wait for it to arrive.

If the signal file has not turned up in 240 waits, the capsule will terminate, and give up it’s slot in the queue.

If you have three capsules waiting, and a 4th capsule starts?

Then it must wait, for one of the prior three capsules to end, to get a slot, in which to execute.

The first thing that happens in the first wrapper, is to look for the signal file.

If it decides to run, it will run wrapper 2.

If it does not run, the capsule will pause for sleep seconds.

Now we will go into wrapper 2.

Inside wrapper 2 you can see the capsule called, your actual capsule.

This is where you put the actual capsule, you want to build and sell, or perhaps even use for your company.

When your actual capsule is finished, the end capsule will terminate the capsule.

Of course, your actual capsule can contain anything.

At the very end of your actual capsule, you must put the final two icons to delete the signal file.

These two icons must be there for all capsules.

If the capsule variable for delete signal file is set to no, then this capsule to delete the signal file, simply will not run.

So if you want to run a capsule, say at midnight, regardless of the signal file.

Then you create the signal file at capsule installation time, and you leave the delete signal file default to no.

Then the capsule will run at midnight, every night, because the signal file will always be there.

Now I will close the sample capsule, and go back to the applications folder.

I will show you are real world example, of what a sellable capsule might look like.

Remember, as a former Meta5 employee or customer, you could build these sorts of applications and sell them in the marketplace.

Of course, we would prefer you not to do business central.

Here is our business central file drawer again.

The first folder is for business central dashboards.

We have arranged our dashboards into 14 groups.

These are groups that are well understood by business central installed accounts.

They are in line with the processing areas of business central.

We have done our first dashboard in sales.

It is called the Head Office Sales Dashboard.

This is against our data warehouse models.

So it would run every night after the data warehouse was updated.

I will open the Head Office Sales Dashboard capsule.

The first portions of the capsule look the same as the template.

When we open up the SQL file, we will see that variable GA, is set to this capsule name.

Now I will run the icons on the second row of the capsule.

We can go to show controls.

We can see the signal file name, is set to the signal file, for this capsule.

So you can see the second line of icons, retrieves the correct variables for this capsule.

You can see wrapper 1 looks the same.

What I will do now, is take you into the capsule, and show you the actual capsule, that will be run.

Now, we are in a realistic capsule, that you can write and sell.

This capsule will make sense to most people.

You have the periods.

You have the Outlets.

You have the product categories.

You have the currencies.

Everyone who has done BI, will be familiar with these sorts of dimensions.

You can see the capsule for Plan versus Actual by month.

The capsule called this month calculations, contains a lot of complex calculations, to perform variance analysis by a variety of months.

You will see some of these as we go through the demonstration.

The way in which capsules using Excel work is this.

You query all your source data, perform any calculations needed prior to collection, and then send all data into the collect data spreadsheet.

The limitation is about 100,000 rows.

Above 100,000 rows, the data transfers tend to be a little slow.

They do work, they are just a little slow.

You may be wondering, why we have an Excel workbook, at the top left of the capsule.

This is so that it will over-write the excel workbook target, so that there is just one row of data in each Excel table, in the workbook.

The data in the collect data spreadsheet is then sent into the Excel Spreadsheet.

There is one workbook per data set sent in.

So in this example, there will be 6 workbooks, with one data region in each workbook.

The data from collect data, will be sent into Excel, one region at a time.

However, the Excel workbook will only be opened once by Meta 5, and all 6 datasets will be sent into the workbook, in one session of Excel.

You do not need to worry too much about the bottom right section of the capsule.

The piece we will highlight is the delete multiple sheets.

Meta5 sends data into Excel data regions.

The data in these data regions is then sent into the power pivot model.

Then all reports are refreshed.

Then you have duplicate data inside the workbook.

The delete multiple sheets capsule, allows the developer, to delete the worksheets, containing the data regions.

In this way no duplicate data is required in the spreadsheet, that is sent to the business user.

The business user gets a spreadsheet with only his, or her, data in it, and only one copy of that data.

Lastly, the spreadsheet is sent to a PC Directory.

It could be sent to email or sent by power shell command.

The important part of the PC directory icon, is the target directory name.

This is defined by capsule level variables.

The actual workbook output can be sent to any PC directory, visible to the desktop.

This is very likely a one drive.

Also, what is important here, are the options for the arrow, from the spreadsheet to the PC Directory.

When I option the arrow, you can see that the region name contains variables.

This means that you can provide a base name for the report.

You can provide a secondary name for the report, for example the name of the person it is intended for.

And you can even name the file extension, for the report, if you are going to use PDFs, as well as Excel.

What is in region name, is the actual name of the object, when it is sent to the PC Directory.

So, for example.

If you were producing a dashboard, that was going to be sent to 20 region managers?

You could include the region manager name, in the name of the excel workbook itself.

And you could include the region manager name, in the directory structure.

The 20 reports, which include the region manager name, could be sent to 20 different one drive directories.

The region managers could be replicating the reports from the different drives.

Therefore, the reports would find their way to the device for the region manager, for which they were intended.

Those are the important design features, to take away from this example capsule.

What I want to show you now, is that we can actually run some queries, and show you some data.

However, we have removed the customer production data that we have tested on, and you can only see data that we can allow to be public.

So, we can open up the period table, and you can see the period data come back from the query.

Notice again, this report is on the data warehouse we are building, not on business central directly.

We cannot show you products or outlets because they contain customer data.

What we want to show you now, are the sorts of queries you can write in the query tool.

We want to show you how sophisticated the generated SQL can be, without having to resort to SQL, to write the queries.

One feature of Meta 5, is that business users themselves, can write their own queries, and not need IT staff to write their queries for them.

Of course, the same effect that you will see here, can be achieved by writing SQL.

It’s just that business users can not write SQL, to the level required, to replace these queries with SQL.

This capsule was developed, before we had developed the techniques of writing queries in SQL, and sending them to the database using Meta5.

When the goal is to sell the capsule many times over, it is faster and easier to maintain the capsule, if the SQL is in a database, as opposed to in a query tool.

For those of you who really want to see the SQL generated, and read a detailed blog post on what is happening in this capsule?

We have published a blog post, B B 6, on our technical BIDA Brains site.

You can go to that blog post on the button provided.

BB0006 Blog Post

Further, if you want to read how a SQL programmer can write SQL, that performs the same function, as what you are seeing in this demonstration?

You can go to the blog post, B B 7, on the button provided on this blog post.

BB0007 Blog Post

We were able to replicate the functionality, of these query icons, in one SQL statement, that is 280 lines long.

When you read that SQL, you will see why business users cannot write it.

Just for your knowledge this report has a report date of August 15th 2017.

We were using old production data, from the data warehouse, in the development of the dashboards.

The first capsule retrieves plan versus actual amounts, over the last 12 months, from the beginning of last month.

So, what we will do now, is just scroll through this capsule, on video, showing you all the queries.

We will show you some SQL.

And for details, please refer to the detailed blog post, B B 6, on the button on this blog post.

Now, I will open up the capsule called this month calcs.

The first capsule retrieves quarter to date and year to date.

The first query gets sales year to date.

The second query gets sales quarter to date.

Now I will open sales last periods.

There are four queries.

They are as follows.

Sales This Month Last Year.

Sales Last Month This Year.

Sales Last Month To Date.

Meaning to the reporting date.

Sales Last Month To Date.

Sales This Month Last Year To Date.

As I said, you can read a detailed blog post on BIDA Brains, that contains all the SQL generated by this capsule.

In the query, Sales Last Month To Date, you see there are so many things in the query window, that they are overlapping each other.

This does not matter, it will still write the SQL properly.

One of the problems in the B I world, is that the SQL has become so complicated, that it gets harder and harder, to present a visual representation,
of the query.

Even so, the Meta 5 query tool will still work, even in the face of too many items, to put on the pallet.

In planned sales this month, you can see the fact table is, the retail outlet sales plan.

One thing to note, is that plans are held at monthly level.

We have to pro-rate the planned sales, across the number of days, that have passed in the month so far.

This capsule simply divides the plan, by the number of days, that have passed in the month.

If we capture the SQL, we can see the reporting date is August 15th 2017, using the American formatting.

In summary.

We capture the periods, the outlets, the product categories and the currencies.

Then we capture a whole series of plans and sales measures, so that we can compare plans versus sales, across periods.

We send all these details out to a spreadsheet called collect data.

Then we send all that data into the Excel power pivot models.

Then we refresh all the charts in the Excel spreadsheet.

Then we send it to the business user however it is that they would like to receive it.

Usually this will be by one drive.

These are the sort of dashboards that business managers really want.

We are using the Steven Few bullet charts, to make the dashboards more attractive, and easier to understand.

So, that is what we wanted to show you.

A real world head office sales dashboard.

I wish I could show you the dashboard itself.

But it contains customer data.

I think you would all know, that the hardest part of developing dashboards for sale, is having access to production quality data.

If this capsule was taking data from business central directly, you could sell this capsule many times over.

If many customers had the same data warehouse data model, as we are planning to do, you could also sell this capsule many times over.

Now I will just take this capsule, and copy it, onto the capsule service prompt.

We can then schedule to run it at, say, three o’clock in the morning, every morning.

I will show you the queue, to show you that the capsule is scheduled to run, at three o’clock in the morning.

As we have mentioned, you can also run a capsule immediately from a windows command line.

So if there were some emergency data updates to the data warehouse during the day, and someone wanted this dashboard updated with the new data?

They could simply go to the command that schedules this capsule, and double click on it.

The capsule would then run.

We believe that a double click on a command file, is about the easiest way to request, the immediate running of a capsule.

Most importantly in this demonstration.

We have shown you that you can sell this capsule many times over.

To install a new capsule all you have to do is.

One. Copy the downloaded capsule file into the correct file drawer.

Two. Set the correct parameters for the capsule.

This is presuming the customer has already installed at least one capsule, and gone through the site setup process.

As we said at the beginning.

The barrier to entry is profound knowledge, of the operational system, you are reporting on.

By allowing the SQL to be written by IT professionals, who know these operational systems very well, you reduce the chances of invalid data, being
presented in the reports.

Summary

What have we shown you?

We have shown you that it is possible to build a capsule, where we retrieve all the variables for the capsule, from a control database.

We then run the capsule, to produce a dashboard, that is of value to the business users.

We deliver the dashboard, via such mechanisms as one drive, or others.

For business central, we will build dashboards across many areas of business central.

We will do this for both business central direct query, as well as using the data warehouse we are building.

And with that?
I would like to say thank you very much for watching our blog post today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Sofi.

Your BIDA AI Assistant.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0043 – Business Central Staging Area Evaluation Database Available

0

Please note the script of the presentation is below so you can read along with the video if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

I am pleased to bring you some news that I hope will interest you.

I am Mihai Neacsu and I am the business development manager here at BIDA.

Today, again, we have some good news for all those people who use Business Central on premise.

Today we are making available an Evaluation Database for our Business Central Staging Area.

This Evaluation Database will be for qualified prospective customers.

So, if you are wondering if having a Business Central Staging Area is good idea for your company?

Please listen on or read on.

The Argument For Staging Areas

Way back in 2019 we released an initial white paper on how to build Staging Areas for Navision.

You can get the white paper for your IT colleagues on the button below.

White Paper

To deliver a data warehousing solution you have to have a Staging Area.

This is as true of Business Central as it is with any other large operational system.

We have just released the tables and columns renaming script for Business Central 2023 Wave 2.

Today, your IT colleagues can write queries against the underlying Business Central database to answer your business questions.

With our renaming script your company can save time and money in development and support of such queries and reports.

You can also have your IT colleagues replicate your Business Central database on to another server and query the data in the replica.

This querying a copy of the production, including easier to remember names for tables and columns, will be sufficient for tens of thousands of Business Central installed accounts.

But many tens of thousands of Business Central installed accounts will want more than just a replica of their operational system.

Tens of thousands of Business Central installed accounts will want a Staging Area.

So, as an installed Business Central Business User you should have the question on your mind:

“Why would I want a Staging Area for my company?”

So please allow me to explain the six main benefits a Staging Area has over a replica of your Business Central operational database.


Benefit Number One – Retain Deleted Data

A staging area allows you to retain historical data that has been deleted from Business Central.

Over time, some tables in Business Central become very large, such as the GL Entries.

They impact operational system performance for inserts, updates, backups and general reporting as well.

This old data also adds to the cost of running Business Central.

By retaining old data in the Staging Area you can delete data from the production system knowing you have an online copy of this old data.

This will make your Business Central system run faster.

It will save you time and money over a simple replica.

Our staging areas carry a flag on every row to say whether it has been deleted from the operational system.


Benefit Number Two – Integrating Other Data

You don’t only have data in your Business Central system.

You have data from many other systems internally in your company.

You also have data coming to you from outside your company.

A staging area allows you to put non Business Central data into the same database and easily combine it with your Business Central data.

A very good example of this is competitive information.

How much are your competitors selling their similar products for?

There is a very wide range of data available to business analysts in any company.

Putting it all into one place like a staging area is one step better than trying to combine that data some other way, such as cutting and pasting in Excel.


Benefit Number Three – Data Validation

You can validate data and mark invalid data as invalid.

Our staging areas carry a flag to indicate the validity of every row of data.

It defaults to valid and then your IT colleagues can write routines against the data to search for invalid data and mark that data as invalid.

Then those rows can be omitted from any reporting or analysis that is being done.

These validity checks can also access data from outside Business Central that is also in the staging area.

This means that you have the chance to mark data as invalid using a wide range of data and complex routines.

One of the biggest problems when reporting from operational systems, and copies of them, is that invalid data creeps into the reporting and analysis being done.

Being able to mark that data as invalid and exclude it from reporting is a very important feature to have.


Benefit Number Four – Fraud Prevention

We would all like to believe that every person working for our company is an honest person.

Sadly, this is not the case.

In many companies employees have committed fraud using the computer systems they have access to.

Such cases of fraud have become much more prevalent as computer systems became more complex and electronic business interactions became more common.

Today it is easier than ever for someone who wishes to defraud their company to do so through complex computer systems.

One of the most common ways of defrauding a company is for people to enter documents and then later delete them to try and “cover their tracks”.

This is similar to what Nic Leeson did in the very famous case of Barings Bank.

Because a staging area retains deleted rows this mechanism of fraud is reduced.

In fact, when a company wide announcement is made that all data entered into computer systems is retained in a separate staging area, this is often enough to discourage would be fraudsters.

This is because such an announcement makes clear that s/he is very likely to be caught.

One of the most powerful deterrents in discouraging people from committing fraud is to explain to them that they are very likely to be caught.


Benefit Number Five – Fraud Detection

Closely related to Benefit Number Four is the detection of fraud if an employee decides to try his/her luck.

Algorithms can be placed to run on the staging area every day looking for possible fraudulent transactions.

These possible fraudulent transactions can then be brought to the notice of the appropriate employees to investigate.

These algorithms generally have to store their results in the staging area as well.

They also do better with more history.


Benefit Number Six – Getting Ready For A Data Warehouse

As discussed above, a staging area provides a number of benefits over and above operational systems replicas.

They can be very beneficial.

This has also been seen in a similar concept called a “Data Lake”.

However, no matter how much benefit can be derived from a staging area, it can not be compared to a proper data warehouse.

A proper data warehouse will contribute much more profit to your company than just a Data Lake or Staging Area.

A staging area is a pre-requisite for a data warehouse.

One of the best reasons you might want to implement a staging area over just a replica of Business Central is to get ready for implementing a proper data warehouse.

There is no better return on investment a company can make than to implement data warehousing properly.

Summary

In summary.

We have released a script to rename Business Central tables and columns to make it cheaper and easier for your IT colleagues to write direct queries against your Business Central database.

You may use that script on your current production database or a replica.

It’s free and it will save your company money.

However, you might want to give consideration to taking the next step up to a staging area.

We have listed the six main benefits of upgrading to a staging area.

For the very minor extra investment we believe these benefits are well worth it.

If you would like to have your IT colleagues review how our staging areas are implemented?

We are willing to make an Evaluation Database, and all the code, available to your IT department for them to evaluate.

There is so much there it’s just easier to give it to your IT Colleagues than to explain it.

Once your IT department gives you the green light that they can support such a staging area.

Then you and your colleagues can make your decision as to whether you wish to gain the six listed benefits above for the small investment required.

If you would like to contact me?

Please just use the “Contact Us” button below.

And with that?

I would like to say thank you very much for reading or listening to our blog post today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Mihai Neacsu

BIDA Business Development Manager.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0042 – Business Central Tables And Columns Rename Script

0

Please note the script of the presentation is below so you can read along with the video if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

I am pleased to bring you some news that I hope will interest you.

I am Mihai Neacsu and I am the business development manager here at BIDA.

Today we have some good news for all those people who use Business Central on premise.

If you have Business Central on Premise and find it hard to write SQL against the underlying database?

We have some good news for you.

As you know, Business Central contains GUIDs in table names.

It also has field names that require brackets around the field names in almost all cases.

This makes writing SQL against the underlying database more difficult than it needs to be.

Today we are pleased to announce the release of an SQL Script that renames all tables and columns in the Business Central 2023 Release Wave 2 demonstration database.

You can run this script, start to finish, against the Demonstration 23 database of Business Central.

Just so you know, we used the Cronus UK database to create this script.

But it is easily edited to run against any other version of Business Central 2023.

We have tested it against that database and it works without error.

You can download this script from the button below and try it out for yourself.

Download Script

Of course, if you are on a different level of Business Central, or you have customized your Business Central, you have two options.

1. You can alter the script by hand.
2. You can send us your SQL Server dictionary tables and we can quickly run the process on a server here.

To make it more interesting for you?

We will do this script generation work for free for the first 20 companies who contact us and ask us to do that.

After the first 20?

We might ask to be paid a small fee just for our time.

We have implemented this script as part of developing our data warehousing product for Business Central on Premise.

This table and column renaming will make it a lot easier for us to build our data warehouse.

It will also make it a lot easier for us to develop SQL Direct Query Dashboards against Business Central.

We hope you will download and try out this script and consider using it in your company.

And with that?

I would like to say thank you very much for reading or listening to our blog post today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Mihai Neacsu

BIDA Business Development Manager.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0041 – Meta5 As A Black Box

0

Please note the script of the presentation is below so you can read along with the video if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

I am pleased to bring you some news that I hope will interest you.

I am Mihai Neacsu and I am the business development manager here at BIDA.

Today we have some really big news for former Metaphor employees and customers.

We have been very busy on a number of fronts.

One of those fronts is this.

We have been trying to find a way to create “Meta5 as a black box”.

I am very pleased to report that we have been successful.

We are, literally, putting the finishing touches on how this works.

We wanted to let the former Metaphor community know about this sooner, rather than later.

So, if you are a former Metaphor employee, or customer?

Please read on or listen on.

Introducing Meta5 As A Black Box

Those people who have used Metaphor back in the 1990s know and understand that it empowered business users to make decisions to improve the profitability of their company.

People who used it know how good it is and the difference it can make.

But Metaphor kind of got lost in the world of IBM.

We would like to “Make Meta5 Great Again”, if we can use that term in the US where so many people are so polarized about politics.

One of the areas where we have been investigating is the idea of making Meta5 available to the mass market.

Today, Meta5 has added many features that make it a great complementary tool to Excel.

So our investigation work has centered around making Meta5 a mass market product of great value to companies who like to do their reporting in Excel.

So please allow me to explain what is now possible.

Please let me apologize for the formatting of the ordered list on blog post.

It seems WordPress does not allow good spacing inside an ordered list.

These are the top 10 features of what is now possible:

  1. The SQL that can be sent to the data source can be up to 64K per statement.
  2. The SQL can be hand written and as complex as the database can support.
    This is very important because with CTEs supported in most databases today it is possible to create very complex statements that ordinary users can not write for themselves.
    These are SQL statements so complex that that Excel itself can not send them into databases.
  3. The SQL statements can contain placeholder parameters which can be set to real parameters before the SQL is sent to the data source.
    This is very important. For example, if a retailer has 20 region managers and each region manager wants his/her own suite of reports containing just his/her data?
    There would be one set of SQL.
    Each SQL statement would contain a variable for “region-manager” and then that can be set at run time.
    Meta5 will deliver, say, 10 reports to each region manager, for a total of 200 reports.
    Most likely these will be delivered by one drive.
  4. PowerShell commands can be executed by Meta5.
    This means that pretty much any function that needs to be executed on Azure, or involve Azure, can be executed from Meta5 or the BIDA scheduler.
    For example reports can be sent to one drive using PowerShell without the server itself being logged into one drive.
  5. Meta5 Capsules can be entirely self contained.
    This is very important.
    Each Meta5 capsule has 208 variables that can be set.
    Usually these are set by a site and vary from customer to customer.
    Historically, capsules were never moved from customer to customer.
    Historically there was never an effort to create Meta5 capsules and sell them on the market.
    We have changed that.
    Now we have the ability to set a “master parameters” database at the desktop level.
    All 208 parameters are retrieved from the master parameters database.
    This means that one capsule can be written that can run on many companies data because the variables will be retrieved from each companies own master variables database.
    If a company runs out of variables at 208 they can simply create a second set of 208 variables and start using them.
    But any single capsule is restricted to 208 variables.
  6. Each capsule can have it’s own variables that over-ride the company global variables.
    Most capsules have their own internal variables that they need to use that over-ride the global company variables.
    Or they may over-ride reserved variables.
    For example there is a now a variable called capsule-name.
    This is a reserved global variable that is set by each individual capsule.
    The result of 5/6 is that we can write one capsule for an ERP like Business Central.
    We can then put it into the business central store and sell it to ANY installed BC account.
    Other members of the former Metaphor community might wish to do the same.
  7. Meta5 can schedule capsules on at timer.
    Meta5 has a feature to schedule capsules to run at a certain time.
    Meta5 also has a feature to start a capsule based on a windows command.
    So a user can run a capsule simply by double clicking the command to run it.
  8. A capsule can wait for a “signal file” from the data source to tell it that it has finished processing.
    Say an ERP is running late in processing and the capsule is scheduled for, say, 4am, but the ERP has not completed processing.
    What is available is a feature for the ERP to ftp a signal file to somewhere the Meta5 server can see it when data is ready.
    The Meta5 capsule can wait until that file has arrived before running.
    This use of signal files is standard across most ERP implementations to let downstream systems know they are finished processing certain data.
  9. The signal file deletion is optional.
    If the customer wants a capsule to run on it’s schedule every day there is a feature to turn off the deletion off the signal file.
    The signal file would be created at capsule installation time and never deleted.
  10. Removal of temporary workbooks from Excel.
    Having retrieved data from any ODBC source, including any CDATA ODBC source, we can send that into Excel workbooks.
    We can then load it from the excel workbooks into the power pivot models.
    We can then refresh all reports/dashboards with the new data in the power pivot model.
    We can then delete the workbooks where the data was sent into Excel so they do not need to be “hidden” from the user.
    The user then gets a workbook with his/her data ONLY in the power pivot model and no redundant worksheets in the workbook.

Given all of the above?

We, or you, can write a capsule once, and then sell that one capsule many times over to many customers.

When you put all this together it means that we now know how to create much more valuable dashboards in Excel, sourcing data from common operational systems.

The “barrier to entry” is deep knowledge of the operational system that you are writing the dashboards against.

But this will work for ANY system, and that includes ANY large system that is widely used.

All we (or you) have to do is develop dashboards better than the vendor has delivered, and better than other people doing the same have delivered.

Of course, if you are a former Metaphor employee or customer, then you might very well have an interest in having your company investigate this whole idea of “Meta5 as a black box”.

There are a number of angles to investigate this from.

Firstly, in using Meta5 inside your company to generate Excel dashboards that you have not been able to get to work any other way…yet.

Or Secondly, we are sure there are some other Meta5 literate people out there who would like to create capsules and sell them to customers on the open market.

We will be doing Business Central and the InfoSys Finacle banking system.

So we would prefer no competition in those spaces for a while.

But there are a lot of large common operational systems (as well as installed data warehouses) where “Meta5 as a black box” would be a very sellable item.

At least we hope so!

I hope that this blog post has been interesting to the former Metaphor community.

Please pass this blog post along to those you know who used to be a part of the Metaphor community.

The idea of “Meta5 as a black box” means that Meta5 could be widely used again to the great benefit of Meta5 customers.

We hope all those people who were once part of the Meta5 family will have an interest in promoting this idea.

If you are interested and would like to be notified when we bring out the demo?

You can contact me on our “Contact Us” button below.

And with that?

I would like to say thank you very much for reading or listening to our blog post today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Mihai Neacsu

BIDA Business Development Manager.

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

BIDA0040 – InfoSys Finacle Version 7 Column Table Renames Available

0

Please note the script of the presentation is below so you can read along with the video if you like.


(Note. We have swapped my photo for the cartoon character. I hope you like it!)

Hello and welcome to our latest blog post.

I am really pleased you have come back to see our latest news!

I am pleased to bring you some news that I hope will interest you.

I am Mihai Neacsu and I am the business development manager here at BIDA.

We have some really big news for our readers.

We have been very busy on a number of fronts.

And one of those fronts is this.

We are building an end to end data warehouse for a client based on the Finacle Version 7 Core Banking System by Infosys / EdgeVerve.

This has been a major undertaking and will go along side our end to end data warehouse solution for Business Central.

To the best of our knowledge this is the largest ever product oriented data warehouse development undertaking, if you measure the number of fields being put into the data warehouse.

Today we are making a significant announcement for the installed users of Finacle Version 7.

If you have Finacle Version 7 installed?

This is very good news for you and your company.

So if you have Finacle Version 7 installed?

Please read on or listen on.

Introducing Renaming of Finacle Tables and Columns

I think it’s safe to say that no one will criticize me for saying that the naming standards of fields in Finacle Version 7 could stand some improvement.

It’s quite an old system that was created in the days of the 30 character name limit in Oracle.

But who would ever be crazy enough to attempt to provide better alias names for all the columns in Finacle?

Well?

That would be us.

We have now completed a process where we have provided aliases for all 56,000 fields in Finacle Version 7.

Of course, now that we have spent months and months doing this renaming of Finacle columns we have the capability to give any Finacle Version 7 installed account a create view script that will alias all your tables and columns.

This create view script will provide you with a significant cost reduction in development, maintenance and support of all internally written reports / queries against your Finacle operational system data models.

This is because the column names are more intuitive, easier to read, easier to remember, and easier to use.

By simply installing this suite of create view statements you can start to reduce the cost of all reporting development done directly against your Finacle Version 7 system.

But wait…there’s more.

One of the things we did was scan all the Finacle PDF Documentation available for the data models and put them into our customers dictionary.

The scanning was not perfect, but it’s much better than reading PDFs.

This means that we can give you Excel based, searchable, documentation for Finacle Version 7.

I can assure you this is much better than reading the PDF manuals.

If you will excuse the obvious marketing line…

But wait…there’s more!

We are offering this create view script, plus the documentation that we have created ourselves from the PDF documentation, to any proven installed Finacle Version 7 account…

FOR FREE!!!

Yes, you heard that correctly.

We will give you the benefit of the hundreds of hours of effort of our team in renaming 56,000 fields inside Finacle Version 7, for free.

In turn, this create view script will save you hundreds, if not thousands, of hours of developer time over the coming years.

Of course, we have done all this work under a Non Disclosure Agreement.

A part of our purpose is to make life easier and better for Finacle installed accounts.

We are working with InfoSys and we are co-operating with InfoSys.

So to take advantage of this great offer?

All we need from you is a scanned copy of the Non Disclosure Agreement your bank signed with Infosys for Finacle Documentation, and a copy of your personal Non Disclosure Agreement that binds you to the Infosys Non Disclosure Agreement.

If that is included in your employment contract then an email to that effect from HR would be fine too.

Once we have confirmed that you are covered by a legally binding Non Disclosure Agreement we can send you the Oracle script and the associated documentation.

Below is a picture showing an example of the documentation we have generated based on our work.

You can click on the image to get the full size to be more readable.

This is the loan account master table.

Everyone who has Finacle installed knows this table.

The columns are self explanatory for the most part.

You can see Column Name and New Column Name.

You can see that the Finacle Column Name has an alias that is more readable and more understandable.

And yes, we really do mean we renamed all the fields that did not have good names in our opinion. All 55,000 of them. About 1,000 fields we left without a new name.

On the left hand side of the report you see two columns that I would like to explain.

The Stage Table column is the name of the table in our data warehouse staging area.

This name can not be changed.

The next column is of interest to you.

This is the Stage View name.

If you want to create new view names for your Finacle renaming process, then you can just fill in new names in this field of an excel input workbook and that will be the new name of the view over the top of your Finacle tables.

So you have the ability to not only use the default view names we provide in our script.

You have the option of creating a new set of view names to make your set of views that much easier to use for all your developers.

Below we have provided you with a couple of examples of the create view scripts just so you can see we are serious about this free offer.

You can see that the default is to simply prefix the table in question with a “v” for “view”.

You can create any view name alias you want.

We sincerely hope that you will take advantage of this free offer.

The guys who did the whole renaming process would love to see their work put to good use!

You can contact me on our “Contact Us” button below.

And with that?

I would like to say thank you very much for reading or listening to our blog post today.

I really appreciate your time and attention.

We hope you found this blog post interesting and informative.

We would love to hear your comments, so please feel free to use the comments section below.

I wish you a great day.

Thank you.

Mihai Neacsu

BIDA Business Development Manager.

Example Create View Scripts

Please note that wordpress removes the multiple blanks between the column name and the alias for the view.

The real code is a little more readable.

Of course it is generated from our dictionary.

These examples make it clear that we have done our best to give better names to every Finacle column.

Of course, if you do not like our names you can always start with our renaming and update the aliases in any way you like!


create view tbaadm.vacct_entry_table as select
sol_id pk_service_outlet_num
,bank_id pk_bank_unique_num
,bill_id pk_bill_unique_num
,tran_id pk_transaction_num
,part_tran_srl_num pk_part_transaction_serial_unique_num
,entity_cre_flg row_verified_flag
,del_flg row_deleted_flag
,ref_code bill_reference_code
,tran_date transaction_date
,bill_func bill_function_code
,dtd_ind mass_bill_consolidation_flag
,tran_type bill_transaction_type_code
,tran_sub_type bill_transaction_sub_type_code
,part_tran_type part_transaction_type_code
,acid general_bank_acct_num
,tran_amt bill_transaction_amt
,value_date transaction_value_date
,rpt_code bill_report_code
,ref_num part_transaction_reference_unique_num
,instrmnt_type bill_transaction_instrument_type_code
,instrmnt_date bill_transaction_instrument_date
,instrmnt_num bill_transaction_instrument_unique_num
,instrmnt_alpha instrument_alphabetical_identification_type_code
,tran_particular bill_transaction_comments_ldesc_01
,tran_rmks bill_transaction_comments_ldesc_02
,prnt_advc_ind transaction_print_advice_code
,restrict_modify_ind transaction_values_modification_restriction_code
,voucher_prnt_flg bill_transaction_voucher_print_flag
,module_id banking_system_module_code
,br_code branch_num
,lchg_user_id last_change_userid
,lchg_time last_change_tstamp
,rcre_user_id create_userid
,rcre_time create_tstamp
,consolidation_ind bill_transaction_consolidation_code
,ptran_ref_ind bill_transaction_partial_reference_code
,ref_tran_id reference_transaction_num
,ref_tran_date reference_transaction_date
,ref_part_tran_srl_num reference_partial_transaction_serial_unique_num
,ref_oclg_zone_code reference_outward_clearing_zone_code
,ref_oclg_zone_date reference_outward_clearing_zone_date
,ref_oclg_set_num reference_outward_clearing_set_unique_num
,ref_oclg_part_tran_srl_num reference_outward_clearing_serial_unique_num
,navigation_flg originating_responding_navigation_code
,bank_code bank_num
,ts_cnt row_modified_counter
,ref_oclg_sol_id reference_outward_clearing_service_outlet_num
,gl_date bill_transaction_general_ledger_date
from tbaadm.acct_entry_table
;


create view tbaadm.vcollection_gen_maint_tbl as select
sol_id pk_service_outlet_num
,bank_id pk_bank_unique_num
,collection_id pk_collection_serial_unique_num
,collection_code collection_code
,collection_crncy collection_iso_currency_code
,collection_b2k_id collection_b2k_event_num
,cif_id customer_unique_num
,oper_acid operative_bank_acct_num
,oper_charge_acid operative_charge_bank_acct_num
,collection_amt collection_amt
,collection_cntry_code collection_country_code
,grace_perd collection_grace_period_days_num
,lodg_date collection_lodgement_date
,due_date collection_due_date
,delink_date collection_delink_date
,corr_coll_bank_code correspondent_collection_bank_num
,corr_coll_br_code correspondent_collection_branch_num
,corr_bank_name correspondent_bank_name
,corr_bank_addr1 correspondent_bank_swift_address_line_01
,corr_bank_addr2 correspondent_bank_swift_address_line_02
,corr_bank_addr3 correspondent_bank_swift_address_line_03
,corr_bank_cntry_code correspondent_bank_country_code
,collection_status collection_status_code
,tran_sol_id transaction_service_outlet_num
,current_func_code current_bill_function_code
,collection_liab collection_liability_amt
,collection_liab_crncy collection_liability_iso_currency_code
,cls_flg collection_closure_flag
,cls_date collection_closure_date
,free_code_1 user_defined_code_01
,free_code_2 user_defined_code_02
,free_code_3 user_defined_code_03
,cont_cr_liab_acid contingent_credit_liability_bank_acct_num
,cont_dr_liab_acid contingent_debit_liability_bank_acct_num
,cont_rate_code contingent_rate_code
,cont_rate contingent_rate_percentage
,cont_crncy contingent_transactions_exchange_iso_currency_code
,cont_liab_amt contingent_liability_amt
,reminder_num_sent reminders_sent_num
,next_reminder_date next_reminder_scheduled_date
,last_reminder_sent_date last_reminder_sent_date
,date_of_remit commission_remitance_date
,bef_rpt_srl_num bef_report_serial_unique_num
,bef_rpt_date bef_report_date
,remit_amt_in_hc home_currency_remittance_amt
,entity_cre_flg row_verified_flag
,del_flg row_deleted_flag
,rcre_user_id create_userid
,rcre_time create_tstamp
,lchg_user_id last_change_userid
,lchg_time last_change_tstamp
,ts_cnt row_modified_counter
,party_name remittance_lodged_party_name
,party_addr1 remittance_lodged_party_address_line_01
,party_addr2 remittance_lodged_party_address_line_02
,party_addr3 remittance_lodged_party_address_line_03
,party_city_code remittance_lodged_party_address_city_code
,party_state_code remittance_lodged_party_address_state_code
,party_cntry_code remittance_lodged_party_country_code
,party_pin_code remittance_lodged_party_address_pin_code
,purpose_of_rem purpose_of_remittance_code
,passport_det party_passport_details_ldesc
,tc_amt travelers_checks_issued_amt
,crncy_amt issued_currency_amt
,cntry_of_visit visit_country_code
,corr_bank_city_code correspondent_bank_address_city_code
,corr_bank_state_code correspondent_address_state_code
,corr_bank_pin_code correspondent_address_pin_code
,nostro_acct_num nostro_bank_acct_num
,other_cif_id other_party_customer_unique_num
,other_party_name other_party_name
,other_party_addr_1 other_party_address_line_01
,other_party_addr_2 other_party_address_line_02
,other_party_addr_3 other_party_address_line_03
,other_party_city_code other_party_address_city_code
,other_party_state_code other_party_address_state_code
,other_party_cntry_code other_party_country_code
,other_party_pin_code other_party_address_pin_code
,ps_ind purchase_sale_flag
,transit_perd collection_transit_period
,encash_cert_det encashment_certificate_ldesc
,cdf_details currency_declaration_form_ldesc
,dclrd_eefc_amt declared_export_earnings_foreign_currency_amt
,eefc_acid export_earnings_foreign_currency_bank_acct_num
,eefc_amt export_earnings_foreign_currency_amt
,cum_eefc_amt cumulative_export_earnings_foreign_currency_amt
,ref_collection_id reference_remittance_collection_serial_unique_num
,bp_liab remittance_purchase_liability_amt
,bp_liab_crncy remittance_purchase_liability_iso_currency_code
,collection_liab_hc_eq remittance_purchase_liability_home_currency_amt
,bp_limit_b2kid remittance_purchase_limit_b2k_event_num
,bp_dlnk_limit_b2kid remittance_purchase_delink_limit_b2k_event_num
,invt_class inventory_location_class_code
,fc_int_conv_rate bill_liability_to_interest_conversion_rate_percentage
,fc_int_conv_rate_code bill_liability_to_interest_exchange_rate_code
,int_on_liab_flg calculate_interest_on_liability_code
,int_calc_crncy_code interest_calculation_iso_currency_code
,rear_end_int_flg calculate_interest_after_purchase_flag
,int_peg_flg peg_calculated_interest_flag
,peg_frequency_in_months peg_frequency_interest_months_num
,fb_ref_num foreign_bank_serial_unique_num
,ref_date foreign_bank_serial_unique_num_assigned_date
,peg_frequency_in_days peg_frequency_interest_days_num
,paysys_id payment_system_unique_num
,benef_coll_bank_code beneficiary_collection_bank_num
,benef_coll_br_code beneficiary_collection_br_code
,benef_bank_name beneficiary_bank_name
,benef_bank_addr1 beneficiary_bank_address_line_01
,benef_bank_addr2 beneficiary_bank_address_line_02
,benef_bank_addr3 beneficiary_bank_address_line_03
,benef_bank_cntry_code beneficiary_bank_country_code
,benef_bank_city_code beneficiary_bank_address_city_code
,benef_bank_state_code beneficiary_bank_address_state_code
,benef_bank_pin_code beneficiary_bank_address_pin_code
,dd_print_count demand_draft_printed_num
,last_dd_printed_date last_demand_draft_printed_date
,against_import_export inward_remittance_advance_against_export_flag
,intermediary_trade intermediary_trade_country_flag
,rbi_permit_det rbi_permit_comments_ldesc
,drawee_centre_code drawee_centre_code
,drawee_code drawee_code
,notl_conv_rate notional_conversion_rate_percentage
,notl_conv_rate_code notional_conversion_rate_code
,fircr_print_cnt foreign_inward_remittance_certificate_printed_num
,first_fircr_print_date first_foreign_inward_remittance_certificate_print_date
,last_fircr_print_date last_foreign_inward_remittance_certificate_print_date
,free_field user_defined_ldesc_01
,acct_num general_maintenance_bank_acct_num
,corr_coll_bic correspondent_collection_bank_identifer_code
,benef_coll_bic beneficiary_collection_bank_identifer_code
from tbaadm.collection_gen_maint_tbl
;

We will be emailing our subscribers useful and
valuable information regarding business intelligence.
Please subscribe here to be on our general emailing list.

HD0015 – How Do I Add A New Data Disk Pod

0


Hello and welcome back to our latest How Do I… blog post.

We are happy that you’ve come back to see our latest blog post.

In this blog post we are going to show you how to add a new data disk pod.

Let us presume you want to add a new 10GB disk pod.

1. Go to the directory: C:\Program Files (x86)\Meta5\exe\

Enter the command: fsinit "C:\ProgramData\Meta5\Pods" 10000 data01

The fsinit utility will create a subdirectory called “data01” in the above named directory and put the pod data files in that directory.

You should see something like this on your PC.

 

2. With Meta5 running on your desktop click on the Meta5 icon in the tray and select “Meta5 Starter Edition Configuration”

You should see this window.

3. In the text box “Server Drive Paths” the value set is "C:\ProgramData\Meta5\Pods\System".

We need to add the new pod to that list:

C:\ProgramData\Meta5\Pods\System,C:\ProgramData\Meta5\Pods\data01

If you have an existing pod that you want to add then the process is the same at this step by adding the path of the pod directory to this text box (comma separated) at which point there is nothing else you need to do as the pod should come online when you start Meta5. For a new pod, more work is required.

4. Start Meta5 normally.

Then start a new command prompt and go to the directory: "C:\Program Files (x86)\Meta5\exe\

Run the following command:

m5cmd fscnsl

This command will open a File Service Console window.

Login as “system” with a password of “original” (it might prompt you to set the password the first time and then login).

Then issue the following commands.

Each pod has a unique “number” assigned to it with the “/#” switch.

The demo has “10” but you can use any legal number between 2 and (I think) 16.

The actual commands are in red and bold so that you can see the response you should expect.

Meta5 File Service Console
Version 4.3.5
Licensed Materials - Property of Meta5, Inc.
Copyright (C) 2000-2023 by Meta5, Inc. - All Rights Reserved


Username: system
Password: ********
Connecting to 'system' console...
system>det
Slot A1: Disk module "System, #1" is online
Size 11065MB: 10936 file descriptors (0%), 249257 data blocks (2%)
Slot A2: Disk module "data1, #10" is in undefined state
Slot A3: blank
Slot A4: blank
system>erase a2 /#:10
ERASING drive A2.
Drive number: 10.
Drive name: unchanged or set to default if new drive.
File descriptors share: unchanged or set to default if new drive.
Are you sure?
(Y/N)? y

[09:22:24] Slot A2: Erasing.
Disk space to be allocated to file descriptors: 25%

[09:22:24] Slot A2: remote operation is in progress.
system>
[09:22:31] Slot A2: Offline.
online all

[09:22:37] Slot A2: Online.

[09:22:37] Slot A1: Already online.

[09:22:37] Slot A2: done.
system>det
Slot A1: Disk module "System, #1" is online
Size 11065MB: 10936 file descriptors (0%), 249259data blocks (2%)
Slot A2: Disk module "data1, #10" is online
Size 10245MB: 4 file descriptors (0%), 11 data blocks (0%)
Slot A3: blank
Slot A4: blank
system>exit

5. You should now have an system desktop that looks like this.

We hope you found this How do I Presentation useful and informative.

With that I’d like to say thank you very much for reading this.

We really appreciate your time and attention.

I wish you a nice day.

Mihai.

HD0014 – How Do I Run PowerShell Commands In Meta5

0

Please watch the video in 1080p because it is recorded in High Resolution.
Please note the script of the presentation is below so you can read along with the demonstration if you like.


Hello and welcome back to our latest How Do I, blog post.

We are happy that you’ve come back to see our latest blog post.

My name is Sofi, and I’m your BIDA AI assistant.

I will be your host for the demonstration today.

In today’s demonstration, we are going to show you how to run Power Shell commands in Meta 5.

This is very useful because you can do such things as send e-mails, send reports to One drive, and download things from One Drive.

You can do all sorts of useful things once you can run a Power Shell command from inside Meta 5.

Now on with our demonstration!

Here we are on our Windows desktop running Meta 5 in the background.

We have a file directory in front of you, which is the BIDA hosting schedule directory.

This is where we put our Power Shell commands.

I will take you through each of the relevant files.

The first file you need, is a file to create the secure string, which will be your password.

You can just run this command, enter your e-mail address, and it will put the encrypted password into this file for you.

To show you what that looks like, I will just click on the encrypted password file.

It is just a series of numbers and letters.

It is the password to the BIDA common e-mail address encrypted.

The next thing you need is a bat file to run a PowerShell command.

This is a very simple file.

You can see the power shell exe command.

But notice the No Profile and execution policy bypass parameters.

These parameters are required with Meta 5 to bypass script security for Power Shell commands.

This is because Meta Five runs under a different profile than the Windows desktop.

So you need to remember to put in no profile execution policy of bypass.

Then you have the command parameter which is just a positional parameter.

That positional parameter is the command that you want to run.

In our example, we want to run a command to send an e-mail to say that the ETL batch has finished.

That is a very common thing that you want to do in data warehousing.

You can see the first line is the name of the e-mail address, that is going to send the e-mail.

The password you can see is, get content from the login password text file, and convert it to a secure string.

So we are getting that password, we are running it through convert to secure string, and we are assigning that to the password variable.

And then we are creating credentials of a new object, which is Automation ps credential.

The argument list is the administration name, the e-mail, the user we want to log in with, and the password for that user.

From there we just send the mail message from BIDA common to Peter’s business e-mail.

The subject is the BIDA weekday batch has completed.

The body of which is also the BIDA weekday batch has completed.

We are using the S M T P server of Office 3 6 5.

We are using the credentials of credential and we are saying use S S L and use port 5 8 7.

Now one note to mention.

This e-mail address must be authorized to send S M T P emails, and the default is to not be authorized.

You can just do a Google search and ask how do I authorize Office 3 6 5 e-mail address to use S M T P e-mail.

Now, there may be other ways of sending emails.

We happen to be using S M T P, so those are the commands that you need.

The text will be on the blog post, and you can see the commands here in the directory.

Now we are going to go over to the Meta 5 desktop.

We are going to open this capsule, run bat commands with results.

We have an icon, run bat file, and the icon is the I application icon.

So when you’re looking for it in new icons, you look for Integrator Icons and the I application icon in the Integrator Icons folder.

We have given it the name run bat file and this can be any PC application.

We say, execution information, Global application.

We have the Execution file parameter.

In here we see the path name.

I will zoom in on it for you.

It is BIDA, beta hosting, C 12, schedule.

You can see it in the video.

The file is run power shell command dot bat.

The exact command that you saw.

Then the command line parameter is the same folder.

And it runs the command run M O 1, send E T L, e-mail, P S 1.

The working directory is also entered.

Now I am just going to run the command.

It is going to execute that Power Shell command.

All we need to do is hit run and you will see it just takes a second.

Now it says complete, please continue.

Because I don’t want to show you Peter’s inbox, you can just trust me that the e-mail actually arrived.

OK!

That completes our demonstration today.

Today we showed you a very simple way of how to run Power Shell commands in Meta 5.

Thus, making the full power of PowerShell available to you, from inside the Meta 5 desktop, and from inside Meta 5 Capsules.

We hope you found this How do I Presentation useful and informative.

With that I’d like to say thank you very much for listening.

We really appreciate your time and attention.

I wish you a nice day.

Sofi. Your BIDA AI Assistant.

The commands that you saw in the video are as follows:

run-this-as-a-powershell-command.ps1
Read-Host -Prompt "Enter your email password" -AsSecureString | ConvertFrom-SecureString | Out-File "M:\A_BIDA_HOSTING\BIDAHS_C012_BC\S01-Schedule\bida-common-login-password.txt"

RunPowerShellCommand.bat
PowerShell.exe -noprofile -executionpolicy bypass -Command %1

run-m01-send-end-etl-email.ps1
$AdminName = "bida.common@businessintelligenceda.onmicrosoft.com"
$Pass = Get-Content "M:\A_BIDA_HOSTING\BIDAHS_C012_BC\S01-Schedule\bida-common-login-password.txt" | ConvertTo-SecureString
$cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $AdminName, $Pass
Send-MailMessage –From bida.common@businessintelligenceda.onmicrosoft.com –To peter@bida.ro –Subject "BIDAWeekDayBatch001 - Batch Completed" –Body "BIDAWeekDayBatch001 - Batch Completed" -SmtpServer smtp.office365.com -Credential $cred -UseSsl -Port 587