Convert JSON files to Word documents

A while back I built the first version of ndocx – a utility that you can call in your terminal to populate standard Word templates. It can used to generate documents based on simple JSON files. You can think of it as a Word document visualizer of your JSON files. The ndocx utility doesn’t need Word or any other application to be installed on the machine, which makes it a good fit for services and you don’t need any license to run it. It’s real open source alternative that you can use as you wish.

ndocx is in fact the command line interface of Novo Docx, a library based on the latest .NET Framework. You can also find an Azure Functions App that exposes the same functionality as a REST service. This is a good alternative if you don’t want to deal with document manipulation yourself.

If you have a JSON file and a word template, using ndocx is as easy as the following command.

ndocx populate mytemplate.docx -params params.json -output output.docx

Take the following word template as an example.

A very simple word template

It has just one placeholder called a. In order to populate that place holder, you would need a JSON file like the following.

{
  "a": "simple test"
}

In other words for each placeholder in your word template, you would need an attribute in your JSON file with the exact same name.

If you have repeating section in your word template, you will need an array for your repeating section and inside the array each placeholder (that you want to populate) needs a matching attribute. For example the following JSON, has a simple attribute called “a” and an array called “repeating1”.

{
  "a": "simpla test",
  "repeat1": [
    {"a": "Col 1 - row 1", "b": "col 2 - row 1"},
    {"a": "Col 1 - row 2", "b": "col 2 - row 2"}]
}

The given JSON file can match the following word document, with a simple placeholder called “a” and a repeating section called “repeat1”.

A simple template with a repeating section

Just keep in mind that currently only “Plain text content control” and “Repeating section content control” are supported which should be enough for majority of use cases.

You can download the latest release of ndocx from its Github repository, Novo Docx.

https://github.com/rezanid/novodocx/releases

Power Platform Tidbits #7 – Optimize for server-side execution (folding) when adding columns

Power query can take some of the relational transformations and ask the data source to take care of them. This means that the transformations – like grouping, filtering, adding columns and more – can run at the data source, where they are more efficient. This also avoids downloading huge amounts of data and accelerate the execution. In Power BI terms, this is called folding.

When writing a Power query you need to make sure every step that can be passed to the data source is passed to the data source. If you pay attention to the Applied steps, when a green vertical line is displayed besides a step, it mean that the step is passed to the data source. For the following picture shows that the first steps, until “Kept top rows” are passed to the data source.

Please note that once you add a step that cannot be passed to the data source, any step that come after will not be passed to the data source. This means as a general rule of thumb, you should try to apply all foldable steps in the beginning when you can.

You should also try to pay attention to when you are mixing multiple functions in a step. You might notice that when you separate them, a part can fold to the data source. Sometimes it can be tricky to spot them, specially when the UI is mixing them behind the scene. Let’s take an example to make it clear.

For example when you add a new column and define its data type through the UI, Power BI will mix two functions Table.AddColumn and Table.TransformColumnTypes, even though it’s just one column and Table.AddColumn is capable of defining the column type. See for yourself.

Adding new column through UI, will mix Table.AddColumn with Table.TransformColumnTypes.

Now the issue is even though Table.AddColumn is a foldable function, the Table.TransformColumnTypes isn’t. To fix this issue, you either need to separate them is different steps or just rely on Table.AddColumn to both add the column and define the type. In the above example we must change:

Table.TransformColumnTypes(Table.AddColumn(#"Selected columns", "Custom", each 1000), {{"Custom", Int64.Type}})

To:

Table.AddColumn(#"Selected columns", "Custom", each 1000, Int64.Type)

And voila:

|Adding column and simultaneously defining the type in Table.AddColumn makes the step foldable.

NovoDocx – An open source service that generates documents using standard Word templates

Lately, as part of solution I have been building for a client, we had to deal with several document generation flows. I’m talking about really complicated document templates. For example there is one that is composed of several tables and sections that compare different credits / mortgages that a client can take and their financial impact, benefits and downsides and more. This is nothing like the simple mortgage documents that your local bank shares with you. In fact, I am happy for their clients that will receive such well-thought documents, but as for the poor developers, this was not an easy task.

There are two out-of-the-box options when you need to populate Word documents.

  • Word templates feature that existed for ages in Dynamics CRM and now inherited by Power Platform. This is normally the easiest option, but it has many limitation and only works in harmony with rows in a table and its direct related tables.
  • Word Online Connector which is a premium connectors and more capable without dependency on any table. The main issue with this one is when you have flows in your solution that will be deployed in different environments or any time the location of your document needs to change. In this case instead of the name of these placeholders you would need to use random numbers because this connector cannot figure out the schema of the document or lock it after your build one.

For the second options there are some workaround, that I will be writing about soon in future, but what if the logic is so complex that using Flows or a Power App does not make sense or not enough.

In search for an open source library

At this point I started looking for an open source .NET library (preferably) to let us get the job done. It should exist in this day and age, right? The answer is no, it doesn’t. There are only some paid libraries and services that are quite expensive for what they do and working with them is not as easy as one might think. After digging deeper into GitHub, I was able to find some libraries that either rely on Open Office or a headless browser or the brain of the library is in a closed-source black box. I have to say honestly the fact that there is no open-source library and not in .NET was annoying enough, so I decided to start one and I decided to start simple and make it as easy as possible for the developer to use with no knowledge of the underlying format (😎 ehem which I know pretty well, but not to Eric White level well ☺).

Novo Docx

NovoDocx’s first readme file push in Git

Today I share with you my first iteration of Novo Docx, my take on a simple-to-use and simple-to-host library and service that you can use in you own projects. The current features are enough to handle a complex documents and quite fast. I have tried to capture many edge cases too. I will be focusing on enriching the functionality and providing more hosting options. The source code is hosted in GitHub. It includes a simple library and an Azure Functions App that you can use however you like.

Power Platform Tidbits #6 – Plugins in single-project vs multi-project

Plugins has been and are an integral extension point in Dynamics and now Dataverse. They allow you to put logic close to the data. Logic that can be invoked at different stages of execution of an action (aka message).

When the number of plugins grows, you might be willing to separate them into different projects and even share common parts of logic through a project that referenced by other projects. This is the most common way of organizing code in almost any kind of application development. However when it comes to Plugin development for Dataverse, this is not the recommended way of development and for good reasons.

Benefits of single-project plugins

In fact, Microsoft recommends that you consolidate all your plugins into one assembly until the size of the resulting assembly grows larger than 16MB. So far it seems that single-project is the only way to go, but there is a caveat here!

Benefits of multi-project plugins

When you are in a big project with multiple developers assigned to plugin development at the same time, the complexity of deployment and test will increase with the number of developers. Imagine each developer working on a new plugin (presumably each in a different git branch) and the test environment is shared. Every time a new version of the assembly is deployed, it might contain only one of the new plugins.

One solution is to give every developer an isolated environment, but it is not always possible due to the number of integrations required and many other complications that comes with provisioning a new environment.

Another solution is to put in place a better discipline around your branching and deployment strategy. For example, you can ask developers to communicate with each other before deployment and cherry-pick other plugins into their own branch to make sure all the latest plugins are included in the deployment. This would work, but it is prone to human error (what isn’t 😏). So let’s summarize. Whenever you need to deploy a new plugin for testing:

  • Try to manage requirement in such way to reduce the number of plugins in development at any given time.
  • Put each plugin in a dedicated branch.
  • Communicate before each deployment.
  • Cherry-pick from other branches, anything that needs to be included in deployment.
  • Implement unit-testing in your plugins to reduce the number of deployment required.

This last one (i.e. unit testing) can substantially reduce the number of deployment, increase quality and save precious time. The first one is important too.

Is there a better way?

There isn’t, or is there? 😑 Ok I will tell you. If the above does not suit for some reason. There is another solution that might help you. In other words, if you tried the above and you are still facing issues, or if you are avoiding single-project approach due to the issues you have faced, there might be a solution for you.

Visual Studio has a feature called linked files. This feature is very easy to use and allows you add links to files in you project without physically copying the file to your projects. I have always used this approach in SSIS development to target multiple versions of SSIS without duplicating the project. But how can we benefit from it in our plugins?

Imagine you are developing several plugins and each developer is working on a plugin (or more). You still put all your plugins in a single project and follow the same branching strategy, but when a developer needs to deploy the new plugin he/she can still create a new project and add all the files required as links into that project, deploy and test as needed. You can cleanup and remove that project in future before merging back to the main branch. The following demonstrates a situation when a developer is working on a LeadQualification plugin in a fancy project.

Fancy.Plugins:
- PluginBase.cs <──────────────╮
- LeadQualificationPlugin.cs <─│─╮
- NamingPlugin.cs              │ │
- ConnectionPlugin.cs          │ │
- ...                          │ │
                               │ │
Fancy.Lead:                    │ │
+ PluginBase.cs ─────────────────╯
+ LeadQualification.cs ────────╯

As you have notices Fancy.Lead project is just an empty project with two links to the main project, but when you compile, it will give you Fancy.Lead.dll in addition to Fancy.Plugins.dll that means you can deploy and test this smaller assembly TEMPORARILY until you are ready to merge it to the main branch.

To create those links you will need to simply hold down Ctrl + Shift while you drag and drop the file from the Fancy.Plugins project to the Fancy.Lead project. In fact if you have ever made shortcuts to files in Windows, this is pretty similar.

Build dialogs using model-driven pages to run flows all with low-code

Low-code is all the rush these days. It makes for more maintainable solutions that are built faster, specially when it comes to extending existing platforms like Power Platforms / Dynamics. It makes our work more decoupled from that of Microsoft’s and we can each focus on building better solutions without treading on each other’s toes. 😊

One of the main extension points in Power Platform is adding custom buttons to the Command bar. If you are not fully familiar with command bar, be sure to read the following article in the official docs to know what it is and where it is before reading further.

Command designer overview – Power Apps | Microsoft Docs

The previous experience was called Ribbon which is still around. If you would like to know more about this evolution, please read the following page from the official docs.

Command bar or ribbon presentation (model-driven apps) – Power Apps | Microsoft Docs

What are we building?

  • Reusable JavaScript library to open custom pages as dialog boxes
  • Custom button in command bar
  • Custom page in model-driven app to act as a dialog box
  • A cloud flow

In case you are wondering if this is a good practice to display custom pages in dialog boxes, this whole thing is recommended by Microsoft, but the only caveat right now is that Microsoft has not (yet?) provided a low-code solution to open custom pages. Instead, they have given us a list of JavaScript samples that can be used as starting points. I specially recommend you read the first article to have a good idea of where we are headed.

A cloud flow that can be triggered by apps

First things first, right? If we are building a dialog box to run cloud flows, we would first need a cloud flow. Anything would do, just remember it should be a flow that can be triggered by Power Apps and I strongly recommend to respond back with the result in your flow.

You’ll notice in the following screenshot that I have a flow that uses PowerApps (V2) as trigger and at the end it is returning two parameters as a result to any app that calls the flow.

Adding a Respond to a Power App or Flow at the end will let the caller (app or flow) to know if your flow has succeeded or not and the caller will have a chance to react accordingly.

A custom page in your model-driven app

Next step is to build a custom page that can be used as a dialog box. The goal of this custom page (which is a kind od Canvas app with some more power from model-driven apps by the way) is to help end-user run a flow (or several flows) and wait for them to complete while displaying a spinner and once the flow complete let the user know if something went wrong or simply disappear and refresh the data in the form.

To add a custom page to your model-driven app, you need to open the app in the new modern editor and click on the New Page button.

I suggest you build your custom page the responsive way, so you can display it in any kind of dialog no matter what size. To make a responsive page, I take the following approach.

As you can see on the image, I have one screen that contains a vertical container, an image called BusySpinner, a rectangle that fills the entire page to hide the content when the flow is ongoing.

In the VerticalContainer, I have two horizontal containers, the first one contains the only input parameter (language) I need to ask to user and the last one contains the OkButton. In your case you might need several horizontal containers. One per each parameter.

The visual layout of the page is similar to the following diagram. If it’s not clear enough, let me know in the comments and I will write all the properties you might need in a table here.

You will two variables to store the state of the page, IsBusy and IsValid.

  • IsBusy – this variable will have the value true when the cloud flow is executing and false when not.
  • IsValid – this variable will be false by default and will be true when all the required parameters are given by the end user and they are valid.

In your app’s OnStart you need the following commands to initialize the variables. I use this event to also initialize a simple collection I will use to fill a combo-box for my language parameter.

Set(IsValid, Not(IsBlank(Param("recordId"))));
Set(IsBusy, false);
If(Not(IsValid), Notify("Opportunity is not selected!"));
ClearCollect(LanguageCodes, "EN", "FR", "DE", "PT");

Next, you’ll need to add the cloud flow to your page, once you do that, you can trigger it in the OnSelect event of the OkButton.

Set(IsBusy, true);
Set(FlowResult,'Generatemandatedocument'.Run(Param("recordId"), LanguageOptions.Selected.Value));
If(FlowResult.succeeded, 
    Notify(FlowResult.message, NotificationType.Success);Back(), 
    Notify(FlowResult.message, NotificationType.Error));
Set(IsBusy, false);

As you have noticed in the above code, I’m setting the IsBusy variable to true then I start the flow, while putting its result in a variable called FlowResult, this allows me to check the result of the flow in the line that follows. If the result is successful, I simply close the page (dialog), otherwise I’ll inform the user. At the end I set IsBusy to false.

To display the spinner and fade the content behind, when the flow is ongoing, you simply need to put the IsBusy in the Visible property of both BusySpinner and BusyOverlay.

You can also use the following expression in the DisplayMode property of the OkButton. This will disable the button when the flow is ongoing or when the input parameters are not valid.

If(IsValid And Not(IsBusy), DisplayMode.Edit, DisplayMode.Disabled)

Reusable JavaScript library to show custom pages in dialogs

Save the following script in a .js file (e.g. dialogs.js) and then open your solution in https://make.powerapps.com, and click on New, from the drop-down menu, open More and click on Web resource.

/* This function uses Navigation API to display a centered dialog. */
/* Simple parameters are used instead of object to make the function compatible with command bar parameters. */
function displayDialog(rowId, tableName, pageName, title, contentWidth, contentHeight, primaryControl) {
  const parseSize = (str, defaultStr) => (str || defaultStr || "500px").match(/([\d\.]*)(.*)/).splice(1,2);
  let [w, wUnit] = parseSize(contentWidth);
  let [h, hUnit] = parseSize(contentHeight);
    var pageInput = {
      pageType: "custom",
      name: pageName,
      entityName: tableName,
      recordId: rowId.replace(/[{}]/g, "")
    };
    var navigationOptions = {
      target: 2, 
      position: 1,
      width: {value: parseFloat(w), unit: wUnit},
      height: {value: parseFloat(h), unit: hUnit},
      title: title
    };
    Xrm.Navigation.navigateTo(pageInput, navigationOptions)
      .then(
        function () {
            if (primaryControl.getFormContext) { primaryControl.getFormContext().data.refresh(); }
            else {primaryControl.data.refresh();}
        }
      ).catch(
        function (error) {
          /* TODO: Add exception handling here. */
        }
      );
}
https://gist.github.com/rezanid/33d22ae5381d1d159737a9645d7be098.js

The above code is based on the Centered Dialog sample of Microsoft. I have only made a bit more production ready.

  • The whole script is a function that can be easily called from the new Command bar designer (still in preview).
  • Width and height have default values to be used in case no value is given by the custom button.
  • Curly braces are removed from rowId to make it Power App friendly.
  • When the dialog box closes, the content is refreshed.

Pay attention to lines 23 and 24 and see how I’m getting to the formContext object. It is done that way so you can use the same script from any flavor of command bar.

I recommend you improve the script to cover for some edge cases and more importantly add exception handling to notify the end-user properly if “something went wrong“. ☠

Adding a custom button to command bar

The new command bar editor can be found in the modern app editor (which is in preview at the time of this writing). You need to select an app in your solution and click on the three dots to open it context menu and select Edit in preview item. Just like the following screenshot.

Once in the new App editor, scroll down to the table where you its command bar to receive this fancy new button. In the following screenshot, you’ll notice I’m editing the command bar of Opportunity table.

Power Platform will ask you which command bar you are interested in. I am going to choose Main form, but in case you are wondering what are these options, you need to read the first link I have shared in the beginning of my post 😉.

Once in the command bar editor, add a new button with the following properties.

PropertyValue
LabelRun my flow (can be anything you want).
ActionRun JavaScript (in future I hope we will be able to use PowerFx instead).
LibraryThe name you gave to your web resource. I used fancy_dialogs for example.
NOTE!
If you can’t find your library in the list you will need to click on the pen icon just besides the drop-down menu.
Function namedisplayDialog
Parameter 1FirstPrimaryItemId
Parameter 2String | opporunity (this should be the logical name of your table)
Parameter 3String | fancy_flowlauncher (this should be the name of your custom page)
Parameter 4String | Run flow (this will be the title of your dialog)
Parameter 5String | 500px
Parameter 6String | 500px
Parameter 7PrimaryControl

If you have followed all the steps correctly, the end result should be similar to the following.