Pages

Friday, September 22, 2017

The role of XLIFF files in the LCS Translation service

In the July 2017 release for LCS, support for XLIFF Translation Memory (TM) files was added. XLIFF (XML Localisation Interchange File Format) is an XML format used to standardize how translations are passed between different tools (Wikipedia).

Such a file holds a lot of information, like tags for source- and target languages, source- and target texts, indication whether the target text is the result of a machine translation, indication about if you actually want a translation of a given text (using {locked} in the comment of a label sets this flag to No), workflow status in relation to review of a translation and much more.

If you have existing translations and you need to add new translations to a project, you would want to tell the translation service about your existing translations to avoid getting them overwritten. You do that by requesting an XLIFF file as the first thing you do:


And here you upload a source and target label file:

In relation the tips 1,2 and 3, it seems like the label files are structured well enough for the tool not care about it. I have uploaded files with different number of labels, because new labels only have been added to source label file and I have used files where a few labels was not in the exactly same order. If it doesn't work out, you must prep the target file first.

As a result the browser returns the XLIFF file almost immediately.  Make sure your browser doesn't block downloads, otherwise nothing happens.


This is a small sample from a file:


You can also open it it Excel:


At this point you can actually start working with the translation. For example by inserting manual translations that you want included.

You can also use the "Multilingual app toolkit" from Microsoft to work with the file. The program is a bit outdated, latest supported Windows version is Windows 8.1 and the integration to Microsoft Translation services is broken. But as an editor the program works fine, and it has a fine UI for all the important tags of the XML file. 

It looks like this:



Now that you have the base for the translation, you can start the actual translation.



This takes you to the file upload dialog, where you upload zip files with your labels and with your XLIFF file:




When you click submit, the the job enters status processing.

There are some manual steps involved in the processing and you'll get an e-mail from the LCS team stating that this could take up to 5 business days. This manual involvement is related to the Microsoft Translator Hub being fed with your XLIFF files in order to train it using your existing translations. This should mean that it is able to pick up on how you have translated existing terms and reuse that way of translating for similar terms. For example (an example from AX 2012 that didn't go well there) the word "Visa" can be translated into two different words in Danish. Either as a required travel document (da: visum) or the credit card (da: Visa). When the translator hub knows what you have translated "Visa" into in other places in your label file, it has a better chance of determining what kind of visa you refer to.

In order to not start manual work just for the sake of this blog post, here is the result from another translation task.


The second file is a new label file which you can check into VSTS. You can do that directly or use this process if you might have other changes to the label file: Merge externally modified label files

The first file is a new XLIFF file including all the new translations. You can send this file out for manual review and the receivers can open it essential using a tool of their choice. Excel or the Multilingual App Toolkit would be good choices. You can actually also find a couple of online tools where you upload the file, edit the translations and download an updated file.

If the review causes changes to the XLIFF file, you can click the Regenerate button, upload the modified XLIFF file and almost immediately get a new label file based on that.

If you are 100% sure that only your source label file gets changes during development, you can keep this new XLIFF file and recycle it for your next translation. If both source and target could be changed, you should build a new XLIFF file.

Thursday, September 14, 2017

Sample loop through metadata of model

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.IO;
using Microsoft.Dynamics.AX.Metadata.Management;
using Microsoft.Dynamics.AX.Metadata.Modeling;
using Microsoft.Dynamics.AX.Metadata.Storage;
using Microsoft.Dynamics.AX.Metadata.Providers;
using Microsoft.Dynamics.AX.Metadata.MetaModel;

namespace MetaDataExample
{
    class Program
    {
        static void Main(string[] args)
        {
            string packagesLocalDirectory = @"J:\AosService\PackagesLocalDirectory";
            IMetadataProvider diskMetadataProvider = new MetadataProviderFactory().CreateDiskProvider(packagesLocalDirectory);

            var l = diskMetadataProvider.Tables.ListObjects("MyModelName");
            var le = l.GetEnumerator();

            while (le.MoveNext())
            {
                AxTable t = diskMetadataProvider.Tables.Read(le.Current);
                Console.WriteLine(t.Name);
            }

            Console.ReadKey();
        }

    }
}

Wednesday, September 6, 2017

Important caveat with the "Chain of command" (CoC) feature

If you use CoC with the July 2017 application release, you need to compile and deploy the standard models you eventually use CoC with. The reason being that the standard models are compiled with Update 8 which didn't have CoC.

This is a temporary inconvenience, and will not be necessary after the "Autum" release.

So, if you write ISV/VAR solutions, you might want to postpone usage of CoC for anything you release before the "Autum" release.

Thursday, August 17, 2017

Package problem with LCS Code Upgrade to July 2017 release

If you get an error like "There is a problem with package 'ContactPerson'. Either the source version being upgraded from is incorrectly identified, or the target version has a package with the same name as a custom package." and you don't have a custom package of the same name, the solution could be the following.


In your existing source (Main branch) you should remove dependencies to the failing models from you model definitions. Check in the change and start the upgrade tool again.

When the upgrade tool is finished you should roll back your change in Main, to keep that branch working.

Wednesday, August 9, 2017

Database log changes in Update 8

Update 8 changes how the database log is logging changes to data. Before Update 8 this was handled through triggers in the kernel and then some application code.

In Update 8 this now happens on triggers created on the SQL Database for each table you enable logging on.

You can still find the old application code in Update 8, but it is not triggered anymore.

There is a bit more information on this community thread: https://community.dynamics.com/ax/f/33/p/246718/684546#684546

Friday, July 14, 2017

Clear the AOD cache, including the SysExtension cache, from the UI

There is really no place in the UI of "Microsoft Dynamics 365 for Finance and Operations, Enterprise Edition" where you can find a button to clear the AOD cache.

If you for example introduce a new class to an extension based on SysExtension this is a problem, because it wont be picked up until the cache has been cleared.

The earlier workaround doesn't seem to work in the "July 2017" update.

But the class to clear the cache is runable, so you can build a URL to call it:

https://[your AOS].cloudax.dynamics.com/?mi=SysClassRunner&cls=SysFlushAOD

Or just make the URL for the menu item:
https://[your AOS].cloudax.dynamics.com/?mi=action:SysFlushAOD

Big thank you to Volker Deuss for giving me this idea.

Wednesday, July 12, 2017

New "Retail" model in "July 2017" creating backwards compatibility issues

The "July 2017" release has a new model called "Retail" and elements have been moved from other models to this model.

Even if you have made your own solutions exclusively with extensions, this might break your build because you might now need to add a reference to the "Retail" model.

But when you add that, you can't build your solution on the "1611" release anymore because build will complain about not knowing the "Retail" model listed in the Descriptor file.

So, even if your code works perfectly on both "1611" and "July 2017", you could have a need to branch out in VSTS to have two different Descriptor files.

Thursday, June 1, 2017

Merge externally modified label files

Sometimes you may need to have your label files reviewed or changed by an external party not having access to Visual Studio or your VSTS project.
This is a description of how a developer can merge the external changes back into the version control.

The tool used

You can of course use all sorts of different software to help you compare the files and merge the files. In this description we use the Diff and Merging tool that already comes with Visual Studio. This is the tool you'll see open when you have conflicts between local files and files in the source control repository.

These are the steps

The Diff and Merging tools doesn't have a menu in Visual Studio for opening it. So it must be opened through a command prompt and it must be given parameters about the files you want to work with.

Open Visual Studio

Open Visual Studio as you'd normally do. Remember to open it with "Run as administrator".
Make sure that Visual Studio doesn't have a label dialog open.

Get latest label files

Ensure that your system is updated with the latest label files from the version control.

Open the command prompt

1. Click the Window button:



2. Click the arrow:


3. Scroll to the right until you find the "Developer Command Prompt for VS2015":


4. Right-click it and choose "Run as administrator":


5. This opens the command prompt:



Start the merge tool

The merge tool needs to know of four file names:
  • Source file (in our case the current label file)
  • Target file (in our case this is the external file you have to merge in)
  • Base file (in our case also the current label file)
  • Result file (in our case also the current label file – as this is where we want to store the merge)
The following is an example with labels in a package called "EGDiagnosticsPlatform" and a model called "EG Diagnostics Platform". The external label file has been placed in H:\External.

Here are my changes, shown with another compare tool (Code Compare):


The command prompt looks like this:
vsdiffmerge "J:\AosService\PackagesLocalDirectory\EGDiagnosticsPlatform\EG Diagnostics Platform\AxLabelFile\LabelResources\en-US\EGDiagnosticsPlatform.en-US.label.txt" "H:\External\EGDiagnosticsPlatform.en-US.label.txt" "J:\AosService\PackagesLocalDirectory\EGDiagnosticsPlatform\EG Diagnostics Platform\AxLabelFile\LabelResources\en-US\EGDiagnosticsPlatform.en-US.label.txt" "J:\AosService\PackagesLocalDirectory\EGDiagnosticsPlatform\EG Diagnostics Platform\AxLabelFile\LabelResources\en-US\EGDiagnosticsPlatform.en-US.label.txt" /m

Here it is in the command prompt:




And this is the result:


Note that this way of looking at source vs. target, makes the tool suggest to remove the new labels made in-house while the file has been out for review. This is something you must fix manually.

Not what we want:


Fixed:


Version control

Note how running the tool immediately flagged the label file as a pending change (because we choose it to be the Result file):


PowerShell

If you don't want to have to figure out how to fill the prompt each time do this merge, you could use a PowerShell script like this:


Run PowerShell as Administrator


Go to the location of the script and run it



Enter the location of your internal file and of the external file:

Version with parameters

If you need to merge the same filter repeatedly, you might want another version of the script. If I knew PowerShell better, I'm sure these could be combined, so the script only asks for parameters if they are not given.

The base function


The call of the function


Running it

Saturday, May 27, 2017

First book about Dynamics 365 for Operations development is out

From the publishers homepage:

"Dynamics 365 for Operations is the ERP element of Microsoft’s new Dynamics 365 Enterprise Edition. Operations delivers the infrastructure to allow businesses to achieve growth and make better decisions using scalable and contemporary ERP system tools.

This book provides a collection of “recipes” to instruct you on how to create—and extend—a real-world solution using Operations. All key aspects of the new release are covered, and insights into the development language, structure, and tools are discussed in detail.

New concepts and patterns that are pivotal to elegant solution designs are introduced and explained, and readers will learn how to extend various aspects of the system to enhance both the usability and capabilities of Operations. Together, this gives the reader important context regarding the new concepts and the confidence to reuse in their own solution designs.

This “cookbook” provides the ingredients and methods needed to maximize the efficiency of your business management using the latest in ERP software—Dynamics 365 for Operations."

Buy it from the publisher or from Amazon.

Friday, May 19, 2017

Use tags to find related bits and pieces in your code base

A lot of elements, if not all, in Dynamics 365 for Operations (D365FO) has a property called "Tags".

With this property you can link all sorts of different otherwise unrelated elements. You could for example use this as part of a bigger change you need to do to your code base, marking all the bits you need to modify.

Adding a tag is as simple as just using it:


For Visual Studio to find places where the tag is used, you need to do a  build with the "Build cross reference data" option checked:


And here is where you search for tags:




And here is the result:


This ought to work for code to. The editor knows about the tags and has intellisense for them:


But the "Find tag references" tool doesn't pick them up. I wonder if that is a bug or not.

And in my opinion you should try to not go overboard tagging away all day long.

Monday, May 15, 2017

FormRun.wait, Box and ChangeCompany - a poor cocktail

I have had some code like this (simplified for the purpose of the example):
...
changeCompany('xyz')
{
    formRun = new FormRun(...);
    formRun.wait();
}

Box::YesNo(...)
If you run this code and manually change company while the form is open and "waiting", the call to Box will make the system freeze up. I have reported this as a bug, but it is not accepted to be fixed.

The workaround, or you could argue proper solution, is to add an event handler for formRun.lifecycleHelper().StageChanging and run the Box code when the stage of the form changes to "Closing". That way Box is called before the form is entirely closed and the company account is changed, and then Box runs all right.

Friday, April 7, 2017

Use Microsoft Flow to start up your Azure VMs

Having Azure VM's running when no one uses them is the cloud equivalent of setting your money on fire.

With the ARM deployments we can do for D365fO now, we can pretty easily make the machines shut down automatically based on a schedule:


But how about startup? If you start a machine manually it takes a while before you can connect to it and use it. So it is preferred to start machines before the workday starts. 

If you control startup with scheduled RunBook scripts you risk starting machines that will not be used during the day. Developers might be on other projects, being sick or being on vacation.

So I have been looking into using Microsoft Flow to start machines. The idea is that a developer opens his/her Flow app before leaving their home in the morning, to start the machines they want to work on that day. Here is how that could look in the Flow app:

    
To get there, first step is to create an Azure Automation account and create the RunBook for starting a VM. You can find the RunBook as a template in the Azure Portal. The process is very well described in this article on the Axology blog: https://axology.wordpress.com/2016/12/09/automated-startup-and-shutdown-of-azure-vms/

The Flow app is very simple to setup. Here are the steps for creating the app button (sorry, some of this is in danish).

First add the manual trigger:


Then add the action, which in this case is an "Azure Automation - Create job" action:



Finally I get a notification when the job is done. Not when the machine actually runs, but when the job kicking off the startup is done:


That is it for a really simple flow. You can probably think of a lot of bells and whistles to add.

I would like to try it out with one of the physical buttons that works with Flow, like Flic, as the trigger:


A button is ordered...

Language issue in AX 2012 with Views built on security tables

Some of the security related system/kernel tables in AX 2012, like for example SecurityRole, is backed by Views in the model store.

If you look at the Views in the model store you can see that some fields hold a label id rather than a text. For SecurityRole, the fields Name and Description holds label ids.

Somewhere in the AX kernel these label ids are converted to actual texts according to the language of the user. So inside AX we don't see these labels, but we see the language specific texts.

This is also the case if you create a View where these tables are part of the meta data.

However if you change the language for your user, it doesn't seem like these Views are updated according to the new language. They keep returning data in the previous language, even after restarting the client and AOS. The only way I have found to make it use the new language is to manually synchronize the view from the AOT.

Thursday, April 6, 2017

Resolve dll references for a project generated from a template

When you create a new project from a template, for example a new Best Practice Rules project, you might get a bunch of references that aren't resolved:



To resolve these you can individually point each reference to the correct dll, or you can take the following faster approach.

In Solution Explorer, right-click your project node and click Properties. The Project Designer appears:

Select the Reference Paths page. In the Folder field, type the path of the folder that contains the items you want to reference, and then click the Add Folder button:

The path for most D365fO dll's is J:\AosService\PackagesLocalDirectory\Bin\

Click Save to save the setting and then close the dialog:


And ... voilĂ :

Monday, April 3, 2017

A gentle introduction to the Immediate Window

Ivan Kashperuk wrote a blog post a while ago about using the Immediate Window when debugging. I wanted to do a more gentle introduction to it, hopefully making you want to go read Ivan's much more thorough post.

The Immediate Window is one of the windows that are open while you debug. The idea is that you can get it to perform simple X++ expressions, using the variables of the program you are debugging.

Now in Dynamics 365 for Operations a thing like tables ids or class ids is not something you can see in the AOT, so debugging a piece of code using these id's can be difficult. You don't know which tables or class names the id's correspond to. This is where I for now use the Immediate Window most, calling something like this (where _tableId is a variable in my program):



There are some technicalities you have to bear in mind when using this. You have to use .NET syntax and .NET types for the expression, it is case sensitive and it is not able to deal with all statements from X++ (like for example a select statement).

You can to some extend use the same expressions from the Immediate Window as conditional breakpoinst. There are some caveats though. If you call stuff from the AOS kernel, some pieces of the kernel are still written in unmanaged code and they will fail with different kinds of errors. Only the parts of the kernel written in managed code will work.  

If you don't follow Ivan's blog already, you should. It is one the of best blogs for developer related information. You can find it at this URL: http://kashperuk.blogspot.com

Thursday, February 16, 2017

AXBUILD fails with a "System.IO.IOException: The file exists" error

Here is a problem with AXBUILD that today added more grayness to my limited amount of hair:


The reason: When AXBUILD kicks off the compilation-threads it needs some temporary files and it uses a .NET function to get the names. It turns out that if the user this runs as, has more that 65535 files named as tmpXXXX.tmp in the Temp folder, the function can simply fail. The naming apparently runs from tmp0000.tmp to tmpFFFF.tmp. 

So the solution was to clean up this folder for our build user, and now all our builds are running again.

Friday, January 6, 2017

Authoring Best Practice checks that use XML based input

In AX7 you can pretty easily write and deploy your own Best Practice checks. The process of doing so is described in this article on the wiki.

Some of the standard Best Practice checks use XML files as input. These XML files are found under your local packages folder, in ..\PackagesLocalDirectory\Bin\BPExtensions\RuleXml

The following is an example of how you can write your own Best Practice check that uses an XML file as input. The check itself is used to check the naming of table extensions. We want to check that a certain prefix is part of the name, and that it is put in the correct place in the overall name. The prefix to look for is specified in the XML file.

The XML file looks like this:
The file is stored in ..\PackagesLocalDirectory\Bin\BPExtensions\RuleXml and the name is MVPProductPrefix.xml. The name is important, as a reference is made to it from the code.


Your Best Practice class needs to extend either DetectorWithXMLDataFile or MetadataDetectorWithXMLDataFile depending on the type of your check. In this case we extend MetadataDetectorWithXMLDataFile since we will be looking at table extension names.

Included in your solution you need to add a data contract class for the XML file. This class must extend from RulePatternsBase.
The framework and the base classes will then do the rest of the work with loading the XML file, de-serialize it and caching it.

Here is the code:

And here is the code for the diagnostic item:

I'd like to thank Joris de Gruyter for helpful hints on how to solve this. And .NET Reflector has been a great tool to look into the standard Best Practice dll's to figure out how they work (Microsoft MVPs get a free license for .NET Reflector from Redgate).