onsdag den 9. december 2009

Batch-renaming a primary key.

For a programmer who has experienced the "terror" of having to batch-rename primary keys in Dynamics AX's predecessor XAL/C5, I was pleasantly surprised to find how easy it is in Dynamics AX 2009.

Though not documented a method called renamePrimayKey exists on the kernel class xRecord which each table apparently inherits from.

If you call this method after changing the primarykey field on a tablebuffer, this change will be cascaded to related tables. :0)

I was given a task to perform renaming of Items in the InventTable.
In an environment that is to go live - so THERE WERE NO TRANSACTIONS OG STOCK LEVELS. If this has been the case we would have adviced the customer to live with it, as the renaming process potentially would have taken a very long time

Nearly 75% of the item numbers were named with 4 digits, the rest with 5.
The customer wanted only 5-digit item numbers.

How to do this in one go ?
A simple job is enough:


static void Job7(Args _args)
{

InventTable inventTable;
;

ttsbegin;
while select forupdate inventTable
where inventTable.ItemId LIKE "????"
{
inventTable.ItemId = "0"+InventTable.ItemId;
inventTable.renamePrimaryKey();
}
ttscommit;
}

As always USE AT OWN RISK.

DYNAMICS AX Import Export Tool - digging a bit deeper.

Normally when I've used the export/import-tool in AX, it's because you want to dump data from a small database and import it in another environment, or if you want
to install some sort of demonstration database in an environment.

Up until now, if I've been presented with a task of importing data into a customers environment, i've always programmed some sort of class that handles the import.

A couple of weeks ago, a colleague of mine showed me that the import/export tool, can be used for the same task - that is importing data, from e.g. a .csv-file, that your customer prepared in Excel.

I was thrilled to discover that I do not actually always need to make a class to import data in a specific (or more specific) table(s).

The procedure is:

1) Create your own Definition group - make sure it is empty by removing all checkmarks on the Set up and include table groups tabs pages, and choose the type User defined.
Save the def. group.

2) Now click the Table setup button. Choose the table that you want to import data into. Choose the import status delete and import (if you want to clear the table of data before you import), or just import. Set up the file name of the file containing the data.

3) On the general tab you can specify field delimiter (in my case ; for a .csv-file).

4) On the Conversion tab you can actually write x++ code, that will be compiled and executed for each line in the file that is read in. In the method you get a table buffer of the table you choose in step 2, and a container containing all the fields read from the file.
This means you can make custom transformation, and even create records in related tables if you want, while at the same time importing data in to the chosen table.
You can let the compiler parse the code to verify syntax before saving the definition.

5) Using the last tabpage on the form, you can have the first record read from the file and shown to you where the values read are mapped to the fields chosen in step 6.

6) Using the button Field setup you can map the fields/values read from the file, to the fields of the table chosen in step 2.
Even on the field level you can write conversion code similar to the code mentioned in step 4.

All this works very nicely for importing data.

Now I've used this for making imports in the supply chain modules, Invent locations, wms locations, planning data and Item coverage data, preparing the go-live at a customer site.
This was done in a test environment.

When the customer had verified the test data, we wanted to move the import templates created using the above steps, to the live production environment, to make the "real" import.

How to be done ?

Moving the import-definition between environments (test and production) is done by exporting data in the tables:

SysExpImpGroup (which contains the definition group)
SysExpImpTable (which contains the tables of the def. group)
SysExpImpField (which contains the fields of the tables of the def. group)
And maybe
SysExpImpQuery (which contains any Export criteria - if used).

One little problem with the abovementiond procedure of moving the SysExpImp table.

When you import the exported data, the code you write in the conversion tab pages will be messed up. The import process apparently strips all newlines in the code.

You'll have to go through the conversion code and insert linebreaks after each semicolon, after import. Otherwise the import will not function correctly !!!

onsdag den 2. december 2009

Weird experience with tablemaps and layers

Today we experienced something weird in AX2009.

We have installed an application module in the AX in the VAR and VAP layers, and are making CUS-layer modifications that are customer specific.
In the module a tablemap is used to be able to implement code once but for use on several tables.

However we experienced a run-time error when we ran a form, that called the code on the table map. The kernel complained about a field having id 0.

We searched high and low but couldn't find any reason for the run-time error that occurred. We then tried making a cus-layer edition of all the mappings on tablemap and voila, no more run time error.

Weird. :0S