Author: tw

  • Multilingual Records using Label editor

    Situation

    As I needed multilingual data on a custom table I tried to identify if Microsoft has a best practice it applies to this requirement. I had to find out that multilingual data is handled differently in every instance I identified.

    The application itself (forms, buttons, instructions, help etc.) is truly multilingual and is handled in a uniform throughout AX. On the other hand, record data mostly cannot be saved in different languages. At the most important places Microsoft has recognised this and has added the possibility of adding multilingual data.

    Below are all the instances of multilingual texts I have identified:

    Label files (application text):

    This is still pretty straight forward: Translations of application text is handled in one location (the label files). There is one editor for editing the label files.

    • X++: The label code is in quotation marks (e.g. "@SYS1234") and the texts can be edited in the label editor using context menu "Lookup Label/Text"
    • AOT object properties: Field showing the standard language, when the field is selected the label code and "…" are displayed. A click on the three dots leads you to the label editor as above.
       
    • The Rename Item Dimension dialogue is the exception to the rule. The dialogue bypasses the label editor, allowing the user to select a language and then edit multiple labels at a time.

    Separate tables (multilingual data):

    Data is always handled in separate tables and never using label files. I guess this is because you do not want the label files to grow indefinitely as more and more data is added. I wonder however if it could have been an option to use labels for a relatively limited table such as the unit table.

    • Item text: Translations for the item text are saved in the InventTxt table. The item text is NOT saved on the InventTable, the default value that is displayed corresponds to the company language. If you change the company language or you share the table over multiple companies with different languages, the default text might be missing. The foreign key linking the InventTxt table to the InventTable is the ItemId.
    • Unit text: Translations of the unit text are saved in the UnitTxt table, the default text is however saved on the Units table. The foreign key linking the UnitTxt table to the Unit table is the UnitId.
      Even though there is essentially no difference in the table definitions of UnitTxt and InventTxt, the user experience is not the same.
    • Workflow instructions: The multilingual instructions of Workflows are again implemented in a different way: The WorkFlowMessageTxt table is a bit more generic than the previous examples by using the TableId and RecordId to link to the source table. In addition the Multilanguage entry form allows the use of variables in the text. Just as is the case with the Item text, the Message text is not save on the base table at all. The system first tries to fetch the users language, if not found, then it tries to fetch the company language. The problem of the item text remains: if you change the company language or you share the table over multiple companies with different languages then the default text might be missing.
       

     

    Suggestion for Microsoft

    Just as Microsoft has added the UtcTimeDate type that supplies clever editing of date and time, Microsoft should add a new datatype for multilingual strings ("MLString"), that supplies the interface to edit, save and display the translations.
    I would like to see the following properties:

    • Default value when used in forms and reports: Try to find a language specific value in the following order 1. text in users language, 2. text in variant of users language (e.g. en-us instead of en-au), 3. text in company language, 4. text in variant of company language (e.g. de instead of de-ch) 5. any language
    • Possibility of adding a second data field or a static text to define a specific language. Optional: If this language is not found, then follow the rules above.
    • One consistent way of editing the translations. Suggestion: use the ItemName type of editing.
    • Transparent management of the saved translations; Suggestion: A table contains all translations with the key TranslationId (preferably Int64 instead of a string) and the LanguageId. Maybe the TableId of the referencing table could be added to allow horizontal partitioning if the table grew too large. All that needs to be saved in the referring table is the TranslationId.

     

    UtcTimeDate:

  • AX 2009 SP1 Released

    Get if from Customersource or Patnersource:
    https://mbs.microsoft.com/customersource/downloads/servicepacks/dynamicsax2009sp1.htm

    Some of the Changes:

    • Support for Windows Essential Business Server added
    • Support for Microsoft SQL Server 2008 added
    • Electronic signatures
    • Time zone updates

    see What’s_New-Microsoft_Dynamics_AX_2009_SP1.PDF

  • Multi-company Batch Processing

    It is rather annoying that you have to run a special client to do batch processing and that you have to ensure that the client is always running, however even more annoying would be if you would have to run a client for every single company, which is what seems to be needed when you look at the standard system. Luckily however there is a no-modification-needed approach to avoid having to run multiple batch clients for multiple companies:
    • Create a new Virtual company e.g. BAT (Administration>>Setup>>Virtual company accounts)
    • On the Company accounts tab, add all companies to the Virtual company.
    • On the Table collections tab, pick the Batch table collection, which is available as a standard table collection.

    That’s it. Easy, isn’t it?

  • Data import options

    There are multiple options to import data. My favorite one when I still "owned" an installation and could take full responsibility was to import directly into the database using and updating the record-id stored in SYSTEMSEQUENCES. Now that I am consultant an can not take unreasonable risks I had to find a new way of efficiently importing data. The standard options are:
    • Excel:
      Populating the Excel template is great, you have predefined drop-down lists for enum values. You can use all the tricks available in excel to calculate fields in order to clean you data. However excel has two significant disadvantages: Importing from Excel is sloooooooow. And tables that are linked via the Record-Id (e.g. Address, Notes) can not be imported.
    • CSV:
      CSV import on the other hand is the exact opposite: importing e.g. thousands of customers takes only a few seconds, while the Excel import takes hours. The record-id field is available for auxiliary table such as Address and Notes, and the relationship will remain even though the record-ids might be re-mapped during import. But again CSV has significant disadvantages: Enum values are just 0,1,2,… you fist have to find the mapping of these values. Editing CSV tables is not easy as Excel does not save to quotation marks. Line breaks as they might appear in the address field can no be imported.

    My idea was to combine the two methodologies and get a new variant with all the advantages – the remaining disadvantage is that it takes longer to set up:

    1. Create both an Excel and a CSV template (export both with a couple of records so that you have examples to work with)
    2. In the the Excel template create new worksheets for each table in which you will prepare the csv data file.
    3. Copy the sample CSV for the dat rows into the new worksheets. Use the menu Data>>Text to columns to convert the csv line (use Delimiter comma and text qualifier {none})
    4. Add another line and mark each column with on of the following data types: STRING, LIST, NUM, DATE
    5. The first five and the last columns contain system fields:
      • The first column always contains "RECORD"
      • The second column always contains the Table-Id. Copy the table Id you found in the exported sample record.
      • The third column contains the record-id which must be unique and can be simply calculated 1,2,3,4,… by using something like = C4+1 for cell C5.
      • The fourth and fifth columns are unused and contain 0
      • The last column contains the record version witch can be set to 1 for all records.
    6. All other fields should contain the following formula =IF(F$1="DATE",TEXT(‘SourceWS’!A7,"yyyy/mm/dd"),IF(F$1="NUM",’SourceWS’!A7,IF(F$1="LIST",VALUE(LEFT(‘SourceWS’!A7,3)),"""" & ‘SourceWS’!A7 & """"))), whereas of course the SourceWS is the original Excel template WS which you have populate and which provides you will all advantages described above. Once you have linked the first field you can copy the formula throughout the csv worksheet. Note that the source data should not contain any line break, but the line break should be replaced by some tag e.g. "-newline-".
    7. Now for creating the csv: copy past all the relevant data into a text editor. You now have tab separated value.
    8. Replace the tabs by commas. Note: Notepad is frighteningly inefficient for doing the simple task. Use a more sophisticated text editor such as Textpad and you will have replaced all tabs within seconds.
    9. Save the generated csv as dat file.
    10. Import
    11. Write a job to replace the -newline- tag:
      static void AddNewLine(Args _args)
      {
          //only modify the following constants
          #define.runCustTable(false)
          #define.runVendTable(false)
          #define.runInventTable(false)
          #define.runAddress(True)
          #define.pattern("-newline-")
          //only modify the constants above
          CustTable custTable;
          VendTable vendTable;
          Address address;
          InventTable inventTable;
          int64 oldCount;
          int i;
          ;
          ttsbegin;
          if (#runCustTable)
          {
              While select forupdate custtable where custtable.Address like "*" + #pattern + "*"
                                                  || custtable.Street like "*" + #pattern + "*"
              {
                  custTable.Address = strreplace(custTable.Address,#pattern,"\n");
                  custTable.Street = strreplace(custTable.Street,#pattern,"\n");
                  custTable.update();
                  i++;
              }
              info(strfmt("%1 CustTable records updated",i));
              i=0;
          }
          if (#runVendTable)
          {
              While select forupdate vendTable where vendTable.Address like "*" + #pattern + "*"
                                                  || vendTable.Street like "*" + #pattern + "*"
              {
                  vendTable.Address = strreplace(vendTable.Address,#pattern,"\n");
                  vendTable.Street = strreplace(vendTable.Street,#pattern,"\n");
                  vendTable.update();
                  i++;
              }
              info(strfmt("%1 VendTable records updated",i));
              i=0;
          }
          if (#runAddress)
          {
              select count(recid) from address;
              oldCount = address.RecId;
              delete_from address where !address.AddrRecId;
              select count(recid) from address;
              if (oldCount != address.RecId)
                  info(strfmt("%1 Address records deleted", oldCount – address.RecId));
              While select forupdate address where address.Address like "*" + #pattern + "*"
                                                  || address.Street like "*" + #pattern + "*"
              {
                  address.Address = strreplace(address.Address,#pattern,"\n");
                  address.Street = strreplace(address.Street,#pattern,"\n");
                  address.update();
                  i++;
              }
              info(strfmt("%1 Address records updated",i));
              i=0;
          }
          if (#runInventTable)
          {
              While select forupdate inventTable where inventTable.ItemName like "*" + #pattern + "*"
              {
                  inventTable.ItemName = strreplace(inventTable.ItemName,#pattern,"\n");
                  inventTable.update();
                  i++;
              }
              info(strfmt("%1 InventTable records updated",i));
              i=0;
          }
          ttscommit;
          info("complete");
      }
  • Un-mark transactions marked for settlement

    Scenario:

    • If you go to Vendors>>Functions>>Open transactions editing you can identify transactions that have been marked for settlement by the red hand in he “Is marked” column. Un-marking these transactions works only by finding the offset transaction and deleting the link from there.

    Issue:

    • I came across transactions that have been identified as marked, however I did not know where the transactions are marked. The marking was created in one case because an update conflict occurred in the payment journal which cause the processing of the payment, casing the marked offset transaction to be lost but the transaction to be settled remained marked; in another case I think someone started to create a credit adjustment note, but deleted the credit adjustment before completing however the marking remained.

    Investigation:

    • In the open transactions editing form (Vendors>>Functions>>Open transactions editing or Customers>>Functions>>Open transactions editing) and find the problematic transaction.
    • Go to Inquiries>>Specification and research the offset transactions. Try to un-mark the transaction with traditional means (i.e. Function>>Open transactions editing on the sales order etc.)
    • If you find a problem an want to continue with removing the marking then right click on the record in “Open transactions editing”, select “Record info”>>”Show all fields” and make a note of the Record-Id.
    • The next step must be done in the AOT using the table browser to directly access the data of certain tables. This is dangerous and definitely not good practice. However I do not know better and would appreciate if someone could show me how to do it better.
    • The table SpecTrans contains all transactions marked for settlement. Find th record-id you previously made a note of in the column RefRecId.
    • Delete the found SpecTrans record to remove the marking.

    Better solutions?

    If you have a cleaner and better solution, please tell me how you do it?

    Thank you

    Thomas

  • Show drop-down list when entering a non-existent item

    Goal:
    • A user can get a dropdown list, just by typing a few letters and pressing enter. (Instead of entering a * to trigger the drop down list or instead of using the mouse)

    Solution:

    • The following static method can be called from the overridden task method of any form. The current implementation only supports ItemId and RouteOpr field. (It would be great if this code were generalized by someone.)
    • NOTE: relies on field name

    //bw start
    //Changed on 16 Jul 2007 by TW
    /* Description:
    display dropdown list if enter clicked while an itemid is not valid
    */
    static public int task(Args args, int _taskId)
    {
        #MACROLIB.Task //import task macro
        FormControl fc;
        FormStringControl fsc;

        switch(_taskId)
        {
            case #taskEnter: //enter key pressed
                fc = args.caller().selectedControl();
                if (SysFormRun::controlType(classidget(fc)) == FormControlType::String)
                {
                    fsc = fc;

                    if (fsc.name() like "*" + fieldid2name(tablenum(InventTable),fieldnum(InventTable,ItemId)) + "*")
                    { //ItemId fields
                        if (!InventTable::exist(fsc.text()))
                        {
                            fsc.lookup();
                            return 1;
                        }
                    }
                    else if (fsc.name() like "*" + fieldid2name(tablenum(InventTable),fieldnum(Route,OprId)) + "*")
                    { //RouteOpr Fields
                        if (!RouteOprTable::exist(fsc.text()))
                        {
                            fsc.lookup();
                            return 1;
                        }
                    }
                }
                break;
        }
        return 0;
    }
    //bw end

  • Automate “Edit Dimensions” to assign Batchnumbers (FIFO)

    Situation:
    • All finished products and sub-assemblies are batch controlled, but the assignment of the batches is not always known in advance and in other cases mixed batches are used which is also not handled automatically. More technical: Blank receipts are not allowed, but blank issuing is.
    • For our purposes FIFO is good enough for assigning the batches to transactions after the fact.

    Solution:

    • The following methods identify existing batches and transactions without batchnumber assign.
    • The positiv and negative quantities are matched and if nescessary some transactions are split.
    • (The following static methods are part of the BWBatches class)

    //bw start
    //Changed on 30 May 2007 by TW
    /* Description:
    This method with fix the Batchnumber assignment of the item identified by the parameter
    In the method the is only the identification of the positive and negative quantities.
    In the end the auxillary method fixBatchAssignmentItemMatch is used to match positive and negative quantities.
    And the auxillary method fixBatchAssignmentItemEdit is used to edit the transactions.
    */
    public static void fixBatchAssignmentItem(ItemId _itemId)
    {
        InventBatch inventBatch;
        InventOnhand inventOnhand;
        InventDim   inventDim;
        InventTable item = InventTable::find(_itemId);
        Qty         remaining;
        array       positives = new array(Types::Container); //positives container: inventDimId, qty, prodDate
        Int         positivesI = 0;
        array       negatives = new array(Types::Container); //negatives container: inventDimId, qty, prodDate
        Int         negativesI = 0;
        array       matches; //matches container: pos inventDimId, neg inventDimId, qty, pos prodDate
        Int         i;
        boolean     hasInventDimIdBlank = false;
        ;

        setprefix(item.ItemId);
        //———————-find stock levels for all batch numbers——————————–
        //—————add non-zero stock levels to postive or negative lists————————
        while select inventBatch order by prodDate asc where inventBatch.itemId == item.ItemId
        {
            inventOnhand = InventOnhand::newInventBatch(inventBatch);
            remaining = inventOnhand.physicalInvent();
            if (remaining)
            {
                inventDim = null;
                inventDim.inventBatchId = inventBatch.inventBatchId;
                inventDim = InventDim::findOrCreate(inventDim);

                if (remaining<0) //add negative stock levels to negatives array
                {
                    negativesI++;
                    negatives.value(negativesI,[inventDim.inventDimId,remaining, inventBatch.prodDate]);
                }
                else if (remaining >0) //add positive stock levels to positives array
                {
                    positivesI++;
                    positives.value(positivesI,[inventDim.inventDimId,remaining, inventBatch.prodDate]);
                }
            }
        }
        //add empty dimension last
     
       inventDim = InventDim::findOrCreateBlank(false);
        inventOnhand = InventOnhand::newItemDim(item.ItemId,inventDim,InventDimParm::activeDimFlag(item.DimGroupId));
        remaining = inventOnhand.physicalInvent();
        if (remaining<0) //add negative stock levels to negatives array
        {
            hasInventDimIdBlank=true;
            negativesI++;
            negatives.value(negativesI,[inventDim.inventDimId,remaining,1\1\1900]);
        }
        else if (remaining >0) //add positive stock levels to positives array
        {
            hasInventDimIdBlank=true;
            positivesI++;
            positives.value(positivesI,[inventDim.inventDimId,remaining,1\1\1900]);
        }

        //———————-match positives and negatives——————————–
        //———————————————————————————–
        if (negativesI || hasInventDimIdBlank)//only negatives und unassigned batches are a problem
        {
            matches = BWBatches::fixBatchAssignmentItemMatch(positives,negatives,item);

            //———————-edit dimensions of transactions——————————
            //———————————————————————————–
            BWBatches::fixBatchAssignmentItemEdit(matches,item);
        }
    }
    //bw end

     

    //bw start
    //Changed on 22 May 2007 by TW
    /* Description:
    match positive and negative stocklevels of an item
    _positives, _negatives: array of containers with three elements [inventDimId, qty, prodDate]
    return value: array of containers with 4 elements [pos inventDimId, neg inventDimId, qty, pos prodDate]
    */
    client server private static array fixBatchAssignmentItemMatch(array _positives, array _negatives, InventTable _item)
    {
        int         i,j;
        Container   posC;
        Qty         posQty;
        Container   negC;
        Qty         negQty;
        Date        batchDate,batchDate2;
        array       matches = new array(Types::Container); //container elements [pos inventDimId, neg inventDimId, qty, pos prodDate]
        int         matchesI;
        ;

        for (i=1; i<=_positives.lastIndex(); i++)//run through all negative stock levels
        {
            posC = _positives.value(i); //_positives container elements [inventDimId, qty, prodDate]
            posQty = conpeek(posC,2);
            batchDate = conpeek(posC,3);
            if (posQty)
            {
                for (j=1; j<=_negatives.lastIndex(); j++)//try to match negative stock levels with positiv stock levels
                {
                    negC = _negatives.value(j); //_negatives container elements [inventDimId, qty, prodDate]
                    negQty = conpeek(negC,2);
                    batchDate2 = conpeek(negC,3);
                    if (negQty)
                    {
                        if (posQty >= -negQty) //more that enough available -> use all needed (-negQty)
                        {
                            matchesI++;
                            matches.value(matchesI,[conpeek(posC,1),conpeek(negC,1),-negQty, batchDate]);
                            negQty -= negQty; //reduce needed Qty (=0)
                            posQty += negQty; //reduce available Qty
                        }
                        else //if (posQty < -negQty) //not enough available -> use all available (posQty)
                        {
                            matchesI++;
                            matches.value(matchesI,[conpeek(posC,1),conpeek(negC,1),posQty, batchDate]);
                            negQty += posQty; //reduce needed Qty
                            posQty -= posQty; //reduce available Qty (=0)
                        }
                        _negatives.value(j,[conpeek(negC,1),negQty,batchDate2]); //update needed value
                    }
                }
                _positives.value(i,[conpeek(posC,1),posQty,batchDate]); //update available value
            }
        }
        //finally check if any negatives remain unmatched
        for (j=1; j<=_negatives.lastIndex(); j++)
        {
            negC = _negatives.value(j);
            negQty = conpeek(negC,2);
            if (negQty)
                error(strfmt("Can not completely resolve stocklevels of %1. Please fix manually.",_item.ItemId),"",SysInfoAction_TableField::newBuffer(_item));
        }
        return matches;
    }
    //bw end

     

    //bw start
    //Changed on 22 May 2007 by TW
    /* Description:
    edit dimensions of transactions
    _matches: array of containers with 4 elements [pos inventDimId, neg inventDimId, qty, pos prodDate]
    */
    private static void fixBatchAssignmentItemEdit(array _matches, InventTable _item)
    {
        #OCCRetryCount
        int     i;
        Qty     remaining;
        Date    batchDate;
        InventDim   inventDimPos;
        InventDim   inventDimNeg;
        InventTrans inventTrans;
        ;

        try
        {
            ttsbegin;
            for (i=1;i<=_matches.lastIndex();i++)
            {
                //_matches container elements [pos inventDimId, neg inventDimId, qty, pos prodDate]
                inventDimPos = InventDim::find(conpeek(_matches.value(i),1));
                inventDimNeg = InventDim::find(conpeek(_matches.value(i),2));
                remaining = conpeek(_matches.value(i),3);
                batchDate = conpeek(_matches.value(i),4);

                remaining = -remaining; //running from the negative perspective

                //fix negative (i.e. issues with unassigned or wrongly assign batch numbers)
                //there might be many transactions with the same BatchId
                if (inventDimNeg.inventBatchId <= inventDimPos.inventBatchId)
                {  
                    //the issued batch (or blank batch) will be fullfilled by newer batch
                    //run through through all transactions by order by date DESCENDING
     
                   while select forupdate inventTrans order by DatePhysical desc
                                            where inventTrans.ItemId == _item.ItemId
                                                && inventTrans.inventDimId == inventDimNeg.inventDimId
                    {
                        //execute changes:
     
                       remaining = BWBatches::auxBatchAssignment(inventTrans,remaining,inventDimPos,inventDimNeg);
                        if (!remaining)
                            break;
                    }
                }
                else
                {
                    //the issued batch will be fullfilled by older batch
                    //run through through all transactions by order by date ASCENDING
     
                   while select forupdate inventTrans order by DatePhysical asc
                                            where inventTrans.ItemId == _item.ItemId
                                                && inventTrans.inventDimId == inventDimNeg.inventDimId
                                                && inventTrans.DatePhysical >= batchDate
                    {
                        //execute changes:
                        remaining = BWBatches::auxBatchAssignment(inventTrans,remaining,inventDimPos,inventDimNeg);
                        if (!remaining)
                            break;
                    }
                }
                if (remaining)
                    error(strfmt("Can not completely resolve stocklevels of %1. Please fix manually.",_item.ItemId),"",SysInfoAction_TableField::newBuffer(_item));
            }
            ttscommit;
        }
        catch (Exception::Deadlock)
        {
            if (xSession::currentRetryCount() >= #RetryNum)
                throw Exception::UpdateConflictNotRecovered;
            else
                retry;
        }
    }
    //bw end

     

    //bw start
    //Changed on 30 May 2007 by TW
    /* Description:
    Auxiliary method that assign a new batchnumber to a transaction and splits the transaction if nescessary
    */
    private static qty auxBatchAssignment(InventTrans _inventTrans, Qty _remaining, InventDim _inventDimPos, InventDim _inventDimNeg)
    {
        InventTrans inventTrans = _inventTrans;
        Qty remaining = _remaining;
        InventDim inventDimPos=_inventDimPos;
        InventDim inventDimNeg=_inventDimNeg;
        InventTransSplit    inventTransSplit;
        ;
        if (remaining>0 || inventTrans.Qty>0)
        {
            error(strfmt("auxBatchAssignment assertion: remaining (%1) and inventTrans.Qty (%2) are expected to be less than zero.",remaining,inventTrans.Qty));
            return 0;
        }
       
        if (-inventTrans.Qty <= -remaining)
        {//The transactions qty is smaller than the required quantity
         // -> use all (i.e. change dimension for entire transaction)
           
    inventTrans.inventDimId = inventDimPos.inventDimId; //assign the new batch number
            inventTrans.update();
            BWBatches::auxBatchAssignmentRef(inventTrans); //change associated sales and production lines
            info(strfmt("%3: %2 %4 -> %2 %5",inventTrans.InventTransId, inventTrans.Qty, inventTrans.DatePhysical,
                            inventDimNeg.inventBatchId?inventDimNeg.inventBatchId:inventDimNeg.inventDimId,inventDimPos.inventBatchId?inventDimPos.inventBatchId:inventDimPos.inventDimId));
            remaining -= inventTrans.Qty;
        }
        else if (-inventTrans.Qty > -remaining)
        {//The transactions qty is larger than the required quantity
         // -> use only part of the transaction (i.e. it is nescessary to plit the transaction!)
            inventTransSplit = InventTransSplit::newInventTransSplit(inventTrans.RecId);
            warning(strfmt(strfmt("Spliting %1: %2 %5 -> %3 + %4 %5",inventTrans.DatePhysical, inventTrans.Qty, remaining, inventTrans.Qty – remaining, (inventDimNeg.inventBatchId?inventDimNeg.inventBatchId:inventDimNeg.inventDimId))));
            inventTransSplit.parmSplitQty(inventTrans.Qty – remaining); //inventTrans will contain the desired qty
            try
            {
                inventTransSplit.run();
            }
            catch (Exception::Error)
            {
                error(strfmt("Can not split transaction. Please fix stocklevels of %1 manually.",inventTrans.ItemId),"",SysInfoAction_TableField::newBuffer(InventTable::find(inventTrans.ItemId)));
                return 0;
            }
            inventTrans = InventTrans::findRecId(inventTrans.RecId,true); //reload the record (for update)
            inventTrans.inventDimId = inventDimPos.inventDimId; //assign the new batch number
            inventTrans.update();
            BWBatches::auxBatchAssignmentRef(inventTrans); //change associated sales and production lines
            info(strfmt("%3: %2 %4 -> %2 %5",inventTrans.InventTransId, inventTrans.Qty, inventTrans.DatePhysical,
                            inventDimNeg.inventBatchId?inventDimNeg.inventBatchId:inventDimNeg.inventDimId,inventDimPos.inventBatchId?inventDimPos.inventBatchId:inventDimPos.inventDimId));
            remaining -= inventTrans.Qty;
        }
        return remaining;
    }
    //bw end

     

    //bw start
    //Changed on 30 May 2007 by TW
    /* Description:
    The auxilary method replicated the inventTrans changes to sales lines and production lines.
    */
    public static void auxBatchAssignmentRef(InventTrans _inventTrans)
    {
        SalesLine salesline;
        ProdBom   prodBOM
        ;

        select forupdate firstonly salesline where salesline.InventTransId == _inventTrans.InventTransId;
        if (salesline)
        {
            if (salesline.QtyOrdered == _inventTrans.Qty && salesline.InventDimId != _inventTrans.inventDimId)
            {
                salesline.InventDimId = _inventTrans.inventDimId;
                salesline.update();
            }
        }
        else
        {
            select forupdate firstonly prodBOM where prodBOM.InventTransId == _inventTrans.InventTransId;
            if (prodBOM)
            {
                if (prodBOM.QtyInventCalc == _inventTrans.Qty && prodBOM.InventDimId != _inventTrans.inventDimId)
                {
                    prodBOM.InventDimId = _inventTrans.inventDimId;
                    prodBOM.update();
                }
            }
        }
    }
    //bw end

     

  • Assets: multiple depreciation rates with identical posting profile

    Issue:
    • Changing legislation leads to many different depreciation rates depending on the time an asset was acquired.
    • Dynamics Ax requires one value model per depreciation rate, i.e. we need around 30 value models.
    • Each value model requires a specific posting profile. Each posting profile has at least 14 entries. Leading to over 400(!) entries.
    • Every new Asset requires the setting up of a new posting profile (if the depreciation rate is again different).

    Solution:

    • Allow the system to fall back to a default posting profile attached to the value model "Std".
    • Make the modification that still allow multiple posting profiles if needed, i.e. first try to use standard functionality, only if failed fall back to a default posting profile.
    • The following methods must be modified:
      • Table AssetLedgerAccounts
        • find
        • findAssetLedgerAccount
        • findAssetLedgerOffsetAccount
        • assetLedgerAccount
        • assetLedgerOffsetAccount
        • exist
      • Class AssetReclassification
        • findNewAssetLedgerAccount
        • findOldAssetLedgerAccount
      • Table AssetDisposalParameters
        • find
        • findPosintingAccound
        • exist
      • Class AssetPostDisposal
        • post
    • The modifications all follow the same pattern: Find the query(ies) in the method and repeat them with BookId == "Std".
      General pseudo code structure:
      found = select1 where BookId == _bookid;
      if (!found)
          found  = select2 where BookId == _bookid;
      //new:
      if (!found)
          found  = select1 where BookId ==  "Std";
      if (!found)
          res = select2 where BookId ==  "Std;

    public static boolean exist( //table AssetLedgerAccounts
        AssetBookId         _bookId,
        AssetPostingProfile _postingProfile,
        AssetTransType      _transType,
        AssetTableGroupAll  _accountCode,
        AccountNum          _accountRelation)
    {
        boolean found;
        ;

        found = (select firstonly
                    RecId
                 from
                    assetLedgerAccounts
                 where
                    assetLedgerAccounts.BookId == _bookId &&
                    assetLedgerAccounts.PostingProfile == _postingProfile &&
                    assetLedgerAccounts.TransType == _transType &&
                    assetLedgerAccounts.AccountCode == _accountCode &&
                    assetLedgerAccounts.AccountRelation == _accountRelation).RecId != 0;

    //bw start
    //Changed on 21 May 2007 by TW
    /* Description:
    Default to the Value Model "Std" if no specific BookId found.
    */

        if (!found)
        {
            found = (select firstonly
                    RecId
                 from
                    assetLedgerAccounts
                 where
                    assetLedgerAccounts.BookId == "Std" &&
                    assetLedgerAccounts.PostingProfile == _postingProfile &&
                    assetLedgerAccounts.TransType == _transType &&
                    assetLedgerAccounts.AccountCode == _accountCode &&
                    assetLedgerAccounts.AccountRelation == _accountRelation).RecId != 0;
        }
    //bw end

        return found;
    }

    • AssetPostDisposal is the only exception to this simple pattern. As there is a while select loop, the situation needs to be handled slightly different

    void post() //class AssetPostDisposal
    {
        AssetAmount                 assetAmount;
        AssetDisposalParameters     assetDisposalParameters;
        AssetDisposalParameters     l_assetDisposalParameters;
        AssetPostType               assetPostType;
        AssetSoldScrap              soldScrap;
        CurrencyCode                companyCurrency = CompanyInfo::standardCurrency();
        LedgerVoucherTransObject    ledgerVoucherTransObject;
        boolean                     useStd = false; //bw
        ;

        ttsbegin;

        […]

        select assetDisposalParameters
              where assetDisposalParameters.PostValue       == AssetPostValue::NBV  &&
                    assetDisposalParameters.SoldScrap       == soldScrap            &&
                    assetDisposalParameters.PostingProfile  == assetTrans.PostingProfile   &&
                    assetDisposalParameters.BookId          == assetTrans.BookId    &&
                    (assetDisposalParameters.ValueType      == assetPostType ||
                     assetDisposalParameters.ValueType      == AssetPostType::All)  &&
                    ((assetDisposalParameters.AssetCode     == TableGroupAll::Table   &&
                      assetDisposalParameters.AssetRelation == assetTrans.AssetId) ||
                     (assetDisposalParameters.AssetCode     == TableGroupAll::GroupId &&
                      assetDisposalParameters.AssetRelation == AssetTable::find(assetTrans.AssetId).AssetGroup) ||
                      assetDisposalParameters.AssetCode     == TableGroupAll::All);
    //bw start
    //Changed on 21 May 2007 by TW
    /* Description:
    Default to the Value Model "Std" if no specific BookId found.
    */
        if (!assetDisposalParameters)
        {
            useStd = true;
            select assetDisposalParameters
              where assetDisposalParameters.PostValue       == AssetPostValue::NBV  &&
                    assetDisposalParameters.SoldScrap       == soldScrap            &&
                    assetDisposalParameters.PostingProfile  == assetTrans.PostingProfile   &&
                    assetDisposalParameters.BookId          == "Std"    &&
                    (assetDisposalParameters.ValueType      == assetPostType ||
                     assetDisposalParameters.ValueType      == AssetPostType::All)  &&
                    ((assetDisposalParameters.AssetCode     == TableGroupAll::Table   &&
                      assetDisposalParameters.AssetRelation == assetTrans.AssetId) ||
                     (assetDisposalParameters.AssetCode     == TableGroupAll::GroupId &&
                      assetDisposalParameters.AssetRelation == AssetTable::find(assetTrans.AssetId).AssetGroup) ||
                      assetDisposalParameters.AssetCode     == TableGroupAll::All);
        }
    //bw end
        if (!assetDisposalParameters.RecId)
        {
                throw error(strfmt("@SYS24602","@SYS67345" + ‘/’ + "@SYS67538", "@SYS67500"));
        }

        while select assetDisposalParameters
            group by PostValue
            where assetDisposalParameters.SoldScrap       == soldScrap                    &&
                  assetDisposalParameters.PostingProfile  == assetTrans.PostingProfile    &&
    //bw start
    //Changed on 21 May 2007 by TW
    /* Description:
    if mod was needed above use Std else use BookId
    */
                  assetDisposalParameters.BookId          == (useStd?"Std":assetTrans.BookId)            &&
    //bw end
                  (assetDisposalParameters.ValueType      == assetPostType                ||
                   assetDisposalParameters.ValueType      == AssetPostType::All)
        {
            l_assetDisposalParameters = AssetDisposalParameters::findPostingAccount(assetTrans.AssetId,
                                                                                    assetDisposalParameters.SoldScrap,
                                                                                    assetDisposalParameters.PostingProfile,
                                                                                    assetDisposalParameters.BookId,
                                                                                    assetDisposalParameters.ValueType,
                                                                                    assetDisposalParameters.PostValue);

            assetAmount = this.postValue(l_assetDisposalParameters.PostValue);
            if (assetAmount)
            {
                ledgerVoucherTransObject = LedgerVoucherTransObject::newCreateTrans(
                                                                        ledgerVoucher.findLedgerVoucherObject(),
                                                                        LedgerPostingType::FixedAssetsDebit,
                                                                        l_assetDisposalParameters.Account,
                                                                        assetTrans.Dimension,
                                                                        companyCurrency, //assetTrans.currencyCode,
                                                                        -assetAmount,
                                                                        assetTrans.TableId,
                                                                        assetTrans.RecId,
                                                                        0);
                ledgerVoucherTransObject.parmOperationsTax(LedgerVoucher::operationsTax(AssetBookTable::find(assetTrans.BookId).CurrentOperationsTax));
                ledgerVoucherTransObject.parmTransTxt(assetTrans.Txt);
                ledgerVoucher.addTrans(ledgerVoucherTransObject);

                ledgerVoucherTransObject = LedgerVoucherTransObject::newCreateTrans(
                                                                        ledgerVoucher.findLedgerVoucherObject(),
                                                                        LedgerPostingType::FixedAssetsDebit,
                                                                        l_assetDisposalParameters.OffsetAccount,
                                                                        assetTrans.Dimension,
                                                                        companyCurrency, //assetTrans.currencyCode,
                                                                        assetAmount,
                                                                        assetTrans.TableId,
                                                                        assetTrans.RecId,
                                                                        0);
                ledgerVoucherTransObject.parmOperationsTax(LedgerVoucher::operationsTax(AssetBookTable::find(assetTrans.BookId).CurrentOperationsTax));
                ledgerVoucherTransObject.parmTransTxt(assetTrans.Txt);
                ledgerVoucher.addTrans(ledgerVoucherTransObject);
            }
        }
        ttscommit;
    }

  • Identify production quantity including route scrap before posting

    Situation:

    • Allow back-flushing of 92Kg of ingredient, even though only 90Kg are available; No negative stock allowed.
    • I did this by modifying the production order before reporting as finished.
    • For this I needed a way to find out how much the system will back-flush (e.g. 92Kg). The difficulty is that we use phantom BOMs and Routes, which makes the calculation rather complicated.
    • NOTE: Activating the production parameter "Physical reduction" (Journals tab), also reduces the quantity automatically to depleat only available stock. But we wanted more controll over this process.

    Solution:

    • Access the functionality used by Dynamics Ax for this calculation.
    • Note: The production must be started for the phantoms to be exploded.

    //bw start
    //Changed on 22 May 2007 by TW
    /* Description:
    Get the expected BOM line qty to be used
    Production must be started to take Phantom BOM/Routes into account
    */
    static public Qty prodItemQty(ProdBOM _prodBom, Qty _seriesSize=0)
    {
        ProdTable           prodTable = ProdTable::find(_prodBom.BOMId);
        BOMCalcData         bomCalcData;
        BOMCalcConsumption  itemCalcLine;
        UnitQty             bomProposal;
        InventQty           inventProposal;
        Qty                 seriesSize = _seriesSize;
        ;
        if (!seriesSize)
        {
            if(prodTable.ProdStatus==ProdStatus::StartedUp)
                seriesSize = prodTable.QtyStUp-(prodTable.reportedFinishedGood()+prodTable.reportedFinishedError());
            else
                seriesSize = prodTable.QtySched;
        }
        bomCalcData = BOMCalcData::newProdTable(seriesSize,_prodTable);
        itemCalcLine = BOMCalcConsumption::construct(_prodBom.Formula,bomCalcData);
        bomProposal = itemCalcLine.calcConsumption(_prodBom,
                                                    ProdRoute::accError(_prodBOM.ProdId,_prodBOM.OprNum),
                                                    NoYes::Yes);
        inventProposal = UnitConvert::qty(
                    bomProposal,
                    _prodBOM.UnitId,
                    InventTableModule::find(_prodBOM.ItemId,ModuleInventPurchSales::Invent).UnitId,
                    _prodBOM.ItemId);

        return inventProposal;
    }
    //bw end

  • Abstract of my Ph.D. Thesis 2004: Estimating and Influencing the cost of ERP-Systems in Swiss SMB

    Small and medium-sized enterprises make substantial investments in ERP-projects. Still, many do not know the costs associated with such projects and how they can influence the costs. This thesis addresses the needs of such businesses by offering a simple cost estimate together with various recommendations for influencing cost and success of ERP-projects. It is based on the analysis of about 40 project including 5 case studies.

    Cost estimate

    The cost drivers of ERP-systems are mainly the number of users and the complexity of the project. Many other aspects can influence the cost and might be a major issue in single projects, but they didn’t prove to be relevant in the general analysis.
    Cost = a + b*Users+c*complexity+e
    The formula was substantiated for small and medium-sized manufacturing companies in Switzerland using the data of 40 projects. It showed that the complexity of a project can be approximated by the team size. The accuracy of the estimates is limited by many factors which can not be accounted for in an early stage of the project. But it is similar to the accuracy of the observed budgets, which were based on much more thorough analyses.
    ExpectedTotalCosts[CHF]=160’000+9’500*Users+56’000*Teamsize
    ExpectedInvestmentCosts[CHF]=160’000+5’000*Users+34’000*Teamsize
    Total project costs can be estimated better than investments alone, because the vendor decides according to his pricing policy to ask more up front – leading to higher investment costs – or to charge more maintenance costs later. Therefore it is strongly suggested that the maintenance costs get the necessary attention during the evaluation.
    In practice many use the much simpler estimate, which depends only on the number of users. Often the complexity gets partly included by adjusting the cost per user. It showed that between Fr. 6’000 and 30’000 per user were used. The investment cost per user of the developed formula is bounded by the 95%-confidence interval between Fr. 10’000 and 15’000.
    ExpectedTotalCosts[CHF]=20’000*Users
    ExpectedInvestment[CHF]=12’000*Users

    Recommendations for ERP-customers

    ERP-customers who want to influence cost must be careful not to endanger the benefits of the system or the success of the project. Therefore factors which reduce costs and improve the success of the project have first priority when trying to reduce the cost. Avoidance of adaptations, investments in evaluation and raising the project competence as well as the capacity of the team members, are some of the important factors.
    Service costs can be influenced best. They, on the other hand, depend largely on how much effort is being put into actual adaptations of standard ERP-solutions. Adaptations can be avoided by rethinking internal processes, by choosing the right ERP-System, and by refraining from implementing too many specific wishes. It can also be shown that avoiding adaptations doesn’t reduce the benefits of an ERP-project but even improves project success.
    It cannot be expected of an average small or medium-sized business that its project managers or team members know who’s who in the ERP-market. Therefore it is strongly suggested that businesses invest adequately in evaluation by engaging independent consultants, to ensure that an optimal ERP-system gets chosen. A lot depends on the choice of the system, still only 26 % of the observed businesses called upon external help for their evaluations and the ones who did, spent only 1 % of the investment amount for this purpose.
    Competence and capacity of the project manager and the team members have a strong impact on the success of a project. Therefore it is important that the most able staff members get released from a substantial part of their daily duties. Depending on the number of competent employees with free capacity, more tasks can be handled in-house, which further reduces external costs.

    Recommendations for ERP-developers

    These recommendations mainly address small, local developers which operate in the SME-market and who are often confronted with pressure from mighty international competitors such as Microsoft, SAP and other large-scale enterprises. A growth strategy is less applicable for the smaller developer since this means entering into direct competition with the big ones of the industry.
    Proximity to customers is the key to success for small and local developers. Instead of imitating large competitors, strengthening the proximity to customers by focussing on a few branches of trade and regions, is the name of the game.
    To be successful the developers should support the already identified recommendations for ERP-customers. A possible chance to do this is by reducing the software licences leaving hardware and service costs as the only investments. This makes the offer more attractive since it has lower risks and it makes it clearer to the customer that adaptations are the main source of higher investment costs.
    Training costs represent an important part of total service cost. For this reason developers should focus on improving the ergonomics of a system rather than adding additional functionalities. Intuitive software helps users to learn faster and leads to higher motivation. More functionality on the other hand will hardly be noticed by a majority of users, because most up-to-date systems satisfy their needs with respect to functionality and therefore many of the hyped functions hardly get used.