Monday, October 20, 2014

Partner Connections 2014 Kicked Off a Great Summit

Monday, October 13, 2014

Summit is now gone, but I wanted to take some time to recount some of the events as I saw them happening. First, I wanted to congratulate the Dynamic Partner Connections team for an excellent event hosted at the Renaissance Hotel. The location could not be anymore upscale and clearly was a great way to get things started.

As I was strolling along the corridors of the hotel with MVPs Joris de Gruyter and John Lowther, we ran into some of the awesome Dynamic Communities folks.

From left to right: Michelle Spitzer, Mark Rhodes, Joris de Gruyter, John Lowther, and Liz Hallen
Most of the topics at the Partner Connections day revolved around leadership and various NAV and SL topics - since the Summit itself tends to be dominated by AX, GP, and CRM, I guess this was a good thing.

Midday came along and it was time for Q&A with Hal Howard, former Microsoft executive and new Executive Director of Dynamics Communities. Many questions coming from the partner community revolved around Microsoft's dismissal of the certification requirements for Dynamics GP, Dynamics SL, and Dynamics NAV and whether Microsoft was trying to send a stronger message with such move. For example, a question came in related to Microsoft "killing the products" via a sale to a competitor or simply a code rollup into one product, a reminder of the Project Green days.

Andy Hafer and Hal Howard field questions from the audience

After the Q&A session, it was time for welcoming Hal to Community and a quick pic.

John Lowther, Hal Howard, Mariano Gomez, Joris de Gruyter

After lunch, I decided to stop by the Expo hall to see how things were progressing and I have to say that, no matter which venue is chosen, it never ceases to amaze me the amount of work that goes into conditioning a massive room with booths, banners, and lighting.

Expo Hall

The same could be said for the Meal hall which, this time around, had to be able to accommodate 3,500 people at any given time.

Meal Hall  - Andy receiving final updates
Came the evening, it was time for the partner reception. I had a chance to catch up with a number of partners and ISVs in attendance and really enjoyed the atmosphere of camaraderie and pleasantries around good food and drinks.

MVPs John Lowther and Leslie Vail

Team Altec (

Frank Vukovits (Dynamic Communities) and Jon Rivers (Data Masons)

After Partner Connections and the partner reception, it was time to gracefully go back to my hotel room and catch up on some work.

Until next post!

Mariano Gomez, MVP
Intelligent Partnerships, LLC

Sunday, October 12, 2014

GPUG Summit 2014 St. Louis - The Journey Begins

My journey to St. Louis, MO begins today. As it's customary, I will be writing articles on a daily basis, if possible, to narrate the events of the day. I have a busy schedule, with presentations on Wednesday, Thursday, and Friday (I just took on a new session on Friday at 2:15 PM called "Testing, Testing, & More Testing" related to upgrades). Please follow me on Twitter @dgpblogster, or The Dynamics GP Blogster on Facebook, and of course right here on this blog for more info on the event and those little behind the scene moments. This year, the Summit is expecting a record breaking 900 attendees and is growing to be THE EVENT to attend after Convergence. If you have not signed up to be a GPUG member, what are you waiting on?
Until next post!

Mariano Gomez, MVP
Intelligent Partnerships, LLC

Wednesday, September 24, 2014

Microsoft Dynamics GP 2015 Developer's Preview: Working with Sample URIs - Part 2

In the part 1 video, I explained how to mount the Microsoft Dynamics GP 2015 Developer's Preview virtual hard disk using Hyper-V. My intent was to provide a part two showing how to mount the VHD file on Windows Azure, but realized it would take more time than I wanted to invest in really getting the point across on many of the aspects around the new service architecture components, so I have decided to forgo the Azure portion until some other day.

Today, I will focus on some of the sample service requests provided on the Developer's preview image, which can be found in the Example Service Requests.txt file available on the desktop of the image.

Before however, I wanted to touch base on REpresentational State Transfer (REST) services. REST, a term first coined by Roy Fielding (a principle author of the HTTP specification) in his doctoral dissertation, is an architectural style that treats networked application states and functionality as resources, which share a uniform interface. This architectural style differs in many ways from that of the Remote Procedure Call (RPC) architecture where services reside on the network and are invoked using request parameters and control data contained within messages.

Some of the basic principles governing REST services are:

  • Actors interact with resources, and resources are anything that can be named and represented. Each resource can be addressed via a unique Uniform Resource Identifier (URI).
  • Interaction with resources (located through their unique URIs) is accomplished using a uniform interface of the HTTP standard verbs (GET, POST, PUT, and DELETE). Also important in the interaction is the declaration of the resource's media type, which is designated using the HTTP Content-Type header. (XHTML, XML, JPG, PNG, and JSON are some well-known media types.)
  • Resources are self-descriptive. All the information necessary to process a request on a resource is contained inside the request itself (which allows services to be stateless).
  • Resources contain links to other resources (hyper-media).

While REST is defined by its author using strict architectural principles, the term is often used loosely to describe any simple URI-based request to a specific domain over HTTP without the use of an additional messaging layer such as Simple Object Access Protocol (SOAP). Implementations adhering to the strict principles of REST are often referred to as being “RESTful,” while those which follow a loose adherence are called “REST-Like”. Microsoft Dynamics GP Services can be considered REST-like (See Chapter 1: Microsoft Dynamics GP Service, page 3 of the Microsoft Dynamics GP Service Based Architecture Preview documentation).

A quick sample

As an example, imagine you need to build a service that interacts with the Microsoft Dynamics GP item master list: basically, a service that could produce the list of items and/or information about a specific item in the list, from a specific company database - in this case Fabrikam - and to be more precise, that company database resides within a specific tenant. Technically speaking, this service could also add or retrieve data for an item to and from the item master in Fabrikam, on the current tenant.

When building a REST-like service, you can must answer 3 basic questions:

  • What resources you are trying to define or expose
  • How are you going to represent the resources (URIs)
  • What actions are you going to support for each URI (HTTP verbs).

  • For our example, the resources will be defined by the hierarchy Tenants(Name:tenant_name)/Companies(company_name)/Items(item_number). The URIs are really dependent on where the service is going to be hosted, so for example, this could be in the form of followed by the above hierarchy. The next thing in line is then to understand what HTTP verbs or actions are supported with each URI.

    Next, we need to determine the URIs for each resource. Right now we only need to determine the relative URIs since the absolute URI will be determined by where we host the service. The item master will be the root URI of the service (/). Using this syntax, /Items() will return all of items contained in the item master; /Items({ItemNumber}) will be the URI for each item within the item master.

    Under the current Developer's preview implementation, if you wanted to retrieve information about an item (HTTP GET), you would then use the following URI notation from your browser:


    By copy and pasting the above URL in the browser, the service call will generate a JavaScript Object Notation file (.json), as shown below:

      "Status": {
        "CorrelationId": "d3056b1bb9d84775ad269abfa09cfa77",
        "Code": 200
      "Payload": {
        "Trace": [],
        "ItemNumber": "2GPROC",
        "ItemDescription": "2 Ghz Processor",
        "NoteIndex": 333.0,
        "ItemShortName": "",
        "ItemType": "SalesInventory",
        "ItemGenericDescription": "",
        "StandardCost": 0.0,
        "CurrentCost": 250.0,
        "ItemShippingWeight": 0.0,
        "DecimalPlacesQTYS": "NotUsed",
        "DecimalPlacesCurrency": "One",
        "ItemTaxScheduleID": "",
        "TaxOptions": "Nontaxable",
        "IVIVIndex": 18,
        "IVIVOffsetIndex": 18,
        "IVCOGSIndex": 137,
        "IVSalesIndex": 112,
        "IVSalesDiscountsIndex": 128,
        "IVSalesReturnsIndex": 134,
        "IVInUseIndex": 0,
        "IVInServiceIndex": 141,
        "IVDamagedIndex": 141,
        "IVVariancesIndex": 783,
        "DropShipIndex": 445,
        "PurchasePriceVarianceIndex": 446,
        "UnrealizedPurchasePriceVarianceIndex": 446,
        "InventoryReturnsIndex": 450,
        "AssemblyVarianceIndex": 0,
        "ItemClassCode": "RM-ACT",
        "ItemTrackingOption": 1,
        "LotType": "",
        "KeepPeriodHistory": true,
        "KeepTrxHistory": true,
        "KeepCalendarHistory": true,
        "KeepDistributionHistory": true,
        "AllowBackOrders": true,
        "ValuationMethod": "FIFOPerpetual",
        "UOfMSchedule": "EACH",
        "AlternateItem1": "",
        "AlternateItem2": "",
        "MasterRecordType": 1,
        "ModifiedDate": "2017-05-21T00:00:00",
        "CreatedDate": "2017-05-21T00:00:00",
        "WarrantyDays": 0,
        "PriceLevel": "",
        "LocationCode": "",
        "PurchInflationIndex": 0,
        "PurchMonetaryCorrectionIndex": 0,
        "InventoryInflationIndex": 0,
        "InventoryMonetaryCorrectionIndex": 0,
        "COGSInflationIndex": 0,
        "COGSMonetaryCorrectionIndex": 0,
        "ItemCode": "",
        "TaxCommodityCode": "",
        "PriceGroup": "BUY",
        "PriceMethod": "CurrencyAmount",
        "PurchasingUOfM": "",
        "SellingUOfM": "",
        "KitCOGSAccountSource": "FromComponentItem",
        "LastGeneratedSerialNumber": "",
        "ABCCode": "B",
        "RevalueInventory": true,
        "TolerancePercentage": 0.0,
        "PurchaseItemTaxScheduleID": "",
        "PurchaseTaxOptions": "NonTaxable",
        "ItemPlanningType": "Normal",
        "StatisticalValuePercentage": 0.0,
        "CountryOrigin": "",
        "Inactive": false,
        "MinShelfLife1": 0,
        "MinShelfLife2": 0,
        "IncludeinDemandPlanning": false,
        "LotExpireWarning": true,
        "LotExpireWarningDays": 0,
        "LastGeneratedLotNumber": "",
        "LotSplitQuantity": 0.0,
        "UseQtyOverageTolerance": false,
        "UseQtyShortageTolerance": false,
        "QtyOverageTolerancePercentage": 0.0,
        "QtyShortageTolerancePercentage": 0.0,
        "IVSTDCostRevaluationIndex": 0,
        "UserCategoryValues1": "",
        "UserCategoryValues2": "",
        "UserCategoryValues3": "",
        "UserCategoryValues4": "",
        "UserCategoryValues5": "",
        "UserCategoryValues6": ""

    You can also retrieve an XML payload by specifying the extension in the URI, as follows:


    Here are other examples of URI notations to perform various service calls to retrieve data from Microsoft Dynamics GP, as provided in the Developer's preview:

    Checking the status of the GP Service.

    Obtaining help on supported HTTP verbs.

    Retrieve information on customer AARONFIT0001 (Aaron Fitz Electrical).

    Retrieve information on customer COMPUTER0001(Computer World).

    Retrieve information on item number 100XLG.

    Retrieve information on site 101G.

    Retrieve information on site 104G.

    Retrieve information on all companies under the current tenant.

    Retrieve information about Fabrikam, Inc. under the current tenant.

    I want to mention that there 2 HTML files provided with the preview, which contain JavaScript sample code showing how to access the Dynamics GP Service. These can be found under the Samples folder. The scripts show how to make use of the HTTP POST, HTTP PATCH, and HTTP DELETE actions to create a new, and update and delete an existing record in Microsoft Dynamics GP respectively.

    There's also a .NET sample application that show how to consume a GP Service as well. This sample can be loaded with Visual Studio in the Developer's Preview image.

    While this is all good, In my next article I will show how to build a Microsoft Dexterity-based service that can be consumed by other applications.

    Until next post!

    Mariano Gomez, MVP
    IntelligentPartnerships, LLC

    Wednesday, September 10, 2014

    Open letter to Jeff Edwards, Microsoft Dynamics Partner Strategy

    Mr. Edwards,

    I read your lines in response to MVP Mark Polino's article No upside to Microsoft's decision to kill Dynamics GP exams on MSDynamicsWorld and I really hope you get to my article Why the end of Microsoft Dynamics GP exam certifications is bad news for customers at some point for an along-the-lines perspective on the issue.

    Nonetheless, my objective here today is to offer some direct comments and frankly to keep an open line of communication (there's probably nothing to debate) on the subject of the retirement of ERP certifications - primarily Dynamics GP which is my area of expertise.

    Counter-argument 1:
    On the shift to the Cloud - We are not saying it is Azure/O365 in place of GP, we are saying GP AND Azure/O365. The integrated solution provides true value to customers and differentiation for our partners. Issues like use tax, approval workflow and cash flow projections are absolutely critical, and they are improved by the use of GP working together with O365.

    I happen to be, probably, one of the most tech savvy MVPs in the Dynamics GP pool of MVPs, with over 15 deployments on Azure along with Office 365 and even Azure Virtual Network to On-Premise network integrations. In fact, I have written a number of articles featured on this site on the subject of cloud and ERP deployments and continue to be one of its biggest proponents.

    The real benefit of ERP cloud deployments - at least for my clients - is time-to-execution. After all, being able to complete an implementation project on average 4 to 6 weeks earlier than your typical on premise deployment has its merits and allows companies to quickly realize a ROI, not having to worry about the infrastructure on which their solution will be deployed. When you really get past this fact, the second biggest driver to a cloud deployment is the expertise of the individuals that allowed the client to realize their solution much quicker: BUSINESS APPLICATIONS SPECIALISTS. Let's not kid ourselves, ERP systems are unlike any other type of technology in the market. You simply cannot take a Windows Server guy and make him/her a manufacturing specialist or a project accounting or tax specialist. It just doesn't work! Despite all the arguments stating "certifications don't make experts", a certification, current or otherwise, is still a vehicle used by customers to understand NOT THE LEVEL OF EXPERTISE, but the LEVEL OF UNDERSTANDING of any given individual on a particular business application subject. For example, if my client perceives my certifications to be strictly technical, then they have a right to question my functional abilities and vice versa.

    Counter-argument 2:
    On your statement that the Dynamics team does not own the expense of their exams. We certainly wish this was the case, but it was not. We had complete budgetary and P&L responsibility for all the exams we created. As stated, we looked at our budget and the current skills and needs in the channel and decided more training, available online, covering more of the integrated Microsoft solution, was a better use of our budget and would have a more positive impact on both our Partners' business and customer success.

    Respectfully, I sense a degree of contradiction in this argument. I fail to understand how rescinding the certifications directly improve building "current skills and needs in the channel", and furthermore how can it not have an effect your own bottom line (more on that later). If Microsoft's goal now is skill-building and targeting of specific needs, the better alternative, in my humble view, would have been to work with said channel to understand how the certifications needed to change or improve in response to your own goals, all within the budget you had. There are various entities who would have gladly worked with Microsoft - for free, even - to ensure the certifications were adequate and sufficient enough, namely Dynamic Partner Connections, the GP User Group (GPUG), and the always willing Microsoft Dynamics GP MVP group. Strangely enough, the collaboration model I described seems to exist and work well for other Microsoft divisions. For example, the SQL Server team is one that works closely with special interest groups and MVPs to ensure the certification is a benefit to the community of SQL Server professionals.

    How are you going to ensure that more online training is being assimilated by the channel when your stated intention is to have a "more positive impact on both our Partner's business and customer success". After all, you cannot control what you can't measure, correct? The bottom line is, Microsoft have the mechanisms in place to ensure the assimilation of other technologies - to use your example, Azure and Office 365. If your goal is to ensure partners are driving customers to the cloud and cloud-based solutions, that's what the partner channel competency program is for. Ironically, partner competency in a specific technology vertical is intrinsically tied to individuals within the organization attaining certifications in those very areas.

    Polino is also correct in stating that we do pay for these exams which should offset to some degree the cost of producing them. If cost was a concern for Microsoft, why not raise the retail price of the exam as opposed to simply doing away with them? After all, those of us who really value the Microsoft Certified Professional program and achieving a certification to prove to our clients that we've put in the sweat would have gladly paid for them. As an anecdote, we had been offering vouchers to our consultants for upgrading their certifications to the most current product release before news of cancellation hit the streets. I would also venture to say, that most responsible partners offer some form of incentive to their consulting and delivery teams, monetary or otherwise, to maintain existing and attain new certifications that can only benefit the partner organization - what's that word again? Competency!

    Counter-argument 3:
    With the expansion of the solution, I would argue it is not easier to become a partner.... We did try to make it cheaper by dropping the cost of unlimited online training from $6,000 to $1,000. As far as a flood of new, untrained entrants, we instituted a requirement for a business plan and proof of investment for any partner signing up. This must be approved by the US Partner Director. New partners coming into the eco-system have dropped by 70% over the last two years, as was our intent. The new partners that do get in offer unique value and are committed to training their people to deliver value to customers

    I find it very interesting that you mention a "business plan and proof of investment" as a mechanism to vet new entrants. In one of my management classes in the MIS/Technology Management program I graduated from, I learned that business plans and funding are only the starting point for any business and that most companies fail where it matters most: sales and execution.

    As I am sure you are aware of, there are partner organizations that can sell and there are partner organizations that can execute or deliver. Rarely do you find the one organization that is very good at both. Mea culpa!

    If there is a silver line here, you have now opened up the floodgates to the return of the boutique consulting firm. After all, large partners can now focus on selling, selling, selling (which is got to be at the top of the list of drivers behind this move) without the added pressure of maintaining a pool of certified individuals just to keep up with some SPA requirement. Less partners, less administration, more revenue, more to the bottom line... I get it!

    The flipside of that coin is larger partner firms tend to outsource the delivery to boutique firms specializing in implementing. Case in point, 80% of my organization's business derives from professional services delivered on behalf of these larger firms, so I may not fit the bill of a traditional revenue partner on your books. The bottom line of this already lengthy explanation is, Microsoft and its larger ERP partner organizations need SOMEONE to deliver such implementations so customers can smile, and Microsoft and, conceivably, the selling partner can perceive those coveted and profitable maintenance plan renewals and margins, respectively.

    Then, why not give us the small guys a chance to continue differentiating ourselves in the ecosystem? After all, I would want to believe that your large (selling) partners also have a vested interest in seeing their entrusted customers' projects being done by individuals who have at least gone through training and completed a product certification. I will say it again, I'M WILLING TO PAY MORE if that's what it takes, but consider bringing back those certifications for the greater good of the community of customers and partners.

    I promise I won't hold my breath on seeing any of my humble views being entertained at any level within Microsoft, but hope you at least get a chance to read them.


    Mariano Gomez, MVP
    Intelligent Partnerships, LLC

    Friday, August 29, 2014

    Microsoft Dynamics GP 2015 Developer's Preview: Loading the VHD image - Part 1

    Now that you are beyond the initial excitement of the preview release announcement and have downloaded the RAR files with the links provided by Kevin Racer, Sr. Program Manager Lead with the Microsoft Dynamics GP team (See Microsoft announces Developer Preview for Dynamics GP 2015 for links to the rar files), it's time to get the VHD image loaded.

    Note: you can use WinRAR or WinZip to extract the virtual hard drive image from the rar files downloaded from PartnerSource. The extracted file is 29.1GB.

    Part 1 will focus on the traditional Hyper-V method of loading the file. Click here for direct access to the video on YouTube.

    Until next post!

    Mariano Gomez, MVP
    Intelligent Partnerships, LLC

    Microsoft Dynamics GP 2015 Developers' Preview is now available

    These are exciting times indeed! Microsoft Dynamics GP 2015 Developer's Preview is now available for partners to download, as featured over at Developing for Dynamics GP by Kevin Racer, Sr. Program Manager Lead with the Microsoft Dynamics GP team (See Microsoft announces Developer Preview for Dynamics GP 2015).

    The Developer's Preview features the new Service Base Architecture (SBA) components that will enable developers from all walks of life to write and integrate applications on any platform to Microsoft Dynamics GP via REST services.

    NOTE: as of the time of this article, hypermedia is still not part of the current design.

    You can always find more information about RESTful services online, but here's a primer on RESTful with WCF on MSDN.

    The RESTful approach facilitates integration across the board as it takes advantage some fundamental principles like HTTP as transport protocol, URIs to identify resources, and Verbs that translate directly into actions.

    Now what makes this even more interesting is, all the business logic to consume and expose services can be written in sanScript. Let me repeat... all the business logic to consume and expose services can be written in sanScript - Microsoft Dexterity's development language. Dexterity has been considerably enhanced and extended for .NET interop. There's no more need to expose a .NET assembly to COM. And through the now familiar Dictionary Assembly Generator (DAG), you can generate the .NET assemblies for your Dexterity based services. This truly allows partners and ISVs to take significant advantage of their existing code base without much effort to add this new functionality.

    Dexterity continues to evolve to deliver powerful functionality
    (C) Copyright Microsoft Corporation, 2014

    .NET interop opens up the door for Dexterity developers to create powerful applications that expose and consume services, along with a host of other options previously available only to Visual Studio developers. Alice Newsam discusses more on .NET Interop in her Dynamics GP Developer Insight article, over at Developing for Dynamics GP. The application integration options have now scaled beyond the traditional Web Services, eConnect, and Integration Manager options for CRUD operations and Dexterity Triggers and Visual Studio Tools for UI integration.

    Partners can download the Roshal Archive format (RAR) files containing the virtual hard drive (VHD) image from PartnerSource using the links provided by Kevin in his article (See Microsoft announces Developer Preview for Dynamics GP 2015).

    It's always good to point out that this is still a preview version so you are encouraged not to release any product or deliver any service to a customer with these tools and rather use for internal education and readiness.


    In my next article I will discuss how to load the VHD image.

    Until next post!

    Mariano Gomez, MVP
    Intelligent Partnerships, LLC

    Tuesday, August 26, 2014

    Customizating Integration Manager Logs - Part 2

    In my previous post I talked about all the out of the box options for setting up Integration Manager ("IM") logs and frankly, the Trace level log is good for most users of IM users. However, when "good" is not good enough, it's necessary to resort to some of the objects and functions available as part of IM's scripting library.

    Errors Collection object, Error object, and functions

    Integration Manager provides the Errors Collection object which is nothing more than a collection or list of all the errors generated during an integration. The Errors Collection must be explicitly retrieved in order to work with the properties within the collection. To navigate the collection we need the Error object to get information about the specific error within the Errors Collection, for example, time of the error, the specific error text, and the type of severity (error or warning).

    IM also provides a number of functions that allow a developer to write into the log file directly. These functions are: LogDetail, LogDocDetail, LogWarning, and LogDocWarning. Each of these functions is discussed in greater detail in Part 5 - Using VBScript, Chapter 22 - Functions of the Integration Manager User's Guide. The following example puts all these together:

    After Document script
    ' Created by Mariano Gomez, MVP
    ' This code is licensed under the Creative Commons 
    ' Attribution-NonCommercial-ShareAlike 3.0 Generic license.
    Const SEVERITY_MEDIUM 1000
    Dim imErrors ' reference the Errors Collection
    Dim imError ' reference a specific error within the collection
    Set imErrors = GetVariable("Errors")  
    If imErrors.Count > 0 Then
     For i = 1 to imErrors.Count
      Set imError = imErrors.Item(i) ' get the error represented by the index
      'Check the severity level of the error
      if imError.Severity = GetVariable("SeverityWarning") then
       'We have hit a warning
       LogDocWarning imError.MessageText, "", SEVERITY_MEDIUM, "Customer Name", SourceFields(somequery.CustomerName)
      ' We hit a major issue, so now we really want to log all details details
       LogDocDetail imError.MessageText, "", SEVERITY_CRITICAL, "Customer Name", SourceFields(somequery.CustomerName)
      End If
     Next 'Continue if there's more than 1 error
    End If

    Note that you can add event logs from anywhere where scripting is allowed in IM. The above sample code is just a small example of how you could customize the logs further, with information that's meaningful to you and your users.

    Hope you found this information useful.

    Until next post!

    Mariano Gomez, MVP
    Intelligent Partnerships, LLC

    Customizing Integration Manager Logs - Part 1

    Just recently, I took on a question on the Microsoft Dynamics GP Partner Online Technical Forum where the original poster asked if it was possible to customize the logs produced by Integration Manager ("IM").

    Before we get into the customization aspects of the log, let's start by remembering that IM already offers 4 levels of log customization out of the box: a Summary log, a Document level log, a Trace level log and, if you consider no log an option, then None. In the case of a Document level log, information about every integrated record is logged, including the document number (keep in mind that document here refers to the entire envelope of data regardless of whether it's an actual document or a master record). The Trace level log, in addition to the Document level information, examines and outputs all the steps performed by IM to get the document into Microsoft Dynamics GP, including any errors or warnings that Microsoft Dynamics GP may send back to IM.
    IM Properties window - Logs tab

    At this point, the above options are out of the box and do exactly what Microsoft developers intended. However, what if you really want to extend the capabilities of the log to provide some additional information that is not currently covered by the Document or Trace level logs? It is possible to provide this extra piece of information if you are familiar with IM event scripts.

    IM Properties window - Scripts tab
    In particular, the Document Warning, Document Error, and Integration Error event scripts allow you to make use of the VBScript scripting editor to extend the information you may include in the log file, regardless of log trace level. As the events suggest, the Document Warning event will fire when IM receives a warning from Microsoft Dynamics GP in response to an attempt to integrate a document, i.e., a missing distribution account when integrating a journal entry. While the journal will still integrate, it means that further editing work will be required in Microsoft Dynamics GP to add the missing account and balance the journal transaction before posting is possible.

    In contrast, the Document Error event fires when Microsoft Dynamics GP cannot accept the master record or transaction being imported due to inconsistencies with the data. For example, a SOP invoice is submitted with missing required field, causing the document to be rejected by Microsoft Dynamics GP. In this case, this response is captured as document error by IM, causing the Document Error event to fire.

    Finally, the Integration Error event fires each time an error occurs for the integration process as a whole.

    This is not to say you can't add additional information to the log at any other point or event within Integration Manager, but you will want to remember that most times when you are dealing with logs, you want to mostly target exceptions within the integration process and not necessarily every single event.

    Tomorrow, I will focus on the scripting options available to enhance/customize the logs. Also remember that this and many other topics will be covered during my GPUG Summit 2014 session on Integration Manager. Please register and attend the session.

    Until next post!

    Mariano Gomez, MVP
    Intelligent Partnerships, LLC

    Thursday, August 21, 2014

    Mariano Gomez does the ALS #IceBucketChallenge

    Thanks to my good friend David Musgrave over at Developing for Dynamics GP for nominating me to the ALS #IceBucketChallenge. David was originally nominated by MVP Jivtesh Singh. See his blog post and video here:

    David completed his challenge (and donation yesterday) and posted this article on his blog as proof of his accomplishment. You can see his challenge video below:

    Of course, supporting the cause and accepting the challenge is what this is all about so here is my poor attempt at self-filming along with drenching myself - and no, I don't have a pool and no, it's not 62 degrees, but the water was ice chilling!

    Since I forgot all about nominating anyone in the video, I take these few extra lines to nominate my kids Laura, Angie, and Miguel Gomez and the Reporting Central team headed by Gianmarco Salzano and Shane Hall. You have 24 hours to complete the challenge.

    Until next post!

    Mariano Gomez, MVP
    Intelligent Partnerships, LLC

    #GPUG Summit 2014 St. Louis Schedule

    I want to begin drawing some attention to my presentation schedule at the upcoming GPUG Summit 2014 in St. Louis, MO where I will be once more delivering some cool and thoughtful sessions around some very relevant topics.

    CodeSessionRoomDate and Time
    TOT02Mariano's Toolbox: Integration Manager, Please!
    Session Level: Intermediate
    231Oct 15, 11:00 AM
    STR04Mariano's Toolbox: Web Client Deployment for You!
    Session Level: Intermediate
    240Oct 15, 4:30 PM
    ITP06Mariano's Toolbox: Why the Support Debugging Tools is a Customer Favorite!
    Session Level: Intermediate
    229Oct 16, 9:30 AM
    UPG07Mariano's Toolbox: Upgrading to Microsoft Dynamics GP 2013 for Dummies
    Session Level: Intermediate
    242Oct 16, 2:00 PM

    To make your participation more enticing, all my sessions are eligible for CPE credits, so please visit the Registration page and sign up. You can check out the full sessions schedule here.

    If you want something to do before the event, there are pre-conference training classes available on October 13 and 14 and offered by the GPUG Academy.

    Finally, this year I have been nominated to the GPUG All Stars and would appreciate your vote. Please help me attain this important achievement.

    Until next post!

    Mariano Gomez, MVP
    Intelligent Partnerships, LLC