Wednesday, September 24, 2014

Microsoft Dynamics GP 2015 Developer's Preview: Working with Sample URIs - Part 2

In the part 1 video, I explained how to mount the Microsoft Dynamics GP 2015 Developer's Preview virtual hard disk using Hyper-V. My intent was to provide a part two showing how to mount the VHD file on Windows Azure, but realized it would take more time than I wanted to invest in really getting the point across on many of the aspects around the new service architecture components, so I have decided to forgo the Azure portion until some other day.

Today, I will focus on some of the sample service requests provided on the Developer's preview image, which can be found in the Example Service Requests.txt file available on the desktop of the image.

Before however, I wanted to touch base on REpresentational State Transfer (REST) services. REST, a term first coined by Roy Fielding (a principle author of the HTTP specification) in his doctoral dissertation, is an architectural style that treats networked application states and functionality as resources, which share a uniform interface. This architectural style differs in many ways from that of the Remote Procedure Call (RPC) architecture where services reside on the network and are invoked using request parameters and control data contained within messages.

Some of the basic principles governing REST services are:

  • Actors interact with resources, and resources are anything that can be named and represented. Each resource can be addressed via a unique Uniform Resource Identifier (URI).
  • Interaction with resources (located through their unique URIs) is accomplished using a uniform interface of the HTTP standard verbs (GET, POST, PUT, and DELETE). Also important in the interaction is the declaration of the resource's media type, which is designated using the HTTP Content-Type header. (XHTML, XML, JPG, PNG, and JSON are some well-known media types.)
  • Resources are self-descriptive. All the information necessary to process a request on a resource is contained inside the request itself (which allows services to be stateless).
  • Resources contain links to other resources (hyper-media).

While REST is defined by its author using strict architectural principles, the term is often used loosely to describe any simple URI-based request to a specific domain over HTTP without the use of an additional messaging layer such as Simple Object Access Protocol (SOAP). Implementations adhering to the strict principles of REST are often referred to as being “RESTful,” while those which follow a loose adherence are called “REST-Like”. Microsoft Dynamics GP Services can be considered REST-like (See Chapter 1: Microsoft Dynamics GP Service, page 3 of the Microsoft Dynamics GP Service Based Architecture Preview documentation).

A quick sample

As an example, imagine you need to build a service that interacts with the Microsoft Dynamics GP item master list: basically, a service that could produce the list of items and/or information about a specific item in the list, from a specific company database - in this case Fabrikam - and to be more precise, that company database resides within a specific tenant. Technically speaking, this service could also add or retrieve data for an item to and from the item master in Fabrikam, on the current tenant.

When building a REST-like service, you can must answer 3 basic questions:


  • What resources you are trying to define or expose
  • How are you going to represent the resources (URIs)
  • What actions are you going to support for each URI (HTTP verbs).


  • For our example, the resources will be defined by the hierarchy Tenants(Name:tenant_name)/Companies(company_name)/Items(item_number). The URIs are really dependent on where the service is going to be hosted, so for example, this could be in the form of http://somedomain.com:port_number/gpservice/ followed by the above hierarchy. The next thing in line is then to understand what HTTP verbs or actions are supported with each URI.

    Next, we need to determine the URIs for each resource. Right now we only need to determine the relative URIs since the absolute URI will be determined by where we host the service. The item master will be the root URI of the service (/). Using this syntax, /Items() will return all of items contained in the item master; /Items({ItemNumber}) will be the URI for each item within the item master.

    Under the current Developer's preview implementation, if you wanted to retrieve information about an item (HTTP GET), you would then use the following URI notation from your browser:

    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Items(2GPROC)

    By copy and pasting the above URL in the browser, the service call will generate a JavaScript Object Notation file (.json), as shown below:

    Items(2GPROC).json
    {
      "Status": {
        "CorrelationId": "d3056b1bb9d84775ad269abfa09cfa77",
        "Code": 200
      },
      "Payload": {
        "Trace": [],
        "ItemNumber": "2GPROC",
        "ItemDescription": "2 Ghz Processor",
        "NoteIndex": 333.0,
        "ItemShortName": "",
        "ItemType": "SalesInventory",
        "ItemGenericDescription": "",
        "StandardCost": 0.0,
        "CurrentCost": 250.0,
        "ItemShippingWeight": 0.0,
        "DecimalPlacesQTYS": "NotUsed",
        "DecimalPlacesCurrency": "One",
        "ItemTaxScheduleID": "",
        "TaxOptions": "Nontaxable",
        "IVIVIndex": 18,
        "IVIVOffsetIndex": 18,
        "IVCOGSIndex": 137,
        "IVSalesIndex": 112,
        "IVSalesDiscountsIndex": 128,
        "IVSalesReturnsIndex": 134,
        "IVInUseIndex": 0,
        "IVInServiceIndex": 141,
        "IVDamagedIndex": 141,
        "IVVariancesIndex": 783,
        "DropShipIndex": 445,
        "PurchasePriceVarianceIndex": 446,
        "UnrealizedPurchasePriceVarianceIndex": 446,
        "InventoryReturnsIndex": 450,
        "AssemblyVarianceIndex": 0,
        "ItemClassCode": "RM-ACT",
        "ItemTrackingOption": 1,
        "LotType": "",
        "KeepPeriodHistory": true,
        "KeepTrxHistory": true,
        "KeepCalendarHistory": true,
        "KeepDistributionHistory": true,
        "AllowBackOrders": true,
        "ValuationMethod": "FIFOPerpetual",
        "UOfMSchedule": "EACH",
        "AlternateItem1": "",
        "AlternateItem2": "",
        "MasterRecordType": 1,
        "ModifiedDate": "2017-05-21T00:00:00",
        "CreatedDate": "2017-05-21T00:00:00",
        "WarrantyDays": 0,
        "PriceLevel": "",
        "LocationCode": "",
        "PurchInflationIndex": 0,
        "PurchMonetaryCorrectionIndex": 0,
        "InventoryInflationIndex": 0,
        "InventoryMonetaryCorrectionIndex": 0,
        "COGSInflationIndex": 0,
        "COGSMonetaryCorrectionIndex": 0,
        "ItemCode": "",
        "TaxCommodityCode": "",
        "PriceGroup": "BUY",
        "PriceMethod": "CurrencyAmount",
        "PurchasingUOfM": "",
        "SellingUOfM": "",
        "KitCOGSAccountSource": "FromComponentItem",
        "LastGeneratedSerialNumber": "",
        "ABCCode": "B",
        "RevalueInventory": true,
        "TolerancePercentage": 0.0,
        "PurchaseItemTaxScheduleID": "",
        "PurchaseTaxOptions": "NonTaxable",
        "ItemPlanningType": "Normal",
        "StatisticalValuePercentage": 0.0,
        "CountryOrigin": "",
        "Inactive": false,
        "MinShelfLife1": 0,
        "MinShelfLife2": 0,
        "IncludeinDemandPlanning": false,
        "LotExpireWarning": true,
        "LotExpireWarningDays": 0,
        "LastGeneratedLotNumber": "",
        "LotSplitQuantity": 0.0,
        "UseQtyOverageTolerance": false,
        "UseQtyShortageTolerance": false,
        "QtyOverageTolerancePercentage": 0.0,
        "QtyShortageTolerancePercentage": 0.0,
        "IVSTDCostRevaluationIndex": 0,
        "UserCategoryValues1": "",
        "UserCategoryValues2": "",
        "UserCategoryValues3": "",
        "UserCategoryValues4": "",
        "UserCategoryValues5": "",
        "UserCategoryValues6": ""
      }
    }
    

    You can also retrieve an XML payload by specifying the extension in the URI, as follows:

    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Items(2GPROC).xml

    Here are other examples of URI notations to perform various service calls to retrieve data from Microsoft Dynamics GP, as provided in the Developer's preview:

    Checking the status of the GP Service.
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Utility/Ping

    Obtaining help on supported HTTP verbs.
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Utility/Help

    Retrieve information on customer AARONFIT0001 (Aaron Fitz Electrical).
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Customers(AARONFIT0001)

    Retrieve information on customer COMPUTER0001(Computer World).
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Customers(COMPUTER0001)

    Retrieve information on item number 100XLG.
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Items(100XLG)

    Retrieve information on site 101G.
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Sites(101G)

    Retrieve information on site 104G.
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Sites(104G)

    Retrieve information on all companies under the current tenant.
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Companies()

    Retrieve information about Fabrikam, Inc. under the current tenant.
    http://localhost:8084/GPService/Tenants(Name=DefaultTenant)/Companies(Fabrikam,%20Inc.)/Companies(TWO)

    I want to mention that there 2 HTML files provided with the preview, which contain JavaScript sample code showing how to access the Dynamics GP Service. These can be found under the Samples folder. The scripts show how to make use of the HTTP POST, HTTP PATCH, and HTTP DELETE actions to create a new, and update and delete an existing record in Microsoft Dynamics GP respectively.

    There's also a .NET sample application that show how to consume a GP Service as well. This sample can be loaded with Visual Studio in the Developer's Preview image.

    While this is all good, In my next article I will show how to build a Microsoft Dexterity-based service that can be consumed by other applications.

    Until next post!

    MG.-
    Mariano Gomez, MVP
    IntelligentPartnerships, LLC
    http://www.IntelligentPartnerships.com

    Wednesday, September 10, 2014

    Open letter to Jeff Edwards, Microsoft Dynamics Partner Strategy

    Mr. Edwards,

    I read your lines in response to MVP Mark Polino's article No upside to Microsoft's decision to kill Dynamics GP exams on MSDynamicsWorld and I really hope you get to my article Why the end of Microsoft Dynamics GP exam certifications is bad news for customers at some point for an along-the-lines perspective on the issue.

    Nonetheless, my objective here today is to offer some direct comments and frankly to keep an open line of communication (there's probably nothing to debate) on the subject of the retirement of ERP certifications - primarily Dynamics GP which is my area of expertise.

    Counter-argument 1:
    On the shift to the Cloud - We are not saying it is Azure/O365 in place of GP, we are saying GP AND Azure/O365. The integrated solution provides true value to customers and differentiation for our partners. Issues like use tax, approval workflow and cash flow projections are absolutely critical, and they are improved by the use of GP working together with O365.

    I happen to be, probably, one of the most tech savvy MVPs in the Dynamics GP pool of MVPs, with over 15 deployments on Azure along with Office 365 and even Azure Virtual Network to On-Premise network integrations. In fact, I have written a number of articles featured on this site on the subject of cloud and ERP deployments and continue to be one of its biggest proponents.

    The real benefit of ERP cloud deployments - at least for my clients - is time-to-execution. After all, being able to complete an implementation project on average 4 to 6 weeks earlier than your typical on premise deployment has its merits and allows companies to quickly realize a ROI, not having to worry about the infrastructure on which their solution will be deployed. When you really get past this fact, the second biggest driver to a cloud deployment is the expertise of the individuals that allowed the client to realize their solution much quicker: BUSINESS APPLICATIONS SPECIALISTS. Let's not kid ourselves, ERP systems are unlike any other type of technology in the market. You simply cannot take a Windows Server guy and make him/her a manufacturing specialist or a project accounting or tax specialist. It just doesn't work! Despite all the arguments stating "certifications don't make experts", a certification, current or otherwise, is still a vehicle used by customers to understand NOT THE LEVEL OF EXPERTISE, but the LEVEL OF UNDERSTANDING of any given individual on a particular business application subject. For example, if my client perceives my certifications to be strictly technical, then they have a right to question my functional abilities and vice versa.

    Counter-argument 2:
    On your statement that the Dynamics team does not own the expense of their exams. We certainly wish this was the case, but it was not. We had complete budgetary and P&L responsibility for all the exams we created. As stated, we looked at our budget and the current skills and needs in the channel and decided more training, available online, covering more of the integrated Microsoft solution, was a better use of our budget and would have a more positive impact on both our Partners' business and customer success.

    Respectfully, I sense a degree of contradiction in this argument. I fail to understand how rescinding the certifications directly improve building "current skills and needs in the channel", and furthermore how can it not have an effect your own bottom line (more on that later). If Microsoft's goal now is skill-building and targeting of specific needs, the better alternative, in my humble view, would have been to work with said channel to understand how the certifications needed to change or improve in response to your own goals, all within the budget you had. There are various entities who would have gladly worked with Microsoft - for free, even - to ensure the certifications were adequate and sufficient enough, namely Dynamic Partner Connections, the GP User Group (GPUG), and the always willing Microsoft Dynamics GP MVP group. Strangely enough, the collaboration model I described seems to exist and work well for other Microsoft divisions. For example, the SQL Server team is one that works closely with special interest groups and MVPs to ensure the certification is a benefit to the community of SQL Server professionals.

    How are you going to ensure that more online training is being assimilated by the channel when your stated intention is to have a "more positive impact on both our Partner's business and customer success". After all, you cannot control what you can't measure, correct? The bottom line is, Microsoft have the mechanisms in place to ensure the assimilation of other technologies - to use your example, Azure and Office 365. If your goal is to ensure partners are driving customers to the cloud and cloud-based solutions, that's what the partner channel competency program is for. Ironically, partner competency in a specific technology vertical is intrinsically tied to individuals within the organization attaining certifications in those very areas.

    Polino is also correct in stating that we do pay for these exams which should offset to some degree the cost of producing them. If cost was a concern for Microsoft, why not raise the retail price of the exam as opposed to simply doing away with them? After all, those of us who really value the Microsoft Certified Professional program and achieving a certification to prove to our clients that we've put in the sweat would have gladly paid for them. As an anecdote, we had been offering vouchers to our consultants for upgrading their certifications to the most current product release before news of cancellation hit the streets. I would also venture to say, that most responsible partners offer some form of incentive to their consulting and delivery teams, monetary or otherwise, to maintain existing and attain new certifications that can only benefit the partner organization - what's that word again? Competency!

    Counter-argument 3:
    With the expansion of the solution, I would argue it is not easier to become a partner.... We did try to make it cheaper by dropping the cost of unlimited online training from $6,000 to $1,000. As far as a flood of new, untrained entrants, we instituted a requirement for a business plan and proof of investment for any partner signing up. This must be approved by the US Partner Director. New partners coming into the eco-system have dropped by 70% over the last two years, as was our intent. The new partners that do get in offer unique value and are committed to training their people to deliver value to customers

    I find it very interesting that you mention a "business plan and proof of investment" as a mechanism to vet new entrants. In one of my management classes in the MIS/Technology Management program I graduated from, I learned that business plans and funding are only the starting point for any business and that most companies fail where it matters most: sales and execution.

    As I am sure you are aware of, there are partner organizations that can sell and there are partner organizations that can execute or deliver. Rarely do you find the one organization that is very good at both. Mea culpa!

    If there is a silver line here, you have now opened up the floodgates to the return of the boutique consulting firm. After all, large partners can now focus on selling, selling, selling (which is got to be at the top of the list of drivers behind this move) without the added pressure of maintaining a pool of certified individuals just to keep up with some SPA requirement. Less partners, less administration, more revenue, more to the bottom line... I get it!

    The flipside of that coin is larger partner firms tend to outsource the delivery to boutique firms specializing in implementing. Case in point, 80% of my organization's business derives from professional services delivered on behalf of these larger firms, so I may not fit the bill of a traditional revenue partner on your books. The bottom line of this already lengthy explanation is, Microsoft and its larger ERP partner organizations need SOMEONE to deliver such implementations so customers can smile, and Microsoft and, conceivably, the selling partner can perceive those coveted and profitable maintenance plan renewals and margins, respectively.

    Then, why not give us the small guys a chance to continue differentiating ourselves in the ecosystem? After all, I would want to believe that your large (selling) partners also have a vested interest in seeing their entrusted customers' projects being done by individuals who have at least gone through training and completed a product certification. I will say it again, I'M WILLING TO PAY MORE if that's what it takes, but consider bringing back those certifications for the greater good of the community of customers and partners.

    I promise I won't hold my breath on seeing any of my humble views being entertained at any level within Microsoft, but hope you at least get a chance to read them.

    Sincerely,

    MG.-
    Mariano Gomez, MVP
    Intelligent Partnerships, LLC
    http://www.IntelligentPartnerships.com