Wednesday, September 10, 2014

Open letter to Jeff Edwards, Microsoft Dynamics Partner Strategy

Mr. Edwards,

I read your lines in response to MVP Mark Polino's article No upside to Microsoft's decision to kill Dynamics GP exams on MSDynamicsWorld and I really hope you get to my article Why the end of Microsoft Dynamics GP exam certifications is bad news for customers at some point for an along-the-lines perspective on the issue.

Nonetheless, my objective here today is to offer some direct comments and frankly to keep an open line of communication (there's probably nothing to debate) on the subject of the retirement of ERP certifications - primarily Dynamics GP which is my area of expertise.

Counter-argument 1:
On the shift to the Cloud - We are not saying it is Azure/O365 in place of GP, we are saying GP AND Azure/O365. The integrated solution provides true value to customers and differentiation for our partners. Issues like use tax, approval workflow and cash flow projections are absolutely critical, and they are improved by the use of GP working together with O365.

I happen to be, probably, one of the most tech savvy MVPs in the Dynamics GP pool of MVPs, with over 15 deployments on Azure along with Office 365 and even Azure Virtual Network to On-Premise network integrations. In fact, I have written a number of articles featured on this site on the subject of cloud and ERP deployments and continue to be one of its biggest proponents.

The real benefit of ERP cloud deployments - at least for my clients - is time-to-execution. After all, being able to complete an implementation project on average 4 to 6 weeks earlier than your typical on premise deployment has its merits and allows companies to quickly realize a ROI, not having to worry about the infrastructure on which their solution will be deployed. When you really get past this fact, the second biggest driver to a cloud deployment is the expertise of the individuals that allowed the client to realize their solution much quicker: BUSINESS APPLICATIONS SPECIALISTS. Let's not kid ourselves, ERP systems are unlike any other type of technology in the market. You simply cannot take a Windows Server guy and make him/her a manufacturing specialist or a project accounting or tax specialist. It just doesn't work! Despite all the arguments stating "certifications don't make experts", a certification, current or otherwise, is still a vehicle used by customers to understand NOT THE LEVEL OF EXPERTISE, but the LEVEL OF UNDERSTANDING of any given individual on a particular business application subject. For example, if my client perceives my certifications to be strictly technical, then they have a right to question my functional abilities and vice versa.

Counter-argument 2:
On your statement that the Dynamics team does not own the expense of their exams. We certainly wish this was the case, but it was not. We had complete budgetary and P&L responsibility for all the exams we created. As stated, we looked at our budget and the current skills and needs in the channel and decided more training, available online, covering more of the integrated Microsoft solution, was a better use of our budget and would have a more positive impact on both our Partners' business and customer success.

Respectfully, I sense a degree of contradiction in this argument. I fail to understand how rescinding the certifications directly improve building "current skills and needs in the channel", and furthermore how can it not have an effect your own bottom line (more on that later). If Microsoft's goal now is skill-building and targeting of specific needs, the better alternative, in my humble view, would have been to work with said channel to understand how the certifications needed to change or improve in response to your own goals, all within the budget you had. There are various entities who would have gladly worked with Microsoft - for free, even - to ensure the certifications were adequate and sufficient enough, namely Dynamic Partner Connections, the GP User Group (GPUG), and the always willing Microsoft Dynamics GP MVP group. Strangely enough, the collaboration model I described seems to exist and work well for other Microsoft divisions. For example, the SQL Server team is one that works closely with special interest groups and MVPs to ensure the certification is a benefit to the community of SQL Server professionals.

How are you going to ensure that more online training is being assimilated by the channel when your stated intention is to have a "more positive impact on both our Partner's business and customer success". After all, you cannot control what you can't measure, correct? The bottom line is, Microsoft have the mechanisms in place to ensure the assimilation of other technologies - to use your example, Azure and Office 365. If your goal is to ensure partners are driving customers to the cloud and cloud-based solutions, that's what the partner channel competency program is for. Ironically, partner competency in a specific technology vertical is intrinsically tied to individuals within the organization attaining certifications in those very areas.

Polino is also correct in stating that we do pay for these exams which should offset to some degree the cost of producing them. If cost was a concern for Microsoft, why not raise the retail price of the exam as opposed to simply doing away with them? After all, those of us who really value the Microsoft Certified Professional program and achieving a certification to prove to our clients that we've put in the sweat would have gladly paid for them. As an anecdote, we had been offering vouchers to our consultants for upgrading their certifications to the most current product release before news of cancellation hit the streets. I would also venture to say, that most responsible partners offer some form of incentive to their consulting and delivery teams, monetary or otherwise, to maintain existing and attain new certifications that can only benefit the partner organization - what's that word again? Competency!

Counter-argument 3:
With the expansion of the solution, I would argue it is not easier to become a partner.... We did try to make it cheaper by dropping the cost of unlimited online training from $6,000 to $1,000. As far as a flood of new, untrained entrants, we instituted a requirement for a business plan and proof of investment for any partner signing up. This must be approved by the US Partner Director. New partners coming into the eco-system have dropped by 70% over the last two years, as was our intent. The new partners that do get in offer unique value and are committed to training their people to deliver value to customers

I find it very interesting that you mention a "business plan and proof of investment" as a mechanism to vet new entrants. In one of my management classes in the MIS/Technology Management program I graduated from, I learned that business plans and funding are only the starting point for any business and that most companies fail where it matters most: sales and execution.

As I am sure you are aware of, there are partner organizations that can sell and there are partner organizations that can execute or deliver. Rarely do you find the one organization that is very good at both. Mea culpa!

If there is a silver line here, you have now opened up the floodgates to the return of the boutique consulting firm. After all, large partners can now focus on selling, selling, selling (which is got to be at the top of the list of drivers behind this move) without the added pressure of maintaining a pool of certified individuals just to keep up with some SPA requirement. Less partners, less administration, more revenue, more to the bottom line... I get it!

The flipside of that coin is larger partner firms tend to outsource the delivery to boutique firms specializing in implementing. Case in point, 80% of my organization's business derives from professional services delivered on behalf of these larger firms, so I may not fit the bill of a traditional revenue partner on your books. The bottom line of this already lengthy explanation is, Microsoft and its larger ERP partner organizations need SOMEONE to deliver such implementations so customers can smile, and Microsoft and, conceivably, the selling partner can perceive those coveted and profitable maintenance plan renewals and margins, respectively.

Then, why not give us the small guys a chance to continue differentiating ourselves in the ecosystem? After all, I would want to believe that your large (selling) partners also have a vested interest in seeing their entrusted customers' projects being done by individuals who have at least gone through training and completed a product certification. I will say it again, I'M WILLING TO PAY MORE if that's what it takes, but consider bringing back those certifications for the greater good of the community of customers and partners.

I promise I won't hold my breath on seeing any of my humble views being entertained at any level within Microsoft, but hope you at least get a chance to read them.

Sincerely,

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

Friday, August 29, 2014

Microsoft Dynamics GP 2015 Developer's Preview: Loading the VHD image - Part 1

Now that you are beyond the initial excitement of the preview release announcement and have downloaded the RAR files with the links provided by Kevin Racer, Sr. Program Manager Lead with the Microsoft Dynamics GP team (See Microsoft announces Developer Preview for Dynamics GP 2015 for links to the rar files), it's time to get the VHD image loaded.

Note: you can use WinRAR or WinZip to extract the virtual hard drive image from the rar files downloaded from PartnerSource. The extracted file is 29.1GB.

Part 1 will focus on the traditional Hyper-V method of loading the file. Click here for direct access to the video on YouTube.


Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

Microsoft Dynamics GP 2015 Developers' Preview is now available

These are exciting times indeed! Microsoft Dynamics GP 2015 Developer's Preview is now available for partners to download, as featured over at Developing for Dynamics GP by Kevin Racer, Sr. Program Manager Lead with the Microsoft Dynamics GP team (See Microsoft announces Developer Preview for Dynamics GP 2015).

The Developer's Preview features the new Service Base Architecture (SBA) components that will enable developers from all walks of life to write and integrate applications on any platform to Microsoft Dynamics GP via REST services.

NOTE: as of the time of this article, hypermedia is still not part of the current design.

You can always find more information about RESTful services online, but here's a primer on RESTful with WCF on MSDN.

The RESTful approach facilitates integration across the board as it takes advantage some fundamental principles like HTTP as transport protocol, URIs to identify resources, and Verbs that translate directly into actions.

Now what makes this even more interesting is, all the business logic to consume and expose services can be written in sanScript. Let me repeat... all the business logic to consume and expose services can be written in sanScript - Microsoft Dexterity's development language. Dexterity has been considerably enhanced and extended for .NET interop. There's no more need to expose a .NET assembly to COM. And through the now familiar Dictionary Assembly Generator (DAG), you can generate the .NET assemblies for your Dexterity based services. This truly allows partners and ISVs to take significant advantage of their existing code base without much effort to add this new functionality.

Dexterity continues to evolve to deliver powerful functionality
(C) Copyright Microsoft Corporation, 2014

.NET interop opens up the door for Dexterity developers to create powerful applications that expose and consume services, along with a host of other options previously available only to Visual Studio developers. Alice Newsam discusses more on .NET Interop in her Dynamics GP Developer Insight article, over at Developing for Dynamics GP. The application integration options have now scaled beyond the traditional Web Services, eConnect, and Integration Manager options for CRUD operations and Dexterity Triggers and Visual Studio Tools for UI integration.

Partners can download the Roshal Archive format (RAR) files containing the virtual hard drive (VHD) image from PartnerSource using the links provided by Kevin in his article (See Microsoft announces Developer Preview for Dynamics GP 2015).

It's always good to point out that this is still a preview version so you are encouraged not to release any product or deliver any service to a customer with these tools and rather use for internal education and readiness.

Downloads
SBA_Preview_Readme.txt

In my next article I will discuss how to load the VHD image.

Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

Tuesday, August 26, 2014

Customizating Integration Manager Logs - Part 2

In my previous post I talked about all the out of the box options for setting up Integration Manager ("IM") logs and frankly, the Trace level log is good for most users of IM users. However, when "good" is not good enough, it's necessary to resort to some of the objects and functions available as part of IM's scripting library.

Errors Collection object, Error object, and functions

Integration Manager provides the Errors Collection object which is nothing more than a collection or list of all the errors generated during an integration. The Errors Collection must be explicitly retrieved in order to work with the properties within the collection. To navigate the collection we need the Error object to get information about the specific error within the Errors Collection, for example, time of the error, the specific error text, and the type of severity (error or warning).

IM also provides a number of functions that allow a developer to write into the log file directly. These functions are: LogDetail, LogDocDetail, LogWarning, and LogDocWarning. Each of these functions is discussed in greater detail in Part 5 - Using VBScript, Chapter 22 - Functions of the Integration Manager User's Guide. The following example puts all these together:

After Document script
'
' Created by Mariano Gomez, MVP
' This code is licensed under the Creative Commons 
' Attribution-NonCommercial-ShareAlike 3.0 Generic license.
Const SEVERITY_MEDIUM 1000
Const SEVERITY_CRITICAL 2000

Dim imErrors ' reference the Errors Collection
Dim imError ' reference a specific error within the collection

Set imErrors = GetVariable("Errors")  
If imErrors.Count > 0 Then
 For i = 1 to imErrors.Count
  Set imError = imErrors.Item(i) ' get the error represented by the index
 
  'Check the severity level of the error
  if imError.Severity = GetVariable("SeverityWarning") then
   'We have hit a warning
   LogDocWarning imError.MessageText, "", SEVERITY_MEDIUM, "Customer Name", SourceFields(somequery.CustomerName)
  Else
  ' We hit a major issue, so now we really want to log all details details
   LogDocDetail imError.MessageText, "", SEVERITY_CRITICAL, "Customer Name", SourceFields(somequery.CustomerName)
  End If
 Next 'Continue if there's more than 1 error
End If
  

Note that you can add event logs from anywhere where scripting is allowed in IM. The above sample code is just a small example of how you could customize the logs further, with information that's meaningful to you and your users.

Hope you found this information useful.

Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

Customizing Integration Manager Logs - Part 1

Just recently, I took on a question on the Microsoft Dynamics GP Partner Online Technical Forum where the original poster asked if it was possible to customize the logs produced by Integration Manager ("IM").

Before we get into the customization aspects of the log, let's start by remembering that IM already offers 4 levels of log customization out of the box: a Summary log, a Document level log, a Trace level log and, if you consider no log an option, then None. In the case of a Document level log, information about every integrated record is logged, including the document number (keep in mind that document here refers to the entire envelope of data regardless of whether it's an actual document or a master record). The Trace level log, in addition to the Document level information, examines and outputs all the steps performed by IM to get the document into Microsoft Dynamics GP, including any errors or warnings that Microsoft Dynamics GP may send back to IM.

IM Properties window - Logs tab

At this point, the above options are out of the box and do exactly what Microsoft developers intended. However, what if you really want to extend the capabilities of the log to provide some additional information that is not currently covered by the Document or Trace level logs? It is possible to provide this extra piece of information if you are familiar with IM event scripts.

IM Properties window - Scripts tab
In particular, the Document Warning, Document Error, and Integration Error event scripts allow you to make use of the VBScript scripting editor to extend the information you may include in the log file, regardless of log trace level. As the events suggest, the Document Warning event will fire when IM receives a warning from Microsoft Dynamics GP in response to an attempt to integrate a document, i.e., a missing distribution account when integrating a journal entry. While the journal will still integrate, it means that further editing work will be required in Microsoft Dynamics GP to add the missing account and balance the journal transaction before posting is possible.

In contrast, the Document Error event fires when Microsoft Dynamics GP cannot accept the master record or transaction being imported due to inconsistencies with the data. For example, a SOP invoice is submitted with missing required field, causing the document to be rejected by Microsoft Dynamics GP. In this case, this response is captured as document error by IM, causing the Document Error event to fire.

Finally, the Integration Error event fires each time an error occurs for the integration process as a whole.

This is not to say you can't add additional information to the log at any other point or event within Integration Manager, but you will want to remember that most times when you are dealing with logs, you want to mostly target exceptions within the integration process and not necessarily every single event.

Tomorrow, I will focus on the scripting options available to enhance/customize the logs. Also remember that this and many other topics will be covered during my GPUG Summit 2014 session on Integration Manager. Please register and attend the session.

Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

Thursday, August 21, 2014

Mariano Gomez does the ALS #IceBucketChallenge

Thanks to my good friend David Musgrave over at Developing for Dynamics GP for nominating me to the ALS #IceBucketChallenge. David was originally nominated by MVP Jivtesh Singh. See his blog post and video here:


David completed his challenge (and donation yesterday) and posted this article on his blog as proof of his accomplishment. You can see his challenge video below:



Of course, supporting the cause and accepting the challenge is what this is all about so here is my poor attempt at self-filming along with drenching myself - and no, I don't have a pool and no, it's not 62 degrees, but the water was ice chilling!


Since I forgot all about nominating anyone in the video, I take these few extra lines to nominate my kids Laura, Angie, and Miguel Gomez and the Reporting Central team headed by Gianmarco Salzano and Shane Hall. You have 24 hours to complete the challenge.

Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

#GPUG Summit 2014 St. Louis Schedule


I want to begin drawing some attention to my presentation schedule at the upcoming GPUG Summit 2014 in St. Louis, MO where I will be once more delivering some cool and thoughtful sessions around some very relevant topics.

CodeSessionRoomDate and Time
TOT02Mariano's Toolbox: Integration Manager, Please!
Session Level: Intermediate
231Oct 15, 11:00 AM
STR04Mariano's Toolbox: Web Client Deployment for You!
Session Level: Intermediate
240Oct 15, 4:30 PM
ITP06Mariano's Toolbox: Why the Support Debugging Tools is a Customer Favorite!
Session Level: Intermediate
229Oct 16, 9:30 AM
UPG07Mariano's Toolbox: Upgrading to Microsoft Dynamics GP 2013 for Dummies
Session Level: Intermediate
242Oct 16, 2:00 PM

To make your participation more enticing, all my sessions are eligible for CPE credits, so please visit the Registration page and sign up. You can check out the full sessions schedule here.

If you want something to do before the event, there are pre-conference training classes available on October 13 and 14 and offered by the GPUG Academy.

Finally, this year I have been nominated to the GPUG All Stars and would appreciate your vote. Please help me attain this important achievement.

Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.intelligentpartnerships.com/

Tuesday, August 19, 2014

Microsoft Dynamics GP 2013 R2 Installation: Utilities changes

When executing Dynamics Utilities for a brand new installation of Microsoft Dynamics GP 2013 R2 you may have noticed a new window in the guiding wizard. This window is the Web Client SQL Server Login window, which allows you to specify a common SQL account that will be created during the system database setup process.

Web Client SQL Server Login window

In GP 2013 R2, users accessing the web client exclusively no longer require a SQL account to access the underlying data in the system and company databases they have been assigned to. However, their Active Directory credentials must be associated to their Microsoft Dynamics GP account (stored in the Users Master table (SY01400). Access to data is afforded via a common SQL Server login once the Active Directory credentials and the Microsoft Dynamics GP user credentials have been validated. This provides users with a single sign-on experience from the web client.


User Setup

The above is described in more detail by Jason Lech, Escalation Engineer at Microsoft in his article on Identity Management.

Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

Thursday, August 14, 2014

#reIMAGINE2014 Keynote Speaker: Doug Burgum

Doug Burgum in his days

Yes, THE Doug Burgum will be the keynote speaker at Microsoft's #reIMAGINE2014 conference in Fargo, ND., hosted by Dynamics Partner Connections. If this is not enough reason to go to Fargo in November, then I don't know what is.

Of course, Doug has changed a bit from his days at Great Plains Software and Microsoft and this is probably the man you will see on stage at the Historic Fargo Theater.

Doug Burgum today

So sharpen up on your Microsoft Dynamics GP history and join us there. You can register by visiting the reIMAGINE 2014 site. Not only will the keynote be off the charts, but you will get to learn a ton.

Still need a reason to attend, then have a look at the series of posts by Pam Misialek on the Inside Microsoft Dynamics GP Blog:

Until next post!

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com

Wednesday, August 6, 2014

Deploying Business Analyzer Companion App Services on Windows Azure Service Bus

Business Analyzer App for Windows 8.1

If you are not using Business Analyzer today, shame on you. Last year sometimes I wrote a small article on how to deploy the Business Analyzer Windows 8 application on a personal laptop, which is typically how most of us road warriors need it to be for demo purposes. However, here at Intelligent Partnerships, we've changed our approach a bit: all our infrastructure and demo environments now reside on Windows Azure, which gives all of our people the flexibility of having a single environment and consistent across the board, while allowing our accounting staff to do work from anywhere in the world.


Today, I want to talk about another one of those cool features in Windows Azure: The Azure Service Bus and how you can leverage it to run the Business Analyzer app for Windows 8.1.

I really don't like to oversimplify anything, but think of Azure Service Bus as cloud based message queuing system, akin to MSMQ, but running on a much more robust platform. The advantage is that applications are no longer bound by myriads of layers (like firewalls, complex authentication, etc.) to communicate with each other and can reside anywhere and on any device that can communicate to the Internet.

So let's get started...

1. To setup a Service Bus you sign into Azure's management portal, then click on the Service Bus option on the left navigation bar.

Service Bus service
2. Click the Create button to add a namespace for your service bus. The namespace identifies the service in Windows Azure and is assigned an address. You can then choose the region where your service will be hosted.

Service bus namespace

By clicking the Ok button, Windows Azure proceeds to activate that namespace if it's available. If it's not available, you will be prompted to enter a new namespace, before you can continue.

3. You can then proceed to install Business Analyzer Companion App Services on one of your virtual machines (preferably not your SQL Server). The installation is straight forward and all you need to do prior to running the setup executable program (setup.exe) is to install the reporting services configuration reports for Business Analyzer. Those are provided in the Companion App services SSRS zip file (MDGP2013_CompanionAppServices_SSRS.zip).

4. Once the Companion App Services application is installed, you can proceed to launch the configuration app, which can typically be found in the C:\Program Files\Microsoft Dynamics\GP Companion App Services\ folder.

Welcome screen
The Welcome screen displays the Companion App Service current connection information, which should later on be replaced by the settings you establish for this instance.

5. After the Welcome screen, you are presented with the Windows Azure configuration screen.

Windows Azure Service Bus Configuration window
In this window you will enter information about the Service Bus you previously configured using the Azure management portal. For the most part you specify the namespace, issuer, and issuer key which can be found


Service Bus connection information
Note the default issuer and default key must match the issuer name and issuer key, respectively, in the Azure configuration window. Click Next to continue - you will experience a short delay while the wizard validates the Azure service bus information you provided.

6. On the Host Configuration screen you can enter the host name and port of the machine that's going to be running the service - typically, just accept the default port. The beauty here is, since we already configured access via the Azure service bus, it's not necessary to expose the public name (server.cloudapp.net) of the host and create an end-point for the port, thus exposing our servers to the world - this is why we created the service bus to begin with!



7. Next on is to select the Companion Apps that will be used with the service. You will want to mark both of them here.

Select Applications

8. You will then want to choose the Data Connections to use with Companion App Services. In this case you have a choice of either Excel Reports or SQL Server Reporting Services reports.

Data Connections
9. Next you must identify where to find the Excel reports using UNC path to specify the folder. My reports happen to be at \\servername\gpfiles\excel reports\reports.

Shared Excel Reports folder

10. You are now asked to enter the address to Report Server and specify a folder where the reports can be found, if you happen to have multiple instances of GP deployed.


If you have deployed SharePoint Integrated mode, then click the checkmark and specify the library address for the reports.

11. Once the report server information is validated, you will receive a confirmation page with the address to be used when configuring the Business Analyzer application.

Configuration Complete window
The address we are particularly interested in is the Service Bus address, which in this case is simply ip-contoso. (with the period at the end).

12. Open the Business Analyzer application, the go to the Configuration window from the Charms on the right. You can then enter the ip.contoso. address in the service configuration.

Business Analyzer Configuration

Sweet! Now, we have configured BA without really exposing the public address of our Azure VM, which should make system administrators very happy.

MG.-
Mariano Gomez, MVP
Intelligent Partnerships, LLC
http://www.IntelligentPartnerships.com