Sunday, November 4, 2018

Analyzing expense receipts with Azure Cognitive Services and Microsoft Flow

Recently, Business Applications MVP Steve Endow and I delivered a session at the User Group Summits in Phoenix, and in particular, to the GPUG Summit titled, "Microsoft Dynamics GP and Azure Services". In this course we detailed a number of Azure Services (among the hundreds) that could potentially be used with Microsoft Dynamics GP.

Being that I have also been working my way through completing a Microsoft sanctioned PowerApps, Flow, and CDS self-paced, online training class offered by edX (click here for more info on the course) and presented by Business Applications MVP Shane Young, I asked myself, what could I do with Microsoft Flow and Azure Services that could benefit Microsoft Dynamics GP users?

Playing with some Azure Services, I came across Azure Cognitive Services which offers the capability of reading images and analyzing them for text via its Computer Vision service. As it turns out, this service offers an optical character recognition (OCR) feature, which is capable of returning the full text or a JSON formatted document with the text. The idea here would be to use Microsoft Flow to read a newly created receipt in a OneDrive folder and transfer the file to Cognitive Services' Computer Vision for analysis, then get back the parsed text from the OCR service. 

Let's see how it's done!

Provision the Computer Vision service

The first thing is to ensure Computer Vision has been enabled as a service on your Azure tenant. For this visit the Azure Portal, click on Create a Resource, then select Computer Vision from the AI + Machine Learning category within the Azure Marketplace. 

Computer Vision

Fill in some basic information like the resource name, location, pricing tier (there's a F0 free service!), and a resource group. Click the Create button when done. This will provision your Computer Vision resource.
 
Copy the service endpoint address and access keys

Once the service is provisioned, click on All Resources, select the Computer Vision resource you just created, then click on Overview.

Grab the service endpoint address and the access keys. You can obtain the access keys by clicking on Show access keys.. (two access keys are provided). 

Computer Vision service endpoint info

This is, by far, one of the easiest services to provision and requires no extra configuration, beyond establishing some access control to limit who can use the service, but that's not a topic for this discussion.

Setup a new Microsoft Flow flow

Over in Microsoft Flow, I started by setting up a blank flow and selected OneDrive's "When a file is created trigger" as this would setup the simple trigger point for when an expense receipt file is added to a Receipts folder I had previously created. You will then be prompted to setup the connection information for OneDrive.

Blank flow with "When a file is created" trigger

NOTE: I selected my personal OneDrive for this, but this can also be done with a folder on your OneDrive for Business environment. In this case, you will want to authenticate with your Office 365 credentials.
Receipts folder


Submit file to Computer Vision service

As it also turns out, Microsoft Flow has a connector to the Azure Computer Vision API, which exposes two actions: OCR to JSON and OCR to Text. Add a New Step and type Computer Vision in the search bar. Select Computer Vision API, then choose OCR to Text action.

Computer Vision API connector - OCR to Text action

Once again, you will be prompted for the connection information to your Computer Vision service on Azure. Enter the user account, the access key and service endpoint as gathered in step two, above.

Computer Vision API - credentials entry
Once credentials are entered, you can decide what to submit to Computer Vision. In this case, we what to send the File Content, which we can select from Dynamic content fields.

File Content from Dynamics content fields

Configure Email step with Results

Upon completion, we want to send the resulting OCR from the analyzed image via email, so we will add another step to the flow. This time, we will a connector to Office 365 Outlook and will choose the Send an Email action for our next step.
Office 365 Outlook connector - Send Email action

We can then setup the properties for the Send an Email step. I have chosen to send the email to myself, and compose a subject line using the File name from the OneDrive directory. As body, I am including the Detected Text, which can be selected from the OCR to Text category under Dynamic content. I've included both the original file and content as part of the attachments.



Finally, I have given this flow a name and saved it.

Testing the Flow

I have dropped two receipt files within my OneDrive Receipts folder. These two receipts present various degrees of quality and text that can be recognized by the service. I was particularly interested in the second receipt (right) as this one was very folded and cracked so I was curious to see how it would be analyzed.

Receipts
For the second receipt, the OCR service returned the JSON payload and a status 200, indicating it was successful in processing and delivering a response.

JSON payload for second receipt

The actual email I received look like this and contained the following text:

Receipt analysis

Now, remember that my goal isn't to judge the accuracy of the OCR result delivered by Computer Vision, but rather to show how easy it is to build these kinds of solutions with Microsoft Flow and existing Azure services. Something like this would take an inordinate amount of time to build using traditional development tools and services.

Conceivably, I could create a simple PowerApps application that uses the Camera control to take the picture of the receipt and save it to my OneDrive folder. At this point, the receipt would be picked up by the Flow and analyzed by Computer Vision as we have established here. Why would this be important? Perhaps if you want to parse the JSON payload and rather submit to Microsoft Dynamics GP or Dynamics 365 as an account payables voucher, this would be useful.

Until next post,

MG.-
Mariano Gomez, MVP

Friday, October 19, 2018

Installing Microsoft Azure Integration Runtime

The Integration Runtime (IR) is a customer managed data integration infrastructure used by Azure Data Factory to provide data integration capabilities across different network environments. It was formerly called as Data Management Gateway.

It is used by Business Central and Dynamics GP to provide a one-way data synchronization between your on-premises SQL Server hosting your company databases, and your Business Central tenant data store in the cloud via Azure Data Factory.

The self-hosted IR is capable of:
  • Running copy activity between a cloud data stores and a data store in private network.
  • Dispatching the following transform activities against compute resources in On-Premise or Azure Virtual Network: HDInsight Hive activity (BYOC), HDInsight Pig activity (BYOC), HDInsight MapReduce activity (BYOC), HDInsight Spark activity (BYOC), HDInsight Streaming activity (BYOC), Machine Learning Batch Execution activity, Machine Learning Update Resource activities, Stored Procedure activity, Data Lake Analytics U-SQL activity, .Net custom activity, Lookup activity, and Get Metadata activity.
To install, go to the Assisted Setup page and download the installation file. Copy the authentication key as you will need it for the configuration step.

NOTE: Be sure to scroll horizontally to the end of the line while selecting the key text.


Launch the installation file to begin the setup process. You must first select the language which you will be using with the Integration Runtime.


Accept (or not) the end-user license agreement to continue with the installation process.


Select the installation folder to host the application files.


Click the install button to begin the installation process. You may be prompted to run the application with elevated permissions to bypass User Account Control settings on your machine.


Once the installation has been completed, click Finish to proceed with IR's configuration.


Paste the Authentication Key from step 1. If you require to bypass proxy settings within your organization, be sure to click on the Change link to setup your proxy info.


The following screen will allow you to enter the name of the self-hosted node and will display a list of nodes as configured by Business Central on the Azure Data Factory service. Also, you may enable remote access from your intranet to the integration node. Click finish when done.


If you choose to enable remote access for management purposes, you must enter an open TCP/IP port on your firewall. By default, this port is 8060. Should you want to enable secure access to while remoting into the node, be sure to select a certificate. You can also enable remote access without a certificate.


Finally, click on Finish to complete the configuration settings an finalize the node registration process.


For more information on Azure Data Factory, please visit:
https://docs.microsoft.com/en-us/azure/data-factory/

For more information on Azure Integration Runtime, please visit:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime

Until next post,

MG.-
Mariano Gomez, MVP

Monday, October 8, 2018

Provisioning a Microsoft Dynamics 365 Business Central demo tenant

With the release of Microsoft Dynamics GP 2018 R2, I had a first look at the update process, then quickly ran into a situation where the option to set up intelligent cloud insights wasn't showing in my already provisioned Microsoft Dynamics 365 Business Central tenant - See, "Set up Intelligent Cloud" option not showing for Microsoft Dynamics 365 Business Central tenants.

I also noted that this option was available with the October '18 Release and that the current tenants would be undergoing upgrades within the next 30 days. In the meantime, if you want to familiarize yourself with the process of deploying intelligent insights, you will need to provision a demo tenant, built on the October '18 release.

Here's how you do this:

1. Launch your browser and enter https://demos.microsoft.com in the address bar. This will take you to the Microsoft demo site. The site can be accessed either with your Microsoft account or your Work account (organizational or Azure AD authentication).

Microsoft Demos (https://demos.microsoft.com
NOTE: If using your work account, you must be a Microsoft partner or MVP to access the site.

2. Once on the Demos site main page, click on My Environments on the upper menu bar.
Main Demos page
3. Once on the Environments page, click on Create a Tenant.

Environments page

4. On the Tenant page, locate the Microsoft 365 Enterprise with Users and No Content section, then click on Create Tenant.

Tenant page

NOTE
: you can provision a tenant for a period of 90 days, which should give you plenty enough time to get started.

5. Once the environment has been provisioned, please take note of the admin user and password you have been assigned as this will be needed to complete the additional steps.


6. Launch your browser and open an In Private session (Edge) or Incognito session (Chrome). In the URL bar, type https://home.dynamics.com.

home.dynamics.com
Enter the admin credentials assigned to you in the previous step. The environment will then be prepared for you:



7. Once the environment has been prepared, proceed to click on Get more apps.

Get more apps

8. Type "Business Centr.." in the search bar to narrow down the list of apps to Business Central, then select Microsoft Dynamics 365 Business Central (Trial), to initiate the free 30-day trial.

Business Central app
9. Now, all you need to do is agree to the terms of use and privacy terms and you are now on your way to provision and Business Central tenant.

Terms of Use and Privacy Terms agreement
You will have 30 days to work with this tenant before it expires. I assume if you need more time, you can delete the demo environment and start all over again. However, I find it more beneficial to simply have your own Business Central tenant as existing tenants should see the October '18 release applied to them anytime now.

Hope you find this useful.

Until next post,

MG.-
Mariano Gomez, MVP

Wednesday, October 3, 2018

"Set up Intelligent Cloud" option not showing for Microsoft Dynamics 365 Business Central tenants

I was (still am!) seriously excited about the release of Microsoft Dynamics GP 2018 R2 - I mean, how can you not? This Intelligent Cloud thing is just superb! The simple premise of being able to sync your data to Microsoft Dynamics 365 Business Central and use the power of the cloud to get actionable insights is just too good to pass up on.

See my article, Microsoft Dynamics 365 Business Central: the perfect complement for Microsoft Dynamics GP users.

In fact, I already had an Office 365 Business Premium tenant provisioned with an instance of Dynamics 365 Business Central for IW. This is my playground, where I learn about the new stuff, where I try to build up my AL development skills, etc., (albeit at a turtle's pace 😊).

After updating my GP 2018 RTM instance to GP 2018 R2, I proceeded to setup the Intelligent Cloud Insights which, in a nutshell, involves setting up an integration between my on-premises Microsoft Dynamics GP application databases and Microsoft Dynamics 365 Business Central using an Azure Data Factory - a hybrid data integration (ETL) cloud service - and an integration runtime that allows to control what companies within a Business Central tenant get synced to what companies in Microsoft Dynamics GP and when.

The steps are fairly clear in this Microsoft Docs article, Connect to the intelligent cloud with Dynamics 365 Business Central. But in particular, when I went to the Assisted Setup page to locate the Set up Intelligent Cloud option, I could not find that particular entry.

Business Central Assisted Setup guide

Upon exchanging a few emails with the Microsoft Dynamics GP Product Management team, it was noted that this feature - Set up Intelligent Cloud - is part of the Dynamics 365 October '18 release and is not in the currently deployed Business Central tenants. The October '18 release should be out any time now and once that update is applied, you should gain the ability to enable intelligent insights for your Microsoft Dynamics GP 2018 R2 on-premises instances.

In the meantime, you can head over to Microsoft Demos to provision a Business Central demo environment with the October '18 release features that you can use to test and familiarize yourself with intelligent insights.

Until next post,

MG.-
Mariano Gomez, MVP

Updating to Microsoft Dynamics GP 2018 R2 - First look

Yesterday marked the release of Microsoft Dynamics GP 2018 R2 to the English speaking market - other languages soon to follow. After a few hiccups in the download process (make sure you clear your browser cache before attempting to download), I was able to get my hands on the 1.81 GB zip file containing the actual installation media (1.92 GB unzipped).

Since this is not a major release, I went straight for the Updates folder on the image and executed the MicrosoftDynamicsGP18-KB4458410-ENU.msp file - The msp file was also published separately on the Service Pack, Hotfix, and Compliance Update Patch Releases for Microsoft Dynamics GP 2018 page on PartnerSource and CustomerSource and it probably offers a better download experience and less of a footprint, if all you need is to update an existing Microsoft Dynamics GP 2018 instance.

Applying the msp is a fairly straight forward process. However, if you need instructions for a full install, take a look at my article Microsoft Dynamics GP 2018 installation - First Look which shows the full installation process. Apparently, there were some updates to the Dexterity Shared Components, so you will see the bootstrap setup in action, working to update those.

The system database update went extremely fast on both my server and client environments and I couldn't be happier with this initial experience. Next up was the Fabrikam database.


NOTE: For this update, I have 2 environments: a Windows Server 2016 running SQL Server 2014 on a Dell PowerEdge 1950 with 32GB of RAM and Dual Xeon 2.24 GHz and a Windows 10 running SQL Server 2014 running on a Dell XPS L701X with 16GB of RAM and SSD drives, both with Fabrikam and all Mekorma products and test data, totaling close to 550 MB each. And while for many, Fabrikam is not a good indicator of how upgrades will turn out, I've learnt over the years that is a very good predictor.

The Fabrikam update completed in or around 15 minutes on both my server and my laptop machine. I didn't find this to be out of the norm considering I was dealing with a 550MB database, in both cases.

Upon launching Dynamics GP, things will seem very normal, until you take a closer look at the homepage, where you will notice a link, the homepage header for Intelligent Cloud Insight.

Dynamics GP Homepage
Clicking on this link will bring up the new Business Central Intelligent Cloud for On-Premises Customers page, which can be accessed directly via your browser here. Where you can begin setting up Intelligent Cloud by clicking on the TRY IT OUT button.

Intelligent Cloud Insights page

My fellow Microsoft Business Applications MVP, Jen Kuntz has a full comprehensive write up on how to configure Intelligent Cloud here, so I will not double-down on this.

NOTE: I don't know if this will become a recurring problem, but I ran into some page scripting issues, but clicking Yes to continue running the scripts on the page simply seems to work all the time, although annoying. My friend, Jen Kuntz, seems to have ran into this as well.

Intelligent Cloud Insights page scripting error
It's worth noting that you can control whether the standard homepage gets displayed vs the Intelligent Insights page when you switch between options on the Navigation Bar, via the homepage customization link, Customize this page...

Customize Home Page window
One thing to note...

After launching Dynamics GP and login into the system, messages did not get re-added to the Alert Messages table (SY01700). This was actually a time saver and improved the overall experience -- oh, the small things in life 😊.

Until next post,

MG.-
Mariano Gomez, MVP

Thursday, August 16, 2018

Calling SQL Server stored procedures from Microsoft Dexterity - revisited

Back in January of 2009, I wrote an article showing a method of calling a SQL Server stored procedure from Dexterity by calling a stored procedure prototype script (sproc) in sanScript.

See, Calling SQL Server stored procedures from Microsoft Dexterity (January 26, 2009).

This sproc tells the runtime engine (Dynamics.exe) that it must connect to either the system or company database to run the stored procedure of the same name. I particularly like this method because it basically it shields the developer from dealing with connections, etc. Also, the procedure executes under the user's security context.

However, there are times when it is necessary to go the extra mile, especially if you have to call procedures you do not want to prototype into your code as a sproc - case in point, calling third party stored procedures, or eConnect stored procedures.

Enter the use of Dexterity SQL library functions and pass-through SQL. The Dexterity SQL library functions are well documented in the help file. If you are familiar with ADO.NET or standard ADO, the steps are pretty similar:

1. Create a connection and set the database context for the stored procedure (akin to a connection string)
2. Create your prepared SQL statement to run the connection, including parameters.
3. Execute and fetch the recordset.


Here's a simple example (for simplicity sake, basic error handling implemented):

1) Write your stored procedure and grant access to the SQL Server DYNGRP role. This will make your stored procedure accessible from your Dexterity application, avoiding SQL Server permission issues. Our stored procedure will need to accept a service call number as a parameter, then return the total amount to be billed on the service call.

dbo.uspGetServiceCallTotal

IF OBJECT_ID ( 'dbo.uspGetServiceCallTotal', 'P' ) IS NOT NULL
  DROP PROCEDURE dbo.uspGetServiceCallTotal
GO

CREATE PROCEDURE dbo.uspGetServiceCallTotal
  @IN_callnbr char(20) = NULL,
  @IN_service_type smallint,
  @INOUT_serviceTotal numeric(19,5) output
AS

SELECT @INOUT_serviceTotal = TOTAL FROM dbo.SVC00200
WHERE CALLNBR = @IN_callnbr and SRVTYPE = @IN_service_type
GO

GRANT EXECUTE ON dbo.uspGetServiceCallTotal TO DYNGRP
GO

2) The following is a sample Dexterity calling script that retrieves the parameter result as a dataset and issue a simply warning with the returned value:

in string CallNumber;
in integer CallType;

local long sqlConn;
local long sqlStatus;
local text sqlStmt;
local string paramlist = "%1, %2, %3 output;";
local string outparam1  = "@outparam";
local string outparam1Type = "numeric(19,5)";
local currency lServiceTotal;
 
sqlStatus = SQL_Connect(sqlConn);
if sqlStatus = OKAY then
     clear sqlStmt;
 
     {setup the output parameter declaration line}
     sqlStmt = sqlStmt + "declare " + outparam1 + CH_SPACE + outparam1Type + CH_SEMICOLON + CRLF;
 
     {setup parameter list to pass into proc}
     substitute paramlist, SQL_FormatStrings(CallNumber), str(CallType), outparam1 + CRLF;  
     sqlStmt = sqlStmt + "exec dbo.uspGetServiceCallTotal" + CH_SPACE + paramlist + CRLF;
     sqlStmt = sqlStmt + "select " + outparam1;
     
     sqlStatus = SQL_Execute(sqlConn, sqlStmt);
     if sqlStatus = OKAY then
           sqlStatus = SQL_FetchNext(sqlConn);
           
           if sqlStatus <> 31 then
                sqlStatus = SQL_GetData(sqlConn, 1, lServiceTotal);
                warning str(lServiceTotal);
           end if;
     end if;
end if;


One of the things the runtime engine will do is run this procedure within the context of the current Microsoft Dynamics GP user connection to SQL Server, this is, another connection will not be created unless it's absolutely necessary. The technique I show above also seeks to declare the parameter list as a string with substitution patterns to reduce chances of SQL code injection. If your SQL procedure will return a larger data set, then you will need a while...do cycle to advance to the next record in the recordset.

Hope you find this useful.

Until next post!

MG.-
Mariano Gomez, MVP

Friday, July 27, 2018

Missing Dexterity Shared Components cause Word Templates processing to hang

Working recently on an issue, I ran into a situation where I would get the following template processing error attempting to print a SOP Invoice - if you look at the bottom status bar, next to the user ID, you will see a Template Processing accompanied by a yellow warning triangle.


Upon inspecting the Template Processing status, I ran into the following Exception Detail message:

The following error occurred while processing this report:
One or more templates could not be processed. View the exception log for more details.


Since the error was referencing a template, I figured I would go back and check to make sure I could edit the template, which was based on the original SOP Blank Invoice Form Template. Upon attempting to modify the template, I received an error:

File Not Found:"C:\Users\\AppData\Local\Temp\SOP Blank Invoice Form Template.docx".




After attempting to print the template, and after all the subsequent tell-tell sign errors, I attempted to exit Dynamics GP and got the following warning:

Processes are currently being run that cannot be deleted. These processes must be run to completion before you exit.



At this point, I went through the checklist:

  • Microsoft Dynamics GP Add-In for Microsoft Word is installed - Check!
  • Microsoft Office 32 bit is installed - Check!
  • Open XML SDK 2.0 for Microsoft Office is installed - Check!

One thing I noticed was, when I opened Word, the Microsoft Dynamics GP fields object would not appear on Microsoft Word's ribbon. This was a sign that something was not right, considering I had the GP Add-In for Microsoft Word installed.

I ran a repair on Microsoft Dynamics GP and this did not solve the problem either. Upon further inspection, I noticed I was missing the Dexterity Shared Components 16.0 (64-bit). In my system, I run several versions of Microsoft Dynamics GP and I had Dexterity Shared Components for all but the Microsoft Dynamics GP 2016 version I was trying to print the invoice from. 

Once Dexterity Shared Components was installed, I went ahead and gave it a try once more and everything worked as expected.

Until next post,

MG.-
Mariano Gomez, MVP