Thursday, July 11, 2019

Developing PowerApps and Flow apps with Azure Cognitive Services' Computer Vision

This is a session I presented at the recent Azure Virtual Day Camp, hosted by Dynamic Communities' D365UG, on June 26, 2019.

In this presentation I show how to leverage Azure Cognitive Services' Computer Vision service with PowerApps and Flow to build a badge scanner. In addition, I show how to use some simple, but powerful PowerApps and Flow functions to do string and image manipulation.

 

NOTE: During this live presentation I had a few audio irregularities. I did the best I could to edit the video to eliminate any moderator shouts to check if I was still around :) However, I don't think it takes away from the ease of understanding the session.

Until next post!

MG.-
Mariano Gomez, MVP

Wednesday, July 10, 2019

#PowerApps - A look at the new JSON() serializer function

In this video I take a look at the newly released JSON() serialization function and how it can be used to serialize an image, added to PowerApps in the month of June. You can read the full details on the PowerApps Blog article, JSON for canvas apps.



Also, take a look at the video created by PowerApps MVP, April Dunham on the subject, along with her intro to AI Builder.

Until next post!

MG.-
Mariano Gomez, MVP

Saturday, June 29, 2019

#PowerApps #PowerBI: Cash flow in Power BI and PowerApps - THR1014

This past Microsoft Business Applications Summit 2019 - Atlanta, I teamed up with fellow Microsoft MVP, Belinda Allen to deliver a 20 minutes theater session showing how to build a cash flow in Power BI, then use Microsoft Flow to get notifications based on specific conditions in the report, and embedding the cash change calendar as a tile in a PowerApps application.

We had a lot of fun putting this presentation together, so I hope you enjoy it.



Until next post!

MG.-
Mariano Gomez, MVP

Sunday, June 9, 2019

#PowerApps: Componentizing Google Maps

Hi PowerAppers!

PowerApps Canvas Apps Components remain one of those features that seem to have endless possibilities. I have, for quite some time now, been working with the Google Maps API in many of my applications to return a static image pointing to a specific location on the map.

As I moved from application to application, I noticed that I always ended up doing the same operations: format a label with a string representing the maps API Url, then substituting very specific elements within that string, with address information or latitude and longitude coordinates, adding the API key, and playing with the size of the map image returned by the maps API. Frankly, a time consuming process that added no value to my projects.

This is a sample Google maps API Url string with some the classic substitution patterns:

Click image to enlarge

In order to solve this problem, I decided to farm this out to Canvas Apps Components. Since my Url string had many placeholders for things like location, zoom values, image size, API key, and even the color of the marker, I figured all those elements could become custom input properties to the component, as shown below:

Component Custom Properties

We could then proceed to add an image control from the media gallery, and set the Image property to our formatted Url with the custom properties as placeholders, as shown below:

Click image to enlarge
Finally, we need the component size to grow accordingly with the map size - here's an idea: you may want to introduce a buffer between the map size and the component size, which would visually act as a border. For this we will set the component Width and Height properties to the GoogleMap.MapImgWidth and GoogleMap.MapImgHeight custom property values and we are done.

To test this all, I created a simple test harness screen:


Now that the map is an actual component, it's easy to move it across PowerApps apps using the new import from cloud applications.

You can download the component from the PowerApps Community Apps Gallery, here.

Related Articles:

#PowerApps: Numeric Up/Down control with persisted button press event using components

Until next post!

MG.-
Mariano Gomez, MVP

Wednesday, May 29, 2019

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs - Selecting a Source Control Provider

So far, I have covered the process of setting up both the Microsoft Dynamics 365 BC containerized application components, along with the VS Code IDE and AL language extensions. We also built the "Hello World" extension on BC's Customer List page and deployed it to our container by following some simple debugging steps. You can read more about it in the following articles:

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 1/3
#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 2/3
#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 3/3 #MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs - Installing Visual Studio Code

The purpose of today's article is to show you how to set up a source code control provider to host your projects for a multi-developer's environment. If you are a Microsoft Dexterity developer, you are probably familiar with using source code control repositories like Visual Source Safe, Team Foundation Server, or even Azure DevOps Repos. I wrote an entire series of articles on the subject, which you can review here:

#DevOps Series: Microsoft Dexterity Source Code Control with Visual Studio Team Services



Downloading and Installing Git


To begin, we will need Git. Git is a plug in for VS Code that allows you to manage code in both Azure DevOps and GitHub, depending on your source code control repository preferences. You can begin the download process by going directly to https://www.git-scm.com/download/win

https://www.git-scm.com/download/win

1. Launch the Git installer and accept the license agreement.

2. In the Select Components window, choose to add Git to your desktop. It just makes life a lot easier. Note, you can also integrate Git to File Explorer, in order to be able to open configuration files and Bash files directly.


3. In Choosing the default editor used by Git make sure to select VS Code as this is primarily the tool used for AL development anyways. This way, you will have one place to develop your code and address any source code repository administrative tasks.


4. For the PATH environment variable, you will want to make sure you can run Git from the Windows command line (DOS prompt) and any third party software - including VS Code.


5. Accept the default here and move on. Of course, if you have a domain environment and want to issue and validate certificates against your Windows Certificate Store, then make sure to choose the Windows Secure Channel library option.


6. I honestly don't have a preference here, but if I had to guess, the Checkout Windows-style, commit Unix-style line endings option is probably more suitable for code being checked into your repository.


7. For the terminal emulator setting, we will use Windows' default console window.


8. Here, I only accepted the defaults.



Connecting to Azure Repos

There are tons of resources on the net explaining how to point VS Code to an Azure DevOps or GitHub repository, depending on your organization's preference. This particular article will explain how to integrate with Azure Repos.

1. Open VS Code and click on the Extensions button. Type "Azure Repos" to locate and install that particular extension .

Azure Repos extension
2. For the final part of this article, you can see the complete process in the following video I prepared showing how to create a Azure DevOps project, synchronize the repository with VS Code, and add your first project files to the repository.


Until next post!

MG.-
Mariano Gomez, MVP

Tuesday, April 30, 2019

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs - Installing Visual Studio Code

In the previous 3 articles of the series, I talked about the rationale for selecting a container based environment for development purposes, we also installed Docker and downloaded and installed the Microsoft Dynamics 365 Business Central containers for Docker. This then set us on a path to installing the development IDE and selecting a source code control provider to host our AL solutions.

See:

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 1/3
#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 2/3
#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 3/3

This article in particular, will focus on the installation of Visual Studio Code (VS Code) and the AL language extensions for the environment.


Installing VS Code

VS Code is to BC developers what Dexterity is to Dynamics GP developers. VS Code provides the IDE required to incorporate the AL language extensions to develop BC integrating solutions. Although SanScript is integrated into the Dex IDE, the analogy still holds.

1. To get started, download VS Code from https://code.visualstudio.com



2. Click on the Download button on the upper right corner of the menu bar. You will then select the Windows 7, 8, 10 option to download the actual installer.


3. Once you have downloaded the installer, choose to run the executable. This will initiate the wizard from which you will follow a set of guided prompts to complete the installation.


4. Acknowledge the license agreement and click on Next to continue. You will then be asked to enter a new installation folder or accept the default - personally, I find that the defaults work best.


5. The installation process will then lay down all the files and register all appropriate components so you can begin using the application.


6. Once the installation is complete, please proceed to click on Finish. This should launch VS Code if you left the checkmark selected.


Installing AL language components and utilities

One of the aspects I like about VS Code is the extensions concept. Extensions are simply, plug ins that augment the VS Code environment. One such extensions is the AL language extension, created by Microsoft.

1. Click on the Extensions button on the left Activity Bar (square button). Type "AL" in the search bar to proceed. This should surface the "AL Language" extension by Microsoft. Click on Install to add this extension to VS Code.


2. Install PowerShell by Microsoft. Following similar process, click on the Extension button and type PowerShell. If you prefer working in the PowerShell ISE environment or from the PowerShell command prompt, that's entirely up to you, but know there's a PowerShell extension for VS Code, which brings the entire language into the VS Code IDE.


3. Install GitLens by Eric Armodio. Following similar process, click on the Extension button and type GitLens. With GitLens you can visualize code authorship at a glance via Git blame annotations and code lens, seamlessly navigate and explore Git repositories, gain insights via powerful comparison commands, and much more.


4. Install Insert GUID by Heath Stewart. Insert GUID is a simple command extension for Visual Studio Code to insert globally unique identifiers (GUIDs) into the Code text editor in a variety of formats.


5. Install Docker Explorer by Jun Han. With Docker Explorer you can manage Docker containers, Docker images, Docker Hub and Azure Container Registry right from VS Code.



"Hello World"

The "Hello World" project serves to test the entire installation up to this point and is the first foray into the world of AL extensions.

1. Press Ctrl+Shift+P on your keyboard to open the VS Code Command Palette (Alternatively, you can choose View | Command Palette from the menu). Type AL:Go! to locate the option to create an AL project.



2. Enter a local folder where you would like to store the project. In this case, I simply removed the last portion of the folder name and replaced with HelloWorld.


3. You will immediately be prompted to select the server type you will be running this project against. Since we've deployed the local containers, it's safe to say we can choose Your Own Server from the drop-down


The above operation results in the creation of a launch.json file that is added to the project.


4. Proceed to replace the server name, currently defaulted to localhost, to the name assigned to your BC container, in this case http://demo-bc. Change the instance to NAV from the default, BC130.

Press Ctrl+Shift+P and type AL:Download Symbols to retrieve all the Windows symbol packages for debugging purposes. More information on AL Windows Symbol Packages here.


5. Press Ctrl+Shift+B on your keyboard to compile the project and create the publishing package for our "Hello World" extension.


This extension simply sets a trigger to the OnOpenPage() event of the Customer List page that displays the message "App published: Hello word". The page is loaded by default as specified in the launch.json file.


6. Press F5 on your keyboard to launch the application in debugging mode. This should launch BC and present the message above.


Once the message has been cleared, the application will continue to load the Customer List.




In the next article, I will talk about connecting to a source code repository and what else we need in order to get our environment fully ready. I will also cover some techniques that are much more adaptable to Microsoft Dynamics GP developers as far as working with AL files and folders and how we can leverage our Dexterity knowledge here to help us administer our projects.

Until next post!

MG.-
Mariano Gomez, MVP

Thursday, April 18, 2019

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 3/3

In Part 2 of this series, we covered the full installation of Docker Desktop, used to run the Dynamics 365 Business Central containers. We also saw how to use PowerShell to enable both the Hyper-V and Containers features on Windows 10.

This article will focus on the installation and troubleshooting of the Dynamics 365 Business Central containers and will provide step by step instructions on how to accomplish this. Remember, there are quite a bit of resources out there, so here they are:

Get started with the Container Sandbox Development Environment
Running a Container-Based Development Environment

But the goal of this series is to help Microsoft Dynamics GP ISVs draw similarities and contrasts with their multi-developer Microsoft Dexterity development environments.


Now that Docker is been installed, we can effectively proceed to lay down the BC containers. This will create a full virtualized environment with all the BC components needed for development purposes. This equates to having a full environment with Microsoft Dynamics GP, Web Client, IIS, and SQL Server in place for developers to code against.


Business Central Containers Installation and Troubleshooting

1. To begin the installation, we must install the NavContainerHelper PowerShell module from the PowerShell Gallery, which contains a number of PowerShell functions, which helps running and interacting with the BC containers.

See NavContainerHelper from Freddy Kristiansen for additional information.

Install-Module NavContainerHelper -force
In the process of installing the NavContainerHelper module, you will be asked to add the latest NuGet provider to be able to retrieve any published packages. After the installation of the NuGet provider, I went to import the NavContainerHelper module and ran into the following error, advising me that running scripts was disabled on the system I was attempting to install on.


By running the Get-ExecutionPolicy command, I was able to identify that all PowerShell execution policies on my machine were set to Undefined, which in turn prevents unsigned scripts from being executed.

Get-ExecutionPolicy
Since I was installing this on my local machine, I simply wanted to bypass any restrictions within the current user scope.

Set-ExecutionPolicy
2. With the installation of the NuGet provider and the changes to the script execution policies in place, it was time to call Import-Module to add the NavContainerHelper module.


Importing the module is a quick step.

3. Finally, it's time to create the BC containers. This is done by calling the New-NavContainer function (from the NavContainerHelper module). You will be prompted to create a user name and password to access the container and BC once installed. Here's the full call:

New-NavContainer -accept_eula -containerName "Demo-bc" -accept_outdated -imageName "microsoft/bcsandbox:us" -auth NavUserPassword -includeCSide -UpdateHosts -doNotExportObjectsToText


New-NavContainer
4. The container files are downloaded onto disk and are extracted.



5. Once all the files are extracted, the container is initialized by Docker. If all goes well, you should see a message letting you know that the container was successfully created.

Container created successfully
If you close the PowerShell window, you will notice a new set of icons on your desktop that will allow you to load BC running on the container, as follows:


  • Demo-bc Web Client: shortcut to the BC web client application
  • Demo-bc Command Prompt: access to the container command prompt
  • Demo-bc PowerShell: access to the PowerShell prompt running on the container
  • Demo-bc Windows Client: launches the Microsoft Dynamics NAV on-premises client
  • Demo-bc WinClient Debugger*
  • Demo-bc CSIDE: launches the CSIDE development environment for BC.


Desktop after a successful BC container deployment
Double-click on the Demo-bc Web Client icon to test the container deployment.

With the installation of Docker and BC containers, we have completed all the supporting environment setup. Be sure to play around with the new options, in particular, with both BC web client and Windows client components. It is important you begin to gain an understanding of the functional aspects of the application, before you embark in developing for this platform - nothing different than what you already did for Dynamics GP.

We are not quite done here, but since I am supposed to be a rational human being and respect the number of parts I chose for this series, I will start a new series showing how to add Visual Studio Code along with selecting and connecting to a source control repository, to close out this topic, so bear with me.

Until next post!

MG.-
Mariano Gomez, MVP