Tuesday, April 30, 2019

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs - Installing Visual Studio Code

In the previous 3 articles of the series, I talked about the rationale for selecting a container based environment for development purposes, we also installed Docker and downloaded and installed the Microsoft Dynamics 365 Business Central containers for Docker. This then set us on a path to installing the development IDE and selecting a source code control provider to host our AL solutions.

See:

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 1/3
#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 2/3
#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 3/3

This article in particular, will focus on the installation of Visual Studio Code (VS Code) and the AL language extensions for the environment.


Installing VS Code

VS Code is to BC developers what Dexterity is to Dynamics GP developers. VS Code provides the IDE required to incorporate the AL language extensions to develop BC integrating solutions. Although SanScript is integrated into the Dex IDE, the analogy still holds.

1. To get started, download VS Code from https://code.visualstudio.com



2. Click on the Download button on the upper right corner of the menu bar. You will then select the Windows 7, 8, 10 option to download the actual installer.


3. Once you have downloaded the installer, choose to run the executable. This will initiate the wizard from which you will follow a set of guided prompts to complete the installation.


4. Acknowledge the license agreement and click on Next to continue. You will then be asked to enter a new installation folder or accept the default - personally, I find that the defaults work best.


5. The installation process will then lay down all the files and register all appropriate components so you can begin using the application.


6. Once the installation is complete, please proceed to click on Finish. This should launch VS Code if you left the checkmark selected.


Installing AL language components and utilities

One of the aspects I like about VS Code is the extensions concept. Extensions are simply, plug ins that augment the VS Code environment. One such extensions is the AL language extension, created by Microsoft.

1. Click on the Extensions button on the left Activity Bar (square button). Type "AL" in the search bar to proceed. This should surface the "AL Language" extension by Microsoft. Click on Install to add this extension to VS Code.


2. Install PowerShell by Microsoft. Following similar process, click on the Extension button and type PowerShell. If you prefer working in the PowerShell ISE environment or from the PowerShell command prompt, that's entirely up to you, but know there's a PowerShell extension for VS Code, which brings the entire language into the VS Code IDE.


3. Install GitLens by Eric Armodio. Following similar process, click on the Extension button and type GitLens. With GitLens you can visualize code authorship at a glance via Git blame annotations and code lens, seamlessly navigate and explore Git repositories, gain insights via powerful comparison commands, and much more.


4. Install Insert GUID by Heath Stewart. Insert GUID is a simple command extension for Visual Studio Code to insert globally unique identifiers (GUIDs) into the Code text editor in a variety of formats.


5. Install Docker Explorer by Jun Han. With Docker Explorer you can manage Docker containers, Docker images, Docker Hub and Azure Container Registry right from VS Code.



"Hello World"

The "Hello World" project serves to test the entire installation up to this point and is the first foray into the world of AL extensions.

1. Press Ctrl+Shift+P on your keyboard to open the VS Code Command Palette (Alternatively, you can choose View | Command Palette from the menu). Type AL:Go! to locate the option to create an AL project.



2. Enter a local folder where you would like to store the project. In this case, I simply removed the last portion of the folder name and replaced with HelloWorld.


3. You will immediately be prompted to select the server type you will be running this project against. Since we've deployed the local containers, it's safe to say we can choose Your Own Server from the drop-down


The above operation results in the creation of a launch.json file that is added to the project.


4. Proceed to replace the server name, currently defaulted to localhost, to the name assigned to your BC container, in this case http://demo-bc. Change the instance to NAV from the default, BC130.

Press Ctrl+Shift+P and type AL:Download Symbols to retrieve all the Windows symbol packages for debugging purposes. More information on AL Windows Symbol Packages here.


5. Press Ctrl+Shift+B on your keyboard to compile the project and create the publishing package for our "Hello World" extension.


This extension simply sets a trigger to the OnOpenPage() event of the Customer List page that displays the message "App published: Hello word". The page is loaded by default as specified in the launch.json file.


6. Press F5 on your keyboard to launch the application in debugging mode. This should launch BC and present the message above.


Once the message has been cleared, the application will continue to load the Customer List.




In the next article, I will talk about connecting to a source code repository and what else we need in order to get our environment fully ready. I will also cover some techniques that are much more adaptable to Microsoft Dynamics GP developers as far as working with AL files and folders and how we can leverage our Dexterity knowledge here to help us administer our projects.

Until next post!

MG.-
Mariano Gomez, MVP

Thursday, April 18, 2019

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 3/3

In Part 2 of this series, we covered the full installation of Docker Desktop, used to run the Dynamics 365 Business Central containers. We also saw how to use PowerShell to enable both the Hyper-V and Containers features on Windows 10.

This article will focus on the installation and troubleshooting of the Dynamics 365 Business Central containers and will provide step by step instructions on how to accomplish this. Remember, there are quite a bit of resources out there, so here they are:

Get started with the Container Sandbox Development Environment
Running a Container-Based Development Environment

But the goal of this series is to help Microsoft Dynamics GP ISVs draw similarities and contrasts with their multi-developer Microsoft Dexterity development environments.


Now that Docker is been installed, we can effectively proceed to lay down the BC containers. This will create a full virtualized environment with all the BC components needed for development purposes. This equates to having a full environment with Microsoft Dynamics GP, Web Client, IIS, and SQL Server in place for developers to code against.


Business Central Containers Installation and Troubleshooting

1. To begin the installation, we must install the NavContainerHelper PowerShell module from the PowerShell Gallery, which contains a number of PowerShell functions, which helps running and interacting with the BC containers.

See NavContainerHelper from Freddy Kristiansen for additional information.

Install-Module NavContainerHelper -force
In the process of installing the NavContainerHelper module, you will be asked to add the latest NuGet provider to be able to retrieve any published packages. After the installation of the NuGet provider, I went to import the NavContainerHelper module and ran into the following error, advising me that running scripts was disabled on the system I was attempting to install on.


By running the Get-ExecutionPolicy command, I was able to identify that all PowerShell execution policies on my machine were set to Undefined, which in turn prevents unsigned scripts from being executed.

Get-ExecutionPolicy
Since I was installing this on my local machine, I simply wanted to bypass any restrictions within the current user scope.

Set-ExecutionPolicy
2. With the installation of the NuGet provider and the changes to the script execution policies in place, it was time to call Import-Module to add the NavContainerHelper module.


Importing the module is a quick step.

3. Finally, it's time to create the BC containers. This is done by calling the New-NavContainer function (from the NavContainerHelper module). You will be prompted to create a user name and password to access the container and BC once installed. Here's the full call:

New-NavContainer -accept_eula -containerName "Demo-bc" -accept_outdated -imageName "microsoft/bcsandbox:us" -auth NavUserPassword -includeCSide -UpdateHosts -doNotExportObjectsToText


New-NavContainer
4. The container files are downloaded onto disk and are extracted.



5. Once all the files are extracted, the container is initialized by Docker. If all goes well, you should see a message letting you know that the container was successfully created.

Container created successfully
If you close the PowerShell window, you will notice a new set of icons on your desktop that will allow you to load BC running on the container, as follows:


  • Demo-bc Web Client: shortcut to the BC web client application
  • Demo-bc Command Prompt: access to the container command prompt
  • Demo-bc PowerShell: access to the PowerShell prompt running on the container
  • Demo-bc Windows Client: launches the Microsoft Dynamics NAV on-premises client
  • Demo-bc WinClient Debugger*
  • Demo-bc CSIDE: launches the CSIDE development environment for BC.


Desktop after a successful BC container deployment
Double-click on the Demo-bc Web Client icon to test the container deployment.

With the installation of Docker and BC containers, we have completed all the supporting environment setup. Be sure to play around with the new options, in particular, with both BC web client and Windows client components. It is important you begin to gain an understanding of the functional aspects of the application, before you embark in developing for this platform - nothing different than what you already did for Dynamics GP.

We are not quite done here, but since I am supposed to be a rational human being and respect the number of parts I chose for this series, I will start a new series showing how to add Visual Studio Code along with selecting and connecting to a source control repository, to close out this topic, so bear with me.

Until next post!

MG.-
Mariano Gomez, MVP

Friday, April 12, 2019

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 2/3

In Part 1 of this series, I outlined the principles and detailed the reasoning behind why we chose to build our Microsoft Dynamics 365 Business Central development environment using Windows Docker containers.

In the Dynamics GP world, we are not quite used to containers, so let me start with the definition, straight from the horse's mouth (so to speak). According to the wizards over at Docker, "A container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another. A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries and settings".

The first thing to highlight from the definition is, "standard unit of software". In fact, that's key to this whole thing! Standardization ensures that every developer in the organization is building and testing code against the same reliable environment. In the Dynamics GP world, although we have the ability to build stable reliable development environments, consistency is not always something that we can achieve easily, unless we are using desktop virtualization, which intrinsically  poses its own challenges.

But this article is about installing Docker. So let's get to it.

Installing Docker


Windows 10 Anniversary Update (build 1607) saw the introduction of Windows containers, a feature that allows you to install and deploy Docker and other container virtualization technologies. Follow these steps to complete a successful installation of Docker.

NOTE: from now on, most of the work will be done in PowerShell.

Enable Windows Containers feature

1. Open Window PowerShell (not PowerShell ISE) with elevated permissions. Click on Start and type "PowerShell". Choose "Run as Administrator" to continue.



2. You must first enable Hyper-V. In PowerShell type the following command:

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V -All

Enable Hyper-V via PowerShell

NOTE: If you previously installed Hyper-V, you must first uninstall it then reinstall it using the PowerShell. This means you need to backup all your Hyper-V images prior to completing this command.


It is recommended to reboot your machine after this operation to allow all components to be properly registered.

3. Upon rebooting your machine, open PowerShell once more, with elevated permissions, and type the following command:

Enable-WindowsOptionalFeature -Online -FeatureName Containers -All

Enable Windows Containers via PowerShell
NOTE: If you previously installed Windows Containers, you must first uninstall it then reinstall it using the PowerShell. This means you need to backup all your container images prior to completing this command.

It is recommended to reboot your machine after this operation to allow all components to be properly registered.

Download and Install Docker

1. To complete Docker installation, you must first go to https://www.docker.com then choose the Products | Docker Desktop.

Products | Docker Desktop

You must create an account with some basic info, if you don't have one already: user name, password, and email, in order to download Docker. Next, confirm your email address by clicking on a link you will receive in the inbox corresponding to the email associated with the Docker account you created. You now log into Docker Hub and download the Docker Desktop for Windows engine. By default, this would be placed in your download folder, unless your browser has been configured differently.

2. Once you've got through the account validation and download process, proceed to run the Docker installer (installer.exe). Upon launching the installer, the process begins with downloading a number of installation packages.


2. During the configuration screen, you will be prompted to select whether you want to run Windows containers vs Linux containers. The choice here should be obvious, but you have the ability to change this after the fact.


4. Upon clicking OK, the installer begins to unpack all files accordingly.


5. If everything goes as expected, you will be asked to sign out and sign back into Windows.


6. After signing into Windows, the service will initiate and you will be presented with a window to enter your Docker account information. This, according to Docker, is to track application usage.


7. I don't know if this is a bug in the installer, but even after selecting to run Windows containers in step 2, I had to manually right-click on the Docker task bar item and select to switch to Windows containers.


It is always good to test Docker to ensure everything is functioning as expected. For this, we can turn to PowerShell once more and execute any of the 2 following commands:

docker --version
docker info

Docker version and information commands
These steps conclude the installation of Docker. In the next installment, we will deploy the actual Microsoft Dynamics 365 Business Central containers and prepare you for what's next.

Hope you find this useful.

Until next post!

MG.-
Mariano Gomez, MVP

Wednesday, April 10, 2019

#MSDYN365BC: Building a Development Environment for Microsoft Dynamics GP ISVs Part 1/3

This is my first foray into the world of Microsoft Dynamics 365 Business Central (BC) development and this series of articles is meant to help Microsoft Dynamics GP ISVs understand the process of building a BC development environment, identify similarities with a Dynamics GP development environment, and fully utilize your accumulated experience. Yes, there's tons of literature out there, but none have the perspective of a GP ISV, so there's that 😋

It is worth noting that I am a 20+ years Microsoft Dexterity developer and, as we say, "we do things a little different around here" in the Dex world, but I am very excited to be initiating this new chapter in my career.

As you all know, at this point in my life I manage the Software Engineering team at Mekorma and as a long time Microsoft Dexterity developer they were a few things I knew I wanted out of this new development environment:

Isolation

From an engineering perspective, this means that each developer needs the ability to author and unit test code locally, while ensuring changes are managed centrally in our Azure DevOps source code repository. This is how we've always done it in the GP world and I did not want my engineering and development team to have to learn new paradigms or think differently about the actual process.

I know, I know... a lot of you prefer to have development images in the cloud and have developers connect to those images and develop from there. This is a personal preference and you need to evaluate what works for your development team. In our particular case, we don't want to be reliant on internet connectivity to have a developer do their work. Some of the best pieces of code have been created when folks are sitting at the beach sipping pina coladas, or while in the mountains in a cabin, so there's that.
 
Ease of Deployment

One of the things I truly dislike about the process of building Microsoft Dynamics GP integrating applications, after all these years, is the need to have several versions of Dynamics GP and Dexterity installed on each developer's machine, depending on the release of GP being targeted. If your company is anything like ours, as of this writing, we support anything from GP 2013 R2 to GP 2018 R2 and everything in between. That's a lot of software!

Having all these instances of GP involves a lot of application installation, service packs, etc., not to mention SQL Server and a variety of versions and builds of your own product, which quickly adds up in terms of time and productivity.

NOTE: We have simplified a lot of these headaches by having a single code base source code repository of our products for all versions of Dynamics GP, but it still does not mitigate the effort of installing all GP versions.

For BC, we wanted something self-contained, much simpler to maintain, that could easily be folded and recreated if needed, without burdening the developer with long winded software installations.

Resilience

Paramount to the development environment is the ability to add features to different versions of BC without having to do any sophisticated branch management. With Dexterity, you have to branch the whole project and not just specific components in order to move to the next build. This is an issue, because, overtime, there are too many branches to manage. The idea of only branching the software components to be enhanced sounded very appealing, making the development environment and process, resilient in the long run.

Given all these requirements, we opted for deploying Business Central Docker images as this would provide the best of all worlds. We also would reserve the online Sandbox for our Sales and Support teams to test and learn new product features while allowing us to continue develop and test without interruptions. 

The first task at hand then, is the installation of Docker and download the BC container images. To keep each topic separated, please read Part 2 in this series.

Until next post,

MG.-
Mariano Gomez, MVP

Thursday, April 4, 2019

#MSDYNGP: "Database must be compatibility level 130 or higher for replication to function" when setting up #MSDYN365BC Intelligent Cloud sync

As of recent, I've been honing on my Microsoft Dynamics 365 Business Central (BC) skills, without leaving my beloved Microsoft Dynamics GP behind. One of the things that I have been working on is making sure customers understand the BI insights gained via data replication between the two systems. As a result, I am always working through the replication configuration a few times a month.

Yesterday, I removed a previous Fabrikam company created via replication from BC and attempted a new replication - If you are not familiar with the configuration of the data replication process between GP and BC, I will be creating a video on this soon, so please stay tuned.

NOTE: The integration runtime service has also been updated, so you will probably need to download a new version.


After setting up the Integration Runtime Service and clicking Next to establish the connection between Intelligent Cloud and my on-premises GP, I received the following error:

"SQL database must be at compatibility level 130 or higher"

Knowing what the error meant, I realized my on-premise database server was SQL Server 2014, which happens to be the minimum database server requirement for Microsoft Dynamics GP 2018 R2. I couldn't change the system database compatibility level to 130 as this requires me to upgrade to SQL Server 2016.

The caveat however is, this replication was working at compatibility level 120, prior to my attempt at a new sync last night.

In doing some research and bouncing around a few emails, I was directed to the following article on the Community website:

Troubleshooting the Intelligent Cloud

The article seems to indicate that compatibility level 130 was a requirement since the January 2019 release, but also seems to suggest that this is only for the NAV / BC replication process, not GP. In fact, as I mentioned before, just a couple weeks ago, I was able to create the replication with compatibility level 120.

As it so happened, my attempt to replicate Fabrikam happened on April 2, 2019, which coincided with the April '19 release launch. As it turned out, this particular BC release introduced Intelligent Cloud synchronization for GP historical data. Since, this version of the sync uses JSON to track changes between the previous sync and the current one being executed, it requires databases to be at compatibility level 130 at the very least. This requirement wasn’t completely documented in the April '19 release notes but the release notes aren’t always 100% complete at the time of posting either.

With that said, customers need to be aware that historical data replication will require Microsoft SQL Server 2016 at the very least. These changes will be documented in the April '19 release notes and an entry will be added to the GP 2018 system requirements page.

Hope you find this information useful.

Until next post,

MG.-
Mariano Gomez, MVP