Wednesday, August 16, 2017

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 2/3

I am so amped-up after my return from the Microsoft Dynamics GP Technical Conference 2017 in Fargo, ND, where I had a chance to catch up with my friends in the partner and ISV community (more on that in a later post). This year, I had a chance to introduce the topics I have been discussing here in my DevOps series and now that I am back, I want to continue writing about the subject as it gets more and more exciting.

In Part 1 of this specific chapter within the series, I talked about building the actual Build-Engine project. If you remember, I specifically said that the build templates provided by Visual Studio Team Services (VSTS) do not fit the bill for Dexterity projects. Dex projects tend to be a bit more cumbersome since we need to have the entire IDE around to compile, extract, and chunk our products. So, it's best if we can isolate these components into an altogether separate project (from that of our actual Dex product) for clarity sake and to maintain our own sanity.


Creating a Build Definition for your Build-Engine Project

Now that we have the Build-Engine project in place, we can proceed to setup a Build Definition. The Build Definition is going to encompass all of the steps required to do things like:

1. Download the resources from our Build-Engine project (Dex IDEs, clean dictionaries, PowerShell scripts, macro templates, etc.

2. Setup any folders needed to support the build process and temporarily store files, etc.

3. Pull the source code from our Dexterity project repository

4. Setup all environment variables

5. Extract dictionaries and create chunk files with (unused blocks) and without source code (total compression).

6. Copy the chunks without source code into an artifact folder

7. Copy chunks and source dictionaries for debugging into an artifact folder

8. Publish the artifact folder

To create a new build for our Build-Engine project:

1. Click on the Build & Release then click the New button.


2. Select an empty template. Dexterity projects, clearly do not conform to any of the existing, pre-defined molds.


3. Click the Apply button to continue.

4. You can now enter the name of your Build-Engine and select from a list of 4 agent queue modes. You can select from Default, Hosted, Hosted Linux Preview, or Hosted VS2017. For all intends and purposes, hosted build agents pools run in the cloud, but can run locally as well. For more information on Hosted Agents, click here. These options define the Build process itself.


The suited option for our Dexterity Build-Engine is Hosted.

5. On the left pane, we can now click on the first task, Get Sources, to identify where the resources for our Build-Engine will come from. In this case, they will come from our Build-Engine project itself, which contains the Dexterity IDEs for versions 12 (GP 2013), 14 (GP 2015), and 16 (GP 2016). All other options are defaulted and really not required to be changed.



This completes the first step (Download Resources for our Build Engine) for today. You can click on Save & Queue to test that all files download properly for the build agent pool.

video


NOTE: My agent failed in the video as I ran out of allocated build minutes for the month. You will need to assess the length of your build process and ensure you plan accordingly. For more information on Team Services pricing, click here.

I strongly encourage you to read MVP David Musgrave's series on Building a Dexterity Development environment, because all principles used in that series are still applicable in our cloud Build-Engine.

Until next post!

MG.-
Mariano Gomez, MVP

Tuesday, August 1, 2017

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 1/3

Resuming my series, I wanted to touch base on the process of building and releasing Dexterity applications with Visual Studio Team services. My friend and fellow MVP, David Musgrave explained how to setup your Dexterity development environment in his Dexterity Development series and the first article in this series directly addressed the source code control process. Although David showed some really clever methods to build and package your Dexterity chunk application, that process still has some downsides to it. Primarily, the process is dependent on a person and a physical machine dedicated to executing the process.

Setting up a Build Engine

When creating a self-contained cloud-based VSTS Build Engine for your Dexterity projects, there are a few considerations:

1. The actual Dexterity IDE. If you are building a Dexterity chunk, you need to at least have a copy of the Dexterity IDE, because you will want to compile your project dictionary prior to chunking, and the chunking process itself still relies on Dexterity Utilities to be able to produce a chunk file.

In addition, you will need as many IDEs as versions of Dynamics GP you are supporting with your product. David describes this well in Part 2 of his series.

2. You need clean (base) dictionaries of each Dynamics GP versions you will be supporting. This is particularly important as you will want to pull the source code from the repository into clean dictionaries to compile and obtain your extracted dictionaries and finally complete the auto-chunk process.

3. Since your build process will ultimately be automated, you need macros to inject constants and compile the dictionary, based on the version of Dynamics GP your product will be supporting. You will also need macros to extract and auto-chunk your product.

NOTE: You can inject constants into your development dictionary by having a Dexterity constants resource file. Your macro will need to have steps in place to import the constants resource file.

4. You will also need scripts to drive a lot of the processes above, i.e., if you are launching Dexterity with a dictionary and a macro as a parameter to execute an automated task, this needs to be done by some task or step that supports this process. Anything scripting related, is preferable done with PowerShell since it has greater levels of automation over standard DOS batch files.

Now that you understand all the challenges, you can quickly guess that the best way to achieve this in Visual Studio Team Services is by setting up a Build Engine project with all the artifacts needed to automate the process.

The following shows the folder structure of a Build Engine project in VSTS:

Build Engine project structure
An explanation is as follows:

1. The DEX12, DEX14, and DEX16 folders contain full copies of the Dexterity IDE files, less things like samples, help files, and what's not. These are not needed as they will never be accessed.

2. The Logs folder will store any log produced by the macros being executed

3. The Macros folder includes all macros that we will need to compile and extract our code into chunk files.

If your macros refer to files in any of the folders in your Build Engine, these need to be relative to the root of structure of your project structure. In addition, it's easier to inject variables for things like the log paths, the dictionaries, version numbers, etc. It's considered best practice to not hard code any of these elements in your macro files to allow for reusability and customization. The following is an example of a chunking macro file:



# DEXVERSION=%DexMajorVersion%
Logging file '%LogPath%vDexUtilsMEP.log'
# ================================================================================
  MenuSelect title File entry 'Open Source Dictionary...' 
# ================================================================================
  FileOpen file '%ModulePath%/DYNMEP.dic' type 0 
# ================================================================================
ActivateWindow dictionary 'default'  form 'Main Menu' window 'DexUtils Toolbar' 
  MoveTo field 'Toolbar Utilities Button' item 0 
  ClickHit field 'Toolbar Utilities Button' item 6  # 'Extract' 
NewActiveWin dictionary 'default'  form Extractor window Extractor 
ActivateWindow dictionary 'default'  form Extractor window Extractor 
  ClickHit field '(L) Extract Button' 
# ================================================================================
  FileSave file '%ChunkDestination%/%VersionNumber%/MEP7156E.dic' 
# ================================================================================
NewActiveWin dictionary 'default'  form Extractor window Extractor 
NewActiveWin dictionary 'default'  form Extractor window Extractor 
  MenuSelect title File entry 'Close Source Dictionary' 
  MenuSelect title File entry 'Open Editable Dictionary...' 
# ================================================================================
  FileOpen file '%ChunkDestination%/%VersionNumber%/MEP7156E.dic' type 0 
# ================================================================================
ActivateWindow dictionary 'default'  form 'Main Menu' window 'DexUtils Toolbar' 
  ClickHit field 'Toolbar Utilities Button' item 9  # 'Auto-Chunk' 
NewActiveWin dictionary 'default'  form 'Auto Chunk' window 'Auto Chunk' 
ActivateWindow dictionary 'default'  form 'Auto Chunk' window 'Auto Chunk' 
  ClickHit field 'Lookup Button 3' 
# ================================================================================
  FileSave file '%ChunkDestination%/%VersionNumber%/MEP7156.cnk'
# ==================================================================================
  MoveTo field 'Dictionary Name' 
  TypeTo field 'Dictionary Name' , 'MEP7156.DIC'
  MoveTo field 'Dictionary Name' 
  MoveTo field '(L) Major Version' 
  TypeTo field '(L) Major Version' , '%DexMajorVersion%'
  MoveTo field '(L) Build Number' 
  MoveTo field '(L) Minor Version' 
  MoveTo field '(L) Build Number' 
  TypeTo field '(L) Build Number' , '%DexBuildVersion%'
  MoveTo field '(L) Major Version' 
  MoveTo field '(L) Build Number'  




You may be asking how these variables will be updated. Very simple: when we setup the Build process itself, we will create environment variables that can be injected into these macro files. You may also be asking how did these variables get there to begin with? Follow the steps outlined in Part 4 of the Dexterity Development Environment series to create your build macro and once you have recorded the macro, you can edit it to set up the place holders for environment variables.

4. The Resources folder contains things like constants resource files, clean dictionaries, and a Dex.ini file for the Dynamics GP versions that will be supported by your integrating Dexterity application.

5. The Scripts folder contains all the PowerShell scripts required to automate some of the tasks. One of such tasks is the ability to setup the dictionaries for the different products you will be building, as shown in this PowerShell script:

Param(
   [int] $VersionNumber,
   [int] $BuildNumber = "000",
   [int] $SubBuildNumber = "0",
   [string] $SingleModule = $null
)

$dictionary=''
$DexMajorVersion = 0
$localFolder = Get-Location #"$([System.IO.path]::GetPathRoot($(Get-Location)))" 

enum BuildType { 
    Source = 0
    Build = 1
}


switch ($VersionNumber) {
    2013 { 
   $dictionary = "$localFolder\resources\Dyn2013.dic"
   $DexMajorVersion = 12
  }
    2015 { 
   $dictionary = "$localFolder\resources\Dyn2015.dic"
   $DexMajorVersion = 14

  }
    2016 { 
   $dictionary = "$localFolder\resources\Dyn2016.dic"
   $DexMajorVersion = 16
  }

    default { 
   Write-Output "Invalid Version Number" 
   } 
}

$DexterityEXE = ".\Dex$DexMajorVersion\Dex.EXE"

$offset = "z_"  # Set blank for live mode.

if ($dictionary -ne ''){
    # Ensure the Destination folders exist.
    [System.Enum]::GetValues([BuildType]) | foreach { 
        $_compileModeName = $_
        mkdir "$localFolder\$($offset)$($_compileModeName)" -ErrorAction SilentlyContinue | Out-Null
        mkdir "$localFolder\$($offset)$($_compileModeName)\$VersionNumber" -ErrorAction SilentlyContinue | Out-Null
    }


    $fileSet = @()
    $modules = @('MICR', 'MICRJ', 'MMM', 'MEP', 'VPS')

    if ($modules -contains  $SingleModule)    
    { $modules = @($SingleModule.ToUpper()); Write-Host $modules -ForegroundColor Green }
    else { Write-Host "All Modules" -ForegroundColor Green }

    foreach($mod in $modules) {
        $workFolder = "$localFolder\$($offset)$($mod)\$($VersionNumber)"
        $dataFolder = "$workFolder\Data"
        mkdir $workFolder -ErrorAction SilentlyContinue | Out-Null
        mkdir $dataFolder -ErrorAction SilentlyContinue | Out-Null

        ## Copy the Dynamics Dictionary.
        $destination_file = "$workFolder\Dyn$($mod).dic"
     copy $($dictionary) $($destination_file)
        Set-ItemProperty $destination_file IsReadOnly -value $false
        Write-Host "$destination_file created." -Backgroundcolor Green -ForegroundColor Black

        # Copy Install_Code macro.
     $original_file = "$localFolder\Macros\Install_Code.mac"
     $destination_file =  "$workFolder\Install_Code.mac"
        $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)


        [System.Enum]::GetValues([BuildType]) | foreach { 
            $_compileModeName = $_
            $_compileMode = [int]$_

      ## Copy Constants.constant file to module-specific version and replace variables.
         $original_file = "$localFolder\resources\Constants.constants"
         $destination_file =  "$workFolder\Const_$_compileModeName.constants"
            $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)


            ## Copy the Macro - Twice: Once for Build and once for Source.
         $original_file = "$localFolder\Macros\Build_$($mod)_Source.mac"
         $destination_file =  "$workFolder\$($mod)_$($_compileModeName).mac"
            $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)

            ## Copy the Install_Constants.mac - Twice: Once for Build and once for Source.
         $original_file = "$localFolder\Macros\Install_Constants.mac"
         $destination_file =  "$workFolder\Install_Constants_$_compileModeName.mac"
            $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)
        }
        

  ## Copy Dex.INI file to module-specific version and replace variables.
  $original_file = "$localFolder\resources\Dex.ini"
  $destination_file = "$dataFolder\Dex.ini"
        $fileSet += , @($original_file, $destination_file, $mod, $compileMode)

        ## After this point, $MOD for MMM is MPP.
        if ($mod -eq 'MMM') { $mod = 'MPP' }
    }

    foreach($item in $fileSet)
    {
        $originating = $item[0]
        $destination = $item[1]
        $mod = $item[2]
        $_compileModeName = $item[3]
        $_compileMode = [int]$item[3]

        $workFolder = "$localFolder\$($offset)$($mod)\$($VersionNumber)"
        $chunkDestinationFolder = "$localFolder\$($offset)$($_compileModeName)"

        if ($_compileModeName -eq [BuildType]::Build.ToString())
        {
            $ChunkTypeComment = ""
            $_subBuildMessage = ""
        }
        else 
        {
            $ChunkTypeComment = "# -- Source -- " 
            $_subBuildMessage = "Build $($BuildNumber).$($SubBuildNumber) ($($BuildNumber).$($SubBuildNumber).$(Get-Date -format yyyyMMdd.hhmmss))"
        }


     (Get-Content $originating) | Foreach-Object {
      $_ -replace '%CompileMode%', "$_compileMode" `
         -replace '%CompileModeName%', "$_compileModeName" `
         -replace '%ChunkDestination%', "$chunkDestinationFolder" `
         -replace '%ChunkTypeConstant%', "$_compileMode" `
         -replace '%DexMajorVersion%', "$DexMajorVersion" `
         -replace '%DexBuildVersion%', "$BuildNumber" `
         -replace '%DexSubBuildMessage%', "$_subBuildMessage" `
         -replace '%ModulePath%', "$workFolder" `
               -replace '%Module%', "$mod" `
         -replace '%GenericPath%', "$localFolder\Generic\" `
         -replace '%TempPath%', "$localFolder\Temp\" `
         -replace '%LogPath%', "$localFolder\Logs\" `
         -replace '%OriginatingPath%', "$localFolder\Resources\" `
               -replace '%ChunkTypeComment%', "$ChunkTypeComment" `
         -replace '%VersionNumber%', "$VersionNumber"
        } | Set-Content $destination

        Write-Host "$($destination) created." -Backgroundcolor DarkRed -ForegroundColor Yellow
    }

    
    $DexterityEXE = ".\Dex$DexMajorVersion\Dex.EXE"
    $DexUtilsEXE = ".\Dex$DexMajorVersion\DexUtils.EXE"

    foreach($mod in $modules) {
        $workFolder = "$localFolder\z_$mod\$($VersionNumber)"
        $dictionary = "$workFolder\Dyn$($mod).dic"
        $LoadMacro =  "$workFolder\Install_Code.mac"

        Start-Process -FilePath $DexterityEXE -ArgumentList @($dictionary, $LoadMacro) -PassThru -Wait -Verbose
##        $process = (Start-Process -FilePath $DexterityEXE -ArgumentList @($dictionary, $LoadMacro) -PassThru -Wait)
##        Write-Host "$($mod) code loaded. Status: " $process.ExitCode
    }
}

6. The Source folder will contain the code retrieved from the Visual Studio Team Services repository. You will need a script to do this as well. The script will leverage the VSTS API to retrieve the code from the repo.

Param(
    [string]$SourceCodeFolder = "$/MICR/Base/2/2015B160", # Ex. 2015
    [string]$VSTSUser = "youraccount@somedomain.com",
    [string]$VSTSUserPAToken = "abcdefghijklmnopqrstuvwxyz0123456789",
    [switch]$TestStructure
)

$localFolder = Get-Location #"$([System.IO.path]::GetPathRoot($(Get-Location)))" 
$workFolder = "$localFolder\Generic\"

$scopePath_Escaped = [uri]::EscapeDataString($SourceCodeFolder) # Need to have this in 'escaped' form.

Write-Host "Pulling code from $($SourceCodeFolder) into $($workFolder)" -BackgroundColor DarkGreen

$recursion = 'Full' # OneLevel or Full
#$recursion = 'OneLevel' # or Full
 
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $VSTSUser,$VSTSUserPAToken)))
 
# Construct the REST URL to obtain the MetaData for the folders / files.
$uri = "https://yourcompany.visualstudio.com/DefaultCollection/_apis/tfvc/items?scopePath=$($scopePath_Escaped)&recursionLevel=$($recursion)&api-version=2.2"

# Invoke the REST call and capture the results
$result = $null
$result = Invoke-RestMethod -Uri $uri -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)}

 
# This call returns the METADATA for the folder and files. No File contents are included.
if ($result.count -eq 0)
{
     throw "Unable to locate code at $($SourceCodeFolder)"
}

$scopePathLength = $SourceCodeFolder.Length + 1 # +1 to eliminate an odd prefixed '\' character.

# =======================
# Create folder structure.
# =======================
for($index=0; $index -lt $result.count; $index = $index + 1)
{
    if ($result.value[$index].isFolder -eq $true)
    {
        #Strip the VSTS path off of the folder, and prefix with the local folder.
        $_subPath = $result.value[$index].path
        if ($_subPath.Length -ge $scopePathLength)
        {
            #Strip the VSTS path off of the folder
            $_subPath = $result.value[$index].path.Substring($scopePathLength)

            #prefix with the local folder.
            #MGB: replace the forward slashes in the remaining VSTS path with backslashes
            $_subPath = "$($workFolder)$($_subPath)" -replace "/", "\"

            if ((Test-Path $_subPath) -ne $true)
            {
                New-Item -Force -ItemType directory -Path $_subPath | Out-Null
                Write-Host $_subPath -BackgroundColor red
            }
            else
            {
                Write-Host "$($_subPath)`t$($result.value[$index].path)" -BackgroundColor Green -ForegroundColor Black
            }
        }
    }
}

# ==============
# Retrieve Files 
# -TestStructure flag will show all folders/files that will be retrieved.
# ==============
for($index=0; $index -lt $result.count; $index = $index + 1)
{
    if ($result.value[$index].isFolder -ne $true)
    {
        #Strip the VSTS path off of the folder, and prefix with the local folder.
        $_subPath = $result.value[$index].path
        if ($_subPath.Length -ge $scopePathLength)
        {
            #Strip the VSTS path off of the folder
            $_subPath = $result.value[$index].path.Substring($scopePathLength)

            #prefix with the local folder.
            #MGB: replace the forward slashes in the remaining VSTS path with backslashes
            $_subPath = "$($workFolder)$($_subPath)" -replace "/", "\"

            ## Retrieve the file text.
            if ($TestStructure -eq $true)
            {
                $fileresult = $result.value[$index].url
            }
            else
            {
                $fileresult = Invoke-RestMethod -Uri $result.value[$index].url -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)}
            }

            New-Item -Force -ItemType file -Path $_subPath -Value $fileresult | Out-Null
            Write-Host $_subPath -BackgroundColor Green
        }
    }
}

You can use the Visual Studio IDE to setup a project on VSTS and add the folders and files needed for your Build Engine project.

In summary, setting up a Build Engine project involves thinking about all the elements required to produce your chunk file: IDE, macros, and scripts that will drive the process. The complexity of each script will depend on the number of points you want to automate, variables and constants you want to inject, and certainly the number of chunks you need to produce for each version of Dynamics GP.

Tomorrow, I will explain the details involved with setting up the actual Build process with the parts.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/

Wednesday, July 19, 2017

#DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 2/2

Continuing with our #DevOps series, today we will address the upgrade process from Team Foundation Server (TFS) to Visual Studio Team Services (VSTS). Yesterday, we addressed the upgrade from Visual SourceSafe (VSS) to VSTS - see #DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 1/2, - and saw all the important steps needed to ensure your Microsoft Dexterity repository is migrated properly. Likewise, you must observe a series of steps prior to moving your TFS repository to VSTS.



Background

One of the main questions I usually field around this topic is, "Why would I want to move from TFS to VSTS?" The truth is, there are a number of reasons that you may want to consider: a) less server administration, b) a cloud solution gives you immediate access to the latest and greatest features, c) improved developers' connectivity - personally, I love the ability of being anywhere in the world and having access to our repository, without having to establish a VPN connection to some server; and d) if you want to keep the finance people happy, just tell them that you are moving from a CapEx model (servers and hardware that needs to be depreciated, with iffy tax deductions) to an OpEx model (subscriptions that are fully tax deductible).

Once you can see through the benefits, it will be easier to adopt VSTS. The next question is usually, "What are the differences between the two?" For a primer on this, and to keep the article tight, take a look at the following whitepaper:

Fundamental differences between TFS and Team Services


Migrating Microsoft Dexterity repositories from Team Foundation Server to Visual Studio Team Services

As with VSS, there are a few acceptable methods to migrate from TFS to VSTS, as follow:

1) Manually. You can copy the most important and perhaps, the latest projects you are working on. When you are done, you can simply mark off the TFS projects as read only. Under this scenario, the assumption is you will be leaving behind your old TFS server to maintain the history of all your old projects, but if you are getting rid of the server (which is the main reason to move to begin with), you may want to consider a different method.

2) Use the Database Migration tool. As you all know, I am a fan of tools that allow you to automate the process and minimize any risk associated with moving data from one place to another, especially over the internet. The Migration Guide is available here. However, this particular Microsoft tool was very young when we first looked at it, and as of the writing of this article, it was still in preview mode.

3) Use a third-party tool. Frankly, when we did our migration here at Mekorma, we tested a number of tools, but settled on the OpsHub Visual Studio Online Migration Utility, available for free from the Visual Studio Team Services gallery.

Visual Studio Online Migration Utility

OpsHub Visual Studio Online Migration Utility Free Version helps developers migrate the most commonly requested data from an on-premises Team Foundation Server to their Visual Studio Team Services account.  It enables basic migration of history of version control change sets, work items, test cases

The Free Utility is limited to migrating projects with less than 2500 revisions of work items and less than 2500 revisions of source control. The Free utility offers very limited support thru the community supported Q&A forum with no additional support included.

But rather than me trying to describe all the steps, I thought it would be best to embed the demonstration video here:


Tomorrow, I will show you how to leverage Visual Studio Team Services' Build process to extract and chunk your Dexterity applications.

Until next post!

MG.-
Mariano Gomez, MVP

Monday, July 17, 2017

#DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 1/2

Yesterday, we talked about #DevOps Series: Microsoft Dexterity source code control with Visual Studio Team Services. The article mainly focused on setting up your Team Services project repository for the first time and taking an existing development dictionary and prepping it and checking in the resources into the repository. But what if you already have a Visual SourceSafe (VSS) or Team Foundation Server (TFS) repository already in place and you are just looking to move to VSTS?



Migrating Microsoft Dexterity repositories from Visual SourceSafe to Visual Studio Team Services

There are two acceptable methods to migrate your Microsoft Dexterity projects repository VSS repository to VSTS: you can use the VSS Upgrade Wizard or you can use the VSSUpgrade command prompt tool. Now, I am a big fan of command-prompt tools, but this is one case where I would suggest you ditch it for the Wizard.

If you would like more information on the VSSUpgrade command-prompt tool, please click here.

Using the VSS Upgrade Wizard 

This is by far, the method I recommend the best. The wizard provides step by step instructions, which makes the process of moving to VSTS a no-brainer. There are a few things you will need to do beforehand.

Preparing for the Upgrade

1.- First, if you are on a version prior to Visual SourceSafe 6.0, you will need to upgrade Visual SourceSafe to version 6, before you can attempt the upgrade. You can download Visual SourceSafe 6.0 here, but please note that this IS NOT an official Microsoft download site, hence, exercise due care when opening any files from an unknown location. Also note that Microsoft support for VSS ended in 2012 - that's right! You are on your own here.

2.- Next, you will need to have a SQL Server available to use as temporary storage to the upgrade process. Since you are already running Microsoft Dynamics GP on some SQL Server, you could probably create a separate instance where you can perform the upgrade. I won't recommend using your production instance to do so.

NOTE: Although SQL Server Express Edition is probably fine for the upgrade, I do recommend you use at the very least SQL Server Standard Edition to prevent any migration issues due to database size limitations imposed by SQL Server Express Edition. If your repositories tend to be very large from years and years of coding (in our case 20 years!) you are probably better off with the Standard Edition of SQL Server.

3.- You will then need to check in all your Microsoft Dexterity project resources into your VSS repository and remove access to all repositories for all developers, but the (main) administrator.

4.- You will have already had to provision a Team Services account. Refer to the previous article in this series for a primer on this process. We found this out the hard way: make sure you create all project shells for your VSS projects before you conduct the upgrade as the Upgrade tool will need this done in advance.

5.- Make a copy of your VSS database and work from the copy. Restore these onto the instance of SQL Server you created in setp 2. Makes sense? Ok, let's move on. As usual, you will not want to expose yourself to some sort of data corruption, so please do not work with your original VSS databases in case something goes wrong. See How To Back Up a Visual SourceSafe Database for additional information on this process.

6.- Download and install the Visual SourceSafe Upgrade Tool for Team Foundation Server (and Visual Studio Team Services). You can get the tool here. You must install the tool on the same machine where you made the copy of your repository database.

7. Run the VSS Analyze Utility to ensure there are no inconsistencies with your VSS database that would prevent the upgrade from being successful. If Analyze produces any errors, you will need to repair the database prior to beginning the upgrade.

7.- For additional preparation steps, please refer to the following MSDN article, Prepare to upgrade from Visual SourceSafe.

Using the Wizard

1. Launch the tool downloaded in Step 6 above. Go to Start and run the VSS Upgrade Wizard.

2. On the Visual SourceSafe Repository page, specify the repository, and if necessary, the Admin password.

Visual SourceSafe Repository page
3. To display the projects in your VSS repository, choose the List Available Projects link. Select the projects you want to upgrade.

List Available Projects
4. Select the check box at the bottom of the page to confirm you have run Analyze. See Step 7 above. Choose Next to proceed.

5. On the Team Project Page, choose Browse and then use the Select a Team project for Migration dialog box to specify the team project into which you want to port the upgraded data. My absolute recommendation here is to select a new team project that you have not been using.

Select a Team Project for Migration page

Choose Next.

6. On the Options page, select whether you want to upgrade the Full history or Tip to omit historical data. When we did this migration, we truncated the data we didn't want to upgrade. That would have been done as an optional step to step 5 above, after all copying the repository database.

Options page
7. On the Options page, specify the name of the SQL Server instance you want the wizard to use for temporary storage.

Options page
Choose Next to continue.

8. Review all settings and choose Next. There will be a checksum to ensure the upgrade can proceed. Choose Upgrade to continue.

9. Once the upgrade is finished, you should be able to navigate to your Visual Studio Team Services account page and verify that all projects have been migrated successfully. If you come across any issues, make sure you print the Migration Report and follow the information provided here to complete additional steps to fix.



Tomorrow, I will walk through the steps to upgrade from TFS to VSTS. Have you completed a migration from VSS to VSTS? I would like to hear your take on it and what "lessons learned" came from executing the migration.

Until next post!

MG.-
Mariano Gomez, MVP

#DevOps Series: Microsoft Dexterity Source Code Control with Visual Studio Team Services

Now that my good friend and fellow MVP, David Musgrave has completed his #MSDexterity Development Environment Series, it's only fitting that I follow it up with a new series on DevOps with Visual Studio Team Services (VSTS) and Microsoft Dexterity.



Background

David's series is important, because it provides an overview of the processes involved in building a Microsoft Dexterity environment and touches upon certain aspects of the code packaging and deployment that are fundamental to DevOps. Keep in mind that no one method is wrong or right, but rather look at it as having a wide array of options at your disposal.

DevOps (Development and Operations) goes beyond the software development life cycle (SDLC) or building a development environment, to incorporate a delivery process that emphasizes communication and collaboration between the product management, software development, and operations teams. Primarily, it seeks to automate the process of software integration, testing, deployment, and infrastructure changes by establishing a culture and environment where building, testing, and releasing software can happen rapidly, frequently, and more reliably. We've always heard the cliched term, "People, processes, and technology" thrown around, but DevOps really focuses on bringing the three together.

Since people and processes are highly dependent on each organization, I cannot really cover those here and frankly, they are tons of resources online describing and expanding on the human and process aspects of DevOps. My goal with this series will be to see how we can leverage VSTS as the technology component for a Microsoft Dexterity development operation - mainly, Independent Software Vendors (ISVs), consulting partners, and last, but not least, customers who do have a Microsoft Dynamics GP development team. I suspect some of the principles outlined here will also apply to non-Dexterity development operations, but you need to be the judge as to what and what doesn't apply to your specific case.

I will also assume that most of you reading this will already have a Microsoft Dexterity development environment that relies on either Visual SourceSafe (VSS) or Team Foundation Server (TFS) as your source code control repository and that you are familiar with the concepts of code check-in and check-out.



Implementing Visual Studio Team Services in a Microsoft Dexterity Development Environment

I often get asked, "how do we get started with Visual Studio Team Services?". To put things in perspective, most Dexterity development teams asking this question are always concerned with compatibility between Microsoft Dexterity and Visual Studio Team Services. After all, their current source code control procedures with VSS and TFS have gotten them to this point with little to no issues. The first thing to know is, Microsoft Dexterity communicates to VSTS through the same TFS source control providers defined in the Source Control options. That's right! If you already connect to TFS, you can certainly communicate to VSTS.

Something else to keep in mind is, Microsoft Dexterity versions 12, 14, and 16 do allow you to use the TFS providers directly, this is, without having to install the Dexterity Source Code Control Services (DSCCS) component. Another reminder is, you will need to have the corresponding version of Visual Studio installed on the same machine with Microsoft Dexterity to work with Dexterity TFS provider of your choice. The following table describes the TFS provider and the corresponding version of Visual Studio need to support that particular provider.

TFS Provider
Visual Studio Version
Team Foundation Server 2010Visual Studio 2010
Team Foundation Server 2012Visual Studio 2012
Team Foundation Server 2013/2015Visual Studio 2013 or Visual Studio 2015


Provisioning Visual Studio Team Services

The first thing you will need is to provision Visual Studio Team Services. For this, you will need to go to the Visual Studio website. Once you are on the site, sign in with your organizational account (either Office 365 or Azure AD account).


When clicking on Sign In, you will be challenged for your credentials and will take you to the Get Started page where you can setup your VSTS account. Click on the Create new account button to continue.

Get Stated page
On the Account Creation page, you will setup a project space to host all your Microsoft Dexterity projects and will use this address to administer all your development, QA, and operations users and teams and your Microsoft Dexterity projects as a whole.

Account Creation page

You have a choice to manage your code using Git control or Team Foundation Version Control (TFVC). Git control allows you to setup a distributed source code control model, where each developer has a copy of the source repository on their dev machine; or a TFVC model, where, typically, team members have only one version of each file on their dev machines. Historical data is maintained only on the server. Branches are path-based and created on the server. For more information, take a look at this comparison between Git and TFVC.

After clicking Continue, VSTS will provision your project workspace and your account. Upon completion, a default first project is created for you where you can describe what it's about and setup continues integration for your project (more on that later).

MyFirstProject landing page


Setting up a project using Visual Studio Team Services

The goal of this section will be to show you how to setup your first project using VSTS. The next article in my series will explain how to migrate your VSS and TFS projects to VSTS.

Since no one names their projects "MyFirstProject", it will be necessary to rename this to whatever name you choose. However, for all intends and purposes, let's say this will be our Microsoft Dexterity Quotes Enhancements project. To rename, the project, click on the gear button, and select Overview.

Overview option

You can proceed to change the name of the project under the Profile section, then click the Save button to proceed. Team Services will then present you with a big warning, make sure you read through before you continue. However, since this is our first project and we have nothing else but an empty shell at this point, it is safe to continue with this process.

Project profile page

Rename Project warning

Once the renaming is successful, the profile page will automatically refresh, showing the new project name and the project team name accordingly.


Setting up the Team Services Project Server in the Visual Studio IDE

Since my team normally develops in Dexterity 12 (GP 2013), 14 (GP 2015), and 16 (GP 2016) as our products are supported on three different versions of Dynamics GP, it is necessary for us to have a version of Visual Studio installed that is supported by all versions of Dexterity simultaneously. We have settled on Visual Studio 2012 with Update 5, which allows us to use the Team Foundation Server 2012 provider in all versions of Dexterity mentioned above, although we have side-by-side installations of Visual Studio 2012 and Visual Studio 2015.

Open Visual Studio 2012 and choose Connect to Team Foundation Server on the start page.

Visual Studio 2012 Start page

You will be challenged by Visual Studio to enter your credentials. Enter the credentials you used to provision your Team Services account.


Visual Studio will then take you to Team Explorer once the operation sign in operation has been completed. Proceed to click on Source Control Explorer to view the new project we've setup.

Visual Studio Source Control Explorer
In order to make this fully functional, we need to map our cloud repository to a local path. First, I suggest you setup the local folder using File Explorer. For this case, I will create a folder called MyProjects and under it, a new folder called Quotes. To map the cloud repository, select the root node of the repository, then click the Not Mapped hyperlink above the details pane in Visual Studio. This will open the Map window.

Map window
Point the folder location to the root folder, MyProjects on your local machine. Click the Map button to complete the operation. You will then be asked if you want to download the content of the repo onto your local machine. Since this is a brand new project, there will be nothing in the Quotes project folder (except for the Visual Studio Team Services build templates), so proceed to click Yes to continue. This will now setup the initial project folder.


Setting up Microsoft Dexterity Source Control

When working with integrating Dynamics GP applications, we typically start out from an empty Dynamics.dic development dictionary, but if you already have an existing project dictionary you would like to begin working with, in either cases the following steps will work.

In the case of our example, we will copy the Quotes_dev.dic development dictionary to the MyProjects > Quotes folder. For more information on setting up your development environment, please refer to #Dexterity Development Environments – Part 2: Setting up Projects and follow up to step 10.

Quotes_dev.dic development dictionary

Next, we can launch Dexterity with the dictionary and the Dex.ini file by using our Quotes.exe shortcut. Note that our dictionary already has existing resources we have developed for this project.

Quote_dev.dic inside the Dexterity IDE

To setup the Source Control information, we need to go to Edit > Options in the Dexterity IDE. At first, the provider option will be set to [Disabled]. We can then choose the TFS provider that matches our Visual Studio installation, in this case Team Foundation Server 2012.

Selecting the Source Control Provider in Options window
The root directory, will correspond to the folder with the hierarchy immediately above the project folder, in this case the MyProjects folder created above.

Root directory setup
Since Dexterity uses the TFS providers supplied with Visual Studio, it needs the credentials you use to access the repository. Enter that information in the User Name field. NOTE: the password field remains disabled.

User Name setup
Click on the ellipses button next to Project Name to select the project (do not type). If you can select the project, then it means that Dexterity is communicating with your source control repository - and that's always a good thing :-)

Projects window

Finally, point to the Original Dictionary for which your code was written. As a personal note, I keep a folder on my machine called Dictionaries, which contains sub folders hosting a clean dictionary for each version of Dynamics GP I develop for. This prevent me from accidentally removing, say, the Dynamics GP installation and not having a dictionary available, causing my Dexterity source code control to fail.

Original Dictionary


Synchronizing your dev dictionary with Visual Studio Team Services

To synchronize your development dictionary to the VSTS repository, you must first update your source code control state. This will allow Dexterity to determine which resources are new.

NOTE: New resources are defined by a resource ID of 22,000 and above. Alternate resources, will also appear as new to Dexterity source control.

Update Source Code Control state

Once you update the source code control state, proceed to check in the resources into the repository. Choose Explorer > Source Control > Check In...

Source code check in

If you are going to check in all resources, click the Insert All button. Be sure to enter a comment that describes the best you can what is being done, i.e., "Initial check in of Quotes project resources".

Check in process

Both Visual Studio Source Control Explorer and Team Services should reflect the checked in resources. You can also look to your hard drive, under the Quotes folder to see the new folders created for Base, Forms, Reports, Scripts and Tables resources.

Resources in Repository

My next article will discuss how to migrate from VSS and TFS to VSTS.

Until next post!

MG.-
Mariano Gomez, MVP

Friday, July 14, 2017

Microsoft Dexterity: Customizing Safe Pay file name

Hello all!

It's been a while since I've worked on my blog as there's quite a bit going on these days: numerous projects at Mekorma, preparing to move to my new place, upcoming trips to Colombia and Fargo, ND, etc., so not much time to dedicate to writing.

Anyways, today I wanted to dive into some development. It is very often that I see on forums a request to be able to customize the Safe Pay file name and I wanted to address this with some Dexterity code, but first - you guessed it! - some background.



Background

Safe Pay is part of Microsoft Dynamics GP's Electronic Banking suite of modules, comprised by Electronic Reconcile, EFT Payables and Receivables, and, well, Safe Pay. The goal of Safe Pay is to allow you to create a file that is sent to your bank to confirm the authenticity of checks and EFTs issued to payees.

Trivia: Who was the original author of the Electronic Banking suite?

The Safe Pay Configurator provides an intuitive (in most cases) user interface that allows you to define the specific file format(s) accepted by your bank(s).

Safe Pay Configurator window
Upon configuring the file format, you will then use the Safe Pay Bank Link Maintenance window to group all the checkbooks associated to the same bank into an Upload ID. As part of this setup, you can define a single file name to be generated for this bank. Naturally, most setups will include some company identifier (in a multi-company environment) and the checkbook to at least make the file unique and distinguishable enough.

Safe Pay Bank Link Maintenance window

Finally, the Safe Pay file itself can be generated only after the payment batch has been posted. For this, you must use the Safe Pay - Transaction Upload window. In this window, you will choose the Bank Upload ID defined above, and a cutoff date. Safe Pay will recall the (posted) payment transactions within the last upload date and the cutoff date entered by you. All checks and EFTs available for processing within that date range will show up for processing.

Safe Pay - Transaction Upload window


The Problem and The Solution

Most organizations would like to have at least a date stamp included as part of the file name itself. For the above example, our desired file name would be something like MMM29-FIRSTBANK-YYYYMMDD.txt, where YYYYMMDD represents the current year, month, and date. As you can tell from the Safe Pay Bank Link Maintenance window, it is not possible to achieve this. There isn't even an option to allow you to select other descriptors for the file name. In order to do this, we will use some Dexterity sanScript to add this feature.

NOTE: This article assumes you have a basic understanding of the principles of integrating Microsoft Dynamics GP applications and that you are familiar with the build process (extract and auto-chunk) of a development dictionary. If you need help setting up your development environment, take a look at the #Dexterity Development Environments series on the Winthrop Development Consultants blog by my good friend and fellow MVP, David Musgrave.

In order to change the Safe Pay file name, we will begin by setting a cross-dictionary trigger against the ME_Configurator_Write_Entire_Format global procedure in the Safe Pay dictionary in our Startup script. You can find this out by capturing a script log by using the Microsoft Dynamics GP Script Debugger tool or taking advantage of the GP Power Tools manual or automatic logging capabilities.

global procedure: Startup

{ Script: Startup
  Created by Mariano Gomez, MVP
  This code is licensed under the Creative Commons 
  Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license.
}

local integer lresult;

pragma(disable warning LiteralStringUsed);
lresult = Trigger_RegisterProcedureByName(1235, "ME_Configurator_Write_Entire_Format", TRIGGER_AFTER_ORIGINAL, script Change_Safepay_File_Name);

if (lresult <> SY_NOERR) then
 warning "Trigger registration error for procedure ME_Configurator_Write_Entire_Format in dictionary SafePay (1235): script Change_Safepay_File_Name";
end if; 

pragma(enable warning LiteralStringUsed);

When ME_Configurator_Write_Entire_Format procedure executes, our script processing procedure, Change_Safepay_File_Name, will fire in response, after the original Safe Pay script is executed. This script makes extensive use of standard Microsoft Dynamics GP global functions, which demonstrates how your integrating code can become really compact if you do some upfront work to understand what's available to you in the core Dynamics.dic dictionary file.

global procedure: Change_Safepay_File_Name

{ Script: Change_Safepay_File_Name
Created by Mariano Gomez, MVP This code is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license. }
in string FormatId;
in string SafepayId;
in string UploadId;
in string FilePathName;
out boolean WroteAll;


local boolean  fExists;
local string  sPath, sFileName, sGeneric;
local string  sTimeStamp;
local integer  nStatus;


sGeneric = Path_MakeGeneric(FilePathName);

{ @@@. Verify the generated SafePay file exists and can be open. }
fExists = File_Probe(sGeneric);

if (fExists) then
 sPath = Path_ParsePathFromPath(FilePathName);
 sFileName = Path_ParseFileFromPath(FilePathName);

 { @@@. Get the current time stamp as a string in YYYYMMDD format. We retrieve the time from the SQL Server. 
   We also take advantage of the RW_Pad Report Write function to add the leading zero to single digit months and days
 }
 sTimeStamp = str(year(sysdate(CURRENT_SERVER))) 
   + RW_Pad(str(month(sysdate(CURRENT_SERVER))), LEADING, ZERO_STR, 2) 
   + RW_Pad(str(day(sysdate(CURRENT_SERVER))), LEADING, ZERO_STR, 2);
 
 { @@@. Call the Microsoft Dynamics GP global function FileName_AppendToName to append the string time stamp }
 sFileName = FileName_AppendToName(sFileName, CH_UNDERSCORE + sTimeStamp);
 
 { @@@. Call the Microsoft Dynamics GP global function File_Rename to rename the Safe Pay file with the new name }
 nStatus = File_Rename(FilePathName, sPath + sFileName, true); 
 
end if;

Note that our script processing procedure uses the same parameter list as the ME_Configurator_Write_Entire_Format Safe Pay procedure. This allows us access to the file path and name as configured in Safe Pay.

You can download the extracted dictionary and chunk file here. Feel free to customize the code as you see fit.

Until next post!

MG.-
Mariano Gomez, MVP

Friday, April 14, 2017

#FabrikamDay: blast from the past

Well, you thought #FabrikamDay was over? Not really, although this will be my last post! You see, there was a time when Microsoft Dynamics GP had its own TV commercial themed "More and Less". Rumors had it that Pam Misialek wanted a narrator with an everyday American accent, but Errol Schoenfish had his way with getting someone to put on a British accent.

More and Less advert


Other interesting facts I could research: the director of photography for this advert is Tristan Whitman under Wyldflower Filmwerks, Inc. This ad was released to cable and satellite channels around the March of 2009 time frame, just in time for Microsoft Dynamics Convergence 2009 (I started attending Convergence, uninterrupted, the following year). By then, the Smart Tag functionality was available with Microsoft Dynamics GP 10.0, but enhanced further with GP 2010. You can appreciate the Smart Tag flashes in the video.

Tristan Whitman at the camera

Until next post,

MG.-
Mariano Gomez, MVP

Thursday, April 13, 2017

#FabrikamDay: the holiday that wasn't

Yesterday was ceremoniously an unofficial holiday. That's right! The date did not show up on any local or international calendar, it wasn't part of any federal or state registry. Frankly, the world did another spin on its axis without skipping a beat and life for billions of people on this planet went on without a hitch - OK, I'm not trivializing what's going on around some problem spots in the world, but I digress. Not so for a few hundred in the Microsoft Dynamics GP community of customers, partners, and independent software vendors (ISVs). We celebrated our own 'holiday' of sorts: #FabrikamDay.

Breaking it all down

To understand what Fabrikam Day is, you first have to ask yourself, "What is Fabrikam?". Fabrikam (and more precisely, Fabrikam, Inc. or Fabrikam, Ltd, formerly The World Online) is the sample company used for demos, development, and testing in Microsoft Dynamics GP. The sample company provides sample data for all core and extended modules, including Field Services, Manufacturing, Project Accounting, and Human Resources/Payroll.

NOTE: Other Microsoft products use the sample company name Fabrikam for demo purposes. Click to obtain a list of fictional Microsoft companies.

For the longest time, the sample company transactions data was based on an April 12, 2017 fictitious date. Each time you log into Microsoft Dynamics GP and select the sample company, by default, you obtain the following warning dialog:



In the early 2000's a first change was done to convert the sample company data to a fictitious date of April 12, 2007. When 2007 came along, the date was forwarded once more 10 years ahead. If you are a veteran in the channel, I'm sure many of you thought, like I did, that you would probably be doing something differently when 2017 came around. At the time, it seemed so far removed from anyone's imagination. You can read more details and technicalities behind the sample company date in this very detailed article by my friend and fellow MVP, David Musgrave.

#FabrikamDay is today or is it? 

Where did the idea of a #FabrikamDay came from?

Back at GPUG Amplify 2016 in Anaheim, CA a group of us (Pam Misialek, Bob McAdam, Chris Dobkins, Amber Bell) started toying around with the idea of putting together an event around April 12, 2017 to commemorate the advent of the sample date. Back then, the original intent was to have a one day celebration costume party close or around one of the major events this year. Given the logistical challenges (and possible cost) the idea never took off.

Fast forward to the beginning of March, 2017 and Amber Bell with Training Dynamo releases the first blog post titled Fabrikam Day in which she describes the sample company date and how has Fabrikam helped her over the years to train her customers.

In mid-march, GPUG Amplify featured a number of "Fabrikam-themed" badges for attendees name tags, hoping to draw attention of the upcoming date.

GPUG Amplify badges for name tag
By the latter part of March, 2017, Chris Dobkins and Melissa Sandrovich over at Njevity picked up where Amplify left off and were just determined to not let the date go by without doing something. Chris and Melissa posted their first #FabrikamDay picture requesting the community to follow up with pictures of their own.

Chris Dobkins and Melissa Sandrovich at Coors Field in Denver, Colorado
Ever since this first picture by Chris and Melissa, the community began responding slowly but surely to the request. According to Hashtracking.com we had just about 479 original tweets and 924 retweets that included the #FabrikamDay, with about 1.7M impressions and reaching over 300 thousand people.


It was now April 12, 2017 (yesterday). The day started with a barrage of tweets allusive to #FabrikamDay and ended with a 4:00 PM EDT online meeting hosted by Pam Misialek, Product Marketing manager at Microsoft which did a quick run down memory lane showing some of the nostalgia and accomplishments over the years. At 4:12 PM EDT we did a "ball dropping" of sorts with a countdown to welcome #FabrikamDay. Those of us in attendance listened to stories, saw old time videos, and played a little trivia game to gauge our knowledge of Microsoft Dynamics GP history.

To say I have attended a few events over the years would be quite an understatement, but #FabrikamDay reminded me of the importance of community and the heart and passion to improve the lives of customers and partners. I hope to continue working for this community and do my part to make it stronger. You do the same!

Now, some posts from around the globe:

David Musgrave (Winthrop Development Consultants) 
#FabrikamDay How to quickly update the sample data to 2027
#FabrikamDay is today or is it?

Amber Bell (Training Dynamo)
- Fabrikam Day

Chris Dobkins (Njevitytogo)
Attention Dynamics GP Customers and Partners: We Need Your Help to celebrate Fabrikam Day!

Mark Polino (DynamicAccounting)
- Happy #FabrikamDay

Rhonda Sutliff (Rockton Software)
- Happy Fabrikam Day!

Until next post,

MG.-
Mariano Gomez, MVP