Tuesday, September 26, 2017

Microsoft Dynamics GP Security and Audit Field Manual

My friends, MVP Mark Polino (@mpolino) and Andy Snook (@snookgofast), both members of the Fastpath team, have just released a comprehensive security book titled, Microsoft Dynamics GP Security and Audit Field Manual.

The book can be found in printed and Kindle formats on Amazon.com and I encourage you to get a copy, read up, and put into practice as this book goes beyond the boring task of assigning security to windows and reports just to prevent someone from accessing some area of the application, and into the realms of compliance, separation of duties, and audit controls.


The book can be found in printed and Kindle formats on Amazon.com and I encourage you to get a copy, read up, and put into practice as this book goes beyond the boring task of assigning security to windows and reports just to prevent someone from accessing some area of the application, and into the realms of compliance, separation of duties, and audit controls.

Finally, I want to take the opportunity to thank both Mark and Andy for extending me an invitation to write the foreword to their book. 

Until next post!

MG.-
Mariano Gomez, MVP

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services - Summary





My DevOps series has concluded, although, I believe this will not be the first or the last time I write about this subject. DevOps is here to stay and the tools and technologies to support development teams only keep getting better.

The following is a list of the topics I covered in the series and I encourage you to add your comments to the comment section of the posts that caught your attention. Let me know what you are doing today and how you plan to incorporate DevOps into your development operation processes.

July 17 - #DevOps Series: Microsoft Dexterity Source Code Control with Visual Studio Team Services

July 17 - #DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 1/2

July 19 - #DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 2/2

Aug 01 - #DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 1/3

Aug 16 - #DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 2/3

Sep 25 - #DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 3/3


I also prepared this video, which I originally intended to add to the previous article, but I am really glad I left it for the summary post. Please be sure to check it out (better viewed in full screen mode).



A link to the Helper.ps1 script containing the library of functions used by the PowerShell scripts found in the previous article can be download from my OneDrive public share, here. These scripts are being updated constantly as we evolve our Build-Engine. Check back for additional updates.

Until next post!

MG.-
Mariano Gomez, MVP

Monday, September 25, 2017

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 3/3

In part 2 of this series, I covered how to setup a Build Definition of for our Build-Engine project. I also began showing the steps required by the Build-Engine definition in order to take your development project from the dictionary to an actual set of extracted dictionaries and chunk files that can be delivered to your QA team.

NOTE: the same process can be used to take your dictionaries from QA to release to your download site.

The first step, as shown before, is to determine the source for the Build-Engine process. We said that we would use the Build-Engine project itself as source for the Build process, since it contains all our Dexterity (and Dexterity Utilities) files, PowerShell scripts, and macros to make it all happen.


1. Following the selection of the source, our first task is to create the necessary folders to host the various files. This tasks uses an inline PowerShell to do this:

CreateFolders (inline PS script)
$folders = @("Build", "Source", "Logs", "Generic", "Temp")  # Create these folders

foreach($item in $folders)
{
    mkdir "$(Get-Location)\$($item)\" -ErrorAction SilentlyContinue | Out-NULL
}

The task creates the following folders:

Build: stores chunk files with no source code

Source: stores the extracted dictionaries and chunk files

Logs: stores all the log files generated by Dexterity Utilities in the process of extracting and chunking dictionaries

Generic: stores the downloaded Dexterity project repository files

Temp: stores any additional component needed throughout the Build process

2. Once we have set up the needed folders, we can then proceed to retrieve our Dexterity project from the VSTS repository. For this purposes, we setup a task that will run our Get_VSTS.ps1 PowerShell script.

Get_VSTS.ps1
Param(
    [string]$SingleModule = "",
    [int]$BuildNumber,
    [string]$VSTSUser,
    [string]$VSTSUserPAToken, 
    [switch]$TestStructure
)

. "$(Get-Location)\Scripts\Helper.ps1"

$modules = Get-ModuleData -Module $SingleModule
if ($modules.Status -ne 0) { 
    Write-Host "Invalid Module : $($SingleModule)" -ForegroundColor Red
    exit 
}

$sourceModule = $modules.SourceFolder
Write-Host "Pulling Module : $($modules.Selected)" -ForegroundColor Green 

# ==============================
# Retrieve source files to pull.
# ==============================
$baseWebFolder = "$/MICR/Base/2/2015B$($BuildNumber)/"
$SourceCodeFolder = "$($baseWebFolder)/$($sourceModule)" # Where to pull from.
$genericFolder = "$(Get-Location)\Generic\" # Where to push files to.

$scopePath_Escaped = [uri]::EscapeDataString($SourceCodeFolder) # Need to have this in 'escaped' form.

Write-Host "`tfrom $($SourceCodeFolder)`n`tinto $($genericFolder)`n"

$recursion = 'Full' # OneLevel or Full
#$recursion = 'OneLevel' # or Full
 
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $VSTSUser, $VSTSUserPAToken)))
 
# Construct the REST URL to obtain the MetaData for the folders / files.
$uri = "https://somedomain.visualstudio.com/DefaultCollection/_apis/tfvc/items?scopePath=$($scopePath_Escaped)&recursionLevel=$($recursion)&api-version=2.2"

# Invoke the REST call and capture the results
$result = $null
$result = Invoke-RestMethod -Uri $uri -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic $($base64AuthInfo)")}

# This call returns the METADATA for the folder and files. No File contents are included.
if ($result.count -eq 0)
{
     throw "Unable to locate code at $($SourceCodeFolder)"
}
$result

# ==============================================
# Create folder structure and sort file objects.
# ==============================================
$script:startTime = Get-Date
$sortedFiles = New-Object 'System.Collections.Generic.SortedDictionary[string, string]'

$_removeLength = $baseWebFolder.Length
for($index=0; $index -lt $result.count; $index++)
{
#    $_path = $result.value[$index].path.substring($_removeLength)
    $_path = "$($genericFolder)$($result.value[$index].path.substring($_removeLength))" -replace "/", "\"
    if ($result.value[$index].isFolder -eq $true) 
    { 
        Write-Host "`t$($_path)" # -BackgroundColor Blue -ForegroundColor Yellow
        New-Item -Force -ItemType directory -Path $_path  | Out-Null
    }
    else
    {
        $sortedFiles[$_path] = $result.value[$index].url
    }
}

# =======================================================
# Create a runspace pool where $maxConcurrentJobs is the 
# maximum number of runspaces allowed to run concurrently    
# =======================================================
$script:maxConcurrentJobs = 10
$script:asyncObj = $null
$Runspace = [runspacefactory]::CreateRunspacePool(1,$script:maxConcurrentJobs)

# Open the runspace pool (very important)
$Runspace.Open()

#$script:Authorization = @{Authorization=("Basic {0}" -f $base64AuthInfo)}
$script:Authorization = @{Authorization=("Basic $($base64AuthInfo)")}
$SortedFiles.GetEnumerator() | foreach {
    # Create a new PowerShell instance and tell it to execute in our runspace pool
    $ps = [powershell]::Create()
    $ps.RunspacePool = $Runspace

    # Base command to 'BeginInvoke'
    # Invoke-RestMethod -Uri $using:remote -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic {0}" -f $using:base64AuthInfo)} -OutFile $using:local 

    [void]$ps.AddCommand("Invoke-RestMethod")
    [void]$ps.AddParameter("OutFile",$_.Key)
    [void]$ps.AddParameter("Uri",$_.value)
    [void]$ps.AddParameter("Method","Get")
    [void]$ps.AddParameter("ContentType", "application/json")
    [void]$ps.AddParameter("Headers", $script:Authorization)

    # Begin execution asynchronously (returns immediately)
    $script:asyncObj = $ps.BeginInvoke() 
}

# ==========================================
## Run the parallel processes to completion.
# ==========================================
if ($script:asyncObj -eq $null) {}
else {
    Write-Host "Pulling $($SortedFiles.Count) code files..."
    while ($script:asyncObj.IsCompleted -eq $false) {}
    Write-Host "`tTime elapsed to pull code: $((Get-Date) - $($script:startTime))"
}

# ================================================
## Change MPP to MMM. Simplifies later processing.
# ================================================
Push-Location $($genericFolder) #Generic 
if (Test-Path "$($genericFolder)\MPP" -PathType Any)
{
    Remove-Item -Path "MMM" -Recurse -ErrorAction SilentlyContinue | Out-Null ## Remove any previous MMM code.
    Rename-Item -path "MPP" -NewName "MMM" 
}
Pop-Location ## Back to where the code was.

This script accepts 5 parameters: the module code (we support 5 products currently) which is validated to prevent an empty parameter from being passed. If we pass in "All", all products will be built; the repository user and personal access token, and a parameter to test the folder structure once it's created. These parameters are passed in by the actual Build definition step.

To retrieve the source files, we construct the service URI and also determine where the files are going to be deposited once retrieved. This is determined by the setting up a relative path to the Generic folder we created in step 1.

Once we connect to the service, we begin retrieving the files by using a for() control structure. There are some other steps that are only relevant to the environment for which this Build process has been designed.

3. Upon retrieving the files from the Dexterity project repository, we are now ready to setup module environment variables and compile the dictionaries, in preparation for the extraction and chunking process.

SetupModuleEnvironment.ps1
Param(
   [int] $VersionNumber,
   [int] $BuildNumber = "000",
   [int] $SubBuildNumber = "0",
   [string] $SingleModule = $null
)

. "$(Get-Location)\Scripts\Helper.ps1"

# Create the folder structures
$folders = Create-FoldersCommands -Version $VersionNumber -Module $SingleModule
Write-Host "Creating Folders:"
foreach ($item in $folders) {
    Write-Host "`t$($item.Folder)"
    mkdir $item.Folder -ErrorAction SilentlyContinue | Out-Null
}


# Copy the files, with replacement of text in text files.
# Ensure the files are saved as 'ASCII'.
$files = Copy-FilesCommands -Version $VersionNumber -Module $SingleModule -BuildNumber $BuildNumber -SubBuildNumber $SubBuildNumber
Write-Host "Copying Files:"
foreach ($item in $files) {
    Write-Host "`tFrom`t$($item.From)"
    Write-Host "`tTo`t`t$($item.To)"
 copy $($item.From) $($item.To)
    Set-ItemProperty $item.To IsReadOnly -value $false

    if ($item.Replacements -ne $null){
        $item.Replacements.psobject.properties | 
        foreach { 
            $_name = "%$($_.name)%"  # The name of the parameter is the text to be replaced, surrounded by '%'
            $_value = "$($_.value)"

            (Get-Content $item.To) -replace $_name,$_value | Set-Content $item.To -Encoding Ascii 
        }
    }
}

Of particular importance is the fact that we use a PowerShell helper script (Helper.ps1) which contains a number of functions that capitalize on the parameters passed here. The general idea, nonetheless, is to make a number of replacements within the macros that assign product information and build numbers, taking into account the version number of Microsoft Dynamics GP for which we will be creating the chunks; and create the shortcuts for Dexterity and Dexterity Utilities to compile and extract the dictionaries, using the proper dictionaries and macros that will run when the Dex platform executables are launched.

In the closing post, summarizing all the articles within this series, I will attach a copy of the Helper.ps1 script.

4. Upon making these replacements and compiling the dictionaries, we can then proceed to extract and chunk our dictionaries.

ChunkDictionaries.ps1
Param(
   [int] $VersionNumber,
   [string] $SingleModule = $null
)
. "$(Get-Location)\Scripts\Helper.ps1"

$EXEx = Create-ExecutableCommands -Version $VersionNumber -Module $SingleModule
Write-Host "Building..."
foreach ($item in $EXEx) {
    Write-Host "$($item.Version)`t$($item.Module)`t$($item.Message)`t" -NoNewline
    Write-Host "`n`t$($item.Executable) : $($item.Timeout) seconds Max.`n`t$($item.Dictionary)`n`t$($item.Macro)`t"

<##>
    # keep track of timeout event
    $timeouted = $null # reset any previously set timeout
    $proc = Start-Process -filePath $item.Executable -ArgumentList @($item.Dictionary, $item.Macro)  -PassThru
    # wait up to x seconds for normal termination
    $proc | Wait-Process -Timeout $item.Timeout -ea 0 -ev timeouted
<##>
    $msg = "Finished."

    if ($timeouted)
    {
        # terminate the process
        $msg = "Time Out!!"
        $proc | kill
    }
    elseif ($proc.ExitCode -ne 0)
    {
        # update internal error counter
        $msg = "Error: $($proc.ExitCode)."
    }
    
    Write-Host "`t$($msg)"
}

Once again, this script takes advantage of the PowerShell helper script library to extract the source code from the development dictionaries and auto-chunk the extracted dictionaries. Note that this script takes in the version of GP to determine the proper version of Dexterity and Dexterity Utilities to launch. This process is completed twice: once for chunks with source (Remove Unused Block in Dexterity Utilities Auto-Chunk option) and another for chunks without source (Total Compression). The source chunks are moved to the Source folder on the Build agent and the object chunks are moved to the Build folder on the Build agent.

NOTE: the Source and Build folders are created by the CreateFolders inline PowerShell script in task 1 above.

5. Upon finalizing the extraction and chunking process of the dictionaries, we move the chunk files with no source code (Total Compression chunks) to the Build sub-folder in the artifacts directory. The artifacts folder is where all resulting files will be stored after the process itself is complete.

Copy Build Artifacts step

6. Then we move the chunks and extracted dictionaries with source code to the Source sub-folder in the artifacts directory.

Copy Source Artifacts
The following Microsoft Docs article talks about Artifacts in Release Management in more detail.

7. Finally, since the Build Agent is volatile, you will need to move the artifacts off the agent and onto a permanent storage location, whether that's on the VSTS servers or a local folder. This is accomplished by publishing the artifacts.

Publish Build Artifacts

My final article in this series will summarize the series and provide links to all previous articles, along with providing a link to the Helper.ps1 PowerShell library.

Until next post!

MG.-
Mariano Gomez, MVP

Wednesday, August 16, 2017

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 2/3

I am so amped-up after my return from the Microsoft Dynamics GP Technical Conference 2017 in Fargo, ND, where I had a chance to catch up with my friends in the partner and ISV community (more on that in a later post). This year, I had a chance to introduce the topics I have been discussing here in my DevOps series and now that I am back, I want to continue writing about the subject as it gets more and more exciting.

In Part 1 of this specific chapter within the series, I talked about building the actual Build-Engine project. If you remember, I specifically said that the build templates provided by Visual Studio Team Services (VSTS) do not fit the bill for Dexterity projects. Dex projects tend to be a bit more cumbersome since we need to have the entire IDE around to compile, extract, and chunk our products. So, it's best if we can isolate these components into an altogether separate project (from that of our actual Dex product) for clarity sake and to maintain our own sanity.


Creating a Build Definition for your Build-Engine Project

Now that we have the Build-Engine project in place, we can proceed to setup a Build Definition. The Build Definition is going to encompass all of the steps required to do things like:

1. Download the resources from our Build-Engine project (Dex IDEs, clean dictionaries, PowerShell scripts, macro templates, etc.

2. Setup any folders needed to support the build process and temporarily store files, etc.

3. Pull the source code from our Dexterity project repository

4. Setup all environment variables

5. Extract dictionaries and create chunk files with (unused blocks) and without source code (total compression).

6. Copy the chunks without source code into an artifact folder

7. Copy chunks and source dictionaries for debugging into an artifact folder

8. Publish the artifact folder

To create a new build for our Build-Engine project:

1. Click on the Build & Release then click the New button.


2. Select an empty template. Dexterity projects, clearly do not conform to any of the existing, pre-defined molds.


3. Click the Apply button to continue.

4. You can now enter the name of your Build-Engine and select from a list of 4 agent queue modes. You can select from Default, Hosted, Hosted Linux Preview, or Hosted VS2017. For all intends and purposes, hosted build agents pools run in the cloud, but can run locally as well. For more information on Hosted Agents, click here. These options define the Build process itself.


The suited option for our Dexterity Build-Engine is Hosted.

5. On the left pane, we can now click on the first task, Get Sources, to identify where the resources for our Build-Engine will come from. In this case, they will come from our Build-Engine project itself, which contains the Dexterity IDEs for versions 12 (GP 2013), 14 (GP 2015), and 16 (GP 2016). All other options are defaulted and really not required to be changed.



This completes the first step (Download Resources for our Build Engine) for today. You can click on Save & Queue to test that all files download properly for the build agent pool.

video


NOTE: My agent failed in the video as I ran out of allocated build minutes for the month. You will need to assess the length of your build process and ensure you plan accordingly. For more information on Team Services pricing, click here.

I strongly encourage you to read MVP David Musgrave's series on Building a Dexterity Development environment, because all principles used in that series are still applicable in our cloud Build-Engine.

Until next post!

MG.-
Mariano Gomez, MVP

Tuesday, August 1, 2017

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 1/3

Resuming my series, I wanted to touch base on the process of building and releasing Dexterity applications with Visual Studio Team services. My friend and fellow MVP, David Musgrave explained how to setup your Dexterity development environment in his Dexterity Development series and the first article in this series directly addressed the source code control process. Although David showed some really clever methods to build and package your Dexterity chunk application, that process still has some downsides to it. Primarily, the process is dependent on a person and a physical machine dedicated to executing the process.

Setting up a Build Engine

When creating a self-contained cloud-based VSTS Build Engine for your Dexterity projects, there are a few considerations:

1. The actual Dexterity IDE. If you are building a Dexterity chunk, you need to at least have a copy of the Dexterity IDE, because you will want to compile your project dictionary prior to chunking, and the chunking process itself still relies on Dexterity Utilities to be able to produce a chunk file.

In addition, you will need as many IDEs as versions of Dynamics GP you are supporting with your product. David describes this well in Part 2 of his series.

2. You need clean (base) dictionaries of each Dynamics GP versions you will be supporting. This is particularly important as you will want to pull the source code from the repository into clean dictionaries to compile and obtain your extracted dictionaries and finally complete the auto-chunk process.

3. Since your build process will ultimately be automated, you need macros to inject constants and compile the dictionary, based on the version of Dynamics GP your product will be supporting. You will also need macros to extract and auto-chunk your product.

NOTE: You can inject constants into your development dictionary by having a Dexterity constants resource file. Your macro will need to have steps in place to import the constants resource file.

4. You will also need scripts to drive a lot of the processes above, i.e., if you are launching Dexterity with a dictionary and a macro as a parameter to execute an automated task, this needs to be done by some task or step that supports this process. Anything scripting related, is preferable done with PowerShell since it has greater levels of automation over standard DOS batch files.

Now that you understand all the challenges, you can quickly guess that the best way to achieve this in Visual Studio Team Services is by setting up a Build Engine project with all the artifacts needed to automate the process.

The following shows the folder structure of a Build Engine project in VSTS:

Build Engine project structure
An explanation is as follows:

1. The DEX12, DEX14, and DEX16 folders contain full copies of the Dexterity IDE files, less things like samples, help files, and what's not. These are not needed as they will never be accessed.

2. The Logs folder will store any log produced by the macros being executed

3. The Macros folder includes all macros that we will need to compile and extract our code into chunk files.

If your macros refer to files in any of the folders in your Build Engine, these need to be relative to the root of structure of your project structure. In addition, it's easier to inject variables for things like the log paths, the dictionaries, version numbers, etc. It's considered best practice to not hard code any of these elements in your macro files to allow for reusability and customization. The following is an example of a chunking macro file:



# DEXVERSION=%DexMajorVersion%
Logging file '%LogPath%vDexUtilsMEP.log'
# ================================================================================
  MenuSelect title File entry 'Open Source Dictionary...' 
# ================================================================================
  FileOpen file '%ModulePath%/DYNMEP.dic' type 0 
# ================================================================================
ActivateWindow dictionary 'default'  form 'Main Menu' window 'DexUtils Toolbar' 
  MoveTo field 'Toolbar Utilities Button' item 0 
  ClickHit field 'Toolbar Utilities Button' item 6  # 'Extract' 
NewActiveWin dictionary 'default'  form Extractor window Extractor 
ActivateWindow dictionary 'default'  form Extractor window Extractor 
  ClickHit field '(L) Extract Button' 
# ================================================================================
  FileSave file '%ChunkDestination%/%VersionNumber%/MEP7156E.dic' 
# ================================================================================
NewActiveWin dictionary 'default'  form Extractor window Extractor 
NewActiveWin dictionary 'default'  form Extractor window Extractor 
  MenuSelect title File entry 'Close Source Dictionary' 
  MenuSelect title File entry 'Open Editable Dictionary...' 
# ================================================================================
  FileOpen file '%ChunkDestination%/%VersionNumber%/MEP7156E.dic' type 0 
# ================================================================================
ActivateWindow dictionary 'default'  form 'Main Menu' window 'DexUtils Toolbar' 
  ClickHit field 'Toolbar Utilities Button' item 9  # 'Auto-Chunk' 
NewActiveWin dictionary 'default'  form 'Auto Chunk' window 'Auto Chunk' 
ActivateWindow dictionary 'default'  form 'Auto Chunk' window 'Auto Chunk' 
  ClickHit field 'Lookup Button 3' 
# ================================================================================
  FileSave file '%ChunkDestination%/%VersionNumber%/MEP7156.cnk'
# ==================================================================================
  MoveTo field 'Dictionary Name' 
  TypeTo field 'Dictionary Name' , 'MEP7156.DIC'
  MoveTo field 'Dictionary Name' 
  MoveTo field '(L) Major Version' 
  TypeTo field '(L) Major Version' , '%DexMajorVersion%'
  MoveTo field '(L) Build Number' 
  MoveTo field '(L) Minor Version' 
  MoveTo field '(L) Build Number' 
  TypeTo field '(L) Build Number' , '%DexBuildVersion%'
  MoveTo field '(L) Major Version' 
  MoveTo field '(L) Build Number'  




You may be asking how these variables will be updated. Very simple: when we setup the Build process itself, we will create environment variables that can be injected into these macro files. You may also be asking how did these variables get there to begin with? Follow the steps outlined in Part 4 of the Dexterity Development Environment series to create your build macro and once you have recorded the macro, you can edit it to set up the place holders for environment variables.

4. The Resources folder contains things like constants resource files, clean dictionaries, and a Dex.ini file for the Dynamics GP versions that will be supported by your integrating Dexterity application.

5. The Scripts folder contains all the PowerShell scripts required to automate some of the tasks. One of such tasks is the ability to setup the dictionaries for the different products you will be building, as shown in this PowerShell script:

Param(
   [int] $VersionNumber,
   [int] $BuildNumber = "000",
   [int] $SubBuildNumber = "0",
   [string] $SingleModule = $null
)

$dictionary=''
$DexMajorVersion = 0
$localFolder = Get-Location #"$([System.IO.path]::GetPathRoot($(Get-Location)))" 

enum BuildType { 
    Source = 0
    Build = 1
}


switch ($VersionNumber) {
    2013 { 
   $dictionary = "$localFolder\resources\Dyn2013.dic"
   $DexMajorVersion = 12
  }
    2015 { 
   $dictionary = "$localFolder\resources\Dyn2015.dic"
   $DexMajorVersion = 14

  }
    2016 { 
   $dictionary = "$localFolder\resources\Dyn2016.dic"
   $DexMajorVersion = 16
  }

    default { 
   Write-Output "Invalid Version Number" 
   } 
}

$DexterityEXE = ".\Dex$DexMajorVersion\Dex.EXE"

$offset = "z_"  # Set blank for live mode.

if ($dictionary -ne ''){
    # Ensure the Destination folders exist.
    [System.Enum]::GetValues([BuildType]) | foreach { 
        $_compileModeName = $_
        mkdir "$localFolder\$($offset)$($_compileModeName)" -ErrorAction SilentlyContinue | Out-Null
        mkdir "$localFolder\$($offset)$($_compileModeName)\$VersionNumber" -ErrorAction SilentlyContinue | Out-Null
    }


    $fileSet = @()
    $modules = @('MICR', 'MICRJ', 'MMM', 'MEP', 'VPS')

    if ($modules -contains  $SingleModule)    
    { $modules = @($SingleModule.ToUpper()); Write-Host $modules -ForegroundColor Green }
    else { Write-Host "All Modules" -ForegroundColor Green }

    foreach($mod in $modules) {
        $workFolder = "$localFolder\$($offset)$($mod)\$($VersionNumber)"
        $dataFolder = "$workFolder\Data"
        mkdir $workFolder -ErrorAction SilentlyContinue | Out-Null
        mkdir $dataFolder -ErrorAction SilentlyContinue | Out-Null

        ## Copy the Dynamics Dictionary.
        $destination_file = "$workFolder\Dyn$($mod).dic"
     copy $($dictionary) $($destination_file)
        Set-ItemProperty $destination_file IsReadOnly -value $false
        Write-Host "$destination_file created." -Backgroundcolor Green -ForegroundColor Black

        # Copy Install_Code macro.
     $original_file = "$localFolder\Macros\Install_Code.mac"
     $destination_file =  "$workFolder\Install_Code.mac"
        $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)


        [System.Enum]::GetValues([BuildType]) | foreach { 
            $_compileModeName = $_
            $_compileMode = [int]$_

      ## Copy Constants.constant file to module-specific version and replace variables.
         $original_file = "$localFolder\resources\Constants.constants"
         $destination_file =  "$workFolder\Const_$_compileModeName.constants"
            $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)


            ## Copy the Macro - Twice: Once for Build and once for Source.
         $original_file = "$localFolder\Macros\Build_$($mod)_Source.mac"
         $destination_file =  "$workFolder\$($mod)_$($_compileModeName).mac"
            $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)

            ## Copy the Install_Constants.mac - Twice: Once for Build and once for Source.
         $original_file = "$localFolder\Macros\Install_Constants.mac"
         $destination_file =  "$workFolder\Install_Constants_$_compileModeName.mac"
            $fileSet += , @($original_file, $destination_file, $mod, $_compileModeName)
        }
        

  ## Copy Dex.INI file to module-specific version and replace variables.
  $original_file = "$localFolder\resources\Dex.ini"
  $destination_file = "$dataFolder\Dex.ini"
        $fileSet += , @($original_file, $destination_file, $mod, $compileMode)

        ## After this point, $MOD for MMM is MPP.
        if ($mod -eq 'MMM') { $mod = 'MPP' }
    }

    foreach($item in $fileSet)
    {
        $originating = $item[0]
        $destination = $item[1]
        $mod = $item[2]
        $_compileModeName = $item[3]
        $_compileMode = [int]$item[3]

        $workFolder = "$localFolder\$($offset)$($mod)\$($VersionNumber)"
        $chunkDestinationFolder = "$localFolder\$($offset)$($_compileModeName)"

        if ($_compileModeName -eq [BuildType]::Build.ToString())
        {
            $ChunkTypeComment = ""
            $_subBuildMessage = ""
        }
        else 
        {
            $ChunkTypeComment = "# -- Source -- " 
            $_subBuildMessage = "Build $($BuildNumber).$($SubBuildNumber) ($($BuildNumber).$($SubBuildNumber).$(Get-Date -format yyyyMMdd.hhmmss))"
        }


     (Get-Content $originating) | Foreach-Object {
      $_ -replace '%CompileMode%', "$_compileMode" `
         -replace '%CompileModeName%', "$_compileModeName" `
         -replace '%ChunkDestination%', "$chunkDestinationFolder" `
         -replace '%ChunkTypeConstant%', "$_compileMode" `
         -replace '%DexMajorVersion%', "$DexMajorVersion" `
         -replace '%DexBuildVersion%', "$BuildNumber" `
         -replace '%DexSubBuildMessage%', "$_subBuildMessage" `
         -replace '%ModulePath%', "$workFolder" `
               -replace '%Module%', "$mod" `
         -replace '%GenericPath%', "$localFolder\Generic\" `
         -replace '%TempPath%', "$localFolder\Temp\" `
         -replace '%LogPath%', "$localFolder\Logs\" `
         -replace '%OriginatingPath%', "$localFolder\Resources\" `
               -replace '%ChunkTypeComment%', "$ChunkTypeComment" `
         -replace '%VersionNumber%', "$VersionNumber"
        } | Set-Content $destination

        Write-Host "$($destination) created." -Backgroundcolor DarkRed -ForegroundColor Yellow
    }

    
    $DexterityEXE = ".\Dex$DexMajorVersion\Dex.EXE"
    $DexUtilsEXE = ".\Dex$DexMajorVersion\DexUtils.EXE"

    foreach($mod in $modules) {
        $workFolder = "$localFolder\z_$mod\$($VersionNumber)"
        $dictionary = "$workFolder\Dyn$($mod).dic"
        $LoadMacro =  "$workFolder\Install_Code.mac"

        Start-Process -FilePath $DexterityEXE -ArgumentList @($dictionary, $LoadMacro) -PassThru -Wait -Verbose
##        $process = (Start-Process -FilePath $DexterityEXE -ArgumentList @($dictionary, $LoadMacro) -PassThru -Wait)
##        Write-Host "$($mod) code loaded. Status: " $process.ExitCode
    }
}

6. The Source folder will contain the code retrieved from the Visual Studio Team Services repository. You will need a script to do this as well. The script will leverage the VSTS API to retrieve the code from the repo.

Param(
    [string]$SourceCodeFolder = "$/MICR/Base/2/2015B160", # Ex. 2015
    [string]$VSTSUser = "youraccount@somedomain.com",
    [string]$VSTSUserPAToken = "abcdefghijklmnopqrstuvwxyz0123456789",
    [switch]$TestStructure
)

$localFolder = Get-Location #"$([System.IO.path]::GetPathRoot($(Get-Location)))" 
$workFolder = "$localFolder\Generic\"

$scopePath_Escaped = [uri]::EscapeDataString($SourceCodeFolder) # Need to have this in 'escaped' form.

Write-Host "Pulling code from $($SourceCodeFolder) into $($workFolder)" -BackgroundColor DarkGreen

$recursion = 'Full' # OneLevel or Full
#$recursion = 'OneLevel' # or Full
 
# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $VSTSUser,$VSTSUserPAToken)))
 
# Construct the REST URL to obtain the MetaData for the folders / files.
$uri = "https://yourcompany.visualstudio.com/DefaultCollection/_apis/tfvc/items?scopePath=$($scopePath_Escaped)&recursionLevel=$($recursion)&api-version=2.2"

# Invoke the REST call and capture the results
$result = $null
$result = Invoke-RestMethod -Uri $uri -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)}

 
# This call returns the METADATA for the folder and files. No File contents are included.
if ($result.count -eq 0)
{
     throw "Unable to locate code at $($SourceCodeFolder)"
}

$scopePathLength = $SourceCodeFolder.Length + 1 # +1 to eliminate an odd prefixed '\' character.

# =======================
# Create folder structure.
# =======================
for($index=0; $index -lt $result.count; $index = $index + 1)
{
    if ($result.value[$index].isFolder -eq $true)
    {
        #Strip the VSTS path off of the folder, and prefix with the local folder.
        $_subPath = $result.value[$index].path
        if ($_subPath.Length -ge $scopePathLength)
        {
            #Strip the VSTS path off of the folder
            $_subPath = $result.value[$index].path.Substring($scopePathLength)

            #prefix with the local folder.
            #MGB: replace the forward slashes in the remaining VSTS path with backslashes
            $_subPath = "$($workFolder)$($_subPath)" -replace "/", "\"

            if ((Test-Path $_subPath) -ne $true)
            {
                New-Item -Force -ItemType directory -Path $_subPath | Out-Null
                Write-Host $_subPath -BackgroundColor red
            }
            else
            {
                Write-Host "$($_subPath)`t$($result.value[$index].path)" -BackgroundColor Green -ForegroundColor Black
            }
        }
    }
}

# ==============
# Retrieve Files 
# -TestStructure flag will show all folders/files that will be retrieved.
# ==============
for($index=0; $index -lt $result.count; $index = $index + 1)
{
    if ($result.value[$index].isFolder -ne $true)
    {
        #Strip the VSTS path off of the folder, and prefix with the local folder.
        $_subPath = $result.value[$index].path
        if ($_subPath.Length -ge $scopePathLength)
        {
            #Strip the VSTS path off of the folder
            $_subPath = $result.value[$index].path.Substring($scopePathLength)

            #prefix with the local folder.
            #MGB: replace the forward slashes in the remaining VSTS path with backslashes
            $_subPath = "$($workFolder)$($_subPath)" -replace "/", "\"

            ## Retrieve the file text.
            if ($TestStructure -eq $true)
            {
                $fileresult = $result.value[$index].url
            }
            else
            {
                $fileresult = Invoke-RestMethod -Uri $result.value[$index].url -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)}
            }

            New-Item -Force -ItemType file -Path $_subPath -Value $fileresult | Out-Null
            Write-Host $_subPath -BackgroundColor Green
        }
    }
}

You can use the Visual Studio IDE to setup a project on VSTS and add the folders and files needed for your Build Engine project.

In summary, setting up a Build Engine project involves thinking about all the elements required to produce your chunk file: IDE, macros, and scripts that will drive the process. The complexity of each script will depend on the number of points you want to automate, variables and constants you want to inject, and certainly the number of chunks you need to produce for each version of Dynamics GP.

Tomorrow, I will explain the details involved with setting up the actual Build process with the parts.

Until next post!

MG.-
Mariano Gomez, MVP
IntellPartners, LLC
http://www.IntellPartners.com/

Wednesday, July 19, 2017

#DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 2/2

Continuing with our #DevOps series, today we will address the upgrade process from Team Foundation Server (TFS) to Visual Studio Team Services (VSTS). Yesterday, we addressed the upgrade from Visual SourceSafe (VSS) to VSTS - see #DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 1/2, - and saw all the important steps needed to ensure your Microsoft Dexterity repository is migrated properly. Likewise, you must observe a series of steps prior to moving your TFS repository to VSTS.



Background

One of the main questions I usually field around this topic is, "Why would I want to move from TFS to VSTS?" The truth is, there are a number of reasons that you may want to consider: a) less server administration, b) a cloud solution gives you immediate access to the latest and greatest features, c) improved developers' connectivity - personally, I love the ability of being anywhere in the world and having access to our repository, without having to establish a VPN connection to some server; and d) if you want to keep the finance people happy, just tell them that you are moving from a CapEx model (servers and hardware that needs to be depreciated, with iffy tax deductions) to an OpEx model (subscriptions that are fully tax deductible).

Once you can see through the benefits, it will be easier to adopt VSTS. The next question is usually, "What are the differences between the two?" For a primer on this, and to keep the article tight, take a look at the following whitepaper:

Fundamental differences between TFS and Team Services


Migrating Microsoft Dexterity repositories from Team Foundation Server to Visual Studio Team Services

As with VSS, there are a few acceptable methods to migrate from TFS to VSTS, as follow:

1) Manually. You can copy the most important and perhaps, the latest projects you are working on. When you are done, you can simply mark off the TFS projects as read only. Under this scenario, the assumption is you will be leaving behind your old TFS server to maintain the history of all your old projects, but if you are getting rid of the server (which is the main reason to move to begin with), you may want to consider a different method.

2) Use the Database Migration tool. As you all know, I am a fan of tools that allow you to automate the process and minimize any risk associated with moving data from one place to another, especially over the internet. The Migration Guide is available here. However, this particular Microsoft tool was very young when we first looked at it, and as of the writing of this article, it was still in preview mode.

3) Use a third-party tool. Frankly, when we did our migration here at Mekorma, we tested a number of tools, but settled on the OpsHub Visual Studio Online Migration Utility, available for free from the Visual Studio Team Services gallery.

Visual Studio Online Migration Utility

OpsHub Visual Studio Online Migration Utility Free Version helps developers migrate the most commonly requested data from an on-premises Team Foundation Server to their Visual Studio Team Services account.  It enables basic migration of history of version control change sets, work items, test cases

The Free Utility is limited to migrating projects with less than 2500 revisions of work items and less than 2500 revisions of source control. The Free utility offers very limited support thru the community supported Q&A forum with no additional support included.

But rather than me trying to describe all the steps, I thought it would be best to embed the demonstration video here:


Tomorrow, I will show you how to leverage Visual Studio Team Services' Build process to extract and chunk your Dexterity applications.

Until next post!

MG.-
Mariano Gomez, MVP

Monday, July 17, 2017

#DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 1/2

Yesterday, we talked about #DevOps Series: Microsoft Dexterity source code control with Visual Studio Team Services. The article mainly focused on setting up your Team Services project repository for the first time and taking an existing development dictionary and prepping it and checking in the resources into the repository. But what if you already have a Visual SourceSafe (VSS) or Team Foundation Server (TFS) repository already in place and you are just looking to move to VSTS?



Migrating Microsoft Dexterity repositories from Visual SourceSafe to Visual Studio Team Services

There are two acceptable methods to migrate your Microsoft Dexterity projects repository VSS repository to VSTS: you can use the VSS Upgrade Wizard or you can use the VSSUpgrade command prompt tool. Now, I am a big fan of command-prompt tools, but this is one case where I would suggest you ditch it for the Wizard.

If you would like more information on the VSSUpgrade command-prompt tool, please click here.

Using the VSS Upgrade Wizard 

This is by far, the method I recommend the best. The wizard provides step by step instructions, which makes the process of moving to VSTS a no-brainer. There are a few things you will need to do beforehand.

Preparing for the Upgrade

1.- First, if you are on a version prior to Visual SourceSafe 6.0, you will need to upgrade Visual SourceSafe to version 6, before you can attempt the upgrade. You can download Visual SourceSafe 6.0 here, but please note that this IS NOT an official Microsoft download site, hence, exercise due care when opening any files from an unknown location. Also note that Microsoft support for VSS ended in 2012 - that's right! You are on your own here.

2.- Next, you will need to have a SQL Server available to use as temporary storage to the upgrade process. Since you are already running Microsoft Dynamics GP on some SQL Server, you could probably create a separate instance where you can perform the upgrade. I won't recommend using your production instance to do so.

NOTE: Although SQL Server Express Edition is probably fine for the upgrade, I do recommend you use at the very least SQL Server Standard Edition to prevent any migration issues due to database size limitations imposed by SQL Server Express Edition. If your repositories tend to be very large from years and years of coding (in our case 20 years!) you are probably better off with the Standard Edition of SQL Server.

3.- You will then need to check in all your Microsoft Dexterity project resources into your VSS repository and remove access to all repositories for all developers, but the (main) administrator.

4.- You will have already had to provision a Team Services account. Refer to the previous article in this series for a primer on this process. We found this out the hard way: make sure you create all project shells for your VSS projects before you conduct the upgrade as the Upgrade tool will need this done in advance.

5.- Make a copy of your VSS database and work from the copy. Restore these onto the instance of SQL Server you created in setp 2. Makes sense? Ok, let's move on. As usual, you will not want to expose yourself to some sort of data corruption, so please do not work with your original VSS databases in case something goes wrong. See How To Back Up a Visual SourceSafe Database for additional information on this process.

6.- Download and install the Visual SourceSafe Upgrade Tool for Team Foundation Server (and Visual Studio Team Services). You can get the tool here. You must install the tool on the same machine where you made the copy of your repository database.

7. Run the VSS Analyze Utility to ensure there are no inconsistencies with your VSS database that would prevent the upgrade from being successful. If Analyze produces any errors, you will need to repair the database prior to beginning the upgrade.

7.- For additional preparation steps, please refer to the following MSDN article, Prepare to upgrade from Visual SourceSafe.

Using the Wizard

1. Launch the tool downloaded in Step 6 above. Go to Start and run the VSS Upgrade Wizard.

2. On the Visual SourceSafe Repository page, specify the repository, and if necessary, the Admin password.

Visual SourceSafe Repository page
3. To display the projects in your VSS repository, choose the List Available Projects link. Select the projects you want to upgrade.

List Available Projects
4. Select the check box at the bottom of the page to confirm you have run Analyze. See Step 7 above. Choose Next to proceed.

5. On the Team Project Page, choose Browse and then use the Select a Team project for Migration dialog box to specify the team project into which you want to port the upgraded data. My absolute recommendation here is to select a new team project that you have not been using.

Select a Team Project for Migration page

Choose Next.

6. On the Options page, select whether you want to upgrade the Full history or Tip to omit historical data. When we did this migration, we truncated the data we didn't want to upgrade. That would have been done as an optional step to step 5 above, after all copying the repository database.

Options page
7. On the Options page, specify the name of the SQL Server instance you want the wizard to use for temporary storage.

Options page
Choose Next to continue.

8. Review all settings and choose Next. There will be a checksum to ensure the upgrade can proceed. Choose Upgrade to continue.

9. Once the upgrade is finished, you should be able to navigate to your Visual Studio Team Services account page and verify that all projects have been migrated successfully. If you come across any issues, make sure you print the Migration Report and follow the information provided here to complete additional steps to fix.



Tomorrow, I will walk through the steps to upgrade from TFS to VSTS. Have you completed a migration from VSS to VSTS? I would like to hear your take on it and what "lessons learned" came from executing the migration.

Until next post!

MG.-
Mariano Gomez, MVP