I have spent way to much time trying to set up code package deployment when using TeamCity together with Octopus deploy. Read this to avoid following my path to failure :-). Well, not complete failure, I did learn some new things about both PowerShell and Octopus…
Episerver have created a PowerShell module to simplify code package deployments.
To install the EpiCloud module in an Octopus step I just:
Install-Module EpiCloud -Scope CurrentUser -Force
Copy/pasted from this blog post. When uploading a package, the EpiCloud module has a dependency to the module Azure.Storage (version >= 4.4.1). Our Octopus server had an older version of that module. So I just installed Azure.Storage in the same way as above.
DO NOT DO THIS IF YOU ARE USING OCTOPUS DEPLOY!
Azure.Storage has a dependency on AzureRm.Profile, which means that AzureRm.Profile was also updated, and other Octopus steps are (indirect) using that module. I haven’t investigated all details but the result was that I broke deployments for other projects on the Octopus server! Colleagues needed to work extra. Clients got frustrated. I’m sorry 🙁
I guess I wouldn’t get this issue if using Azure DevOps since the server isn’t shared between builds/deployments(?).
I really wanted to be able to do code package deployments, so I continued on a separate Octopus “worker” (additional server that might mean additional license cost for Octopus). On this worker I could do whatever I wanted without affecting others. Not a solution I recommend, but for now I just wanted to get the deploy working.
I continued with setting up the deployment. First thing I did was to “refactor” the config transformation files for Preproduction and Production environments. Before, the transformation files relied on the “transformation chain” that is done when doing regular deployments to DXC-S. When doing code package deployments, transformations are done in a more regular fashion, but you need to make sure you transformations files will work both when the transformations are chained, and when they are not.
This took quite some time (and I’m not sure I got it right).
Next thing was to rename the nuget file. The uploaded nuget package to needs to follow a naming convention. Not a big deal, I just renamed it in the nuspec file (that OctoPack is using).
The content of the uploaded nuget package must follow a certain structure. This structure is not the standard structure when doing a web deploy. I could make this structure change using the nuspec file, but then the package would only work for the code package deployments, and not the other environments… So, I decided to use PowerShell to restructure the content in the nuget package created by TeamCity. That means rename the nuget file to have a .zip extension, create a folder called wwwroot and unzip the contents to that folder, zip that folder, and finally remove the .zip extension from the new file. Phew!
The package created by TeamCity contains all transformation files for the different environments, together with other files specific for an environment (e.g. Episerver license files). The transformations are performed by Octopus (except for DXC-S preproduction and production). Before the transformations are done, I do some variable substitution in some of the transformation files. Those variables could be sensitive keys that you don’t want to check in to source control. I did not find an existing Octopus step that would only perform the same variable substitution that is done in the Web deploy step. So, I would need to write custom PowerShell to perform the substitution…
By this point I gave up. I dont think I would have gotten a maintainable solution. What would I think if I was to take over a solution like this from someone else. A very non-standard way of doing things, with quite a lot of custom scripts. A custom script might contain bugs, and probably needs to be maintained by someone.
I would not feel proud of this solution.
- The config transform files needed to work both as “chained” transforms and when doing a normal transformation. This leads to quite messy transformation files.
- Needed to have custom script for substituting variables in files.
- Needed to have custom scripts for re-structuring the nuget package.
- Needed to write custom scripts that duplicate functionality already built in to the regular Web deploy step in Octopus.
- Needed to install additional PowerShell modules, that apparently can affect other deployments on the same Octopus server.
I really like to hear if anyone have set up code package deployments using Octopus, and what approach you used. It feels like I constantly was working against the tools, I might have missed something obvious.
My humble suggestion to Episerver is to maybe take another approach. I don’t know if it’s possible, but maybe you can make a PowerShell module that works something like this:
- Initialize a new deployment. This perform all the tasks that is done before the actual deploy begins. I don’t know all the details, but takes a backup of db, create slot, turn of auto-scaling, etc.
- Do the actual deploy to the slot using regular web deploy (PowerShell module is not used for this step). This means that all standard functionality of Octopus (or Azure DevOps) can be used.
- Finalize deployment. Perform warmup of slot, swap slots, turn on auto-scaling, etc.