Showing posts with label DTAP. Show all posts
Showing posts with label DTAP. Show all posts

Monday 27 February 2023

Emergency fixes into a controlled production Power Platform environment

Power Platform is easy to push changes thru environments using managed solutions.   This is a simple way to allow development to continue but deploy an emergency fix quickly and then get it into the "main branch".

Deploying a hot fix into production within a DTAP regulated environment

If you build the Power Apps environment dynamically, then hot fixes are easy.

There are still open questions around, the duration backups are held for and taking backups and restoring to new sandbox environments to be answered.

The Power Platform ALM Microsoft Knowledge base is very good.

Friday 24 February 2023

Environment variables for Power Automate

Overview: Environment variables are great in Power Platform but when using managed solutions and ALM there are a couple of points that will make your deployment and support easier.

Background:  Power Platform has Environment Variables (env vars) that are sored in two entities within the Dataverse namely: definitions and values.

We deploy env vars thru solutions and can easily amend with adding an unmanaged layer.

Problem:  In you managed environment you end up with a tangle of env vars that make upgrading solutions fail.  in a nutshell, deleting unmanaged layers using the UI, only clears the value of the env var variable part. And not the definition part. The unmanaged later env var is made up of two parts and stored in the 2 environment entities in the Dataverse.  Both must be removed.

It makes sense as in UAT we have env vars initially setup in a solutions, then we have unmanaged layers added when we amended the values, then later we use a different solution to deploy the latest env var from a different solution.

What I have seen is env being deployed as part of a 1 solution for all artefacts approach, as the project grows more solutions are added for packaging and each of these solutions has a few more env variables.  Eventually as you use the env vars across multiple Canvas apps, Power automate instances, you build a dedicate solution for env vars.

Best Practice:  Well what I recommend at least

  1. Put all variables into a single solution from the start, it means it is easy to deploy them quickly across all your DTAP environments e.g. uat, Prod.
  2. In the unmanaged solution ensure the env variables do not have a "current value".  Set the "default value" in Dev.
  3. Run settings files to fill the "Current value" in each DTAP env, have the current values for each env in a single settings file that will push the settings via the pipeline i.e. setting-uat.json, setting-prd.json
Tip: If you need to change any value merely rerun the solution containing the env vars, don't ever use an unmanaged layer to change env vars.

Tip: It's better to build your Power Platform environment in DevOps pipelines but if you use existing environments and merely push solutions on top (much more common), then clean up your existing vars as outlined below.
  1. Delete both the definition and variable parts (manual) until the environment UAT, PRD has no custom env vars. - use the Dataverse meta api
  2. Run the single env var solution.
  3. Never add unmanaged layers to env variables, if you need it changes, change the solution package and deploy, should take minutes to do. 

Sunday 9 January 2022

Azure DevOps Series - Azure Pipelines

Azure Pipelines are good at deploying solutions by setting up the infrastructure (I prefer to use PaaS and get out of the Infrastructure world, using ARM templates) and deploying code with the appropriate DTAP environment configuration.  Azure Test Plans are used to verify builds. 

Overview: DevOps allows for shorter duration periods to deploy code into production.  Some industries still require a high amount controlled environment code changes, think medical systems.  In the PaaS world with DevOps pipelines automation of code and verification is drastically reduced.

It's key to figure out your DTAP deployment strategy.  I outline a mixture of an old school deployment strategy with a PaaS DevOps model that is fairly risk adverse below:

This PaaS approach requires three types of pipelines:

  1. PaaS Infrastructure - One-offs and updates
  2. Build/App Deployments
  3. Database data or Schema (Bounded Context) updates 
Build agents
The instructions from Azure pipelines requires agents to run the instructions.  The easiest is to use MS host pipelines/agents.  If you need more power or software, use self hosted agents, these can be on VM's or hosted in docker containers.  It's a good idea to ensure the build agents are running the latest version as it doesn't change often. Self host if you need to run software that is not part of the MS hosted agent set.  V3 of the agent is excellent, try use this first until you outgrow it, or have specific timing concerns.

Azure DevOps Series Posts:

  1. Azure DevOps Series - Overview 
  2. Azure DevOps Series - Azure Boards 
  3. Missing
  4. Azure DevOps Series - Azure Pipelines (This Post)
  5. Azure DevOps Series - DevSecOps



Wednesday 12 February 2020

Power Apps - DTAP Azure diagram

Overview:  A sample architecture for DTAP in a highly controlled environment.  There are a lot of variations and the ability to use Power Apps publishing.


There are a few ways to manage environment deployments.
  1. Simple: Create in unmanaged solutions and Export and Deploy Managed solutions manually using the Power Platform Ux.
  2. Enterprise: Build Azure DevOps pipelines to add unpacked code to Github and deploy solutions using Azure pipelines.
  3. 3rd party ALM tooling, some is specific to Dynamics.
  4. Nov 2022 - Microsoft announce Power Platform Pipelines 

Custom Connectors:  As you step thru the DTAP environments using ALM to deploy you need to point the custom connector to the appropriate APIM (API):
Environment Variables are extremely useful for ALM


Saturday 30 November 2013

Planning Suggestion for SharePoint 2013

Overview:  It is always a good idea to have an exact breakdown of your SharePoint achitecture. I do this using a diagram and a corresponding spreadsheet.  This post has an example of the spreadsheet, I have a tab for each DTAP environment before I build it out. 

Server Name Server Role Logical Group CPU C D RAM Location IP Environment
SVR-PR-WFE1 SharePoint Web Front End SP WFE 4 90 80 16 London 10.189.10.50 Production
SVR-PR-WFE2 SharePoint Web Front End SP WFE 4 90 80 16 London 10.189.10.51 Production
SVR-PR-WFE3 SharePoint Web Front End SP WFE 4 90 80 16 M 10.189.10.52 Production
SVR-PR-WFE4 SharePoint Web Front End SP WFE 4 90 80 16 M 10.189.10.53 Production
SVR-PR-APP1 SharePoint Application Server SP APP 4 90 80 16 London 10.189.10.54 Production
SVR-PR-APP2 SharePoint Application Server SP APP 4 90 80 16 London 10.189.10.55 Production
SVR-PR-APP3 SharePoint Application Server SP APP 4 90 80 16 M 10.189.10.56 Production
SVR-PR-APP4 SharePoint Application Server SP APP 4 90 80 16 M 10.189.10.57 Production
SVR-PR-OWA1 Office Web Applications OWA 8 90 80 16 London 10.189.10.58 Production
SVR-PR-OWA2 Office Web Applications OWA 8 90 80 16 London 10.189.10.59 Production
SVR-PR-OWA3 Office Web Applications OWA 8 90 80 16 M 10.189.10.60 Production
SVR-PR-OWA4 Office Web Applications OWA 8 90 80 16 M 10.189.10.61 Production
SVR-PR-WF1 Workflow Services SP WF 4 90 120 8 London 10.189.10.62 Production
SVR-PR-WF2 Workflow Services SP WF 4 90 120 8 M 10.189.10.63 Production
SVR-PR-SRCH1 SharePoint Search Type A Search 8 134 80 32 London 10.189.10.70 Production
SVR-PR-SRCH2 SharePoint Search Type A Search 8 134 80 32 M 10.189.10.71 Production
SVR-PR-SRCH3 SharePoint Search Type B Search 8 134 300 24 London 10.189.10.72 Production
SVR-PR-SRCH4 SharePoint Search Type B Search 8 134 300 24 M 10.189.10.73 Production
SVR-PR-SRCH5 SharePoint Search Type C Search 8 134 500 24 London 10.189.10.74 Production
SVR-PR-SRCH6 SharePoint Search Type C Search 8 134 500 24 M 10.189.10.75 Production
SVR-PR-SRCH7 SharePoint Search Type D Search 8 134 500 24 London 10.189.10.76 Production
SVR-PR-SRCH8 SharePoint Search Type D Search 8 134 500 24 M 10.189.10.77 Production
SVR-PR-SRCH9 SharePoint Search Type D Search 8 134 500 24 London 10.189.10.78 Production
SVR-PR-SRCH10 SharePoint Search Type D Search 8 134 500 24 M 10.189.10.79 Production
SVR-PR-DBS1 SharePoint Databases SQL 16 134 500 32 London 10.189.10.85 Production
SVR-PR-DBS2 SharePoint Databases SQL 16 134 500 32 M 10.189.10.86 Production
CL-PR-DBS Cluster 10.189.10.87
LS-PR-DBS Listener 10.189.10.88
SVR-PR-DBR1 SSRS & SSAS Databases SQL 8 134 500 32 London 10.189.10.89 Production
SVR-PR-DBR2 SSRS & SSAS Databases SQL 8 134 500 32 M 10.189.10.90 Production
CL-PR-DBR Cluster 10.189.10.91
LS-PF-SP-DBR Listener 10.189.10.92
SVR-PR-DBA1 TDS & K2 Databases SQL 16 134 500 32 London 10.189.10.93 Production
SVR-PR-DBA2 TDS & K2 Databases SQL 16 134 500 32 M 10.189.10.94 Production
CL-PR-DBA Cluster 10.189.10.95
LS-PR-DBA Listener 10.189.10.96

Note: Window "Page File" or "Paging File" is a contentious issue, depending on the recovery.  So the dump logs go to 1 of the drives and I normally make sure they go to the c drive that is over provisioned for the is size.  I don't really know how important this is but I always estimate on the c drive for at least 2 times the maximum ram for my calculations.