Showing posts with label Flow. Show all posts
Showing posts with label Flow. Show all posts

Sunday 18 June 2023

App Insights for Power Platform - Part 6 - Power Automate Logging

Note: Announcement 23 Aug 2023 - integration of Power Automate telemetry data with Azure Application Insights.

Series

App Insights for Power Platform - Part 1 - Series Overview 

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM 

App Insights for Power Platform - Part 6 - Power Automate Logging (this post)

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards 

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power Automate Licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern

Overview:
  • Power Automate holds it own set of logs and history.  While Power Automate's internal logging is good and useful, it does not push logs into App Insights or Log Analytics for central monitoring.
  • You can manually export Power Automate logs from Power Platform and import them into Log Analytics, using the "Data Export" option.

  • Or you can create an Azure Functions that you can use to write to App Insights, below is a simple recording of the function (this is recording aims to remove complexity so there is no VS Code or publishing.
Function to write into App Insights Logs

#r "Newtonsoft.Json"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    dynamic data = JsonConvert.DeserializeObject(requestBody);
    int eventId = data?.eventId;
    string errType = data?.errType;
    string errMsg = data?.errMsg;
    string correlationId = data?.correlationId;
    string workflowId = data?.workflowId;
    string workflowUrl = data?.workflowUrl;    
    string flowDisplayName = data?.flowDisplayName;

var custProps = new Dictionary<stringobject>()
{
    { "CorrelationId", correlationId},
    { "WorkflowId", workflowId},
    { "WorkflowUrl", workflowUrl},
    { "WorkflowDisplayName", flowDisplayName}
};

using (log.BeginScope(custProps))
{
    if (errType=="Debug"
    {
        log.Log(LogLevel.Debug, eventId, $"{errMsg}");   
    }
    else if (errType=="Critical")
    {
        log.Log(LogLevel.Critical, eventId, $"{errMsg}");  
    }
    else if (errType=="Warning")
    {
        log.Log(LogLevel.Warning, eventId, $"{errMsg}");   
    }
    else if (errType=="Trace")
    {
        log.Log(LogLevel.Trace, eventId, $"{errMsg}");          
    }
    else if (errType=="Error")
    {
        log.Log(LogLevel.Error, eventId, $"{errMsg}");          
    }
    else
    {        
      log.LogInformation($"Event is {eventId}, type is {errType}, and msg is {errMsg}");
    }        
};
    string responseMessage = $"This HTTP triggered function executed successfully. {errType} - {errMsg}";
    return new OkObjectResult(responseMessage);
}

Power Platform Admin Centre:

There is nice analytics inside the Power platform Admin Centre as shown below to examine Flows/Power automate:

The flows can also be reviewed on a per environment basis:

Series

App Insights for Power Platform - Part 1 - Series Overview 

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM 

App Insights for Power Platform - Part 6 - Power Automate Logging (this post)

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards 

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power Automate Licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern

Friday 19 May 2023

Logging and Monitoring Advanced Canvas Apps

Overview: Most Canvas apps reach back to 3rd parties or internal storage e.g. SQL, Dataverse, blobs to persist data.  In an ideal world we want to be able to identify error and performance concerns and be able to trace quickly.

Scenario: A canvas app performs a search against a custom connector, that goes to an external API, for example search's for Company Numbers from an Open API.

  • The annotated diagram, in orange records the button pressed to run the search.  This only shows if the "Upcoming feature", "Enable Azure Application Insights correlation tracing" is enabled.
  • A custom connector, in blue, makes  a call to an REST/Open API.
  • The last call is to an external API, highlighted in purple, (in this case it was actually a mocked APIM endpoint, but the icon would change it it was totally external such as call API search to the IRS, HMRC, Inland Revenue.

Tip: Always create App Insight Instances using a workspace and not the classic-mode app insights as it is being deprecated by Microsoft in early 2024.

App Insights Understanding:

App Insights setup using the Workspace-based setup (Log Analytics) can be queried in two ways:

  • Query thru App Insights
  • Query thru the Log Analytics workspace (the Kusto/KQL is slightly different, but it's rather minor)

Tip: If you upgrade from classic to the workspace base app Insights, the log history is still "query-able" as App Insights combines the logs from AppInsights Classic (stored in app insights directly) and the logs stored in Log Analytics.

Tip: Power Automate has a connector to Log Analytics so it's good to use this for flows so you can trace canvas apps using flow journeys.  Most people tend to build a custom connector to a function that uses the AppInsights SDK.  I've used both and they are both valid approaches and shown in the annotated diagram below.

The two options for logging Flows into App Insights.

Note: If you create a new App Insights workspace-based instance remember to update the loggers in all you Azure Services to the new instance (app key).  For example functions, APIM and Service Bus are common components.

Note: You can log to multiple workspace/app insights in a tenant and the correlations will be retrieved so you see the full history, assuming you have permissions to all log sources.

Learning: the instrumentation key for app insights has 3 parts to it: 
1) instrumentation key (basically a unique identified to find and allow logs to be saved into AppInsights, 
2) ingestion endpoint (URL for the log), and 
3) monitoring metric endpoint (URL for metrics/performance counters/live metrics/failed requests/). 

Here is an example and you can see the 3 parts:
InstrumentationKey=2675bxxx-xxxb-xxxx-bf5558009ccf;IngestionEndpoint=https://uksouth-1.in.applicationinsights.azure.com/;LiveEndpoint=https://uksouth.livediagnostics.monitor.azure.com/

Tuesday 14 March 2023

Power Platform Logging, Monitoring, and Alerting

This post relates to a previous strategy blog post, read that first https://www.pbeck.co.uk/2023/02/setting-up-azure-application-insights.html

Overview:  Microsoft uses Azure Application Insights to natively monitor Power Apps using an instrumentation key at the app level.  To log for model driven apps and Dataverse this is a Power Platform config at the environment level e.g. UAT, Prod.

When setting up Application Insights, use the Log Analytics workspace approach and not the "Classic" option as this is being deprecated.

Power Apps (Canvas Apps): Always add the instrumentation key to all canvas apps, it is set in the "App" level within each canvas app.  Deploy solutions brings challenges with changing keys for app insights logging (unmanaged layers).

"Enable correlation tracing" feature imo. should always be turned on, it is still an experimental feature but with it off, the logging is based on a sessionid

"Pass errors to Azure Application Insights" is also an experimental feature.  Consider turning it on.

Canvas Apps have "Monitor", Model driven apps also have this ability to monitor, and Power automate has it's own monitoring

Log to App Insights (put app insights on Azure Log analytics), simple example with customDimensions/record.

Trace("My PB app ... TaxAPI.NinoSearch Error - Search - btnABC",
        TraceSeverity.Error, // use the appropriate tracing level
        {
            myappName: $"PB App: {gblTheme.AppName}",
            myappError: FirstError.Message,  // optional
            myappEnvironment: gblEnv,
            myappErrorCode: 10010,
            myappCorrelationId: GUID() // unique correlationId
        }
    );
Query the logs using kusto:
traces
| extend errCode = tostring(customDimensions["myappErrorCode"]), err = tostring(customDimensions["myappError"])
| where errCode == "100100"

Coming June 2023

Push cloud flow execution data into Application Insights | Microsoft Learn

Allows logging to tie Flows back to the calling Canvas app.  You can now do this manually but it has to be applied at all calls to or after the flow.

Below is a basic checklist of decisions to ensure you have suitable logging

Logging Checklist:

  1. Setup Azure Log Analytics (1 per DTAP env e.g. uat, prd)
  2. Get the workspace key needed for logging to Log analytics "Agents" > "Log Analytics agent instructions", copy the Workspace Id and the Secondary Key
  3. Create an Azure Application Insights per DTAP
  4. Each Canvas app needs an instrumentation key (check, have you aligned DTAP log instances with the Canvas App DTAP)
  5. Power Automate has great monitoring, but it is a good idea to setup logging for Dataverse (which shall cover model apps), done thru Power Platform Admin Studio > Environment
  6. Enable Logging Preview Feature for Canvas apps & check the power automate push cloud execution feature state.
  7. Do you have logging patterns in you Canvas app for errors, do you add tracing, and is it applied consistently?
  8. Do you have a Pattern for Power Automate runs from Canvas apps?  I like to log if the workflow errors after the call.
  9. Do you have a Pattern for Custom Connectors?
  10. Do you correlation trace Custom API (internal and 3rd party)? 
  11. Do you have a Try, Catch, Finally scope/pattern for Workflows.  How do you write to the logs, most common is to use an Azure Function with the C# SDK.  I like to use the Azure Log Analytics Connector in my catch scope to push error info into the workspace log using a custom table.
  12. Ensure all Azure Services have instrumentation keys. Common examples are Azure Functions, Azure Service Bus, API Manager, the list goes on...
  13. Do you implement custom APIM monitoring configuration?
  14. Do you use the SDK in your code (Functions etc.)?
  15. Setup Availability tests - super useful for 3rd party API's.

Once you have the logs captured and traceable (Monitor & Alerting Checklist):

  1. Create Queries to help find data
  2. Create monitoring dashboard using the data
  3. Use OOTB monitoring for Azure and the platform
  4. Consider linking/embedding to other monitors i.e. Power Automate, DevOps, Postman Monitor
  5. Setup alerting within the Azure Log Workspace using groups, don't over email.  For information alerts, send to Slack or Teams (very simple to setup a webhook or incoming email on a channel to monitor)
  6. Power Automate has connectors for adaptive cards channel messaging, consider using directly in Flows or from alerts, push the data into a flow that will log the alert using an adaptive card right into the monitoring channel.

Sunday 1 March 2020

Power Automate Notes

What is Power Automate?
Power Automate previously called Flow.  Power Automate contains "Flows".  Power Automate is workflow including RPA options, refered to a Power Automate Desktop (PAD).  Power Automate is a workflow engine that is based on Azure Logic Apps.  Powerful extendable workflow solution for low code automation.  Allows workflows to be easily automated with 3rd part systems e.g. SAP.

Used for:
  1. Personal Workflows e.g. I send an email to all people that have not update the DevOps Scrum board on a in the last day as a scrum master.
  2. Business Process e.g. Holiday request form.  If more than 10 days, need senior manager approval.  Generate RFP based on an event.  Historically, used K2 or Nintex or WCF workflows for business processes.
  3. Integration: e.g., move twitter posts into my warehouse for data-mining later.
A key concept is who shall the flow run as, a user or a service principal.  Setup a Service Principal in Power Automate » Benedikt's Power Platform Blog (benediktbergmann.eu)

Flows run against the owners account for calculating allowable usage (default), it is common practice to use a service account (normal AAD account with user and Pswd) to own specific flows.  A service account is a normal user account from AAD/Entra's perspective.  A dedicate account ensures that when a specific user leaves your business, the flow continues to run.
Img 1. Usage limits in Flows are counted against the flow owner.



Agree Power Automate Standards:
  1. Flow Names: start with a verb  format: Verb + What the Flow does + (trigger) e.g. "Get Tax Rates (Instant)".  I like to also prefix with the Company and Project, but feel free to have a standard to suit your business.  e.g. EY-USTax Get State Taxes (Instant) or EY-USTax Get All US State Tax Rates (Scheduled) or Get SalesForce Data.  Optional, for project specific workflows I also prefix witht he project name e.g. USTax-GetTaxRate. 
  2. Description: Short description to help readers understand what the flow does.
  3. Actions: Leave the Action desc and add info e.g. Compose: Tax Note.
  4. Variables: prefix with "v" e.g. vTaxTotal in camelCase.  e.g. vUserAge.
  5. Error handling & Logging: Catch errors and log into to App Insights via an Azure Function or Log using the built in Azure Log Analytics Action.  More logging means better traceability.
  6. Scope: Add scope actions for try catch logic.  Add multiple actions inside a "Scope" Action
  7. Terminate the flow with the Terminate Action if the flow has failed.
  8. Environment Variables: Great for logging as I go thru DTAP.  Also see.
  9. Connection Reference Name: Agree a format, does this flow run as  user or as a specified user.
  10. Loop Sparingly, use First() for performance.
  11. Owner: I like to use a service account in dev, it's a good idea to add tech owners as when it needs updating to support and easily find who they should talk too.  Understand who you are running the flow as, this ties to licencing but is critical.  You need to know you Actions and licencing limits on a project.
  12. Comments:  Im not a huge fan as the naming should make most flows make sense/self documented, but for tricky logic, comments are great.  Agree a standard.
  13. Retry policy: What happens if an action fails, do you want to try again?  
I borrowed a lot of the standards from "Matthew Devaney"
I found this document later - it is excellent. by Prasad Athalye - Best Practices for Power Automate(Microsoft Flow) Development

Licencing:
  1. Seeded licence is part of O365.  Use standard functionality such as standard connectors without needing to pay more for advance.  The advance/premium connectors are not part of the O365 licence.
  2. Per User licence -  Allows the  user $15 retail, can get discount with bulk and can use the advanced connectors & on-prem. gateway.  Many users need multiple workflows, normally personal workflows.
  3. Per User RPA licence - same as above but also has amazing RPA capabilities.
  4. Per Flow/Process - $100 per process per month, min 5 flows per month licences.  Anyone can use as part of the process.  Use for few people but process does a lot of workflows.  Can add a process one at a time after the first 5.
Licencing MS page
Power Automate has some licence add-ons available: AI builder and an unattended RPA add-on.
"Power Apps licenses will continue to include Power Automate capabilities", I don't know what is included but I assume it means any connector I can use in Power Apps, assuming I'm in Power Apps I can make Flows for.

Build workflows:
  • Can get a dedicated IDE tool for Power Apps or use the browser (which i always use).
  • There are over 350 connectors (in both standard and premium) and you can always use a custom connector to any OpenAPI (Swagger) endpoint.
  • Templates have some examples and often help as a starting point to make your own custom Flows in Power Automate.
  • Easy clear tracing so you  can see what part of the workflow is working and where you fail, and you can drive into the issue.  Super easy to use.
  • Example of an Instant Cloud flow triggered by a canvas Power App...

Query a Dataverse table in a Flow using OData

ParseJSON is fantastic for converting Open API/OData/JSON into an object

Extending - break out for programmatic control, I use C# Functions from my Flows and call them via HTTP triggers.
Retrieving  a row from the Dataverse custom "Subject" table.

Robotic Process Automation (RPA):
  • Also known as UI Flows within Power Automate.  Microsoft have purchase and integrated Softomotive for UI flows to add WinAutomation.
  • Attend (user logged in) and unattended version (complete tasks without manual intervention)
  • Can have multiple instances
  • API is generally better than using RPA as it is versioned and generally not changeable, whereas using a website, they website can be changed causing the RPA flow to fail.  Useful for instance when the RESP API is incomplete.
  • Recording tool for creating UI flows - Web use Selenium to record.
  • 3 Types: 1) Windows /Desktop/Screen reader and 2) web/website (Selenium) and 3) WinAutomation (covers both Windows and Web, easy to use but not as full featured yet).
  • WinAutomation has a drag and drop IDE, has error handling.
  • UI flows are well priced.  Also get AI builder credits with UI flow licences.
  • "Power Automate Per user plan with attended RPA to use UI flows and WinAutomation" Microsoft.
AI Builder:
Cognitive builder e.g recognize forms and extract data. E.g. receive invoices and add to accounting SaaS software.

Other: Zapier is a good tool for end user automation.  Easier than Power Automate but not as structured.  I'd use Zapier to automate in small businesses without O365 or flow licencing and allow end users to do it themselves.

Problem Solving:

Problem:  Another developer created a Flow I need to use from a Canvas page inside a model app.  The Flow is showing up when i say add but i get the error "Unable to add flow"
Initial Hypothesis: The flow is owned by another developer having ownership and the connection is done thru their account, take ownership and change the Connection (Dataverse connection in my case)
ResolutionTo use an existing flow, you need: 1) flow in same solution as the canvas app, 2) Ownership and the connection string needs to be switched to the new developer

Problem: Migrating solutions between environments, all the workflows fail when they use the Dataverse connector with 403 errors. Tracing the flows, I can see the error  "Flow client error returned with status code 'BadRequest' and details {error: code: 'XrmApplyUserFailed... UserNotinActiveDirector ... does not exist in user tenantId"
Problem: Using the service Principal account needs to be registered in the new Dataverse environment.
The connector issue was fixed, we had to recreate the connection from scratch making sure it was set up with a/the service principal. Then we added the registered app into the environment as an application. user.


Sunday 15 July 2018

Power Platform Notes - Power Apps, CDS, Power BI & Power Automate

Power Apps

Variables:
Microsoft Docs "PowerApps and Excel both automatically recalculate formulas as the input data changes".
  • Contextual variables - scoped at a screen level
  • Global variables - scoped app level
  • Fx> Set(MyUniqueClientNo, 12)
Functions:
Fx> UpdateContext({MyTimesheetId: 34})
Tip from Shane Young:  Note the setting variable may be the reference, so for a control use:
Fx> UpdateContext({MyTimesheetId: txtTimesheetId.Text}) not
Fx> UpdateContext({MyTimesheetId: txtTimesheetId.Text}) unless you want the context to float

Pass a variable to another screen use the Navigate overload, OnSelect property of a button
Fx> Navigate(Screen2, ScreenTransition.None, {TSvar: MyTimesheetId}
MyTimeSheet Id is a contextual variable

If Statement:
Fx> If(MyUniqueClientNo = 12, lblAns.text = 'yes', lblAns.text = 'No')

Environments:
  1. Power Platform allows you to create multiple environments, each environment is connected to you AAD tenant.
  2. Policies can be applied to prevent DLP against all environments or specific environments.
  3. Use Environments for DTAP, business units, or Extranet vs Intranet.
  4. Updated June 2022: Environments can be of type: Default (don't use and rename), Developer, Trial, Sandbox or Production.
  5. Each Power App environment can have no CDS/Datavserse or a single CDS/Dataverse connect.
Manged vs Un-Managed Solutions:
Solution are useful for packaging and moving assets between environments or tenants.  Manged restricts who can edit the applications in the solution.  I like to keep each solution with a single app in it.  On large projects, it's a good idea to keep environment variables, custom connectors, cloud flows, apps in separate solutions.  It makes it easier to deploy pieces.    

Managed Environments:
Don't mix up with Managed solutions as I did recently.  Managed environments allow for a host to control all the environments.  You also should use a dedicated host production environments with Dataverse database to control all the environments.  You need to install solution on the host.  Managed environment have no correlation to managed solutions.  You need managed environments is you are going to use Power Platforms simple ALM for App deployment.

ALM:
Power Apps has it's own source control and you can manually export and import entire projects "Export package" not connections are not export as part of the package.  Solutions are the best way to move code between environments.  It's a good idea to use environment variables  Get ADO, more secure, better tooling using solutions over manual exporting code using packages.  Power Apps Build Tools allows for ADO/DevOps integration generally using solutions to deploy.  At a minimum you should have your own dev env (Sandbox or Developer type), a test (Sandbox type) and production (production type env) for any serious app you build.

Add "Power Platform Tools"

Example Power Platform ALM deployment using DevOps - 5 environments

Update Jan 2022
Solutions are used to in Power Apps to package and move code and resources such as flows, env variables, canvas apps, Dataverse tables between environments.  It's a good idea to not use the default environment as everyone has access to the environment.  The DTAP around ALM can get pretty complex and I like to keep it fairly simple.  Connectors can get nasty in packaging.  ADO using AppBuild Tools makes for a good CI/CD solution for Power Apps.

Environment variables are a great way to manage configuration, they can also easily be configure in the pipelines to replace with the appropriate values.  Tip:  Don't ever set the current value in an env var it get carried thru to the next DTAP env.  Also env vars support Azure Key Vault, use it for secrets.

Custom Connectors and getting Connection References correct when using managed environments to deploy code need to use environment variables so they can easily be set for each environment using CI/CD.  It's a good idea to create connections in solutions and use a dedicate Owner account.  Connection references in solutions - Power Apps | Microsoft Learn

There is also a great tool to unpack .msapp solutions for raw review and storage in git.  https://powerapps.microsoft.com/en-us/blog/source-code-files-for-canvas-apps/ for me, it's main use is verifying code and standards and comparing versions.  Post on unpacking solutions.  You can also see how many controls you have in you Canvas app.  MS have the guideline of 500 controls in an app.  

Have a: 
  1. Dev environment that is of the "sandbox" type.  Have an un-managed solution that will act as your base canvas app for all new Power Apps.  When you export the solution remember to export as a managed solution and ensure environment variables are emptied before publishing the export.  Good place to backup to a repository such as TFS Azure DevOps/Git.
  2. Test env of the type "production" or "sandbox".  Only allow managed solutions to be deployed.  Watch out for custom connectors, I can't get them to deploy correctly in solutions.
  3. Prd env must be of type "production".
Manual ALM i.e. Use PowerApps to store Dev versions, Exporting solutions and unpacking managed solutions into UAT and production.  Manual ALM is good even manually moving the applications by hand is simple and safe.  There are backups and manual backups to avoid a source code repository.  I also like the testing framework in Power Apps.  Using the build tools getting full ALM/CI/CD it is easy to then move your Power Platform into a higher ALM CI/CD level using the Power Platform Build tools, DevOps, and PowerShell cmds.  
Update April 2022: Great simple video on ADO for Power Platform 

Licencing:
Updated: Power Apps Licencing Summary as of 30 December 2019
"To run standalone apps, guest users need the same license as users in your tenant."  Microsoft Docs.  
  • PowerApps using AAD B2B user (both members and guest) using standalone Power Apps shall require the Power Apps licence (not necessarily Portal Apps).  Steps to Share Power Apps with Guests
  • SharePoint user interacting with a Power Apps form on a SharePoint list do not require a Power Apps licence.
Coding Standards for Power Apps:
  • Microsoft provide a whitepaper as a suggestion for using naming/coding standards.
  • Controls and variables should use prefixed Camel case e.g. txtContactEmail
  • Data sources use Pascal Case e.g. ContactUsers 
  • Screen must be Full names e.g. Edit Contact User Screen as they are read as named by screen readers like JAWS and Microsoft reader.
  • Prefix controls and collections (Label is lbl, Textbox is txt, Button is btn, Collection is col, drop down lists are ddl, Form is frm, Group is grp, Gallery is gal, Icon is ico, Images are img, Table is tbl.  I also prefix my components with com.
  • Variable naming is: gbl for Global and context variables are var
  • Use Containers for layout and groups for control aggregation.
  • Use TabIndex so keyboard navigation makes sense.  Use the AccessibleLable over HintText.  Ensure colour contracts and use the approved colour templates stored in global variables.  All fonts, sizing and colouring must come from global variables so the app can be easily changed.  Use the App Checked and complete all identified accessibility issues/concerns.  Test on actual devices and use screen readers.
  • Ensure App Insights is setup, capture and log errors, also use the monitoring (maybe not in production).
Power Apps Portals:
Build responsive website using CDS.  Allow anonymous access or implement multiple Identity Providers (IdP) such as AAD B2C, AAD, Google+, LinkedIn.

Updated: 28 July 2018:
Common Data Service (CDS): Comes from Dynamics CRM Sales, pretty much used like CT's in SharePoint.
  • Based on Azure SQL & uses CosmosDB with a nice API that support REST/ODatav4;  
  • Has Row, field RBAC and uses AAD for authentication;
  • Allows Power BI, Power Apps (previously labeled as PowerApps) & Flow (Power Automate - new name since Ignite 2019) to work with CDS (renamed to Dataverse);
  • Use Power Query to Import data easily or data flows
  • Multiple data source such as SQL Server, Cosmos, files, Excel, Search, 3rd parties such as SAP.
  • CDS is not part of the O365 licencing (including E5), you need to get a PowerApps P1 or P2 (Now Power Apps for Users as licencing changed circa Oct 2019).  Note Power Apps licencing included with O365 is extremely limited and for instance does not cover CDS;
  • One Power Platform Environment ties to one CDS/Dataverse database
  • Any type of data; any type of app.  
  • Easy to use; secure.
https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/data-platform-intro

Examples: 
Insert or Update using Patch to CDS:
Patch(
        Vote,
        Defaults('Employee Sentiment'),
        {
            userid: UserID,
            vote: ThisItem.Vote,
            votedate: Now()
        }
    );
Select from CDS:
UpdateContext( { locVotes: LookUp('Vote', votedate = <date>) });

Tip: XrmToolBox is a fantastic tool.  Multiple contributors have added to the client application, there are so many useful features for instance FetchXMLBuilder allows you to query the Dataverse tables.

Power Automate

Updated 11 December 2019:
Microsoft Power Automate (previously called Microsoft Flow, is an user friendly workflow tool/service): Simple workflow engine sitting on Azure Logic App.
Basic business process flows are:
  1. Automated Flows - used when event/status change e.g. CDS row status changes
  2. Button Flow/Instant Flow - Events such as a click triggers the flow;
  3. Scheduled Flow - in effect timer jobs;
  4. Business Process Flow - ensures users enter  data consistently (think Windows Presentation Framework) & follows the same steps every time; and
  5. UI Flow (preview) - Provides RPA capabilities;
https://docs.microsoft.com/en-us/power-automate/get-started-logic-flow

Building Blocks for Power Automation are:
  • Trigger - what starts the workflow, can be manual trigger or an automated trigger
  • Actions - ,
  • Conditions,
  • Data operations, 
  • Expressions - basic operations like adding two numbers together, making a string upper case, and
  • Flow Type - see the 5 flow types above.
Display directions using Map on a Image control in PowerApps:
https://dev.virtualearth.net/REST/v1/Imagery/Map/road/Routes/driving?waypoint.1=51.525503,-0.0822229&waypoint.2=SE9%4PN&mapSize=600,300&key=<key>
https://dev.virtualearth.net/REST/v1/Imagery/Map/road/Routes/driving?waypoint.1=" & EncodeUrl(txtDriverLocation.Text) & "&waypoint.2=" & EncodeUrl(txtDriverLocation.Text) & "&mapSize=600,300&key=AsR555key

Wednesday 6 June 2018

SharePoint Online Replacement Patterns in Diagrams

Overview: This Post highlights my default position for achieving Common SharePoint solutions using SharePoint Online, flow and Azure Functions.


Matt Wade has a great resource on the components making up O365.
https://app.jumpto365.com/

Sunday 13 September 2015

SharePoint 2013 Workflow options - notes

Overview:  There are a lot of workflow options and each of the solutions lend themselves favorably to different circumstances.  In this post I look at the more common options around workflow for SharePoint.  The 3 options I'm exploring are: K2 blackperl, Nintex and SP2013 WorkFlow Manager.  Also note that existing SP2010 workflow still exists and is an option if your business has workflows on the platform already or you have this skill set available.  There are other products but these are the main stream options.

So each of these products has their place and suit different organisations.  This post is my opinion and I am not a workflow expert and show my thoughts on when I would favor 1 of the approaches.

Licencing:  Workflow Manager does not have the licencing costs.  Nintex has a server and CAL licencing model and K2 has a server licencing model.

Skills:  what are your existing in-house skills.  If you already have K2 or Nintex expertise it makes these products far more attractive.

Size:  How big is your organisation, how complex are the workflows, how many workflows and how often do they change shall influence the workflow option to select.

SharePoint 2013 Workflow Manager
SharePoint 2013 introduces an new standalone workflow engine based on Windows Workflow Manager 1.0 using .NET 4.5.  In the SP 2013 world, Office Web Apps (OWA) and Workflow Manager runs as a service separate from SharePoint 2013.
  • SharePoint Designer 2013
  • Ideal for simple or medium complexity workflow processes
  • Limited to a pre-defined set of activities and actions
  • Relatively quick and easy to configure
  • Custom workflow development through Visual Studio
  • Can implement state-machine workflows
  • Supports custom actions/activities
  • Supports debugging
  • Ideal for modelling complex processes
  • Requires a developer
  • Workflow Manager
  • High Density and Multi-Tenancy
  • Elastic Scale
  • Fully Declarative Authoring
  • REST and Service Bus Messaging

Nintex
  • On-premise and cloud workflows – but cloud workflows do not allow custom actions
  • Nintex uses the SharePoint workflow engine
  • Easy to create Nintex workflows (good tooling) but not so easy to upgrade and maintain if complex – they require a proper dev environment if workflows require changing
  • Tight coupling with SharePoint – so upgrades need to be planned. Some workflows have broken after upgrade.
  • Can create custom activities but these are limited to constraints imposed by Nintex design surface
  • More suited to State machine workflows using reusable custom modules and user defined actions.
  • Nintex uses its own database which you will need intimate knowledge of when it comes to performance issues.

K2
K2 – technology agnostic – best suited if SharePoint is only a part of your technology snapshot, some folks consider K2 a BPM product.
Pros:
  • Off box WF hosting:  Allows for increasing the number of blackperl servers and no resource overlap, flexible licencing model as it is server based
  • Well tried and tested workflow engine
  • Good reporting and troubleshooting
  • Excellent SOA layer (SmartObjects) with multiple products.  This is more an EA feature as it can be a great way to create an SOA.  Allows API to connect to custom SQL, CRM, SAP, Web Services.
  • Proven advanced tooling, good visual tooling (not as good as Nintex IMHO)
Cons:
  • Cost is relatively high, support costs are extensive, need to pay for dev and pre-prod licence
  • Not based on the latest MS workflow engine
  • Not easy for end users to build WF (despite marketing noise)
  • Setup and monitoring is specialised and will require advanced K2 expertise
    Difficult to back out of product
  • Tooling requires training and breaking out of OOTB features requires a high level of expertise and dependency on K2
  • Support tended to be patchy with technical knowledge
Updated: 2017-11-03.  Possible Extranet facing blackpearl Infrastructure design
Summary:
K2 is a good product if you need to build an SOA layer for integration, are prepared to install correctly (cost) and maintain.  You shall need dedicated workflow people to create the workflows.  So in the right circumstances it has it’s place.

Updated 11 December 2019:
Microsoft Power Automation (formerly Microsoft Flow) is the default workflow option when working with the Microsoft Power Platform (Power Apps, Power Automation and Power BI).  O365, SPO and D365 can also use Microsoft Power Automation.  Azures Logic Apps is also a good option especially if your application is C#/Azure based and not within one of the SaaS Azure offerings.