Tuesday, 6 August 2024
Using Power automate to populate Excel files with data
Power Automate Flows containing Approvals failing with XrmApprovalsUserRoleNotFound
Overview: The Power Automate Approval action on an existing flow stops working and throws the error: "'XrmApprovalsUserRoleNotFound'. Error Message: The 'Approvals User' role is missing from the linked XRM instance." I did not have control on the environment so I needed others to perform the fix so that I could verify.
Problem:
The ‘Start
and wait for an approval’ action/connector used in a Power Automate flow has
been failing for 3 weeks in Dev, I now need this functionality to change the
workflow ‘IDC-StartApprovalCertificateWorkflow’.
Initial
Hypothesis:
All the runs of my flow have been failing for 3 weeks in Dev for 3 weeks on the ‘Start and wait for an approval’ in the dev env. I have tried creating a new vanilla flow using the ‘Start and wait for an approval’ action and it fails with the same issue.
Triggering the flow in other environment including my production, and the ‘Start and wait for an approval’ action works. I cannot see any difference except the environments. The error message "XrmApprovalsUserRoleNotFound" is basically telling me that my user should be in the Approval Users role. I have the role assigned.
- Env: Client-DEV
- Type: Automated
- Plan: This flow runs on owner's plan (paul@client.com)
Resolution:
Microsoft
Support: Check the
user running the flow is int he 'Approvals User' role is correctly assigned in
the environment user security roles.
Admin: The user running the flow already
has the role assigned. We have re-assigned the role again. Did not test, got the developer/owner to
test.
Developer/me/flow owner: The approval has started
working again in the Dev environment, I just retested and flow that was not firing yesterday is now working again. New flows also fire/work correctly.
Summary: The user role in the Dataverse was correctly assigned, it looks like a refresh of the user in the 'Approver User' role corrected the issue.
Research:
https://www.linkedin.com/pulse/power-automate-approvals-flows-failing-adrian-colquhoun
Sunday, 18 June 2023
App Insights for Power Platform - Part 6 - Power Automate Logging
Note: Announcement 23 Aug 2023 - integration of Power Automate telemetry data with Azure Application Insights.
Series
App Insights for Power Platform - Part 1 - Series Overview
App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics
App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)
App Insights for Power Platform - Part 4 - Model App Logging
App Insights for Power Platform - Part 5 - Logging for APIM
App Insights for Power Platform - Part 6 - Power Automate Logging (this post)
App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards
App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics
App Insights for Power Platform - Part 9 - Power Automate Licencing
App Insights for Power Platform - Part 10 - Custom Connector enable logging
App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern
- Power Automate holds it own set of logs and history. While Power Automate's internal logging is good and useful, it does not push logs into App Insights or Log Analytics for central monitoring.
- You can manually export Power Automate logs from Power Platform and import them into Log Analytics, using the "Data Export" option.
- You can use the built in action to log directly to Azure Log Analytics with a flow. Here is a 4 min recording outlining the approach in a Power Automate Flow.
- Or you can create an Azure Functions that you can use to write to App Insights, below is a simple recording of the function (this is recording aims to remove complexity so there is no VS Code or publishing.
#r "Newtonsoft.Json"using System.Net;using Microsoft.AspNetCore.Mvc;using Microsoft.Extensions.Primitives;using Newtonsoft.Json;public static async Task<IActionResult> Run(HttpRequest req, ILogger log){string requestBody = await new StreamReader(req.Body).ReadToEndAsync();dynamic data = JsonConvert.DeserializeObject(requestBody);int eventId = data?.eventId;string errType = data?.errType;string errMsg = data?.errMsg;string correlationId = data?.correlationId;string workflowId = data?.workflowId;string workflowUrl = data?.workflowUrl;string flowDisplayName = data?.flowDisplayName;var custProps = new Dictionary<string, object>(){{ "CorrelationId", correlationId},{ "WorkflowId", workflowId},{ "WorkflowUrl", workflowUrl},{ "WorkflowDisplayName", flowDisplayName}};using (log.BeginScope(custProps)){if (errType=="Debug"){log.Log(LogLevel.Debug, eventId, $"{errMsg}");}else if (errType=="Critical"){log.Log(LogLevel.Critical, eventId, $"{errMsg}");}else if (errType=="Warning"){log.Log(LogLevel.Warning, eventId, $"{errMsg}");}else if (errType=="Trace"){log.Log(LogLevel.Trace, eventId, $"{errMsg}");}else if (errType=="Error"){log.Log(LogLevel.Error, eventId, $"{errMsg}");}else{log.LogInformation($"Event is {eventId}, type is {errType}, and msg is {errMsg}");}};string responseMessage = $"This HTTP triggered function executed successfully. {errType} - {errMsg}";return new OkObjectResult(responseMessage);}
Power Platform Admin Centre:
There is nice analytics inside the Power platform Admin Centre as shown below to examine Flows/Power automate:
The flows can also be reviewed on a per environment basis:Series
App Insights for Power Platform - Part 1 - Series Overview
App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics
App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)
App Insights for Power Platform - Part 4 - Model App Logging
App Insights for Power Platform - Part 5 - Logging for APIM
App Insights for Power Platform - Part 6 - Power Automate Logging (this post)
App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards
App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics
App Insights for Power Platform - Part 9 - Power Automate Licencing
App Insights for Power Platform - Part 10 - Custom Connector enable logging
App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern
Friday, 19 May 2023
Logging and Monitoring Advanced Canvas Apps
Overview: Most Canvas apps reach back to 3rd parties or internal storage e.g. SQL, Dataverse, blobs to persist data. In an ideal world we want to be able to identify error and performance concerns and be able to trace quickly.
Scenario: A canvas app performs a search against a custom connector, that goes to an external API, for example search's for Company Numbers from an Open API.
- The annotated diagram, in orange records the button pressed to run the search. This only shows if the "Upcoming feature", "Enable Azure Application Insights correlation tracing" is enabled.
- A custom connector, in blue, makes a call to an REST/Open API.
- The last call is to an external API, highlighted in purple, (in this case it was actually a mocked APIM endpoint, but the icon would change it it was totally external such as call API search to the IRS, HMRC, Inland Revenue.
Tip: Always create App Insight Instances using a workspace and not the classic-mode app insights as it is being deprecated by Microsoft in early 2024.
App Insights Understanding:
App Insights setup using the Workspace-based setup (Log Analytics) can be queried in two ways:
- Query thru App Insights
- Query thru the Log Analytics workspace (the Kusto/KQL is slightly different, but it's rather minor)
The two options for logging Flows into App Insights. |
Tuesday, 14 March 2023
Power Platform Logging, Monitoring, and Alerting
This post relates to a previous strategy blog post, read that first https://www.pbeck.co.uk/2023/02/setting-up-azure-application-insights.html
Overview: Microsoft uses Azure Application Insights to natively monitor Power Apps using an instrumentation key at the app level. To log for model driven apps and Dataverse this is a Power Platform config at the environment level e.g. UAT, Prod.
When setting up Application Insights, use the Log Analytics workspace approach and not the "Classic" option as this is being deprecated.
Power Apps (Canvas Apps): Always add the instrumentation key to all canvas apps, it is set in the "App" level within each canvas app. Deploy solutions brings challenges with changing keys for app insights logging (unmanaged layers).
"Enable correlation tracing" feature imo. should always be turned on, it is still an experimental feature but with it off, the logging is based on a sessionid
"Pass errors to Azure Application Insights" is also an experimental feature. Consider turning it on.
Canvas Apps have "Monitor", Model driven apps also have this ability to monitor, and Power automate has it's own monitoring
Log to App Insights (put app insights on Azure Log analytics), simple example with customDimensions/record.
Trace("My PB app ... TaxAPI.NinoSearch Error - Search - btnABC", TraceSeverity.Error,
// use the appropriate tracing level
{ myappName: $"PB App: {gblTheme.AppName}", myappError: FirstError.Message,
// optional
myappEnvironment: gblEnv, myappErrorCode: 10010, myappCorrelationId: GUID()
// unique correlationId
} );
Query the logs using kusto:
traces
| extend errCode = tostring(customDimensions["myappErrorCode"]), err = tostring(customDimensions["myappError"])
| where errCode == "100100"
Coming June 2023
Push cloud flow execution data into Application Insights | Microsoft Learn
Allows logging to tie Flows back to the calling Canvas app. You can now do this manually but it has to be applied at all calls to or after the flow.
Below is a basic checklist of decisions to ensure you have suitable logging
Logging Checklist:
- Setup Azure Log Analytics (1 per DTAP env e.g. uat, prd)
- Get the workspace key needed for logging to Log analytics "Agents" > "Log Analytics agent instructions", copy the Workspace Id and the Secondary Key
- Create an Azure Application Insights per DTAP
- Each Canvas app needs an instrumentation key (check, have you aligned DTAP log instances with the Canvas App DTAP)
- Power Automate has great monitoring, but it is a good idea to setup logging for Dataverse (which shall cover model apps), done thru Power Platform Admin Studio > Environment
- Enable Logging Preview Feature for Canvas apps & check the power automate push cloud execution feature state.
- Do you have logging patterns in you Canvas app for errors, do you add tracing, and is it applied consistently?
- Do you have a Pattern for Power Automate runs from Canvas apps? I like to log if the workflow errors after the call.
- Do you have a Pattern for Custom Connectors?
- Do you correlation trace Custom API (internal and 3rd party)?
- Do you have a Try, Catch, Finally scope/pattern for Workflows. How do you write to the logs, most common is to use an Azure Function with the C# SDK. I like to use the Azure Log Analytics Connector in my catch scope to push error info into the workspace log using a custom table.
- Ensure all Azure Services have instrumentation keys. Common examples are Azure Functions, Azure Service Bus, API Manager, the list goes on...
- Do you implement custom APIM monitoring configuration?
- Do you use the SDK in your code (Functions etc.)?
- Setup Availability tests - super useful for 3rd party API's.
Once you have the logs captured and traceable (Monitor & Alerting Checklist):
- Create Queries to help find data
- Create monitoring dashboard using the data
- Use OOTB monitoring for Azure and the platform
- Consider linking/embedding to other monitors i.e. Power Automate, DevOps, Postman Monitor
- Setup alerting within the Azure Log Workspace using groups, don't over email. For information alerts, send to Slack or Teams (very simple to setup a webhook or incoming email on a channel to monitor)
- Power Automate has connectors for adaptive cards channel messaging, consider using directly in Flows or from alerts, push the data into a flow that will log the alert using an adaptive card right into the monitoring channel.
Sunday, 1 March 2020
Power Automate Notes
Power Automate previously called Flow. Power Automate contains "Flows". Power Automate is workflow including RPA options, refered to a Power Automate Desktop (PAD). Power Automate is a workflow engine that is based on Azure Logic Apps. Powerful extendable workflow solution for low code automation. Allows workflows to be easily automated with 3rd part systems e.g. SAP.
Used for:
- Personal Workflows e.g. I send an email to all people that have not update the DevOps Scrum board on a in the last day as a scrum master.
- Business Process e.g. Holiday request form. If more than 10 days, need senior manager approval. Generate RFP based on an event. Historically, used K2 or Nintex or WCF workflows for business processes.
- Integration: e.g., move twitter posts into my warehouse for data-mining later.
![]() |
Img 1. Usage limits in Flows are counted against the flow owner. |
- Flow Names: start with a verb format: Verb + What the Flow does + (trigger) e.g. "Get Tax Rates (Instant)". I like to also prefix with the Company and Project, but feel free to have a standard to suit your business. e.g. EY-USTax Get State Taxes (Instant) or EY-USTax Get All US State Tax Rates (Scheduled) or Get SalesForce Data. Optional, for project specific workflows I also prefix witht he project name e.g. USTax-GetTaxRate.
- Description: Short description to help readers understand what the flow does.
- Actions: Leave the Action desc and add info e.g. Compose: Tax Note.
- Variables: prefix with "v" e.g. vTaxTotal in camelCase. e.g. vUserAge.
- Error handling & Logging: Catch errors and log into to App Insights via an Azure Function or Log using the built in Azure Log Analytics Action. More logging means better traceability.
- Scope: Add scope actions for try catch logic. Add multiple actions inside a "Scope" Action
- Terminate the flow with the Terminate Action if the flow has failed.
- Environment Variables: Great for logging as I go thru DTAP. Also see.
- Connection Reference Name: Agree a format, does this flow run as user or as a specified user.
- Loop Sparingly, use First() for performance.
- Owner: I like to use a service account in dev, it's a good idea to add tech owners as when it needs updating to support and easily find who they should talk too. Understand who you are running the flow as, this ties to licencing but is critical. You need to know you Actions and licencing limits on a project.
- Comments: Im not a huge fan as the naming should make most flows make sense/self documented, but for tricky logic, comments are great. Agree a standard.
- Retry policy: What happens if an action fails, do you want to try again?
- Seeded licence is part of O365. Use standard functionality such as standard connectors without needing to pay more for advance. The advance/premium connectors are not part of the O365 licence.
- Per User licence - Allows the user $15 retail, can get discount with bulk and can use the advanced connectors & on-prem. gateway. Many users need multiple workflows, normally personal workflows.
- Per User RPA licence - same as above but also has amazing RPA capabilities.
- Per Flow/Process - $100 per process per month, min 5 flows per month licences. Anyone can use as part of the process. Use for few people but process does a lot of workflows. Can add a process one at a time after the first 5.
Power Automate has some licence add-ons available: AI builder and an unattended RPA add-on.
"Power Apps licenses will continue to include Power Automate capabilities", I don't know what is included but I assume it means any connector I can use in Power Apps, assuming I'm in Power Apps I can make Flows for.
Build workflows:
- Can get a dedicated IDE tool for Power Apps or use the browser (which i always use).
- There are over 350 connectors (in both standard and premium) and you can always use a custom connector to any OpenAPI (Swagger) endpoint.
- Templates have some examples and often help as a starting point to make your own custom Flows in Power Automate.
- Easy clear tracing so you can see what part of the workflow is working and where you fail, and you can drive into the issue. Super easy to use.
- Example of an Instant Cloud flow triggered by a canvas Power App...
Query a Dataverse table in a Flow using OData |
Extending - break out for programmatic control, I use C# Functions from my Flows and call them via HTTP triggers.
Retrieving a row from the Dataverse custom "Subject" table. |
- Also known as UI Flows within Power Automate. Microsoft have purchase and integrated Softomotive for UI flows to add WinAutomation.
- Attend (user logged in) and unattended version (complete tasks without manual intervention)
- Can have multiple instances
- API is generally better than using RPA as it is versioned and generally not changeable, whereas using a website, they website can be changed causing the RPA flow to fail. Useful for instance when the RESP API is incomplete.
- Recording tool for creating UI flows - Web use Selenium to record.
- 3 Types: 1) Windows /Desktop/Screen reader and 2) web/website (Selenium) and 3) WinAutomation (covers both Windows and Web, easy to use but not as full featured yet).
- WinAutomation has a drag and drop IDE, has error handling.
- UI flows are well priced. Also get AI builder credits with UI flow licences.
- "Power Automate Per user plan with attended RPA to use UI flows and WinAutomation" Microsoft.
Problem: Migrating solutions between environments, all the workflows fail when they use the Dataverse connector with 403 errors. Tracing the flows, I can see the error "Flow client error returned with status code 'BadRequest' and details {error: code: 'XrmApplyUserFailed... UserNotinActiveDirector ... does not exist in user tenantId"
Sunday, 15 July 2018
Power Platform Notes - Power Apps, CDS, Power BI & Power Automate
Power Apps
Variables:Microsoft Docs "PowerApps and Excel both automatically recalculate formulas as the input data changes".
- Contextual variables - scoped at a screen level
- Global variables - scoped app level Fx> Set(MyUniqueClientNo, 12)
Fx> UpdateContext({MyTimesheetId: 34})
Fx> UpdateContext({MyTimesheetId: txtTimesheetId.Text}) unless you want the context to float
Pass a variable to another screen use the Navigate overload, OnSelect property of a button
Fx> Navigate(Screen2, ScreenTransition.None, {TSvar: MyTimesheetId}
MyTimeSheet Id is a contextual variable
If Statement:
Fx> If(MyUniqueClientNo = 12, lblAns.text = 'yes', lblAns.text = 'No')
Environments:
- Power Platform allows you to create multiple environments, each environment is connected to you AAD tenant.
- Policies can be applied to prevent DLP against all environments or specific environments.
- Use Environments for DTAP, business units, or Extranet vs Intranet.
- Updated June 2022: Environments can be of type: Default (don't use and rename), Developer, Trial, Sandbox or Production.
- Each Power App environment can have no CDS/Datavserse or a single CDS/Dataverse connect.
ALM:
Power Apps has it's own source control and you can manually export and import entire projects "Export package" not connections are not export as part of the package. Solutions are the best way to move code between environments. It's a good idea to use environment variables Get ADO, more secure, better tooling using solutions over manual exporting code using packages. Power Apps Build Tools allows for ADO/DevOps integration generally using solutions to deploy. At a minimum you should have your own dev env (Sandbox or Developer type), a test (Sandbox type) and production (production type env) for any serious app you build.
Add "Power Platform Tools" |
- Dev environment that is of the "sandbox" type. Have an un-managed solution that will act as your base canvas app for all new Power Apps. When you export the solution remember to export as a managed solution and ensure environment variables are emptied before publishing the export. Good place to backup to a repository such as TFS Azure DevOps/Git.
- Test env of the type "production" or "sandbox". Only allow managed solutions to be deployed. Watch out for custom connectors, I can't get them to deploy correctly in solutions.
- Prd env must be of type "production".
Licencing:
Updated: Power Apps Licencing Summary as of 30 December 2019
- PowerApps using AAD B2B user (both members and guest) using standalone Power Apps shall require the Power Apps licence (not necessarily Portal Apps). Steps to Share Power Apps with Guests.
- SharePoint user interacting with a Power Apps form on a SharePoint list do not require a Power Apps licence.
- Microsoft provide a whitepaper as a suggestion for using naming/coding standards.
- Controls and variables should use prefixed Camel case e.g. txtContactEmail
- Data sources use Pascal Case e.g. ContactUsers
- Screen must be Full names e.g. Edit Contact User Screen as they are read as named by screen readers like JAWS and Microsoft reader.
- Prefix controls and collections (Label is lbl, Textbox is txt, Button is btn, Collection is col, drop down lists are ddl, Form is frm, Group is grp, Gallery is gal, Icon is ico, Images are img, Table is tbl. I also prefix my components with com.
- Variable naming is: gbl for Global and context variables are var
- Use Containers for layout and groups for control aggregation.
- Use TabIndex so keyboard navigation makes sense. Use the AccessibleLable over HintText. Ensure colour contracts and use the approved colour templates stored in global variables. All fonts, sizing and colouring must come from global variables so the app can be easily changed. Use the App Checked and complete all identified accessibility issues/concerns. Test on actual devices and use screen readers.
- Ensure App Insights is setup, capture and log errors, also use the monitoring (maybe not in production).
Build responsive website using CDS. Allow anonymous access or implement multiple Identity Providers (IdP) such as AAD B2C, AAD, Google+, LinkedIn.
Updated: 28 July 2018:
Common Data Service (CDS): Comes from Dynamics CRM Sales, pretty much used like CT's in SharePoint.
- Based on Azure SQL & uses CosmosDB with a nice API that support REST/ODatav4;
- Has Row, field RBAC and uses AAD for authentication;
- Allows Power BI, Power Apps (previously labeled as PowerApps) & Flow (Power Automate - new name since Ignite 2019) to work with CDS (renamed to Dataverse);
- Use Power Query to Import data easily or data flows
- Multiple data source such as SQL Server, Cosmos, files, Excel, Search, 3rd parties such as SAP.
- CDS is not part of the O365 licencing (including E5), you need to get a PowerApps P1 or P2 (Now Power Apps for Users as licencing changed circa Oct 2019). Note Power Apps licencing included with O365 is extremely limited and for instance does not cover CDS;
- One Power Platform Environment ties to one CDS/Dataverse database
- Any type of data; any type of app.
- Easy to use; secure.
Power Automate
Updated 11 December 2019:Microsoft Power Automate (previously called Microsoft Flow, is an user friendly workflow tool/service): Simple workflow engine sitting on Azure Logic App.
Basic business process flows are:
- Automated Flows - used when event/status change e.g. CDS row status changes
- Button Flow/Instant Flow - Events such as a click triggers the flow;
- Scheduled Flow - in effect timer jobs;
- Business Process Flow - ensures users enter data consistently (think Windows Presentation Framework) & follows the same steps every time; and
- UI Flow (preview) - Provides RPA capabilities;
Building Blocks for Power Automation are:
- Trigger - what starts the workflow, can be manual trigger or an automated trigger
- Actions - ,
- Conditions,
- Data operations,
- Expressions - basic operations like adding two numbers together, making a string upper case, and
- Flow Type - see the 5 flow types above.
https://dev.virtualearth.net/REST/v1/Imagery/Map/road/Routes/driving?waypoint.1=51.525503,-0.0822229&waypoint.2=SE9%4PN&mapSize=600,300&key=<key>
https://dev.virtualearth.net/REST/v1/Imagery/Map/road/Routes/driving?waypoint.1=" & EncodeUrl(txtDriverLocation.Text) & "&waypoint.2=" & EncodeUrl(txtDriverLocation.Text) & "&mapSize=600,300&key=AsR555key
Wednesday, 6 June 2018
SharePoint Online Replacement Patterns in Diagrams
Matt Wade has a great resource on the components making up O365.
https://app.jumpto365.com/
Sunday, 13 September 2015
SharePoint 2013 Workflow options - notes
So each of these products has their place and suit different organisations. This post is my opinion and I am not a workflow expert and show my thoughts on when I would favor 1 of the approaches.
Licencing: Workflow Manager does not have the licencing costs. Nintex has a server and CAL licencing model and K2 has a server licencing model.
Skills: what are your existing in-house skills. If you already have K2 or Nintex expertise it makes these products far more attractive.
Size: How big is your organisation, how complex are the workflows, how many workflows and how often do they change shall influence the workflow option to select.
SharePoint 2013 Workflow Manager
SharePoint 2013 introduces an new standalone workflow engine based on Windows Workflow Manager 1.0 using .NET 4.5. In the SP 2013 world, Office Web Apps (OWA) and Workflow Manager runs as a service separate from SharePoint 2013.
- SharePoint Designer 2013
- Ideal for simple or medium complexity workflow processes
- Limited to a pre-defined set of activities and actions
- Relatively quick and easy to configure
- Custom workflow development through Visual Studio
- Can implement state-machine workflows
- Supports custom actions/activities
- Supports debugging
- Ideal for modelling complex processes
- Requires a developer
- Workflow Manager
- High Density and Multi-Tenancy
- Elastic Scale
- Fully Declarative Authoring
- REST and Service Bus Messaging
Nintex
- On-premise and cloud workflows – but cloud workflows do not allow custom actions
- Nintex uses the SharePoint workflow engine
- Easy to create Nintex workflows (good tooling) but not so easy to upgrade and maintain if complex – they require a proper dev environment if workflows require changing
- Tight coupling with SharePoint – so upgrades need to be planned. Some workflows have broken after upgrade.
- Can create custom activities but these are limited to constraints imposed by Nintex design surface
- More suited to State machine workflows using reusable custom modules and user defined actions.
- Nintex uses its own database which you will need intimate knowledge of when it comes to performance issues.
K2
- Off box WF hosting: Allows for increasing the number of blackperl servers and no resource overlap, flexible licencing model as it is server based
- Well tried and tested workflow engine
- Good reporting and troubleshooting
- Excellent SOA layer (SmartObjects) with multiple products. This is more an EA feature as it can be a great way to create an SOA. Allows API to connect to custom SQL, CRM, SAP, Web Services.
- Proven advanced tooling, good visual tooling (not as good as Nintex IMHO)
- Cost is relatively high, support costs are extensive, need to pay for dev and pre-prod licence
- Not based on the latest MS workflow engine
- Not easy for end users to build WF (despite marketing noise)
- Setup and monitoring is specialised and will require advanced K2 expertise
Difficult to back out of product - Tooling requires training and breaking out of OOTB features requires a high level of expertise and dependency on K2
- Support tended to be patchy with technical knowledge
K2 is a good product if you need to build an SOA layer for integration, are prepared to install correctly (cost) and maintain. You shall need dedicated workflow people to create the workflows. So in the right circumstances it has it’s place.
Updated 11 December 2019:
Microsoft Power Automation (formerly Microsoft Flow) is the default workflow option when working with the Microsoft Power Platform (Power Apps, Power Automation and Power BI). O365, SPO and D365 can also use Microsoft Power Automation. Azures Logic Apps is also a good option especially if your application is C#/Azure based and not within one of the SaaS Azure offerings.