Wednesday 26 April 2023

Uploading files to a Dataverse table using the "Add picture" control in a Canvas App

Overview: I need to be able to upload pictures or an file for that matter and persist the file in a Dataverse entity.  This took me a little longer than it should have.

Notes: I added a "Add picture" control to the screen, I also added a label and a icon ro Clear the uploaded image.

I also added a save button, here I persist to a Dataverse table named "Evidence", The Evidence table has a column called "File" of type "File". The part that took me awhile in the Patch statement was getting the file into the correct format (as it is a record not a picture).  This works for any file upload but has the drawback of needing to change the lookup to "All file types".  

Saturday 22 April 2023

Microsoft's Well-Architected

Overview: Goal is to make using Azure and your IT function operates as optimally as possible through performance, scale-abity, minimise costs, reliability, optimise devOps, test failures, geo-data sovereignty, app security, Auth security.  It is to be done consciously: 1) Collect/Gather (Well-Architected Review) > 2) Analyse > 3) Advise (build plan) > 4) Implemented  

Five Pillars:

  1. Security;
  2. Performance Efficiency;
  3. Reliability;
  4. Cost Optimisation;
  5. Operational Excellence;

Tons of Tools are part of the Well-Architected framework:

  • Azure Advisor, analyse workload - gives possible recommendations/improvements - can import into the Azure Well-Architected Review Tool.
  • Azure Well-Architected Review Tool (amazing) - Answer qus and input Azure Advisor recommendations. Pick an area, and pick one or more of the five pillars, then work thru the the Azure Well-Architected Review Tool, to get a milestone and then look to implement iteratively.

  • Well-architected checklist
  • Provides templates to complete actions e.g., RPO/RTO, security threat analysis, treat modelling (STRIDE is MS same as DREAD)

1. Security Pillar (Protect, Detect and respond to threats)

Tools: Monitor from Azure Security Centre (ACS) NB!, Azure Defender and Microsoft 365 Defence make up Azure Sentinel (SIEM), can stream SIEM from On-prem..

2. Performance Efficiency Pillar
Trade off of cost with reliability, scale and performance.  Chaos testing - test breaking/removing resources to mimic problems.  Monitor performance resources for reliance, and performance.  Do you want to dynamically scale, react to performance or increase load to increase/scale the services.  Cache (Redis) and in as many layers as possible.  Multiple regions/zones for services close to users (paired regions are a good idea), zones, region reliance can affect performance.  Health model in effect is ensure you have monitors and alerts to verify you systems heath think Azure Dashboards or Grafana.

3. Reliability Pillar
High Availability (HA) & Resilience of Azure Resources

4. Cost Optimisation Pillar
  • Understand cost (choose the right service e.g. CosmosDB can be cheaper than SQL or vis versa)
  • Optimise (remove orphaned resources, reservation vs PAYG (licence optimisation, scale consumption when needed/optimise instances), be pragmatic in cost to benefit/cost trade-offs).  Cost modelling to understand what the cost is likely to be going forward.  Good RTP/RTO and multi-geo is expensive but if you need it.  Design choices affect the cost.  Optimise data transfers, auto-scalability (vertical and horizontal both expansion and reduction of resources). Use Azure Cost Management Tool.  Automating provisioning helps with cost as the correct resource provisioning is implemented.  Bicep is stateful whereas ARM is stateless.
  • Control costs going forward (Review periodically/constantly, use alerts  to monitor usage).  Monitoring your resource usage, can it be reduced.  
5. Operation Excellence Pillar

Dr. Kai Dupé presented the Well-Architected Framework on behalf of Microsoft 20 April & 21 April 2023 where I took notes to build this post.

Monday 10 April 2023

Postman automation reminders

Also see "Postman to check Open API's are Running"

Fire Postman collections on demand using curl

A monitor is already setup: I need the postman monitor id and an API key

Run local postman collection using Newman via Powershell (call from CI pipelines or a short-cut on the desktop)




Thursday 6 April 2023

Runas on Flows

Overview: If I use a connection in a Canvas App, the signed in user uses their own permissions and the connector as as the signed in user.  

Problem: I wish to run a flow as a specific user and not the users calling the flow from the Canvas app.

Hypothesis: I wish to call logging connector into Log Analytics, so I have created a flow, If I use the Power Apps V2 connector, it offers and option to run in another users context.

Resolution: Open the Workflow, ensure you are using the Power Apps V2 trigger, then...


Here I use the Scopes to perform a Try Catch finally set of logic


Tip:  most people tend to use a custom connector to push the error message into a function from the Workflow, the function app uses the App Insights SDK and logs the workflow error.

Simple C# code to write to App Insights using the SDK. 

#r "Newtonsoft.Json"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
int eventId = data?.eventId;
string errType = data?.errType;
string errMsg = data?.errMsg;
string correlationId = data?.correlationId;
string workflowId = data?.workflowId;
string workflowUrl = data?.workflowUrl;
string flowDisplayName = data?.flowDisplayName;
var custProps = new Dictionary<string, object>()
{
{ "CorrelationId", correlationId},
{ "WorkflowId", workflowId},
{ "WorkflowUrl", workflowUrl},
{ "WorkflowDisplayName", flowDisplayName}
};
using (log.BeginScope(custProps))
{
if (errType=="Debug") { log.Log(LogLevel.Debug, eventId, $"{errMsg}"); }
else if (errType=="Trace") { log.Log(LogLevel.Trace, eventId, $"{errMsg}"); }
else { log.LogInformation($"Event is {eventId}, type is {errType}, and msg is {errMsg}");}
};
string responseMessage = $"This HTTP triggered function executed successfully. {errType} - {errMsg}";
return new OkObjectResult(responseMessage);
}

More Info:

Reza Dorrani has a great recording showing running Power Automate flows using elevated/shared accounts.