Tuesday, 20 January 2026

App Insights for Power Platform - Part 12 - A fix story

Overview: My system notified me that my production errors were going crazy.  Quickly I knew all the tenants at my largest customer were down.  

Problem: Apps would load and stay in a loading state.  I could see the Canvas apps but that is where is would continue try to load.  

Initial Hypothesis: Originally, I thought is was 1 environment and I know it was loading a SharePoint list so i thought it may be permissions, but it was on all my environments and my continuous test was picking them all up.

I checked the Microsoft Services and all the services are working: https://azure.status.microsoft/en-gb/status

I went to QA and Dev and they were also failing with the same issue.  I ran the Canvas app in debug mode and could see the error was relating to Connectors to the European APIM for Dataverse.

Next I went to other environments on my client in other regions, they too were also failing.  I was a bit surprised as I wasn't getting any feedback from other clients, or feeds, so i logged onto my own company Power Platform tenant and in environments, they were working.  So this was only to this specific client. And now I knew the extent, and i could not run flows only on the client environments.

Here are the CI test report and results for a subset of the apps on the business units production environment.

Ran a test to check a single Production department Environment with 24 Canvas apps:

Simple test: runs on 3 worker processes on a single browser engine chromium.

Reports logs show all 8 of these sites are not working

All 8 tests were not finding the Title of the page.  I log from Playwright and from Canvas apps using App Insight traces, the error was supper easy to pickup even without the Power Platform trace in the Dev environment.

Resolution: Raise a ticket, tell MS that we had the issue and provide the info (not raised by me, but by a support engineer).  30 minutes later, the company has been advised to close the browsers and try again.  Did this manually and issue resolved.

I still had the old token for SPO in Playwright for Chromium, so I ran the test on for all 3 browser engines.  Chromium fails with the old token, Firefox and webkit pass as they grabbed new login tokens.
Success: Same test using webkit browser engine - working as they have a new SPO Bearer token.


If I find out what cause the issue, I'll post what MS did and found out.  


Useful Dashboards I used:


















Portal.azure.com OOTB App Insights Url to retrieve an OperationIDs' history
https://portal.azure.com/#blade/HubsExtension/BladeRedirect/bladeName/Microsoft_Azure_LogicAppsRunBlade/
runId/<OPERATION_ID>/ logicAppName/<LOGIC_APP_NAME>/ resourceGroupId/%2Fsubscriptions%2F<SUBSCRIPTION_ID>
%2FresourceGroups%2F<RESOURCE_GROUP_NAME>%2Fproviders%2FMicrosoft.Logic%2Fworkflows%2F<LOGIC_APP_NAME>



Series

App Insights for Power Platform - Part 1 - Series Overview 

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM 

App Insights for Power Platform - Part 6 - Power Automate Logging

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards 

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power Automate Licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern 

App Insights for Power Platform - Part 12 - A fix story (this post)


Thursday, 15 January 2026

MS Fabric - Storage underpinning

 Microsoft Fabric get all its data from OneLake.  I believe all storage, except Real Time Intelligence (RTI), uses OneLake to ensure there is only one copy of the data.


Left side of the diagram/Data sources:

Any data source - copy data from any source into a Lakehouse, and the Parquet and Delta is stored in OneLake and exposed via the Catalog Layer

Mirroring: Some data sources are mirrored into OneLake Parquet, including Snowflake, PostgreSQL, and Azure SQL 2025. 

MS Fabric SQL is part of Fabric, and the SQL database is mirrored into OneLake.

Shortcuts - 3rd party software holds the parquet* data and allows MS Fabric to query the data.   

Sunday, 11 January 2026

Working with Snowflake and MS Fabric

Overview: Snowflake covers a small area of what Fabric does.  But Snowflake cover it's area unbelievably well.  For large enterprises use these together even though there is some overlap, Snowflake is great at what it does! 

Five ways to use Snowflake data in Fabric: 

1. ETL - use Data Factory or an ETL tool to copy data from Snowflake to Fabrics OneLake (point in time copy).  This should be your last option. 

2. Direct query (no copy) - Fabric compute (Power BI, Notebooks, Dataflows, Pipelines) runs queries directly against Snowflake’s SQL endpoint. Best when you want zero‑copy and Snowflake stays the system of record.

3. Mirroring (copy + sync) - Fabric mirrors a Snowflake database using CDC into OneLake so Fabric can work locally with governed, accelerated data while staying synced with Snowflake.  Good for small and commonly accessed data. 

4. Shortcut to Snowflake‑hosted Iceberg (no data copy) - Fabric creates a Shortcut (virtual pointer) to Iceberg tables stored with Snowflake, so Fabric tools read them without moving data.

5. Snowflake writes Iceberg to OneLake - Like option 3 but Snowflake handle the outbound - Snowflake materializes Iceberg tables into a OneLake location; Fabric then reads them natively (open‑format interop).

Reference:
Greg Beaumont's Architecture blog - Fantastic stuff! 

Saturday, 10 January 2026

SharePoint AI new features

 SharePoint has added some great AI components when coupled with M365 Copilot.  We have had this for a few weeks in GA:

The Summarize feature is fantastic:

  • It works well in Word and Excel and is okay in PPTX files.
  • It processes images using OCR and AI to interpret them and generate a summary.
  • It works with PDFs/Adobe documents, including text with embedded images and image-based PDFs.  Pdf-A also.

There is no MS Graph API yet for this functionality, so I used Playwright to scrape the summaries and add them to a summary metadata field.  I came across Knowledge Agent, which is in public preview, and it is fantastic.  

Knowledge Agent (for SharePoint online if you have a M365 Copilot licence): 

Knowledge Agent in the SPO UI

If you need it enabled, as a SharePoint Admistrator you need to enable it using PowerShell.

Generate metadata using your existing files and metadata as the example shows below:

Sunday, 30 November 2025

SharePoint for BIM

Overview: Building Information Modelling (BIM) Enterprise Document Management Systems are expensive.  I decided to use SharePoint instead, as we already have SharePoint licencing, security is approved, and it works with Entra MFA.  SharePoint offers great API's and is well-known and easy to manage.  Lastly, I got much better performance from SharePoint than the other two dedicated Doc Mgm Systems I evaluated.

Concerns:

  • BIM version, numbering, and workflows: No issues seen; merely need a simple workflow
  • Web Viewer for CAD files

Advantages:

  • Part of the Microsoft ecosystem
  • Free, as we all have E3 licences
  • Users know how to use and manage SharePoint, so no training is required
  • SaaS is running and is part of our business SLA, with 99.9% uptime.  SharePoint has an active-active architecture, built-in backups and data is stored in multiple locations
  • Reduced setup and no external 3rd party requiring contracts and approvals.  No supplier has nearly as many compliance assertions including ISO27001, SOC1, SOC2, SOC3, GDPR, 
  • Security is already ready with the client Entra userbase with MFA.  DLP and sensitivity labels.   Great built-in data residency, audit logs and retention policies.  File Sync is super helpful in working with large CAD files, especially in remote locations.  All data is encrypted at rest and in transit.
  • SharePoint is widely used in construction projects.  Customers and third parties can securely access SharePoint Online for central collaboration.
  • Mobile-friendly, tool-friendly, management-friendly approach to BIM.
<ProjectCode>-<CreatedDivision/Partnercode>-<DocType>-<Discipline/Category>-<IncrementNo>
BLD123-MACE-CAD-ELE-00001

HLD Architecture Designs for Finding SharePoint File data

 SharePoint data source using Microsoft Foundry to build an Agent



Sunday, 2 November 2025

Edge Computing Notes (IoT)

WIP

Edge Computing is where computing is done close to IoT devices and machines/PLCs.  Basically, if it happens on the "edge" of your network.  The processing occurs on local devices, gateways, or edge servers near IoT sensors, cameras, or machines.

  • Low Latency: Ideal for applications like autonomous vehicles, industrial automation, and AR/VR.
  • Bandwidth Efficiency: Only relevant data is sent to the cloud, reducing costs.
  • Reliability: Systems can continue functioning even with intermittent connectivity.
  • Over the past few weeks, I ordered a Raspberry Pi, which I intend to use for processing data from remote IoT sensors, namely Cameras, LiDAR, and temperature.

    Message Queue Telemetry Support (MQTT) is a light weight messaging protocol used to allow remote devices to communicate reliably using a pub-sub model operating in real-time.  MQTT 5.0 is ISO/IEC 20922 and the latest approved edition done in 2019.  Sends small packets

    Node-RED provides a web browser-based flow editor, which can be used to create JavaScript functions (Wikipedia)

    Azure IoT Edge is a cloud-to-edge computing platform that extends Azure services and custom logic to IoT devices. It allows you to deploy containerised workloads (such as AI models, analytics, and business logic) directly on edge devices, enabling local data processing and reducing reliance on continuous cloud connectivity. This improves latency, bandwidth efficiency, and resilience in offline scenarios.

    IoT Edge Modules are Docker-compatible containers running Azure services, third-party apps, or custom code. Examples: Azure Stream Analytics, Azure Machine Learning models. [learn.microsoft.com]

    IoT Edge Runtime must be installed on each edge device. Handles module deployment, security, and communication between modules, devices, and the cloud.

    Includes:

    IoT Edge Agent: Deploys and monitors modules.

    IoT Edge Hub: Manages messaging between modules and the cloud. [learn.microsoft.com]


    Azure IoT Hub is a cloud service for managing IoT Edge devices, configurations, and monitoring.


    A Raspberry Pi has an OS, whereas an Arduino device has no OS:

    • Arduino is an Application + Microcontroller (MCU) - Compile code in C++ or C
    • Pi is Application + Library & System Functions (OS) + MCU



    Why Use Ada on MCUs?

    • High reliability (ideal for avionics, medical devices).
    • Built-in concurrency and timing features.
    • Safer than C/C++ in many cases due to strong typing and runtime checks.

    • If you need maximum safety and reliability, Ada is superior.
    • If you need performance, simplicity, and broad support, C dominates.
    Ada is a strongly typed, multi-paradigm programming language originally developed for the U.S. Department of Defense to ensure reliability in mission-critical systems.
    Setup:
    • Install GNAT (GNU Ada Toolchain)
    • GNAT Studio or VS Code with Ada extensions.
    • Run gnat --version in your terminal
    • Write the code:

    Compile & Run> gnatmake hello_mobility.adb