Showing posts with label KQL. Show all posts
Showing posts with label KQL. Show all posts

Friday, 26 September 2025

European Fabric Conference in Vienna Sept 2025 takeways

FabConEurope25 was terrific in Vienna last week.  Great opportunity to meet Fabric and data experts, speak to the product teams and experts, and the presentations were fantastic.  The hardest part was deciding which session to attend as there are so many competing at the same time.  

My big takeaways:
  • Fabric SQL is excellent.  The HA, managed service, redundancy, and shipping logs ensure that OneLake is in near real-time.  Fabric SQL supports new native geospatial types.  SQL has temporal tables (old news), but row, column and object-level (incl. table) security is part of OneLake.   There are a couple of things security reviewers will query, but they are addressed.
  • Fabric Data Agent is interesting.  Connect to your SQL relational data and work with it.
  • User-defined functions (UDF), including Translytical (write-back), HTTP in or out, wrap stored procedures, notebooks,.... - amazing.
  • OneLake security is complex but can be understood, especially with containers/layers, such as Tenant, Workspace, Item, and Data.  There is more needed, but it's miles ahead of anything else, and Graph is the magic, so it will only continue to improve. - amazing, but understand security.  Embrace Entra and OAuth; use keys only as a last resort.
  • Snowflake is our friend.  Parquet is fantastic, and Snowflake, including Iceberg, play well together with MS Fabric.  There are new versions of Delta Parquet on the way (and this will even make Fabric stronger, supporting both existing and the latest formats).
  • Mirroring and shortcuts - don't ETL unless you need to shortcut, then mirror, then ETL.
  • Use workspaces to build out simple medallion architectures.
  • AI Search/Vector Search and SQL are crazy powerful.
  • New Map functionality has arrived and is arriving on Fabric.  Org Apps for Maps is going to be helpful in the map space.  pmtiles are native... (if you know you know)
  • Dataverse is great with Fabric and shortcuts, as I learned from Scott Sewell at an earlier conference.  Onelake coupled with Dataverse, is massively underutilised by most orgs, 
  • Power BI also features new Mapping and reporting capabilities related to geospatial data.
  • Other storageCosmosDB (it has its place, but suddenly, with shortcuts, the biggest issue of cost can be massively reduced with the right design decisions).  Postgres is becoming a 1st class citizen, which is excellent on multiple levels. The CDC stuff is fantastic already.
  • RTI on Fabric is going to revolutionise Open Telemetry and AI, networking through the OSI model, application testing, digital twins, and live monitoring,....  I already knew this, but it keeps getting better.  EventHub and notebooks are my new best friends.  IoT is the future; we all knew this, but now with Fabric, it will be much easier to implement safely and get early value.
  • Direct Lake is a game changer for Power BI - not new, but it just keeps getting better and better thanks to MS Graph.
  • Manage Private Endpoint as improved and should be part of all companies' governance.
  • Purview... It's excellent and solves/simplifies DLP, governance and permissions.  I'm out of my depth on Fabric Purview and governance, and I know way more than most people on DLP and governance. Hire one of those key folks from Microsoft here.  
  • Warehouse lineage of data is so helpful.  
  • We need to understand Fabric Digital Twins, as it is likely to be a competitor or a solution we offer and integrate. 
  • Parquet is brilliant and fundamentally is why AI is so successful.
  • Powerful stuff in RDF for modelling domains - this is going to be a business in itself.  I'm clueless here, but I won't be in a few weeks.
Now the arr..
  • Pricing and capacity are not transparent.  Watch out for the unexpected monster bill!  Saying that the monitoring and controls are in place, but switching off my tenant doesn't sit well with me if workloads aren't correctly set out. Resource governance at the workspace level will help fix the situation or design around it, but it will be more expensive.
  • Workspace resource reservation does not exist yet; however, it can be managed using multiple fabric tenants. Distribution will be significantly improved for cost control with Workspace resource management.
  • Licensing needs proper thought for an enterprise, including ours.  Reserve Fabric is 40% cheaper, and it cannot be suspended, so use the reserved fabric just as you would for most Azure Services.  Good design results in much lower cost with Workloads.  Once again, those who genuinely understand know my pain with the workload costs.
  • Vendors and partners are too far behind (probably due to the pace of innovation)
Microsoft Fabric is brilliant; it is all under one simple managed autoscaling umbrella.  It integrates and plays nicely with other solutions, has excellent access to Microsoft storage, and is compatible with most of the others.  Many companies will move onto Fabric or increase their usage in the short term, as it is clearly the leader in multiple Gartner segments, all under one hood.  AI will continue to help drive its adoption by enterprises.

Friday, 19 May 2023

Logging and Monitoring Advanced Canvas Apps

Overview: Most Canvas apps reach back to 3rd parties or internal storage e.g. SQL, Dataverse, blobs to persist data.  In an ideal world we want to be able to identify error and performance concerns and be able to trace quickly.

Scenario: A canvas app performs a search against a custom connector, that goes to an external API, for example search's for Company Numbers from an Open API.

  • The annotated diagram, in orange records the button pressed to run the search.  This only shows if the "Upcoming feature", "Enable Azure Application Insights correlation tracing" is enabled.
  • A custom connector, in blue, makes  a call to an REST/Open API.
  • The last call is to an external API, highlighted in purple, (in this case it was actually a mocked APIM endpoint, but the icon would change it it was totally external such as call API search to the IRS, HMRC, Inland Revenue.

Tip: Always create App Insight Instances using a workspace and not the classic-mode app insights as it is being deprecated by Microsoft in early 2024.

App Insights Understanding:

App Insights setup using the Workspace-based setup (Log Analytics) can be queried in two ways:

  • Query thru App Insights
  • Query thru the Log Analytics workspace (the Kusto/KQL is slightly different, but it's rather minor)

Tip: If you upgrade from classic to the workspace base app Insights, the log history is still "query-able" as App Insights combines the logs from AppInsights Classic (stored in app insights directly) and the logs stored in Log Analytics.

Tip: Power Automate has a connector to Log Analytics so it's good to use this for flows so you can trace canvas apps using flow journeys.  Most people tend to build a custom connector to a function that uses the AppInsights SDK.  I've used both and they are both valid approaches and shown in the annotated diagram below.

The two options for logging Flows into App Insights.

Note: If you create a new App Insights workspace-based instance remember to update the loggers in all you Azure Services to the new instance (app key).  For example functions, APIM and Service Bus are common components.

Note: You can log to multiple workspace/app insights in a tenant and the correlations will be retrieved so you see the full history, assuming you have permissions to all log sources.

Learning: the instrumentation key for app insights has 3 parts to it: 
1) instrumentation key (basically a unique identified to find and allow logs to be saved into AppInsights, 
2) ingestion endpoint (URL for the log), and 
3) monitoring metric endpoint (URL for metrics/performance counters/live metrics/failed requests/). 

Here is an example and you can see the 3 parts:
InstrumentationKey=2675bxxx-xxxb-xxxx-bf5558009ccf;IngestionEndpoint=https://uksouth-1.in.applicationinsights.azure.com/;LiveEndpoint=https://uksouth.livediagnostics.monitor.azure.com/

Sunday, 6 August 2017

Common KQL Search Helper


Overview: I am forever forgetting the intricacies of using search and this post is a short note for common searches I use.  Ensure the search is working against "Everything":

Find items under a specific url (Path)
path:https://www.radimaging.com/sites/*

To only see team site (webtemplate)
webtemplate:STS

To see Content Types e.g. task list items
spcontenttype:Task

Example used in a search box for using a wildcard on the title
ClientSector:Finance AND Title:*Paul*

You can use the query approach for any Managed property to refine your search.  Example:
http://radimaging.net/sites/healthcare/_api/search/query?querytext='ClientSector:Finance*'&startrow=2&rowlimit=500

ClientSector is a Managed Property (property bag) at the root site collection level of each site collection.
startrow is telling my query to skip the first 2 results
rowlimit is telling the search to return up to 500 results (max is 500, default is 50 if not specified).

Similar querying can be done using the search results page in the browser:
https://radimaging.net/Pages/results.aspx#k=ClientSector%3AFinance*#s=105
Client Sector MP search equivalant in the browser
s = startrow

It is a good idea to have a query tool (SharePoint Search Query Tool V2.7) to help build up queries and validate query logic.  I have built consoles to do this in CSOM in the past or used PowerShell with CSOM which is pretty good.
https://github.com/SharePoint/PnP-Tools/tree/master/Solutions/SharePoint.Search.QueryTool (PNP Tools GitHub)  as of 10 Dec 2018 version 2.8.2 is the latest version.

Examples:
querytext='SharePoi*'
querytext='ManagedPropertyCreated:Test*'&startrow=10&rowlimit=500&sortlist='created:descending'&clienttype='ContentSearchRegular'


Read More:
https://gallery.technet.microsoft.com/office/Query-SharePoint-2013-373ff97a
http://nikcharlebois.com/get-search-results-in-sharepoint-2013-using-powershell/