Showing posts with label short-cut. Show all posts
Showing posts with label short-cut. Show all posts

Wednesday, 30 April 2025

MS Fabric OneLake Shortcuts

 "Shortcuts in Microsoft OneLake allow you to unify your data across domains, clouds, and accounts by creating a single virtual data lake for your entire enterprise.MS Learn

It allows open storage format data to be stored in the source system, metadata is added to OneLake, and the data can be queried; the load is predominantly performed against the source system, e.g., Dataverse/Dynamics.

Clarification: A shortcut is automatically added to MS Fabric for each Dataverse.  Dataverse creates Parquet files (est 5-10% extra data storage, counts against Dataverse storage).  Via the shortcut, report writers or data engineers can access the Dataverse data as though it is inside MS Fabric's OneLake.

Understand: Dataverse creates Parquet files that MS Fabric can look at to generate dataset data.

"Shortcuts are objects in OneLake that point to other storage locations.MS Learn

External shortcuts (data is held at the source system) supports any open format storage format, including: 

  • Apache Iceberg Tables via Snowflake,
  • Parquet files on SnowFlake,
  • Microsoft Dataverse
  • Azure Data Lake Storage (ADLS), 
  • Google Cloud Storage, 
  • Databricks, 
  • Amazon S3 (including Iceberg tables),
  • Apache Spart (Iceberg)
Internal shortcuts supported:
  • SQL Databases: Connect to SQL databases within the Fabric environment.
  • Lakehouses: Reference data within different lakehouses.
  • Warehouses: Reference data stored in data warehouses.
  • Kusto Query Language (KQL) Databases: Connect to data stored in KQL databases.
  • Mirrored Azure Databricks Catalogs: Access data from mirrored Databricks catalogs.
I think these are also Internal shortcuts:
  • PostgreSQL
  • MySQL
  • MongoDB

Sunday, 15 January 2023

Postman to verify OpenAPI's are running

Problem:  Our teams rely on a 3rd party API for a new project being delivered, the API's are in a state of change and are constantly up and down making life tough for the teams replying on the API.

Hypothesis:  I need a quick way to check the API's to see if they are all working in dev, and test.  I have two postman collections for the REST API's.  If i combine them and check the key API's using postman I can save myself and other time as I'll know the current state of the API's.

Solution: Create a site collection that does the API verification, you can make it more complex with data and variables.

Problem:  I can open Postman and run the test which takes a few minutes.  We need to do this quicker.

Hypothesis: I'd like to be able to run the tests quickly on demand.  Use postman CLI and Powershell to run the collection and display the result.

Solution

1) Add the Postman CLI to my machine:

PS> powershell.exe -NoProfile -InputFormat None -ExecutionPolicy AllSigned -Command "[System.Net.ServicePointManager]::SecurityProtocol = 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://dl-cli.pstmn.io/install/win64.ps1'))"

2) In postman generate an API Key for the Collection > Run Collection > Automate runs via CLI > Generate the API Key > Copy the generated code


3) Run the code in PS to verify it works correctly.

4) Copy the PS code into a newly Created ps1 file on your local machine, I added a read line so I can see the result.


5) Run the API.ps1 file and verify the result

6) Setup a desktop short-cut to run and see the result.  Right click the API.ps1 file and create a shortcut on your desktop.  Right click and amend the target and amend the target value:

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -ExecutionPolicy Bypass -File C:\Users\PaulBeck\Downloads\Projects\PoC\Postman\API.ps1

7) Save and run the shortcut to verify.

Problem:  Monitor and alert DTAP API's are working and performance

Resolution: I want to monitor that my endpoints specified in my Postman collection in Dev, UAT et al. are working, can be more than 1 endpoint using Postman Monitor.

Next steps: Add to automated DevOps processes, using Newman.