Showing posts with label APIM. Show all posts
Showing posts with label APIM. Show all posts

Monday 12 June 2023

App Insights for Power Platform - Part 5 - Logging for APIM

Series

App Insights for Power Platform - Part 1 - Series Overview 

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM (this post)

App Insights for Power Platform - Part 6 - Power Automate Logging

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards 

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power Automate Licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern

Overview: APIM is often part of you Power Platform solutions, such as monitoring and controlling all inbound and outbound traffic or to wrap over Azure functions.

Within APIM you can add multiple App Insights Instances.  You can send all logging to a single instance an override specific API's to log to different instances.  Making the logging nice and granular.

Setup Logging

The diagram below is where i used the operation Parent Id to find a log entry using the Transaction Logs in App Insights I can see the APIM entry and the entry to the backend 3rd party and their http response
You can hook up so you can see the Canvas App Session, then the function call, which calls APIM, and then see the backend call to the gov 3rd party API.

  1. Logging can be global or set at the API level in APIM.
  2. Telemetry "Sampling" will log a percentage of requests.
  3. "Always log errors" captures any errors APIM gets.
  4. Headers and body are not included in logs unless you specify them.

Series

App Insights for Power Platform - Part 1 - Series Overview 

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM (this post)

App Insights for Power Platform - Part 6 - Power Automate Logging

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards 

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power Automate Licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern

Friday 9 June 2023

App Insights for Power Platform - Part 1 - Series Overview

Overview: Microsoft have great capabilities for logging and monitoring.  In this series of posts I will be examining the various parts of logging that may be useful in building solutions that are well monitored, provide alerting, easy tracing, and identifies issues or potential issues as soon as possible.

I am looking at App Insights for Power Platform monitoring.  So this includes: 

  • Power Apps (Canvas, and model apps),
  • Power Automate,
  • APIM, 
  • Azure Functions, 
  • Azure Service Bus, and
  • App Insights.

I shall be setting up a demo environment and these are the logical components being covered.


All the components making up the solution shall log into Log Analytics (left-hand side of the diagram).

For Continuous Integration, my clients will be Postman monitor (it's awesome and so easy to use all those postman collections), DevOps is great and I'll use it to run smoke tests after new releases.  I also use flows, to report on flows (sounds nuts but i love it).  These are at the bottom of the diagram. 

Lastly on the right of the diagram, I look at extracting logs for reporting (Power BI), and Monitoring using Azure DevOps (p.s. think about Grafana instead of DevOps Dashboards, it so nice).

Couple of extras are: Availability Logging, alerting, automating Canvas app testing, Playwright.  

From the diagram, you can see the data is now held in Log analytics and it can be queried via Log Analytics or App Insights using Kusto.  Note: the syntax is slightly different.

Series

App Insights for Power Platform - Part 1 - Series Overview (this post)

App Insights for Power Platform - Part 2 - App Insights and Azure Log Analytics 

App Insights for Power Platform - Part 3 - Canvas App Logging (Instrumentation key)

App Insights for Power Platform - Part 4 - Model App Logging

App Insights for Power Platform - Part 5 - Logging for APIM (this post)

App Insights for Power Platform - Part 6 - Power Automate Logging

App Insights for Power Platform - Part 7 - Monitoring Azure Dashboards 

App Insights for Power Platform - Part 8 - Verify logging is going to the correct Log analytics

App Insights for Power Platform - Part 9 - Power automate licencing

App Insights for Power Platform - Part 10 - Custom Connector enable logging

App Insights for Power Platform - Part 11 - Custom Connector Behaviour from Canvas Apps Concern

Tip: The Power Platform Admin Centre has a good overview of the Power Platform, but to make logging and monitoring better push data into Azure Log analytics and monitor and alert centrally.

Also seeView and download Dataverse analytics - Power Platform | Microsoft Learn

Saturday 14 January 2023

APIM Logging

Overview: Azures API Management is a big service, it is worth understanding the logging capability so you can effectively analyse traffic.

Thoughts:

  • Multiple App Insights can be setup with default logs going to a specific App Insights.
  • Each API can be overridden to log to any of the API's added to API.
  • The old "Classic" App Insights, stored data internally, whereas the new "workspace-based" app insights", I think of it it as "V2 App Insights connected to a Log Analytics", the new data is stored in the workspace.
  • If you upgrade App Insights, the results blend from two storage locations, the old data stored internally with App Insights and the new data stored within Log Analytics - if you query Log analytics, you only see the new log analytics data.
  • Security for App Insights should be done at the Resource Group (RG) level, ther are AppInsight roles for use at RG level, if the workspace is on a different resource group to the app insights connected instance, ensure you sort out the permssions in both RGs.
  • Open Telemetary project is making strides forward, and for API's it will be great.

Problem: I recently migrated a customer Dev, Test, Appearance, Pre-prod and Production (Not yet) to use the AppInsights instance running on Log Analytics (sometimes refereed to as V2).  Logging wasn't work correctly.


Initial Hypothesis: I have complicated resource groups differing crossing DTAP boundaries.  By default, APIM has a logging catch all setup per APIM instance and then specific API's settings are changed to log to specific App Insights.

Steps:

My AppInsights instance was to rename the old classic type AppInsights e.g. "appinsights-dev" becames "appinsights-dev-delete".

Create a new AppInsights instance using the V2 Log Analytics option  and name it the original name "".  The client opted for the name to be the same.  It would be simpler to give it a name like "appinsights-dev02".  The clients also wanted to use a shared Log Analytics instance per env e.g. "loganalytics-dev-shared".






Monday 15 March 2021

APIM Authentication and authorisation

Overview:  APIM provides two methods for secure access to an API operations:

  1. Subscription: passed in the header or the query string (generated primary and secondary subscription key for each subscription), and 
  2. Client Certificates.
WIP


Saturday 13 March 2021

APIM OpenAPI Specification Documentation Example within the Developer Portal

Overview:  I find document APIM contracts incredibly important and yet it's often very poorly done.  This post provide a simple YAML and JSON example that can be imported into APIM or any other gateway product for that matter.

The YAML file below can be imported into APIM and published to the developer portal.  The example provides a clear example on options and how an API should be documented.  Developers can see an example of the JSON to use when performing the PUT.  The developers can see more information of a property, for instance a passport number would be a certain length and rather than specifying and option free text string with no description, the developer would know that the property has to come in the correct format.

Simple Open API specification showing a single documented operation for a complex PUT object (YAML).

PB APIM Series:
Documenting you API in the Developer portal (this post)


Friday 12 February 2021

APIM debugging, tracing, monitoring tricks and tips

Debugging APIM requests from Visual Studio code 

Has an extension for debugging APIM.

Azure API Management - Visual Studio Marketplace


It's also useful to have a APIM requesting/client extension installed

Tracing APIM

To get a full trace add the HTTP header "Ocp-Apim-Trace: true" to the request and the response shall contain a URL to retrieve the trace information.




App Insights for APIM
Logs in three places:
  1. Incoming requests (come into APIM)
  2. Dependency request (go to backend/outgoing)
  3. if an exception occurs it is also logged in App Insights
So logging can be set at either the global or API level.
Setup in APIM > Monitoring > Application Insights (link APIM to the App insights Logger).

Documentation Tips

Ensure you fill in relevant descriptions and summaries.  It's also key to provide examples.  

https://swagger.io/docs/specification/adding-examples/

My Technical Working Notes for Microsoft Technology: APIM OpenAPI Specification Documentation Example within the Developer Portal (pbeck.co.uk)

APIM documentation updates on the Developer portal (after re-publishing).  It has a great UI, but ensure the summaries are added for param/attributes to get a truly rich integration set of documentation (it will save so many questions and time).   I also like to add a getting started guide, keep it short and simple and most importantly have a simple explanation of security/authentication and connectivity.

- in: query
  name: age
    schema:
      type: integer
      maximum: 3
    examples:       # Multiple examples
      max: # Distinct name
        value: 3   # Example value
        summary: The age is dependent on dob, min is 0, can't be negative 

Monday 19 October 2020

APIM High Availability and Performance across Regions

Overview:  APIM can be setup in multiple regions and incoming request will be routed to the closest APIM endpoint.  If there is only 1 APIM region, it is best to ensure the API/App Service/Function is hosted in the same region.  With multiple APIM's you can also host a API in the same region.  The routing is either done automatically using Azure Front Door or via policy on the APIM.

Front Door can be substituted with Azure WAF, or Cloudflare or Barracuda's SaaS solution.

More Info

Tuesday 29 September 2020

Sunday 13 September 2020

Options Layering API's on Data Sources - Micrososervices kind of

Hasura takes data sources such as SQL, Postgress & MySQL and converts it into GraphQL API's.  SQL Server is in preview.  Service is available on Azure and hooks into AAD and AAD B2C.  Hasuru looks extremely interesting and useful.  Potentially a great time saver.

CDS/DataFelx/Oakdale - Allows for Entity creation and provides REST API's.

SharePoint lists provide HTTP API's for CRUD operations.

REST API's vs GraphQL

OpenAPI specification (previously known as the Swagger specification) is my default for an API, this allows for a known RESTful API that anyone with access can use.   Open API has set contracts that returned defined objects which is great, you can work with the API like a database with simple CRUD operations as defined by the specification.  The issue is that the returned objects are fixed in structure so you may need 2 or more queries to get the data you are looking for.  Alternatively, GrapghQL allows the developer to ask for the data exactly as the want it.
Open API example:
/api/user/{2} returns the user object  // Get the user object for user 2
/api/users/{2}/orders/10  // Returns the last 10 orders for the user
GraphQL example:
Post a single HTTP request.
query {
 User(id: "") {
    name
    email
    orders(last: 10 {
      orderid
      totalamount
      datemodified
    }
 }

Database Option Notes:
PostgreSQL - Hybrid relation database (Communitity edition free, standard and enterprise higher loads) sits on Linex but can also use Windows OS.  Replicas are used for additional read-only nodes (I think up to 5 geo relicas allowed) 
Can be hosted on Azure or AWS IaaS.  PostgreSQL is also offered as a fully managed service (PaaS) on Azure (either single server or hyperscale).
MariaDB community fork off MySQL - Open source relation database used in the LAMP app stack.  Azure MySQL manage service can have up to 5 replicas for read heavy workloads.

Wednesday 24 June 2020

Postman API Builder Intro

Overview: Tools for building and mocking API's.  Swagger has good tooling and my original preferred choice.  APIM - Great tooling, part of Azure and easy to replace mocks as you go along with the live implementation.  Postman is offering a great set of functionality to rival Swagger and APIM.  This post looks at Postman's new functionality around building API's.

Postman API Builder:
Not only a test rig, it now offers the ability to build API's and mock:
  • Mock - so you can test supports key and OAuth authentication
  • Assert Tests - You can specify asserts in postman
  • Test suite - generate collections/Collection Runner - Allows a set of related tests to run sequentially.
]
  • Document the API
  • Monitor
  • Version control for changes e.g. GITHub
  • API Versions supported
  • Note: Free plan has all of this, limited on the number of API's but all the features are on the free plan.  The main notation formats are support including:  Open API specification (OAS) & GraphQL
Summary:
I like Swagger tooling, I have done a few projects find APIM fantastic for building API's quickly.  Postman historically was merely my test rig but looking at the functionality, Postman API Builder is a great option for designing and building API's.  Postman is a good tool for building into CI/CD pipelines to validate API's.

Few more assert examples:

Postman offers a service to monitor API's using your postman collections, these can be triggered using Curl so can build into DevOps, Power Automated scheduled flows,....
sentry.io looks good as an alternative option

Wednesday 12 February 2020

Power Apps - DTAP Azure diagram

Overview:  A sample architecture for DTAP in a highly controlled environment.  There are a lot of variations and the ability to use Power Apps publishing.


There are a few ways to manage environment deployments.
  1. Simple: Create in unmanaged solutions and Export and Deploy Managed solutions manually using the Power Platform Ux.
  2. Enterprise: Build Azure DevOps pipelines to add unpacked code to Github and deploy solutions using Azure pipelines.
  3. 3rd party ALM tooling, some is specific to Dynamics.
  4. Nov 2022 - Microsoft announce Power Platform Pipelines 

Custom Connectors:  As you step thru the DTAP environments using ALM to deploy you need to point the custom connector to the appropriate APIM (API):
Environment Variables are extremely useful for ALM


Tuesday 14 January 2020

API Management Advanced Mocking

My earlier post shows an Azure APIM engineer the steps to setup API Management Mocking

This post contains 2 screen shows showing how to add multiple mock responses to a request in APIM:
Amend the Get Operation Inbound Response Policy

Two APIM REST Requests showing dynamic mocking

Simple OpenAPI YAML 3 Example with Mock up examples for APIM.  
To review:
  1. Open an existing APIM Instance
  2. Import the YAML file example (download and rename), thi sis a YAML 3.01 specification.

3. Add "Mock Response policy to all Incoming requests, alternatively you can do it at an operation level.
    4. Test & Verify 

Sunday 12 January 2020

API Management Mocking

Problem:  Create an Open API Specification (OAS) endpoint for testing using APIM

Background:
  • Azure has a great service to bring multiple API's under a single publishable layer (think MuleSoft).  I like to use APIM to setup an initial contract that developers can use before setting up the actual API.  This allows both the consumer teams and the back-end development team to work independently to this OpenAPI agreed contract. 
  • Swagger originally owned and ended up creating the OpenAPI specification (OAS) that now of companies now use.
  • Swagger has great tooling for creating OAS API's, documentation, stub hosting and generating code such as .NET core that you can import into your dev environment. 
  • Azure has a developer APIM instance licence that is relatively inexpensive (creating an APIM instance takes up to 20 minutes) but leaving it running for my personal dev is pretty expensive.
Overview:  This post outlines the steps to setup an APIM instance using an OpenAPI file created in swagger.  The APIM service shall be setup to return mock data.

Steps:
1. Create a new instance of APIM - Do this first as it takes up to 20 minutes before the service is ready.
1. Create an APIM instance
2. Open swagger.io in a browser and signup, "Create New", this gives you your starting OAS file for your custom API.

2. Login to Swagger and create the Open API file using YAML
3. Using the Swagger Editor, create the desired endpoints.  It took me a few hours to get use to YAML as it is space sensitive but very readable.
3. Swagger editor
4. On the top right of the Swagger editor is the "Export" option.  Click Export > Download API > JSON Unresolved.  And keep the .json file ready to import into your APIM service.
5. Open APIM, and add a new API.  APIs > Add API > Open API as show below.
Import OAS file into APIM
 6. Import the file and the fields get populated per you instructions.
Upload the OpenAPI json file into APIM
7. The list of operations shows up - in my case i only have a single GET operation call \Get Customers
APIM Service is now created but not connected to a back-end

8. Add Mocking to the end point.  Highlight the Operation i.e. \Get List all Customers > Inbound Processing (Add Policy) button.  Select "Mock-responses" > Save


9. Generate the JSON responses:
a) Select the Operation i.e. \GET List all customers
b) Frontend drop down > Form-based editor
c) Click the "Responses" tab
d) Click the "200 OK" link
e) Click in the Sample Box, and "Auto Generate", Save


 The APIM service is now setup and we are ready to test.

1. Test using APIM's Testing tool


Check the 200 Response
Mocked response from APIM test tool

2. Testing using Postman
Open Postman, and craft the request as shown below:
Postman APIM testing

NoteThere are several competitor products like Mulesoft, Amazon API gateway, Postman and Swagger also offer a lot of these features.  There are other products that I have not used such as Kong API Gateway, GCP has Apigee, Gartner has a list of competitors and the magic quadrant done each year.

Summary:  It is pretty straight forward to setup APIM mocking as shown above.  And then easy to test it using Azure APIM tests and Postman.   This post show how to add various mocked APIM responses.   

APIM Notes

 5 Pricing Versions of APIM:

  1. Developer Version (No SLA) - No prod data but has all the premium features.
  2. Consumption Model (99.95% SLA) - priced per request serviced.  Missing Dev portal, not a static IP adr - automatically scales, 
  3. Basic  (99.95% SLA)  - No AAD integration - Scales to 2 units
  4. Standard  (99.95% SLA)  - Scales to 4 units
  5. Premium  (99.99% SLA)  - Allows Custom Domains - unlimited scale.  Also allows a single APIM instance to be scaled to more than 1 Azure Region.  Also direct VNET access is only available in premium.
One can move (scale up or down) between Azure APIM pricing tiers but it take up to 45 minutes.  
All consumption pricing tiers require the owner to setup or perform scaling. 
APIM has Metrics to determine if the number of APIM units should be increased or decreased.  The best metric is the capacity metric made up of  Note: Ignore short spikes look for an average over 15 minutes.  Microsoft suggests that the APIM Capacity metric running over 60% to 70% for a period of 30 minutes would indicate that scaling is appropriate.
When building your scaling strategy understand that adding APIM Units takes time (roughly 30 min) so scaling at 60% may not work for flash traffic in which case you'll need to account for this outside of only using metrics for scaling.
Tip: Regionally deployed APIM instances point to a single back-end URL, so it is best to keep the traffic routed to APIM in the same region as the back-end for simple scenarios, use the other region for failures and obviously you can multi-route back end traffic using Front Door for larger deployments.

Tip: Use Azure Key Vault for secrets.

Developer Portal - part of APIM, this portal/website is highly "customisable" and allows users to discover/consume our API's.  User can check their access (only see API's they have access to) and to try/test the API.

Tips:

  • Users > Groups > Products > API EndPoints
  • Subscriptions assigned to 1) products or/and 2)API endpoints.
  • Transition policies great for turning SOAP into JSON.  There are a lot of OOTB policies, easy to create policies using C#.
  • APIM Extension for VS Code is nice for working with APIM.
More Info:
https://www.youtube.com/watch?v=0yf_xm5cPIo
https://www.youtube.com/watch?v=gA2yxwKo0M0

PB APIM Series:

Sunday 29 September 2019

OAuth for Custom Connectors

Problem:  I have an APIM end point that has both the subscription key and OAuth2 setup for security and I need to connect Power Apps.

There are basically two parts to this problem, setting up your Power App to use OAuth and then also passing in the subscription key on every APIM end point request.
  • This post assumes that both the APIM App registration and the Client App have been registered on Azure AD.
  • Check with postman can access the end point as shown below.  Postman will authenicate and use the bearer token (OAuth) and the subscription as shown below to test
Client ID: 5559555-555a-5558-5554-555d4
Auth URL: https://login.microsoftonline.com/555fd555-555f-5554-5551-555b444c3555/oauth2/v2.0/authorize
Token URL: https://login.microsoftonline.com/555fd555-555f-5554-5551-555b444c3555/oauth2/v2.0/token
Refresh URL: https://login.microsoftonline.com/555fd555-555f-5554-5551-555b444c3555/oauth2/v2.0/authorize


We now know the APIM is working with OAuth and the subscription key.
Configure OAuth from Power Apps:
  • Once Postman can get valid 200 responses using OAuth2 and the subscription key, setup the custom connector.  I could not get the Azure AD connector to work so i used the OAuth connector as shown below: 

Add a policy that will add the subscription key to every https request to the APIM/resources as shown below.

Updated 2 Feb 2020 - Example of custom OAuth code flow authentication for a custom API's using AAD security.

Note: Deploying Custom Connectors in solutions always seems to be an issue.  It is a good idea to keep all connection references in a separate Power Apps Solution.