Sunday 15 January 2023

Postman to verify OpenAPI's are running

Problem:  Our teams rely on a 3rd party API for a new project being delivered, the API's are in a state of change and are constantly up and down making life tough for the teams replying on the API.

Hypothesis:  I need a quick way to check the API's to see if they are all working in dev, and test.  I have two postman collections for the REST API's.  If i combine them and check the key API's using postman I can save myself and other time as I'll know the current state of the API's.

Solution: Create a site collection that does the API verification, you can make it more complex with data and variables.

Problem:  I can open Postman and run the test which takes a few minutes.  We need to do this quicker.

Hypothesis: I'd like to be able to run the tests quickly on demand.  Use postman CLI and Powershell to run the collection and display the result.

Solution

1) Add the Postman CLI to my machine:

PS> powershell.exe -NoProfile -InputFormat None -ExecutionPolicy AllSigned -Command "[System.Net.ServicePointManager]::SecurityProtocol = 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://dl-cli.pstmn.io/install/win64.ps1'))"

2) In postman generate an API Key for the Collection > Run Collection > Automate runs via CLI > Generate the API Key > Copy the generated code


3) Run the code in PS to verify it works correctly.

4) Copy the PS code into a newly Created ps1 file on your local machine, I added a read line so I can see the result.


5) Run the API.ps1 file and verify the result

6) Setup a desktop short-cut to run and see the result.  Right click the API.ps1 file and create a shortcut on your desktop.  Right click and amend the target and amend the target value:

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -ExecutionPolicy Bypass -File C:\Users\PaulBeck\Downloads\Projects\PoC\Postman\API.ps1

7) Save and run the shortcut to verify.

Problem:  Monitor and alert DTAP API's are working and performance

Resolution: I want to monitor that my endpoints specified in my Postman collection in Dev, UAT et al. are working, can be more than 1 endpoint using Postman Monitor.

Next steps: Add to automated DevOps processes, using Newman.

Saturday 14 January 2023

APIM Logging

Overview: Azures API Management is a big service, it is worth understanding the logging capability so you can effectively analyse traffic.

Thoughts:

  • Multiple App Insights can be setup with default logs going to a specific App Insights.
  • Each API can be overridden to log to any of the API's added to API.
  • The old "Classic" App Insights, stored data internally, whereas the new "workspace-based" app insights", I think of it it as "V2 App Insights connected to a Log Analytics", the new data is stored in the workspace.
  • If you upgrade App Insights, the results blend from two storage locations, the old data stored internally with App Insights and the new data stored within Log Analytics - if you query Log analytics, you only see the new log analytics data.
  • Security for App Insights should be done at the Resource Group (RG) level, ther are AppInsight roles for use at RG level, if the workspace is on a different resource group to the app insights connected instance, ensure you sort out the permssions in both RGs.
  • Open Telemetary project is making strides forward, and for API's it will be great.

Problem: I recently migrated a customer Dev, Test, Appearance, Pre-prod and Production (Not yet) to use the AppInsights instance running on Log Analytics (sometimes refereed to as V2).  Logging wasn't work correctly.


Initial Hypothesis: I have complicated resource groups differing crossing DTAP boundaries.  By default, APIM has a logging catch all setup per APIM instance and then specific API's settings are changed to log to specific App Insights.

Steps:

My AppInsights instance was to rename the old classic type AppInsights e.g. "appinsights-dev" becames "appinsights-dev-delete".

Create a new AppInsights instance using the V2 Log Analytics option  and name it the original name "".  The client opted for the name to be the same.  It would be simpler to give it a name like "appinsights-dev02".  The clients also wanted to use a shared Log Analytics instance per env e.g. "loganalytics-dev-shared".






Monday 2 January 2023

Power Platform Competitors

Overview:  I like Power Platform, but there are other options out there.  Most products that are competitors cover a piece of what the Power Platform covers.

Mendix: A friend has used Mendix and they were not a fan.  I have played around and I like the product.  I certainly haven't found it's limitations but it feels straight forward and logical.  Easy to learn. 

Cons: 

  1. The publishing is very slow
  2. Free version you have to use Mendix subdomain.  Basic plan is $50/month, for my small demo, it a high price,


Outsystems:

Airtable:  Has lots of templates and easily connects to various data sources.  Easy to extend or build apps on the platform. Provides storage and low code apps.  It's like having Dataverse and Power Apps.  There are pre-built templates to get the team off to a start.

Nintex: Bought K2 and have a long history in workflows (workflows for SharePoint and O365), screen/form generation and form building.  If your company uses Nintex then worthwhile but I wouldn't use it for new projects or if the team does not have significant experience.

UiPath: RPA tool. A strong tool for automation, from a desktop automation and recording piece I feel UiPath is ahead of Power Automate Desktop (PAD).  Both UiPath and Power automate allow for attended and unattended runs.  PAD is part of Power Automation Premium or Power Automate Process licences. Automation Anywhere, and Blue Prism are other big players in the RPA space.

Postman Workflows: Bit of a dark horse but people love postman and i think this may become a interesting option for Rapid development.

Amazon:  AWS has various services that allow for Low Code solutions.  "Amazon Honeycode" is kind of like Model driven & Canvas apps, that can call Lambda's.  Lamda's are the same service as Azure Functions on AWS.  AWS Honeycode has predifined templates as staring points so I feel it is more like Salesforce's low code approach. This allows the developer to break out an write complex logic or persist the database in S3 storage.  "Amazon QuickSight" works like Power BI for reporting on solution data.

Salesforce Lightening: Allows for building custom apps and utilising Salesforce CRM and it's data. 

Retool: Good set of connectors to API's.


Appian: 

OutSystems: