Monday 16 August 2021

Distribute tracing using Azure Application Insights across Azure SaaS product

Overview:  Building SaaS products using multiple underlying Azure PaaS, and IaaS services with multiple Microservices supporting and calling each other is great.  The issue is we need to be able to trace, debug, and observe the logic flow through multiple Microservice calls.  Distribute Tracing on Azure supports technical players such as devOps teams, developers, support, technical leads and/or architects to find and trace the entire execution to figure our what is/has happened.  Application Insights provides rich functionality on Azure PaaS services.

Distribute Tracing:  A good option for providing consistent traceable logging is to use Application Insights with the Distributed Tracing to trace the flow of each transaction.  The original request generates an Id which is set as the operation_parentId.  Now we can easily follow the execution of a specific operation.

It is fairly easy to tie multiple operations together.  For example, an operation that fires timer jobs, each timer job would be seen as unique operations with their own full trace.  By referencing the original operation when the timer jobs are setup, long running distributed jobs can be tied together.

Thoughts:  Distributed tracing generally catches exceptions from the underlying infrastructure services such as SQL, Azure Functions, App Service on Windows but in code you can add additional tracing information.  The tracing info can be exception based but most of this is picked up anyway or trace base (when an even happens you want to record).

It is a good idea to instantiate the Telemetry client once per service e.g. webAPI and merely call using the same instantiate telemetry object instance throughout each application.

On exception in both client and Server side code write to App Insights telemetry.  Below is the C# server side code snippet:

try

{

    ...

    telemetry.TrackTrace(message, SeverityLevel.Warning, properties);

}

catch (Exception ex)

{

    telemetry.TrackException(ex);

}

More Info:

Distributed Tracing in Azure Application Insights - Azure Monitor | Microsoft Docs

Application Insights API for custom events and metrics - Azure Monitor | Microsoft Docs

Sunday 8 August 2021

Tools for Architects

Overview:  As an architect we use multiple tools, it's a good idea to standardise tool usage especially in larger businesses. It's pretty common to se architects using different tool for drawing architecture diagrams (viso, draw.io,...)

Common Drawing tool I've seen architects use:

  • Visio
  • Draw.io
  • Balsamiq (my odd tool of choice that offers low fidelity, alternatives are Figma which is high fidelity for UI and Ux design, issue is end users thing its the website and get confused, Axure is also a nice Ux prototyping tool).
  • Lucidchart (competitor products: Visio, Miro, Balsamiq, ) Lucidscall can pull diagrams from architecture.  The is also a cool python git project that allows you to specify a diagram using Python and it draws the diagram and you can also then use the Python code to provision the infrastructure.
  • Miro (great for sharing).  Similar to Lucidcharts, with good integration, I tend not to introduce Miros as I use Teams and it's whiteboarding.  FigJam is pretty good for brainstorming from the makers of Figma.
  • PowerPoint (sic. but it happens and some architects are good with it)

Tools for Retrieving Azure Architecture and creating documentation:

Enterprise Architecture Tooling:

  • Sparx
  • Archimate

Dev Tooling I use a lot:

  • Visual Studio
  • SQL Server Management Studio 
  • Postman/Swagger

WIP


Sunday 1 August 2021

JMeter - The basics

JMeter is an easy to use open source load testing tool by simulating network requests.  JMeter is good for figuring out how well the server side responses are working under different test conditions.  JMeter is built with Java and can run on Linux, Mac or Windows using a Java Virtual Machine (JVM). 


JMeter is Single Agent:

  • JMeter runs from the machine it is installed on so it does not have multiple agents.  Saying that it can simulate hundreds of users on fairly low spec machines.  
  • To avoid network latency, test on the same subnet or data center.  A simple VM in Azure (with 2 vCPU's and 8 GB RAM) can mimic over a thousand requests per second.
  • You can run tests off multiple machines to generate extreme loads (first I would use 8 cores and 64GB ram until the network traffic is maxed).
  • Install the Windows JDK 11 before installing installing JMeter.
Updated 2 May 2023: The current version of JMeter is Apache JMeter 5.5

Download and record web tests using the JMeter GUI tool.
Azure Load Testing needs the recorded tests generated by the JMeter GUI.
Create a new Azure Load Test Resource and use the recorded JMX/Test script file.


JMeter GUI
Open /bin/jmeter.bat


Wednesday 28 July 2021

Azure DevOps User Stories Tips

Quick Point on User Stories and Acceptance Criteria.

1.      User Stories description must follow the format:

As the <role> , I want to <feature> so that <benefit>

Note: Always follow the exact same format and bold up the standard/fixed parts for user stories.  Pls keep consistency across your teams user stories.  Under the user story in the description, feel free to add more description, annotated images (very useful) and links to Figma, Axure, UI mocks or Miro.  User Stories should also follow the INVEST (Independent, Negotiable, Valuable, Estimatable, Small, and Testable) breakdown.

2.      Acceptance Criteria (Use Gherkin Language) under the user story (ensure it goes into the User Story section and not comments or the description)

Scenario:
  Given
  When
  Then

Example

Scenario: Employee requests leave
  Given an employee has sufficient leave available in the year
  When the employee schedules leave (holiday)
  Then the employee is informed his request is valid and his manager is informed of the request.

Note: Always follow the exact same format and bold up the standard/fixed parts for user stories.  Pls keep consistency across your acceptance criteria.  I bold and use the four parts as shown above in the example.  You can use "and" to extend the story, just try keep them within the idea of INVEST.

3.      Other:  Order is Tasks belong to User Stories, User Stories belong to a Feature.  Features belong to Epics  These items must be related within Azure DevOps.   Epics > Features > User Stories (Acceptance criteria) > Tasks.

Scrum - Part 1
Scrum - Part 2 
Scrum - Part 3 - Scaling Scrum
Scrum - Part 4 - Kanban and Scrum
Scrum - Part 5 - Certification PSM 1

Azure DevOps - Introduction

Tuesday 20 July 2021

Open Banking & Crypto currency - Capital Gains Tax

In the 2019/2020 & 2020/2021 UK tax years, I bought and sold crypto currencies; I happened to make a little money out of it thru no skill of my own.  "A rising tide lifts all boats".  It has been bothering me so I looked up how taxing crypto works and it falls under Capital Gains. 

UK individuals get £12,300 I am well below the threshold for owing HMRC additional tax.  I still need to report my gains on my SA100/Self Assessment with HMRC using SA108 for reporting Capital Gains.  There is also a great initiative Payment Services Directive 2 (PSD2)/Open Banking offered by companies like TrueLayer that provide API's to get current account statements/transaction using OAuth2 permissions.  PDS2 is pretty useful for anti-money laundering (AML), specifically Know Your Customer (KYC) that falls under AML.

It is worth using this capital gains allowance if you can get solid returns off applicable assets such as shares or cryptocurrency.

"When you dispose of cryptoasset exchange tokens (known as cryptocurrency), you may need to pay Capital Gains Tax.

You only have to pay Capital Gains Tax on your overall gains above your tax-free allowance. The Capital Gains tax-free allowance is: £12,300".

TrueLayer and Token is a service that connects to all UK commercial banks instead of you having to individually connect to the banks Open Banking API's.

References

Payment Services Directive 2 and Open Banking | UK Finance

https://www.gov.uk/capital-gains-tax/allowances

Elastic Database Client Library for client database segregation on Azure PaaS for SaaS

Overview:  Provide a logically separated database instance for each client on my SaaS solution.  Using the Elastic Database client library from Microsoft on Azure PaaS services provides logical security separation of data, performance is on a per customer, and easy scalability.  Use Azure SQL Elastic Pools (HA redundant secondary database, built in DR).  Also add temporal tables for a full history of all transactions.

PoC:

  1. Provision 3 databases - A Shard Map Manager (Catalogue) database and 2 client databases (tenants/shards).
  2. Add shard related metadata to the Catelogue database for each of these databases.
  3. Create below Three service principals in Azure AD: 
    • Management Service Principal: for creating shard metadata structure.  A database contained user in Shard Map Manager db and each tenant db.
    • Access Service Principal: to load shard mapping at application side.  A database contained user in Shard Map Manager db.
    • Connection Service Principal: to connect tenant database.  Database contained user in each tenant db.


                        Management Service Principal: for creating shard metadata structure

CREATE USER [shard-map-admin-sp] FROM EXTERNAL PROVIDER

EXEC sp_addrolemember N'db_ddladmin', N'shard-map-admin-sp'

EXEC sp_addrolemember N'db_datareader', N'shard-map-admin-sp'

EXEC sp_addrolemember N'db_datawriter', N'shard-map-admin-sp'

GRANT EXECUTE TO [shard-map-admin-sp]

 

Access Service Principal: to load shard mapping at application side

CREATE USER [shard-map-access-sp] FROM EXTERNAL PROVIDER

EXEC sp_addrolemember N'db_datareader', N'shard-map-access-sp'

GRANT EXECUTE TO [shard-map-access-sp]

                                                         

Connection Service Principal: to connect client/tenant database

CREATE USER [tenant-connection-sp] FROM EXTERNAL PROVIDER

EXEC sp_addrolemember N'db_datareader', N'tenant-connection-sp'

EXEC sp_addrolemember N'db_datawriter', N'tenant-connection-sp'

EXEC sp_addrolemember N'db_ddladmin', N'tenant-connection-sp'

GRANT EXECUTE TO [tenant-connection-sp]


References:

https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-database-client-library


Saturday 10 July 2021

Modeling and working with data on the Dataverse within the Power Platform

The Common Data Model - Is an industry agreed approach to storing commonly used data.  Use it to store applications across your company/organisation.  Results in a single source of truth instead of multiple copies with data contained in different schema's.

Two types of relationships namely: 1:N and N:N (Dataverse hides the associate entity/intersect entity that is created in the background).  

Use "Option Sets" for small static data and use "Lookups" for larger or changing data.  In Multiselect Option sets for N:N relationships, rather use "Lookups".

SSMS can be used to view data using T-SQL, better to use "SQL 4 CDS" as it provides a full ability to work with data within the XrmToolBox.

XrmToolBox - is a 3rd party download that has a ton of contributed tools and there are good options for modelling data.  This is a collection of tools that are unbelievably useful and get continually updated and new tools added.

Excel and browser plugins - Can be used to import/export data and there are some nice Edge/Browser plugins to help such as "Level up for Dynamics 365/Power Apps".  Level up for Dynamics 365/Power Apps is a fantastic tool that I encourage any developer to add to Edge or Chrome as an extension.

Level up for Dynamics 365/Power Apps Browser extension
Dynamics 365 Power Pane Browser extension.

Dynamics 365 Power Pane is also a useful extension shown below:
Power Pane options

The built in browser Dataverse management tool is super easy to use to model your Common Data Model further.