Wednesday, 28 July 2021

Azure DevOps User Stories Tips

 Quick Point on User Stories and Acceptance Criteria.

1.      User Stories description must follow the format:

As the <role> , I want to <feature> so that <benefit>

Note: Always follow the exact same format and bold up the standard/fixed parts for user stories.  Pls keep consistency across your teams user stories.  Under the user story in the description, feel free to add more description, annotated images (very useful) and links to Figma, Axure, UI mocks or Miro.

2.      Acceptance Criteria (Use Gherkin Language) under the user story (ensure it goes into the User Story section and not comments or the description)

Scenario:
  Given
  When
  Then

Example

Scenario: Employee requests leave
  Given an employee has sufficient leave available in the year
  When the employee schedules leave (holiday)
  Then the employee is informed his request is valid and his manager is informed of the request.

Note: Always follow the exact same format and bold up the standard/fixed parts for user stories.  Pls keep consistency across your acceptance criteria.  I bold and use the four parts as shown above in the example.

3.      Other:  Order is Tasks belong to User Stories, User Stories belong to a Feature.  Features belong to Epics  These items must be related within Azure DevOps. 

Scrum - Part 1
Scrum - Part 2 
Scrum - Part 3 - Scaling Scrum
Scrum - Part 4 - Kanban and Scrum
Scrum - Part 5 - Certification PSM 1

Tuesday, 20 July 2021

Open Banking & Crypto currency - Capital Gains Tax

In the 2019/2020 & 2020/2021 UK tax years, I bought and sold crypto currencies; I happened to make a little money out of it thru no skill of my own.  "A rising tide lifts all boats".  It has been bothering me so I looked up how taxing crypto works and it falls under Capital Gains. 

UK individuals get £12,300 I am well below the threshold for owing HMRC additional tax.  I still need to report my gains on my SA100/Self Assessment with HMRC using SA108 for reporting Capital Gains.  There is also a great initiative Payment Services Directive 2 (PSD2)/Open Banking offered by companies like TrueLayer that provide API's to get current account statements/transaction using OAuth2 permissions.  PDS2 is pretty useful for anti-money laundering (AML), Know Your Customer (KYC) and fair amount more that I'm not an expert in working with.

It is worth using this capital gains allowance if you can get solid returns off applicable assets such as shares or cryptocurrency.

"When you dispose of cryptoasset exchange tokens (known as cryptocurrency), you may need to pay Capital Gains Tax.

You only have to pay Capital Gains Tax on your overall gains above your tax-free allowance. The Capital Gains tax-free allowance is: £12,300"

References

Payment Services Directive 2 and Open Banking | UK Finance

https://www.gov.uk/capital-gains-tax/allowances

Elastic Database Client Library for client database segregation on Azure PaaS for SaaS

Overview:  Provide a logically separated database instance for each client on my SaaS solution.  Using the Elastic Database client library from Microsoft on Azure PaaS services provides logical security separation of data, performance is on a per customer, and easy scalability.  Use Azure SQL Elastic Pools (HA redundant secondary database, built in DR).  Also add temporal tables for a full history of all transactions.

PoC:

  1. Provision 3 databases - A Shard Map Manager (Catalogue) database and 2 client databases (tenants/shards).
  2. Add shard related metadata to the Catelogue database for each of these databases.
  3. Create below Three service principals in Azure AD: 
    • Management Service Principal: for creating shard metadata structure.  A database contained user in Shard Map Manager db and each tenant db.
    • Access Service Principal: to load shard mapping at application side.  A database contained user in Shard Map Manager db.
    • Connection Service Principal: to connect tenant database.  Database contained user in each tenant db.


                        Management Service Principal: for creating shard metadata structure

CREATE USER [shard-map-admin-sp] FROM EXTERNAL PROVIDER

EXEC sp_addrolemember N'db_ddladmin', N'shard-map-admin-sp'

EXEC sp_addrolemember N'db_datareader', N'shard-map-admin-sp'

EXEC sp_addrolemember N'db_datawriter', N'shard-map-admin-sp'

GRANT EXECUTE TO [shard-map-admin-sp]

 

Access Service Principal: to load shard mapping at application side

CREATE USER [shard-map-access-sp] FROM EXTERNAL PROVIDER

EXEC sp_addrolemember N'db_datareader', N'shard-map-access-sp'

GRANT EXECUTE TO [shard-map-access-sp]

                                                         

Connection Service Principal: to connect client/tenant database

CREATE USER [tenant-connection-sp] FROM EXTERNAL PROVIDER

EXEC sp_addrolemember N'db_datareader', N'tenant-connection-sp'

EXEC sp_addrolemember N'db_datawriter', N'tenant-connection-sp'

EXEC sp_addrolemember N'db_ddladmin', N'tenant-connection-sp'

GRANT EXECUTE TO [tenant-connection-sp]


References:

https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-database-client-library


Thursday, 3 June 2021

Post a message into a Teams Channel using any HTTP client

Overview:  I need to post messages into Teams channels from my application, it is extremely easy to do and took me 15 minutes. 

Steps to Post a message from Postman into a specific Teams Channel:

1. Setup a channel to accept POST requests


Add a connector to the Channel




Find the "Incoming Webhook" connector


Create/Configure the new Webhook

Copy the webhook endpoint

2. Send a postman POST HTTP request to push the data into the Teams Channel


3. Verify the result in the teams Channel
The custom message is displayed in the channel.

Tip: Format the card/message using these instructions.

Tuesday, 1 June 2021

SaaS Azure Testing Thoughts

 Tooling:

  1. API Automation - Postman, Newman
  2. UI Automation - Selenium
  3. IDE - Visual Studio 2019
  4. Test Organization - Azure DevOps Test Plan
  5. CI/CD - Azure DevOps

Code reviews:

Code review is used as a verification technique to ensure that each unit is coded as per standards and expected business logic and inline with coding standards and best practices.  Automate code review built into Azure Pipelines should include:  

  • SonarQube - Static Analysis, 
  • Blackduck - Open-Source Scanning (OSS) tool.  Used to look for license risks and unused references.
  • Checkmarx - Static Application Security Testing (SAST) tool benefits include: Detect security vulnerabilities, Improve developer practices, and reports on code ownership.  Static code anaylsis.  VeraCode is a competitor product.
  • BugSuite
Code should pass OWASP (Open Web Application Security Project) shows the most common code vulnerabilities.  Also ASVS (Application Security Verification Standard) - framework for controls when building applications to cover functional and NFR's.

Unit testing:

Unit tests are written to ensure every unit of code is working as expected, and to prevent a defect from going to the next level on all C# code.  Xunit and Moq are the tools to be used for unit testing using the standard Arrange > Act > Assert pattern.

As long as Unit test coverage is high and of a good standard, I don't mind if the tests are written before the code (TDD) or as most developers tend to do the tests after the code is written.

API testing:

All API must use Postman collections and Environments for local testing.  The tests need to cover all API's dealing with authentication, authorisation, checking status codes, body responses, headers, data persistence, and post test clean-up.  Use Newman to integrate postman tests into Azure pipelines:

https://www.npmjs.com/package/newman-reporter-htmlextra

Selenium testing:

Code for UI must be automated where possible.

SonarQube: "automatic code review tool to detect bugs, vulnerabilities, and code smells in your code" SonarQube documentation




Saturday, 8 May 2021

Enterprise Service Bus and Message Queue thoughts

Message Queues have been around for many years and I've implemented message queue using: SonicMQ, MSMQ,  IBM MQ.  I like to keep my architecture as simple as possible so I still use queues but deciding between your "eventing" architecture brings Enterprise Service Buses (ESB) into picture and all the other players.  The last 2 native PaaS players for messaging are Event Grid and Event Hub (IoT).  It come down to what you are trying to achieve, are you going to need more functionality later, what does you business have available and the skills you have to work with.

I like Azure Storage Queues, they are cheap, simple to setup and understand so for simple message queue capability:

Azure Storage Queues sit on Azure Storage (Type Queue Storage).  Multiple queues can be on a single Queue Storage Account, and must be named in lowercase.  Messages up to 64kb.  Order of queued messages is not guaranteed.  Max message lifespan is use to be 7 days (default is 7 days), now maximum time-to-live can be set to -1 meaning never expire.  SAS token to pragmatically access.  REST API to add and pull from the Queue (also has a peek API).v  Azure Storage supports "Poison queue" so when the "Dequeued count" exceeps the threshold set on the queue, the message is moved into the "Poisoned Queue".  Pricing is pretty cheap and LRS is the cheapest with GA-GRS (Geo Redundant storage) being the most expensive.  

Get a message from the queue and amend the message to wait 60 seconds (Code Source: Microsoft Docs)
QueueClient queueClient = new QueueClient(connectionString, queueName);
// Get the message from the queue QueueMessage[] message = queueClient.ReceiveMessages(); // Update the message contents queueClient.UpdateMessage(message[0].MessageId, message[0].PopReceipt, "Updated contents", TimeSpan.FromSeconds(60.0) // Make it invisible for another 60 seconds

// DeQueue/Delete the message queueClient.DeleteMessage(retrievedMessage[0].MessageId, retrievedMessage[0].PopReceipt);

Azure Enterprise Service Bus (ESB), is a fully managed ESB service.  Allows for standard queues (point-to-point) or topics also called pub-sub (point-to-multipoint) messaging.  Has two external connectivity options use Hybrid connections (webSockets) over Azure Relay.  Messages up to 256kb except Premium up to 1MB message size allowed.  Unlimited lifespan.  Dead lettering option.  Programmatic access via SAS token, AAD.  Supports access via REST or AMQP (used for many years as the standard for Message Queues).  Has Duplicate message detection (ensures "At-most-once" delivery).




Competitor options for Azure Enterprise Service Bus including message exchanging technologies: AWS SQS, GCP Pub Sub, NATs, Oracle ESB, JBoss Fuse, Mule ESB (from Mulesoft), IBM Websphere ESB, BizTalk, Azure EventGrid, Azure Storage Bus, Sonic ESB and I guess all the message ques link SonicMQ, IBM MQ, 

More Info:
NATS - Common ESB software gaining popularity



Friday, 30 April 2021

Azure Naming Conventions

 My Format (I simplify for smaller companies)

<Company>-<BusinessUnit>-<Region>-<Environment>-<ResourceType>-<Project>-<Instance>

GS-IT-UK-PR-RGP-Treetops-001

GS-HR-US-DV-NSG-Cloud-001

I like to enforce the same length for each part, just because it makes it easier to read in a list.  i.e. Region - Could be the 2 digit country code 

Environment is my DTAP environment i.e. DV = Development, TS = Test, AS = Acceptance, PR=Production

Resource Type is the Azure Resource Type e.g. Network Security Group = NSG.   It is worth publishing a list as application services could be app or aps.

Tip: In azure sometime you can't use hypens or need to use lowercase.  If I am forced, then I keep the same convention but merely abide by the rules of the service.

The key is just keep it consistent.  I find organisation use Tags poorly so with the naming convention, it helps replace the need for Tags or tags can easily be added as it gives the info away in the name.

Microsoft Recommends Azure naming convention:



Friday, 2 April 2021

Power Apps using Excel in One Drive or SharePoint

Overview:  A Canvas Power App can easily connect to Excel held with SharePoint of OneDrive.   It is great for getting values in, or for reading static values from a list.

Limitations

1. You need to create a table in Excel to connect from Power Apps.  The problem is the table cannot contain any formulas.  I wanted to input a value, use Excel formulas to get a calculate risk value.  It can't connect.  The work around is to build the formulas in Power Apps but for my customer complex Excel sheet I don't want to spend weeks re-engineering the logic from code.

Excel: Trying to read field B19 that reference my calculation:


Power Apps: Trying to connect to the Excel table results in the "Excel file containing formula are currently not supported" error message:

2. Size limit, max 2 MB Excel file.  This may be bigger now.

Summary:  You can't work with large data sets and you can read from calculation cells in tables o Excel is fine for inputting into an Excel document that will be used as excel but pretty useless as a data store for Power Apps.  Rather use a SharePoint list but this doesn't help if you actually want to use Excel as it contains complex logic.

Sunday, 21 March 2021

Building Response Canvas Power App

Problem:  Power Apps is great but historically you either build for the tablet or the phone and then the end users get an alright experience on the other device type.  Microsoft have Horizonal and Vertical containers and it is a greate way to build Resposnvie/Progressive applications.  

On the Power App make the following settings changes:

  • Settings > Advanced Settings > Enable "Layout Containers"
  • Settings > Screensize + Orientation > Diable "Scale to Fit"
  • Settings > Screensize + Orientation > Diable "Lock Orientation"

Layout controls: Horizontal container & Vertical container, use these 2 controls to get a Responsive Design.  Screen layouts are only available for the Tablet layout. 

WIP

Monday, 15 March 2021

APIM Authentication and authorisation

Overview:  APIM provides two methods for secure access to an API operations:

  1. Subscription: passed in the header or the query string (generated primary and secondary subscription key for each subscription), and 
  2. Client Certificates.
WIP


Saturday, 13 March 2021

APIM OpenAPI Specification Documentation Example within the Developer Portal

Overview:  I find document APIM contracts incredibly important and yet it's often very poorly done.  This post provide a simple YAML and JSON example that can be imported into APIM or any other gateway product for that matter.

The YAML file below can be imported into APIM and published to the developer portal.  The example provides a clear example on options and how an API should be documented.  Developers can see an example of the JSON to use when performing the PUT.  The developers can see more information of a property, for instance a passport number would be a certain length and rather than specifying and option free text string with no description, the developer would know that the property has to come in the correct format.


Simple Open API specification showing a single documented operation for a complex PUT object (YAML).

PB APIM Series:
Documenting you API in the Developer portal (this post)


Sunday, 7 March 2021

SaaS Product Customer Experience - level 101 Thoughts

Overview: With SaaS products is is easy for our customers to leave.  We need to reduce turnover/churn and technology can help to deliver great Customer Experience (CX). 


Topics to Review:
  1. SLA's & OLA's
  2. Service Status Page - Microsoft do a great job at showing status pages for their services.
  3. Incident Management - Convert into knowledge base both internally and customer facing.  Sales and customer management information alignment.  Find a customer, see their past requests, help reduce churn. service now incident management
  4. Content - product documentation, community forum, online knowledge base, ability to have a good search to cross all the channels (think Coveo), support chat bots, live conversations with support.
  5. Certifications, gamification useful for building a community. 
  6. Communicate new features, educate support, educate sales and evangelists and clients.  Training and consultancy.  Monitor communication, use Sentiment Analysis.
  7. Support level i.e. free support may not allow phone conversations and not have an SLA whereas premium support may offer 24 hour production issue resolution with money back guarantees.  Example: Azure Support Plans.  Phone is expensive for support, do we offer this 24/7 and having good support is costly.  People are becoming more familiar with digital self service.  It is also a good idea to have a  warm hand-off from automation to a person.
Getting your processes correct and clear is key to your Digital customer experience (CX).  Key areas to consider: 
  • Advertising/attracting clients, converting leads to clients;
  • Trials and paying (make it easy, cost effective, billing) - customer must understand what they are paying for;
  • First time user experience (easy good experience the client can use);
  • Habitual users (once use to the system, do my users have the best experience - get usage telemetry); and
  • Support (levels of support, chat bots, email, call support).
Thoughts: SaaS world changes so quickly these days, great customer experience and support are more important than ever.  An interesting idea I heard, "You can loose a customer on price or customer service, you will only win them back thru customer service".  (KYC) Know Your Customer to ensure you can delight your customer.  Get one step ahead, try understand your customers concerns early.


.NET versions no longer supported

Problem:  The client has several existing .NET applications/products.  These products are running on Azures Service Fabric, which has been developed over several years.  Service Fabric applications have various .NET core and .NET framework versions, and many of the apps have gone out of Microsoft support for the .NET version they are written on.

Initial Hypothesis:  We have .NET framework and .NET Core apps.  .NET Core 5.0 has dropped the "Core" part of the naming, so it is merely .NET 5.   .NET framework came first, and the last version was .NET framework 4.8.  .NET Core is the follow on and .NET Core 3.1 next version has the name chnge to .NET 5.0.  The "Core" part is dropped.  Lastly, .NET Standard is often used by software companies to write dll's. Only class libraries are allowed, but any .NET framework or .NET core projects can reference them.

  • .NET Framework - 4.8 was the latest release version
  • .NET Core became .NET - Last was 3.1 but renames to .NET 5 which is the latest version
  • .NET Standard - Class projects only

Proposed Resolution: My preference is to upgrade .NET core and .NET Framework separately as this is the lowest risk, most extended life with minimized cost. 

 .NET Core

For .NET Core apps, if less than .NET core 3.1 migrate the app to .NET 5.  .NET Core 3.1 don't upgrade.  Any other versions should be upgraded to .NET 5. 

Version

Release date

Released with

Latest update

Latest update date

Support ends[20]

.NET Core 1.0

2016-06-27[21]

Visual Studio 2015 Update 3

1.0.16

2019-05-14

June 27, 2019

.NET Core 1.1

2016-11-16[22]

Visual Studio 2017 Version 15.0

1.1.13

2019-05-14

June 27, 2019

.NET Core 2.0

2017-08-14[14]

Visual Studio 2017 Version 15.3

2.0.9

2018-07-10

October 1, 2018

.NET Core 2.1

2018-05-30[15]

Visual Studio 2017 Version 15.7

2.1.26 (LTS)

2021-03-09

August 21, 2021

.NET Core 2.2

2018-12-04[16]

Visual Studio 2019 Version 16.0

2.2.8

2019-11-19

December 23, 2019

.NET Core 3.0

2019-09-23[23]

Visual Studio 2019 Version 16.3

3.0.3

2020-02-18

March 3, 2020

.NET Core 3.1

2019-12-03[24]

Visual Studio 2019 Version 16.4

3.1.13 (LTS)

2021-03-09

December 3, 2022

.NET 5

2020-11-10[25]

Visual Studio 2019 Version 16.8

5.0.5

2021-04-06

3 months after .NET 6 release


 
.NET Framework

For .NET Framework, anything less than .NET Framework 3.5, move to 3.5.  3.5 Framework based apps can stay on 3.5.  For app using .NET Framework 4.0 to 4.5.1 upgrade to .NET framework 4.8.  For app using .NET 4.5.2 to 4.7, stay on their versions.

Version

Existing Application Support

Target New Applications?

1.x

out of support - migrate ASAP

out of support - do not target

2.x

out of support - migrate ASAP

out of support - do not target

3.x

out of support - migrate ASAP

out of support - do not target

3.5

 Operating System (until 2029) 

Operating System (until 2029) target with caution

4.0 to 4.5.1

out of support - migrate ASAP

out of support - do not target

4.5.2 to 4.7

⚠️Operating System (mixed) - review support policy of each version

Operating System (mixed) - target with caution

4.8

 Operating System (indefinite)

 Operating System (indefinite)


.NET Standard