Sunday 27 June 2021

Azure Elastic Pools - DB creation, schema alignment for SaaS

Overview:  Provisioning and seeding databases is pretty straight forward however ensuring multi-tenant databases schemas are aligned is a little tricky.  Azure has the Elastic Job Agent service that has been in preview for many years and is a good service.  It is extremely useful for updating multiple database instances en-masse.

There are 2 common scenarios

1. Provision a new database for a client.  This involves 1) creating the SQL database instance either on a server or an elastic pool. 2) Update the new database with the appropriate schema 3) insert any seed data into the database.

2. Update groups or all database instances to a specific schema i.e. change schema for all clients.

Scenario 1 needs to apply the schema to multiple databases and Elastic Job is perfect for this.  Scenrio 2 also needs to update a single database and can use the same T-SQL code to ensure new dbs have the correct schema applied.

Elastic Job Agent:

A job can be created that uses T-SQL (from source control) to ensure the schema of target databases are updated to a specific schema version.  A dedicate database is used to monitor and manage the jobs.  I call this the "agent-elasticdb" database.  

Note:  The Target can be all databases on a server, elastic pools, groups of name database instances or a single database.

If a job has multiple target database's to update, the updates are run in parallel.    

Thursday 3 June 2021

Post a message into a Teams Channel using any HTTP client

Overview:  I need to post messages into Teams channels from my application, it is extremely easy to do and took me 15 minutes. 

Steps to Post a message from Postman into a specific Teams Channel:

1. Setup a channel to accept POST requests


Add a connector to the Channel




Find the "Incoming Webhook" connector


Create/Configure the new Webhook

Copy the webhook endpoint

2. Send a postman POST HTTP request to push the data into the Teams Channel


3. Verify the result in the teams Channel
The custom message is displayed in the channel.

Tip: Format the card/message using these instructions.

Tuesday 1 June 2021

SaaS Azure Testing Thoughts

 Tooling:

  1. API Automation - Postman, Newman
  2. UI Automation - Selenium
  3. IDE - Visual Studio 2019
  4. Test Organization - Azure DevOps Test Plan
  5. CI/CD - Azure DevOps

Code reviews:

Code review is used as a verification technique to ensure that each unit is coded as per standards and expected business logic and inline with coding standards and best practices.  Automate code review built into Azure Pipelines should include:  

  • WhiteSource Bolt - Scan packages for vulnerabilities.
  • SonarQube - Static Analysis, 
  • Blackduck - Open-Source Scanning (OSS) tool.  Used to look for license risks and unused references.
  • Checkmarx - Static Application Security Testing (SAST) tool benefits include: Detect security vulnerabilities, Improve developer practices, and reports on code ownership.  Static code anaylsis.  VeraCode is a competitor product.
  • BugSuite
Code should pass OWASP (Open Web Application Security Project) shows the most common code vulnerabilities.  OWASP ASVS (Application Security Verification Standard) - framework for controls when building applications to cover functional and NFR's for web applications.

Unit testing:

Unit tests are written to ensure every unit of code is working as expected, and to prevent a defect from going to the next level on all C# code.  Xunit and Moq are the tools to be used for unit testing using the standard Arrange > Act > Assert pattern.

As long as Unit test coverage is high and of a good standard, I don't mind if the tests are written before the code (TDD) or as most developers tend to do the tests after the code is written.

API testing:

All API must use Postman collections and Environments for local testing.  The tests need to cover all API's dealing with authentication, authorisation, checking status codes, body responses, headers, data persistence, and post test clean-up.  Use Newman to integrate postman tests into Azure pipelines:

https://www.npmjs.com/package/newman-reporter-htmlextra

Selenium testing:

Code for UI must be automated where possible.

SonarQube: "automatic code review tool to detect bugs, vulnerabilities, and code smells in your code" SonarQube documentation

Code Smells:  Bloaters, OO abusers, ....

Checkmax detects potential security issues

Disposable email addresses: You often need to test login/account creation and it's useful to have temporary disposable email addresses:

Sunday 16 May 2021

Model Driven Apps in Power Apps

Overview:  Model-driven apps are one of three types of PowerApps we can build.  Sits on the CDS/Dataverse.

Model Driven Apps use the CDS to hold the data to drive the application.  CDS has an encrypted REST API to access the data which makes it great for building and connecting 3rd party applications.  

Choose from the three types of Power Apps available.

As the model driven app is based on the CDS model, you need to ensure you have the appropriate entities and attributes.  Very similar to ERD/UML modelling.  
  • Either standard or activity (time base) as a starting point for creating new entities.
  • There are lots of attribute type e.g. text field digital number, or email address...  
  • The Security model is great for ensuring the correct people have the appropriate rights to the data and CDS instance.  User can have multiple security roles and are additive (positive) in nature.
Note:  A lot of this information comes from the Power Apps App Maker training day from Microsoft.  The course is excellent and I wish I'd seen this before doing the PL-100 exam.

Sunday 9 May 2021

SaaS Go-to-Market (GTM)

Overview: Setting up a new SaaS product is about getting a product that fits into a market.  I like the 1/10/100 strategy especially in the B2B market.  For me it is about getting the product offering right before pushing money into marketing or generating revenue.  The issue is you need to be fully ready including items where support is a low touch as possible.  

A common mistake is not having private beta's and keeping the beta small.  It's dead easy, get 1 trusted customer that you have an open and honest relationship to find the easy improvements, and figure out how to support them and answer questions ...

Customer must have the need, preferably with a bad experience of a competitor in the same market, high trust relationship, and have high expectations for the offering.

So the tech and monitoring, onboarding is all done, we start with 1 customer in beta.  We learn lessons, improve the product, and now we prepare for a private beta for 10 customers.  Get comms right, ensure all support and monitoring is in place.

Saturday 8 May 2021

Enterprise Service Bus and Message Queue thoughts

Message Queues have been around for many years and I've implemented message queue using: SonicMQ, MSMQ,  IBM MQ.  I like to keep my architecture as simple as possible so I still use queues but deciding between your "eventing" architecture brings Enterprise Service Buses (ESB) into picture and all the other players.  The last 2 native PaaS players for messaging are Event Grid and Event Hub (IoT).  It come down to what you are trying to achieve, are you going to need more functionality later, what does you business have available and the skills you have to work with.

I like Azure Storage Queues, they are cheap, simple to setup and understand so for simple message queue capability:

Azure Storage Queues sit on Azure Storage (Type Queue Storage).  Multiple queues can be on a single Queue Storage Account, and must be named in lowercase.  Messages up to 64kb.  Order of queued messages is not guaranteed.  Max message lifespan is use to be 7 days (default is 7 days), now maximum time-to-live can be set to -1 meaning never expire.  SAS token to pragmatically access.  REST API to add and pull from the Queue (also has a peek API).v  Azure Storage supports "Poison queue" so when the "Dequeued count" exceeps the threshold set on the queue, the message is moved into the "Poisoned Queue".  Pricing is pretty cheap and LRS is the cheapest with GA-GRS (Geo Redundant storage) being the most expensive.  

Get a message from the queue and amend the message to wait 60 seconds (Code Source: Microsoft Docs)
QueueClient queueClient = new QueueClient(connectionString, queueName);
// Get the message from the queue QueueMessage[] message = queueClient.ReceiveMessages(); // Update the message contents queueClient.UpdateMessage(message[0].MessageId, message[0].PopReceipt, "Updated contents", TimeSpan.FromSeconds(60.0) // Make it invisible for another 60 seconds

// DeQueue/Delete the message queueClient.DeleteMessage(retrievedMessage[0].MessageId, retrievedMessage[0].PopReceipt);

Azure Enterprise Service Bus (ESB), is a fully managed ESB service.  Allows for standard queues (point-to-point) or topics also called pub-sub (point-to-multipoint) messaging.  Has two external connectivity options use Hybrid connections (webSockets) over Azure Relay.  Messages up to 256kb except Premium up to 1MB message size allowed in queues, In topics, I think, the max message size allow is 100MB.  Unlimited lifespan.  Dead lettering option.  Programmatic access via SAS token, AAD.  Supports access via REST or AMQP (used for many years as the standard for Message Queues).  Has Duplicate message detection (ensures "At-most-once" delivery).



Standard has limitations such as messaging size, lower through put, no networking (firewall/NSG is useful for limiting IP's).
Source: https://www.serverlessnotes.com/docs/service-bus-namespace-standard-or-premium-tier 

AMQP can run on TCP on ports 5671 and 5672 or on https on port 443

Competitor options for Azure Enterprise Service Bus including message exchanging technologies: AWS SQS, GCP Pub Sub, NATs, Oracle ESB, JBoss Fuse, Mule ESB (from Mulesoft), IBM Websphere ESB, BizTalk, Azure EventGrid, Azure Storage Bus, Sonic ESB and I guess all the message ques link SonicMQ, IBM MQ.

Update 31 Jan 2022:  
Problem:  Messages are pushed onto the topic but are taking between 5 and 35 minutes to process.  The listener was an app registered on Service Fabric, and this started happening after we rebuilt a new instance of Service Fabric.  The Service Bus still work but 1 subscriber was taking this extra 5-35 minutes from what was previously being processed within 1 second.
Resolution:  As always, reboot :)  I disable the topic and re-enabled it and the messages were being processed within 1 second.  I could not find this behavior on google searches and after a lo fair amount of trying to check messages and configuration, a good old restart fixed the delay in message processing.

More Info:
NATS - Common ESB software gaining popularity



Friday 30 April 2021

Azure Naming Conventions

 My Format (I simplify for smaller companies)

<Company>-<BusinessUnit>-<Region>-<Environment>-<ResourceType>-<Project>-<Instance>

GS-IT-UK-PR-RGP-Treetops-001

GS-HR-US-DV-NSG-Cloud-001

I like to enforce the same length for each part, just because it makes it easier to read in a list.  i.e. Region - Could be the 2 digit country code.  Case consistency is also important. 

Environment is my DTAP environment i.e. DV = Development, TS = Test, AS = Acceptance, PR=Production

Resource Type is the Azure Resource Type e.g. Network Security Group = NSG.   It is worth publishing a list as application services could be app or aps.

Tip: In azure sometime you can't use hyphens or need to use lowercase.  If I am forced, then I keep the same convention but merely abide by the rules of the service.

The key is just keep it consistent.  I find organisation use Tags poorly so with the naming convention, it helps replace the need for Tags or tags can easily be added as it gives the info away in the name.

Microsoft Recommends Azure naming convention:


Another example format:
<4digitApplicationName>-<2digitEnvironment>-apim-<2digitcounter> e.g. taxp-pr-apim-01 or  dvla-dv-appins-01 
I think it is just key to agree a standard format and stick to it.

I 100% ensure naming is consistently followed for resources, it's also useful to have a few tags, but I'm not dogmatic regarding tags.

Tags Examples
Env: Dev
Data Classification: Confidential
Project: Tax Treaties