Sunday 13 December 2020

ISO 27001 Certification & OWASP

Overview:  I have been thru several ISO and security audits over the years for various companies offering SaaS products.  This post outlines a some of my note around the latest ISO 27001 audit I touched on.

ISO 27001 covers Information Security Management (ISMS) which is about protecting and managing your businesses information assets to reduce your business risks.  It demonstrates that your organisation has good security practices in place.

Note: ISO 27001 is a management of systems standard for an organisation, it is not done for a particular product.  

An ISO 27001 ISMS consists of policies, procedures and other controls involving people, processes and technology".  https://www.itgovernance.co.uk/iso27001

Parts to an ISO 27001 audit:

  • Part 1 - Check you have the correct documentation.  
            Output is a go ahead and get a visit plan from the auditor.
  • Part 2 - Checks you as a business are complying/working to the documentation.  Basically evidence based reporting based on visual confirmations and discussing with the staff using interviews to verify compliance (sample based auditing).  Findings normally grouped into 3 types of findings: 1)   Opportunity for improvement = suggestions, need to review before next audit to see if this is worth implementing 2) Non conformance - Minor = can have a few of these, look to fix 3) Non conformance - Major - won't get certification with a major.  There is a period to address/fix major issue/issues.  Always complete the phase 2 audit as they may discover other majors.
            Output Findings report and several weeks latter the certification.
  • Certification
  • Yearly: Need to repeat and show you are improving based on the findings and the audit will generally go into specific areas in more detail.
More Info:
Data Protection and Regulation note - see bottom of post for ISO27001

Notes
Business Continuity quarter check
Annual Security Policy & Standard Review 
Security training - different roles need different training
Annual penetration testing
Audit annual re-certification days
Risk Information: Non conformity & root cause analysis

Technical:  Encryption and REST, Encryption in Transit, DAST/SAST on code, =logically secure customer data/security, Azure Defender to harden infra and continuously monitor, vulnerability or external penetration testing, ASVS/OWASP.

ISO 27701 - "ISO 27701 extends the meaning of “information security” detailed in ISO 27001. While the privacy and protection of personal data is part of ISO 27001, the newer standard extends the scope to include the “protection of privacy as potentially affected by the processing of PI" source: https://www.learningcert.com/information-security/iso-27001-vs-iso-27701/

ISO 27017 - is a security standard developed for cloud service providers and users to make a safer cloud-based environment and reduce the risk of security source Wikipedia.  I think ISO27017 is now part of ISO27001 extended.

ISO 28000 - is the spec for security management systems for the supply chain (partner dependancies e.g. software vendor, hosting company service)

ISAE 3402/SOC 2/ISO 27001 - about verification of business processes/internal controls of the business of of a high standard.

Tuesday 1 December 2020

Testing your home Internet Speed using your IPhone

Problem:  Broadband offers various speeds options when purchasing, the actual speeds you get are usually well below and depend on you specific instance. 

Initial Hypothesis:  iOS has multiple apps to monitor speed to your iPhone.  

Resolution:  Download "Speedtest" using the app store an any Apple device.   5G performance is fantastic.

Below are my Results, I live in South West London (Zone 4)

Sky broadband - SW London

Broadband Download (Mbps)  Up Speed (Mbps)  Location             
Sky phone 34.80 5.72 SW London 
EE 4G - LTE 13.00 0.13 SW London
O2 4G - LTE 16.90 10.20SW London
EE - 5G 372.00 19.80 Newcastle

EE 4G - Mobile
EE 4G - SW London

O2 4G - SW London

5G - Newcastle

Thoughts:
Speed tests vary greatly, so worth doing at least 3 to get an average. 70 Mbps download on EE4G is very possible so the download speeds can be as good as my Sky broadband.  5G performance is fantastic - the MIFI/5G routers are going to be awesome when 5G rolls out to my area.  Using O2 and EE at my home, O2 is way faster down but interesting the upload speed is amazing using O2 (The O2 tower is way better positioned).

Update 15/03/2023
Nice package to check Internet upload and download speed on Mac, Windows and Linux that I got from Tobias Zimmergren

Install cmd PS>  npm install --location=global fast-cli
Run cmd> fast -u

Sunday 29 November 2020

Azure SQL Basic Options Summary

OverviewAzure SQL is incredible.  There are a lot of options when choosing how to host database and performance good.  "handles patching, backups, replication, failure detection, underlying potential hardware, software or network failures, deploying bug fixes, failovers, database upgrades, and other maintenance tasks", from Microsoft Docs and Azure SQL.

Azure SQL is the PaaS database service that does the same functions as SQL Server did for us for many years as the workhorse for many organisations.  Microsoft initially only offered creating VM's and then installing SQL Server on-prem.   Azure SQL is Microsoft's PaaS SQL database as a Service offering on the cloud.  Azure SQL is a fully managed platform for SQL databases that Microsoft patches managed backups and high availability.  All the features that are available in the on-prem. Edition are also built into Azure SQL with minor exceptions.  I also like that the minimum SLA provide by Azure SQL is 99.99%.

Three SQL Azure PaaS Basic Options:

  1. Single Database - This is a single isolate database with it's own guaranteed CPU, memory and storage.
  2. Elastic Pool - Collection of single isolate databases that share DTUs (CPU, Memory & I/O) or Virtual Cores.
  3. Manage Instance - You mange a set of databases, with guaranteed resources.  Similar to IaaS with SQL installed but Microsoft manage more parts for me.  Can only purchase using Virtual Core model (No DTU option).
Thoughts: Managed Instances recommend up to 100TB but can go higher.  Individual databases under elastic pools or single databases are limited to a respectable 4 TB.

Two Purchasing Options:

  1. DTU - A single metric that Microsoft use to calculate CPU, memory and I/O.  
  2. Virtual Cores - Allows you to choose you hardware/infrastructure.  One can optimise more memory than CPU ratio over the generalist DTU option.
Thoughts:  I prefer the DTU approach for SaaS and greenfield projects.  I generally only consider Virtual Cores, if I a have migrated on-prem. SQL onto a Managed Instance or for big workloads virtual cores can work out cheaper if the load is consistent.  There are exceptions but that is my general rule for choosing the best purchasing option.

Three Tiers:

  1. General Business/Standard (There is also a lower Basic Level)
  2. Business Critical/Premium
  3. Hyperscale

Backups

Point in time backups are automatically stored for 7 to 35 days (default is 7 days), protected using TDE, full, differential and transaction log backups are used to point in time recovery.  The backups are stored in blob storage RA-GRS (meaning in the primary region and all the read-only backups are stored in a secondary Azure region).  £ copies of the data in the active Azure Zone and 3 read only copies of the data.

Long Term Retention backups can be kept for 10 years, these are only full backups.  The smallest retention is full backups retained for each weeks full backup.  LTR is in preview available for Managed Instances.

Azure Defender for SQL 

Monitors SQL database servers checking vulnerability assessments (best practice recommendations) and Advance Threat Protection which monitors traffic for abnormal behavior.

Checklist:

  1. Only valid IP's can directly access the database, Deny public Access,
  2. AAD security credentials, use service principals
  3. Advanced Threat Protection has real time monitoring of logs and configuration (it also scans for vulnerabilities), 
  4. Default is to have encryption in transit (TLS 1.2) and encryption at rest (TDE) - don't change,
  5. Use Dynamic data masking inside the db instance for sensitive data e.g. credit cards
  6. Turn on SQL auditing,

Note: Elastic Database Jobs (same as SQL Agent Jobs).

Azure offers MySQL, Postgre and MariaDB as hosted PaaS offerings. 

Note: The Azure SQL PaaS Service does not support the filestream datatype : use varbinary or references to blobs. 

Monday 26 October 2020

Identity Server - OAuth and OIDC

Overview:  The current version of Identity Server is 4.  Identity server is basically a .NET Core 3.1 application that is an Identify Provider (IdP) similar in role to PingId, SiteMinder, AAD b2C.  Identity server allows application (native mobile, web sites and servers) to securely authenticate users.  In this post OAuth means OAuth2.0.

OAuth2 Grant Types:

Flow Description Client Grant Type
Authorization with PK Authorization Code Grant Type.  Default choice for authorization. Native mobile Apps, Windows app, Browser Apps Code
Client Credential Server-to-server (S2S) communication also refereed to as Machine-to-machine (M2M). Server,Consoles,Services ClientCredentials
Implicit Rather use the Authorization Code Flow with PKCE Native Apps & SPA's often use Implicit Flow Implicit
Hybrid
Device
Resource Owner Pswd Don't use

Scopes: The authorisation Server specifies the "scope" that the user has consented too.  So for an API you can limit the actions the user can perform.  Scopes must be unique strings.  Recommendation is to name your scopes by the API and the Verb e.g. "pauls_api.read" is better than "read".  Scopes are used to give the user access to resources so "read" is not a good idea.  Also scopes have length limits so don't be crazy verbose in naming.

Mandatory Endpoints:  OAuth specifies 2 endpoints namely:
  1. /authorization endpoint - gets the access grant and user consent (only code and implicit flows use this endpoint)
  2. /token Endpoint - issues tokens (client credential only uses the token endpoint, obviously code & implicit flow use both endpoints)
Optional Endpoint Extensions:
  • /revoke - used to revoke a token
  • /userinfo - used to hold profile info for the user e.g. name, email.  The /userinfo endpoint is used in OIDC implementations of OAuth and specifies user must use: address, phone, email, profile in their scopes.
  • /.well-known/oauth-authorization-server - useful to discover the actual OAuth implementation.
Tokens
Access Token:
  • JSON Web Token (JWT) pronounce "JOT" is an access token that contains authorisation and profile data.  The alternative is to use Opaque to JWT but most implementations use JWT.  
  • JWT's need to be validated using the signature.  The JWT Access Token is base 64 encoded and are made up of three parts separated by period signs i.e. HEADER.PAYLOAD.SIGNATURE


Refresh Token:
  • Refresh tokens are opaque
  • Single endpoint with a single function to get a new Access Token.


Interactive description of the OAuth Code Flow process:


Monday 19 October 2020

APIM High Availability and Performance across Regions

Overview:  APIM can be setup in multiple regions and incoming request will be routed to the closest APIM endpoint.  If there is only 1 APIM region, it is best to ensure the API/App Service/Function is hosted in the same region.  With multiple APIM's you can also host a API in the same region.  The routing is either done automatically using Azure Front Door or via policy on the APIM.

Front Door can be substituted with Azure WAF, or Cloudflare or Barracuda's SaaS solution.

More Info

WAF Options

Overview:  HTTP Traffic from users to web sites and API's need to have WAF protection.  Both Azure and AWS have good services to protect your API's and applications.  There is also the option to use a dedicated WAF Services.  When protecting large organizations with hybrid cloud providers then options like Barracuda, Imperva/Encapusla, F5 and Cloudflare are good enterprise level options.  Fundamentally, a WAF sits as an intermediary between the user and the resource they are requesting using HTTP.  I like to set my highest priority rule to DENY all HTTP & HTTPS traffic, then i specifically open the rules that i want to flow thru, a lot people do it the other way around in smaller implementations.

WAF Options:
  • Azure WAF simple in 1 region for a WAF especially with APIM and if you are an Azure customer simple got for an Azure Application Gateway with WAF enabled.  DDoS is s separate service that can be integrated before Azure WAF or Azure Firewall.  Cheaper per IP SKU option for specific IP adrs.
  • Azure Front Door WAF is pretty amazing, Cloudflare is historically the leader with similar functionality.  On Microsoft Azure the main two options for WAF are Front Door WAF (Best, most expensive) and Azure Application Gateway WAF.
  • Competitor  options: Barracuda WAF SaaS Service or Any software firewall KEMP, F5, Check Point, Fortinet/Fortigate, Cloudflare WAF, Akamai, AWS WAF, AWS Network Firewall, Cloud Armor is GCPs WAF I believe, ....  
  • Check WAF service has protection at least for DDoS, XSS, SQL injection attacks, SSL Termination if you need it, Managed RuleSets.
  • AWS WAF is for web traffic (layer 7), there is a separate AWS Shield service that is used for DDos attacks.  AFS can be applied at a Application Load Balancer, Amazons API Gateway, and Amazon CloudFront.  With AWS WAF you also get Shield (standard free).  Shield adds advanced features and the standard version that is always included by default with AWS WAF has monitoring and DDoS protection.
  • Barracuda WAF is a SaaS Service that has worked fairly well for me.  Has a fair amount of options and rules.  Has add-ons like anti-virus scanning.
  • Imperva WAF was previously called Incapsula WAF, that provides a SaaS WAF service including Smart DDoS (block dodgy traffic and passes thru good requests), API Security, SQL injections, Xss.  Multiple data centers around the world.
  • Cloudflare is a Secure access service edge (SASE).  Cloudflare provides a WAF service at hundreds of endpoints around the globe (for instance there are 5 Cloudflare endpoints in Australia).  WAF functionality like SSL, DDoS (L7), customer rule e.g. rate limiting, OWASP rules applied, "api protection", et al. is done close to the user request (nice low latency) and then if successful it is pushed to the backend.

 

Last Updated: 2022-03-15

Friday 9 October 2020

App Insights - Website and API Monitoring

Overview:  App Insights has functionality to run scheduled web requests and log the output in App Insights.  There are multiple advantages to this including end to end active monitoring of web sites and API's, and keeping the application warm.

Below I show a simple request to my blog (public website) and the results, Azure refers to this test type as a URL Ping test which is basically a URL HTTP GET request.  


Wait a few minutes and Refresh to see the results:

Very easy way to include a constant check that your API or Website is running.  There is also the options to create "Multi-step web test" using Visual Studio.  You can record the authentication and assert for known response content to build advanced constant monitoring.

Tip: The URL does need to be publicly available.

The content I used to test out the functionality comes from the Microsoft Docs site.
Also see Live Metric Stream that is part of App Insights.

Monitoring using Azure Monitor Dashboards:
  • The image above shows a dashboard that can be used to monitor a SaaS applications PaaS Infrastructure.
  • It's a good idea to create multiple dashboards and they can show the overview and allow the user to drill into specific areas.
  • Internal boards watching key API's, HTTP uptime ping type requests is also a good idea.

Updated: 11 Jan 2023
- Checkly looks like a great monitoring service that works with Playwright for continuous monitoring.  I like to keep all my SaaS on Azure, but Checkly is independent an as it is monitoring.

More Info: 
App Insights MultiStep Tests  Replacement Option for MultiStep Test based on Azure Functions