Showing posts with label security. Show all posts
Showing posts with label security. Show all posts

Sunday 5 June 2022

NIST/RMF - Cyber risk control

Overview: NIST National Institute of Standards and Technology - Provides Risk Management Framework (RMF) - Is a framework to reduces security risk to systems and data.


Goals of NIST RMF:
  • Consistent and cost effective set of security controls
  • Repeatable assessment approach
  • Technology neutral
  • Implement an efficient risk-based security and privacy program.

Notes:

  • Each of these six steps have Special Publications (SP) that are applicable to the area.
  • The core document to for RMF is NIST SP 800-37 Revision 2. 
  • Used to identify security/pricacy risks at both the operation and system level 

Sunday 22 May 2022

OWASP top 10 - 2021

 Always verify your system's have reviewed the OWASP top 10 vulnerabilities 2021

  1. Broken Access Control (ensure users only see what they should have access too).  Permission need to be correctly specified and enforced.  i.e. don't allow customers to have Admin account privileges or access to another customers details.  Security needs to allow enough right/privileges so people and clients can do what they need to but must not over allow privileges.  Prevention techniques must include least privilege (no access by default and give the minimum needed permissions), audit access and changes (think tomb tables/temporal changes) but can also include access auditing and logging access control changes.  Lastly, QA must verify permissions work as expected, so often it is just assume it works as the QA thinks and not to actual policy/specification), 
  2. Cryptographic failure (data in transit and at rest must be securely encrypted, more sensitive data must be more encrypted/scrambled.  By default everything should be encrypted but add extra encryption/controls to sensitive information like passwords.  For payment processing use a specialist provide like Stripe.  Encryption can be symmetric or asymmetric.  Ensure you use algorithms that are considered weak such as Triple DES also known as 3DES rather use AES, or hashing Algorithms such as SHA1 and RIPEMD160 are weak, rather use a SHA2 algorithm.  Manage your keys properly such as Azure Key Vault and rotate the keys regularly.  Salt hash key data to have encryption and hashing on sensitive data), 
  3. Injection (data input results in unintended processing) examples are SQL injections and Cross side scripting (XSS),
  4. Insecure Design (new addition in 2021, don't include sensitive information in error messages, ensure architecture can scale, passwords stored in plain text or are not rotatable, deal with DDoS, if 1 part of an app brings the whole solution down),
  5. Security misconfiguration (default setup that is not very secure e.g. default password is not changed on installed software, ensure security hardening happens.  I.e. short passwords, changing passwords, updating software,...)
  6. Vulnerable outdated components (we use components in code, if they have security weaknesses it's likely you also have it Log4J was an example.  You need to know all you supply chain dependencies so you need to ensure they are not vulnerable.  SAST scanning tools help identify vulnerabilities.  Know your component dependencies!), 
  7. Identification & authorisation failures (You are who you say you are, check claims, use OAuth, OIDC example is session hijacking or stealing someone else's identification, public pc, next user logs in as you previous user),
  8. Software and Data integrity failures (insured CI/CD pipeline, closely linked to item 6. example, could automatically update a nugget package to the latest automatically which has a security flaw in it),
  9. Security Logging and Monitoring failures (check logs to detect security issues, incidents will happen and can often see people scanning for vulnerabilities),
  10. Server-side Request Forgery (SSRF) (servers ask for endpoint info, use only-allow in headers or only allow the specific IP of the partner, ensure only relevant info is returned).

Sunday 6 February 2022

Azure DevOps Series - Integrating Security into Azure DevOps

Overview: Integrate automate security testing into your CI/CD  Azure Pipelines, this area is of expertise is sometimes refered to as DevSecOps.  Azure DevOps provides build and automation servers.  In the OSAWP 2021, number 8 is Software and Data integrity failures.  This covers securing CI/CD pipelines.

CI/CD Pipeline hardening - Code is written and committed to source code repository, Linting (SonarQube), build , test, and deploy.  Can also include infrastructure and networking setup.  YAML & JSON are common for building pipelines.  All these steps need to be hardened.  

Ensure only authorized intended actors can run/use the pipeline or part within the pipeline.  Eg. ensure only developers can check in code, they must have permissions.  

  • Harden but you have to be pragmatic so developer can do there work but also don't over allow access.
  • Ensure logging is running.  
  • Keep plugins and reference frameworks up to date to avoid weaknesses being exploited.  Ensure OS and containers are up to date.
  • Use dedicate build/service accounts.  
  • Using Azure DevOps (ADO) does a lot of hardening automatically.  
  • Don't expose sensitive information in you logs like pswds as if the logs are hacked you have a problem.
  • Ensure artifacts are correctly locked down.  To get artifacts the pipeline only needs read access.
  • Verify SaaS service you use are secure.  Integrate external security SaaS software.
  • For security there are native tools for security, plugins or external service as mentioned above.  Mend/WhiteSource Bolt is a tool used for scanning packages for vulnerabilities.  AzureDevOps has Mend Bolt as a add-in.  There is a free service but it is fairly limited.  Can also run these scans from Developer level and not just in the pipeline.

Azure DevOps Series Posts:

  1. Azure DevOps Series - Overview 
  2. Azure DevOps Series - Azure Boards 
  3. Missing
  4. Azure DevOps Series - Azure Pipelines 
  5. Azure DevOps Series - DevSecOps (This Post)
Note: AWS Pipelines refered to as CodePipelines is the same as Azure DevOps in AWS world.  

Wednesday 7 July 2021

Microsoft Dataverse (CDS) - Overview

Overview:  Dataverse is CDS, there is a long story on the naming but ultimately Dataverse is a data store with a advanced security model, Open API's, workflows, pipeline injection...  It is awesome.

It is high performance, and would take considerable effort and components to deliver similar functionality or even semi close functionality.  It does have limitations mainly around performance but don't let that fool you, Dataverse is fast and powerful but for massive industrialized storage it's not the right option.  The costs are also a key consideration.

The biggest mistake I see is people making the same mistakes as they do with relational databases namely: 

Poor Dataverse implementation down to 1) poor entity relationship design, 2) either too many table containing duplicate data or to few table being expanded for a dev teams capability but ignoring existing systems, 3) poor security 4) too many cooks.

Basically, like any Database service, you need to have owners and try keep the structure logical and expand it appropriately.  The idea behind the data model used by the dataverse is to have centralized secure shareable data like customers or account information.  It's simple, treat dataverse as you would your most precious core database, have an owner that needs to understand and approve changes.

Note:  Microsoft have had some trouble naming Dataverse, it was previously known as the Common Data Service (CDS).

Dataverse logo

Overview: Dataverse helps improve processes.  And Dataverse helps reduce time to build IT capability, remove shadow IT, improve security and governance.  Data is the common data store we need to use to be effective.  As part of the Power Platform, it allows us to build custom software fairly quickly.

Updated 07-July-2022

Dataverse provides relation data storage (actually runs on Azure SQL (Azure Elastic Pools), Cosmos DB, and Blobs), lots of tools e.g. modelling tools.  I think of it as a SQL database with lots of extra features.  Most importantly business rules and workflow.

  • Dataverse relies on AAD for security
  • Easy data modelling and supports many-to-many relationships NB!
  • Easy to import data using PowerQuery compatible data sources
  • Role-based data (previously called row) and column (previously called fields) level security.  See Dataverse security in a nutshell at the bottom of this post.
  • Provides a secure REST API over the Common Data Model, it's awesome
  • Easy to generate UI using PowerApps model driven app
  • Ability to inject business rules when data comes in or out of the Dataverse (can also use .NET core code)
  • Can also stored files (ultimately in blob storage)
  • Search that indexes entities and files
  • CDS used tables, Dataverse calls them Entities.  Some of the UI still refers to table.  Just assume Entity and Table are interchangeable terms.
Dataverse basically allows you to have a PaaS data hosting service that mimics what we have done for many years with databases and Open API, has advance features and tooling and it is all securely managed for you.  

The cons are basically: is that it is expensive.  So you need to know your size and keep buying add-ons to the plans.  Scaling Dataverse is expensive.

Common Data Model: Collection of tables (previously called entities and most CRM people still call them entities) e.g account, email, customers for a business to reuse.  Comes from CRM originally, the starting point consists of hundreds of entities pre-created.  Common standard for holding data.

Each Power Platform Environment has a single Dataverse associate to it.  It's a good idea to have more than one environment but at it's simplest, use a trial to learn and progress to production.

Once I have a new environment, I can use Power Apps to access my environments Dataverse and model out a new table to store info, I am storing people tax returns.
Go into the Dataverse and model directly

Model the table in you Dataverse instance

Dataverse Security in a Nutshell:
  1. A user is linked from AAD to the User entity in the Dataverse.
  2. User Entity record is aligned to the AAD User.
  3. AAD Users can be part of AAD security groups.
  4. Dataverse Teams (Dataverse Group Teams) can have Users and or Security Groups assigned.
  5. Dataverse Group Teams are aligned to Business Units.
  6. Business Units have roles (rights).
"Security is additive" in Dataverse (generally the whole MS and security world these days).  i.e. no remove actions.  If you have permission in any of the groups you can access the data/behavior.

Business Units used to restrict access to data.  Can be hierarchical i.e. Enterprise > Audit > EMIA > UK (Don't use it like this, keep it simple)
Security Roles define a users permissions across the Dataverse entities i.e role can read only from Accounts entity 
Teams consist of users and security groups.  That get assigned roles.  There are two types of Teams in Dataverse: Owner teams & Access Teams
Field-level security, only allows specified users to see the field data

https://learn.microsoft.com/en-us/power-platform/admin/wp-security-cds (Good clear post on Dataverse security, core concepts are Business, Units, Teams, Roles, Users & OAuth/AAD)

Sunday 13 December 2020

ISO 27001 Certification & OWASP

Overview:  I have been thru several ISO and security audits over the years for various companies offering SaaS products.  This post outlines a some of my note around the latest ISO 27001 audit I touched on.

ISO 27001 covers Information Security Management (ISMS) which is about protecting and managing your businesses information assets to reduce your business risks.  It demonstrates that your organisation has good security practices in place.

Note: ISO 27001 is a management of systems standard for an organisation, it is not done for a particular product.  

An ISO 27001 ISMS consists of policies, procedures and other controls involving people, processes and technology".  https://www.itgovernance.co.uk/iso27001

Parts to an ISO 27001 audit:

  • Part 1 - Check you have the correct documentation.  
            Output is a go ahead and get a visit plan from the auditor.
  • Part 2 - Checks you as a business are complying/working to the documentation.  Basically evidence based reporting based on visual confirmations and discussing with the staff using interviews to verify compliance (sample based auditing).  Findings normally grouped into 3 types of findings: 1)   Opportunity for improvement = suggestions, need to review before next audit to see if this is worth implementing 2) Non conformance - Minor = can have a few of these, look to fix 3) Non conformance - Major - won't get certification with a major.  There is a period to address/fix major issue/issues.  Always complete the phase 2 audit as they may discover other majors.
            Output Findings report and several weeks latter the certification.
  • Certification
  • Yearly: Need to repeat and show you are improving based on the findings and the audit will generally go into specific areas in more detail.
More Info:
Data Protection and Regulation note - see bottom of post for ISO27001

Notes
Business Continuity quarter check
Annual Security Policy & Standard Review 
Security training - different roles need different training
Annual penetration testing
Audit annual re-certification days
Risk Information: Non conformity & root cause analysis

Technical:  Encryption and REST, Encryption in Transit, DAST/SAST on code, =logically secure customer data/security, Azure Defender to harden infra and continuously monitor, vulnerability or external penetration testing, ASVS/OWASP.

ISO 27701 - "ISO 27701 extends the meaning of “information security” detailed in ISO 27001. While the privacy and protection of personal data is part of ISO 27001, the newer standard extends the scope to include the “protection of privacy as potentially affected by the processing of PI" source: https://www.learningcert.com/information-security/iso-27001-vs-iso-27701/

ISO 27017 - is a security standard developed for cloud service providers and users to make a safer cloud-based environment and reduce the risk of security source Wikipedia.  I think ISO27017 is now part of ISO27001 extended.

ISO 28000 - is the spec for security management systems for the supply chain (partner dependancies e.g. software vendor, hosting company service)

ISAE 3402/SOC 2/ISO 27001 - about verification of business processes/internal controls of the business of of a high standard.

Sunday 8 March 2020

Handling Security Incidents

Security Incident: An incident that potentially has compromised a companies systems or data.

Goal:  Focus on restoring confidentiality of systems/data and prevent further attack.  Contain the incident and eradicate the issue.  Full resolution target timeline is met for incidents.  These incidents can take up to 100 days but depends on the complexity.  

Examples:  Virus, Trojan Horse, Stolen data, increased unauthorized permissions, compromised server, copying data, DoS, unauthorized system access, ....

Need to record each event and work through the life-cycle (ISO 27035).  Can be dedicated software or modules such as ServiceNow's Security Incident Response (SIR).

  1. Plan & Prepare
  2. Detection
  3. Assessment and Decision - Get logs, review/analyse, document the findings, notify leadership teams.  Impact/Priority e.g. Critical vs Low business impact.
  4. Response - limit damage plan, decide on approach, notify if needed and remediate.
  5. Lessons Learnt - ensure the threat is removed and potential lessons can help improve the attach surface for similar issues.

https://en.wikipedia.org/wiki/Computer_security_incident_management

Note: Be careful not to delete forensic evidence.

Tip: Organisations must have a Security Incident Plan.  Plan, be ready, know what to do in advance improves the handling of Security incident.


Sunday 29 September 2019

OAuth for Custom Connectors

Problem:  I have an APIM end point that has both the subscription key and OAuth2 setup for security and I need to connect Power Apps.

There are basically two parts to this problem, setting up your Power App to use OAuth and then also passing in the subscription key on every APIM end point request.
  • This post assumes that both the APIM App registration and the Client App have been registered on Azure AD.
  • Check with postman can access the end point as shown below.  Postman will authenicate and use the bearer token (OAuth) and the subscription as shown below to test
Client ID: 5559555-555a-5558-5554-555d4
Auth URL: https://login.microsoftonline.com/555fd555-555f-5554-5551-555b444c3555/oauth2/v2.0/authorize
Token URL: https://login.microsoftonline.com/555fd555-555f-5554-5551-555b444c3555/oauth2/v2.0/token
Refresh URL: https://login.microsoftonline.com/555fd555-555f-5554-5551-555b444c3555/oauth2/v2.0/authorize


We now know the APIM is working with OAuth and the subscription key.
Configure OAuth from Power Apps:
  • Once Postman can get valid 200 responses using OAuth2 and the subscription key, setup the custom connector.  I could not get the Azure AD connector to work so i used the OAuth connector as shown below: 

Add a policy that will add the subscription key to every https request to the APIM/resources as shown below.

Updated 2 Feb 2020 - Example of custom OAuth code flow authentication for a custom API's using AAD security.

Note: Deploying Custom Connectors in solutions always seems to be an issue.  It is a good idea to keep all connection references in a separate Power Apps Solution.

Sunday 7 April 2019

Azure Active Directory, B2C and Rights

Azure Identity Management is a fairly large body on knowledge.  Basically, dividing it into different areas makes if easier to understand.

RBAC in Azure:
Azure AD and B2C bother offer a way to authenticate a user thru the user providing an identity.
The user is assigned to 1 or more groups, and then the groups (or individual users) are assigned to Roles.  The diagram below shows internal and external users and how permissions can be given out.  Resulting in Role Based Access Control (RBAC).  The application itself deals with the operations a user can perform but having the users role/claims allows the individual applications to figure out what action the user can perform.

RBAC can be assigned at 1 of these 4 levels to manage Azure Resources:

Tip: For small Azure Tenants, managing resources are the resource level works well, but in most enterprises, you should mange at the Resource group or even subscription level to keep management controllable.
Note: There is the concept of "Directory", multiple "Resource Groups" are setup to a directory.  I believe all companies should have a single directory but it is more common to find even relatively small businesses common to have multiple directories. 
"Multiple subscriptions can trust the same Azure AD directory. Each subscription can only trust a single directory." Microsoft Docs

Saturday 30 March 2019

Azure Security Checklist

Overview:  Constantly improving Security on Azure and Office 365 is essential to a lot of companies.  Microsoft provide outstanding infrastructure and monitoring for companies and it is also the companies responsibility to configure and secure O365 and Azure to ensure security and allow for the appropriate liberalization of services so the business can operate effectively.  This post outlines some basic items to look at to optimize the balance of security and it meeting you business needs.

I generally go through the infrastructure and write up a report for management in the form of:  Finding, Recommendation and Management Comment.

Finding:  The company users login using their Azure AD accounts and the credentials page has not got customized branding to help user know they are logging into the companies secure resources (SharePoint, email etc.)
Recommendation:  Using the "Azure Portal", use the "Azure Active Directory" Service > "Company Branding" to upload the company logo in banner and square format and update the color/theme for match the firms branding.
Management Comment: We accept the finding and wish to mark the changes immediately.

Microsoft Provide Tooling to help identify improvements and below are two tools you can use to help clarify the current environment so improvements can be recommended.

Network Security Groups (NSG's)
NSG's are basically firewall on Azure.  Fantastic and simple.  They can get really complex with multiple policies.  Azure gives you great tooling for Azure networks called "Network Watcher".

Tuesday 26 February 2019

Microsoft Information Protection Update

Microsoft as of 2019 Feb is still using Microsoft Information Protection (MIP)/ Azure Information Protection AIP interchangeably as this video from Ignite 2018 Oct highlights.  Today I went to the Ignite tour and AIP and MIP are being used to mean the same topic that I'm referring to as AIP in this post.

MIP is a framework that includes AIP includes AIP scanner (files share and SharePoint on-prem.), DLP (cloud), RMS, Azure Advanced Threat Protection, MCAS (cloud), Windows Information Protection (integrates, understands AIP labels), need a central portal to monitor in to the "Security and Compliance Centre" (SCC).

The screenshot from the Ignite London presentation shows where AIP is today as presented by Maayan Nasman Rand.  The presentation was a good overview of AIP.  The big improvement to AIP over the past 3 months is the Analytics/Monitoring, this was not working and now it's very good but still in preview.


  • AIP is getting closer but I feel the big missing piece is the encryption used by AIP does not allow SPO to provide previews and more importantly search cannot index the data in SPO.  Despite this key missing piece, I'd use it on O365 without encryption if I'm in a SharePoint store.   
  • The native applications auto labelling is improving quickly.
  • The Auto-labeling feature is new and useful.
  • A few months ago, AIP labels were merged into the Security & Compliance Centre, worth noting is if you had labels in AIP admin, you need to migrate the labels using "Unified labelling" option and the policies need to be manually brought into the Security & Compliance Centre.
  • Auto-labeling is now in the Mac Office suite and also it is coming to the Office apps in Droid and iOS (preview).
  • AIP is an add on, new Office and Office for Mac and Android have the AIP plug-in already installed.  Applies to all office products including Outlook, Word, Excel, PPTX.
  • The UI ribbon for AIP in Office on Windows has also been updated to a new look.
  • Microsoft Cloud App Security (MCAS) has scanners to perform labelling (like AIP scanner) but also works on g-suite and Box others are coming
  • AIP Scanner works on file shares (CIFS) and SP2013 and SP2016 on-prem.
  • 3rd party product Adobe Pro does not do yet have the ability to update labels, but it's coming soon (Jun 2020?).  They use the SDK that developers can all use.  
  • The Monitoring/Reporting is actually working, a year back it was flakey and the UI and find-ability UI is much improved.
  • A couple of Preview Screens show today:



Previous AIP Posts:
AIP - Protect your companies documents (Catching up to Symantec's product quickly)
SharePoint Saturday AIP Notes

Saturday 8 September 2018

SharePoint Saturday 2018 - Cambridge

Here is my slide deck from SharePoint Saturday Cambridge 2018  Introduction to Azure Information Protection (10 MB includes recordings)

Sessions I attended:
1. PowerApps Jump Start by Sandy Ussia
I got some useful pointers in this session, Sandy presents well and focused on business/citizen developers. 
2. Office 365 Security and Compliance with Albert Hoitingh and Daniel Laskewitz
This was two sessions and amazing.  Hands-on how it works and what I need to know.  Absolutely brilliant double session.  These guys really know AIP, DLP and O365 security.  Great info in a small focused setting.
3. Managing Content in O365 with Erica Toelle
I did not know Erica, I do now!  And wow she is good, she covered O365 security center, Cloud App Security (new service looks for security on O365 tenants) and AIP.  Great knowledge, humble and so easy to talk to.
4. My presentation on AIP, I cover a few points from Erica's session, as most of the audience were in both our sessions, I skipped over the info Erica already provided.
5. Containers with Anthony Nocentino
Amazing presenter - very engaging and I learnt a lot about containers.

A great conference, well organised - the sessions info were outrageous.  The speaker's dinner in Sidney Sussex College was quite an experience.  Thanks to the organizers:  Paul Hunt, Mark Broadbent, & Andy Dawson

Sunday 30 July 2017

Tech Megatrends in 2017 - The bigger Picture

Problem: What are the major technology changes that are going to shape the enterprise over the next 5-10 years? 

Initial Hypothesis: Working within technology and making businesses more competitive I see the speed at which technology changes.  Working with multiple large customers, I know that people are generally looking at a handful of strategic technology trends.  While technology trends are actually all connected and a hierarchy.  Basically, I am classifying the significant tech trends and not drilling down to lower levels such as Mobile application patterns or particularly technologies such as O365.

While technologies are not so kind as to make themselves easy to segregate I feel the big trends are:

Analytics - Storage, mining, analyzing data and reporting.  Basically, this is an old industry with new trends such as larger data sets, new data sets e.g. social media, additional reporting formats/media, AI is really trended recognition on steroids.  The trend has been to make analytics available closer to real time and not it's moving into predictive.  Robotics is a closely related working with data. 

Security - We have so much more data and devices, the old world of protecting your own assets and monitoring outside people in a regulated fashion is no longer the main way of securing.  Key is understanding that identity is king.  Be this an individual or a computer; we need to be able to know the person or system are whom they say they are (non-repudiation).  Block-chain - While I think it's important and in some industries absolutely critical, this to me falls under security.  We need to trust and share between machines and transactions. Once again an old industry with a shift in focus in that identity and collaboration is now central to security and not using a centralized castle like security models.

Cloud Computing - Low-cost computing paid for on demand is simply a continuing trend for businesses, only now we are good at virtualization and cloud computing and the big three (AWS, Azure & Google) are getting better quickly.  Cloud computing ties to Analytics and security. Virtualization has progressed and we are getting better at providing computing safely and at a lower cost.

Sub Trends:

Robotics/Automation - Automation goes manual steps on behalf of a human and historically this has been more around manufacturing.  Going forward I see it taking over rudimentary information worker roles.  Ingesting legislative data is a great example and coupled with machine learning, we can build up models to identify what laws are applicable in different jurisdictions. Medical diagnosis are often referred to under AI is not intelligence, it's pattern recognition and analytics to determine a likely problem from the data picked up and cleaned.  The medical example will be essential to us thriving on this planet; doctors will only get complex issues, the rudimentary stuff will disappear and be dealt with better meaning fewer returning patients.

IOT - not exactly new just we have to do it better on a larger scale.  So while technology is changing, systems communication is not.  I see drones more as a subset of IOT, hardware devices using analytics.  Pretty similar to fully automated self-driving cars.

Augment Reality (AR)/Virtual Reality - will be big but like block chain will have unique application is different industries.  It relies on Analytics and security as its underpinning.

Here is some research I gleaned from 5 key firms on current technology mega-trends (this is how I see these firms views on the tech mega trends,):

PWC
===
Artifical Intelligence (AI)
Augment Reality (AR)
Blockchain
Drones (UAV)
IOT
Robotics
Virtual Reality
3D printing

EY
===
Artificial Intelligence - Cognitive learning
Robotic Automation (RPA)
Blockchain
Analytics
Internet of Things (IOT)
Cybersecurity

Deloitte
=======
Dark Analytics
Machine Intelligence
Mixed Reality
Inevitable architecture (cloud computing)
Everything as a service (cloud computing)
Blockchain

KPMG
====
Big Data & analytics/AI
Cloud Computing
Cryptocurrency/Digital Payments
IOT
Robotics
CyberSecurity
Virtual and augmented reality - stock and glasses -Google glasses

McKinsey
=======
AI
Analytics
Robotics

Possible Resolution:  Most people in technology are aware of the mega trends, how relatively important each trend is and the details of how quickly change is happening.  For me analytics, speed to market and trust need to underpin everything I deliver.

Sunday 20 March 2016

Hacking SharePoint input field Validation


Problem: Here is an easy way to step around SharePoint 2013's input field validation for a drop down list.  Any list contains a drop-down list (configured to only allowed to chose values from the drop down list), using Internet Explorer (IE) developer tools and amending the DOM, when posting the form, the change value is inserted into the list.
Replication Steps:
  1. Open IE go to the list to add a new list item (the list must have a drop down field column), hit Fn+F12 to open the IE Dev toolbar.
  2. In the "DOM Explorer" tab select the "Select element" icon (top left).
  3. Click on the drop down control i.e. "Primary/Secondary" input control as shown below.
  4. Edit the DOM value for the item selected to some crazy text and save the form.
  5. Open the item in view mode and you will see the crazy data as shown below circled in red in the bottom picture.  


Tuesday 16 December 2014

SharePoint 2013 Public Website Check list

Ux:
  1. Responsive design vs Device channels - Does the site switch resolutions and browsers gracefully.  RWD vs AWD (Adaptive Web Design)
  2. Broken Links: Check My Links 3.3.4 is a plugin for Chrome to check a page for broken links (go over main pages at least)
  3. Fiddler - Use for 404, and other errors, look for dodgy urls and headers being passed around.
  4. Charles is a similar tool - helps with broken links, size of files, shows web calls, review response headers, size of files and speed of execution.
  5. Minification - is the minification of JavaScript and CSS.
  6. Alt labels, WCAG, valid html checker
SEO:


Testing:
All devices and browsers (1. PC/laptop (IE 11-IE7, Chrome, Firefox, Opera, Mac/Safari), 2. Phones(iPhone, Android OS, Windows OS), 3. Tablets (Android, MS/Surface, iOS/iPad).

Helper Tools:
AddThis.com - Nice tool to add Social bookmarking service for your websites. Collects stats TypeKit - Nice for Fonts, review the licensing needed.

Security:
  1. Check Internal Search is not returning passwords
  2. Check google is not picking up passwords/confidential data 
  3. Remove response headers:
  4. MicrosoftSharePointTeamSiteServices(versio), X-Powered-By. X-SharePointHealthScore, X-aspNet-Version) Performance X-SharePointHealthScore
  5. Check XSS and SQL but with SharePoint you are testing the product
<script>alert();</script>
<img src=;; onerror=alert();>
<iframe src=javascript:alert();>

Thursday 20 March 2014

Data Protection and Regulation

Update 2023 December - Accessibility

  • European Accessibility Act (EAA) EU directive will be law in member states by June 2025.
  • The Disability Equality Act came into force 2022.
  • General Equal Treatment Act (AGG) came into force in 2006, prohibits discrimination based on disability.

Update 2022 Mar 16: Privacy Management | OneTrust implement Privacy Management

Update 2021 Nov 20:  Applies to organisation handling personal EU and UK data member data.  Limits what organisations can do with peoples personal data.  It's enforce but the key is to only use data you need, protect personal people's data basically how any company should behave.

UK is still part of UK law since Brexit Data protection Act 2018.  Differences:

  • Territorial applicability of UK GDPR
  • International transfers of personal data

Overview:  Data protection in relation to SharePoint is a large body of information.  This post outlines my notes on holding data within SharePoint and generally applicable to various regulations I have come across.  Also, see my post on Compliance for O365 and SharePointLast updated: 18 July 2019

Records Management:  Data needs to be disposed of depending on the applicable rules, the rules depend on the industry, country, the category of data.  AvePoint has good records management and governance tools to help with the disposal/cleanup of data.

Search: Request for Information (Freedom of Information (FOI)).  SharePoint can be used to traverse over multiple systems/LOB to determine where information is held about individuals.  Configure to generate reports or as a starting point in trawling data in the enterprise.

United Kingdom:
Updated 24 May 2016 - The European Union (EU) General Data Protection Regulation (GDPR) "intends to strengthen and unify data protection for individuals within the European Union (EU). It also addresses export of personal data outside the EU" Wikipedia 
The EU GDPR applies to EU member states such as the UK, Germany et al. and covers personal data held by companies in the EU and extends to companies holding EU citizens data.  Of interest for the GDPR is that companies can be fined up to 4% of turnover.  SharePoint and Office 365 holds a lot of company assets and data and appropriate protection needs to be in place.  Part of any companies active Defense needs to include SharePoint.  Of note here is Office 365 have fantastic capabilities in defense and I believe will increase the speed enterprises move to the cloud.  

http://www.computerweekly.com/news/2240114326/EC-proposes-a-comprehensive-reform-of-data-protection-rules New EU Data Protection Directive not yet legally binding.  Companies in the UK are bound by the Data Protection ActFreedom of Information Act 2000  also plays a part with personal data. DPA 1998 explained.  DPA 2018 alignes with GDPR.

Purpose of GDPR (started 25 May 2018):
  • Protect personal data
  • Consistency legislation in the EU
  • Encourage competition between EU countries
GDPR is concerned with EU citizens personal data and protecting it.

DLP has a module for Health Records that adheres to the U.K. Access to Medical Reports Act 

G-Cloud allows public sector organizations to buy cloud services, from a range of suppliers on a validated secure network.  In effect, it is cloud services for local and central government.  G-Cloud in effect offers the cloud (think AWS & Azure type services) to government bodies.  Updated: 16 March 2016, the G-Cloud has been abandoned.

Dealing with Breaches:
SharePoint holds a ton of company data and needs to be part of any companies Active Defence strategy.  Still need the old school basic defences: Firewall, Intrusion detection, and anti-virus. Do you have a list of critical applications and data within SharePoint?  Do we know who we do business with (client or HR could compromise our data)?  Who is likely to attach?  Employee, organised crime, ... and what happens when we are compromised?  (Do we shut down or restrict,  how do we identify, legal and forensics, communication plan). DLP can help with breaches:
PII data
Theft - are employees mining SP data looking for highly confidential data, IP or client lists  
Security Centre helps with:
  • Investigation
  • Forensic collection
European Union (including the UK):
  • Companies will be required to appoint data protection officers if more than 250 employees.
  • Organisations will have to notify citizens in plain language what information is collected and how it is used as well as explicitly get consent before using any personal information.
  • Users of online services must also have the right to be forgotten, which means they must be able to remove or delete personal information from an online service.
  • Clear rules for data transfer across borders within multinational corporations with a streamlined process that once approved by one data authority, will be accepted by all others.
  • Requiring organisations to notify the national data protection authority and all individuals affected by a data breach within 24 hours.
  • Businesses operating in more than one EU country will, however, welcome the fact that they will be subject to oversight from one supervisory authority rather than multiple authorities
  • Once the directive is accepted companies will have 2 years to comply.
  • Organisations will only have to deal with a single national data protection authority in the EU country where they have their main establishment. Likewise, people can refer to the data protection authority in their country, even when their data is processed by a company based outside the EU. Wherever consent is required for data to be processed, it is clarified that it has to be given explicitly, rather than assumed.
  • People will have easier access to their own data and be able to transfer personal data from one service provider to another more easily
  • A ‘right to be forgotten’ will help people better manage data protection risks online: people will be able to delete their data if there are no legitimate grounds for retaining it.
  • EU rules must apply if personal data is handled abroad by companies that are active in the EU market and offer their services to EU citizens.
  • Penalties of up to €1 million or up to 2% of the global annual turnover of a company.
South Africa:

What is POPI?
Protection of Personal Information (POPI) is the legal requirement in South Africa for holding, collecting, distribution, amending and destruction of information involving people and companies. POPI controls how your personal information is used by organizations, businesses or the government.
With so much personal data held by an increasing number of companies, there needs to be some benchmark for companies to follow if they are to ensure that data is handled legitimately. POPI provides the laws/framework to guide how companies must store personal data relating to people and companies that it holds in either electronic or paper form.
In a nutshell, when holding parties personal data POPI attempts to enforce:
  • transparency
  • only collect information that you need
  • ensure the data is protected/secure
  • ensure the personal data help is correct, required and up to date
  • discard data when it is no longer needed
  • ensure the end person/subject has given his/her explicit consent to keep and use their personal data
  • allow the end person/subject to see their own data that you hold if they request it

Why is should you adhere to POPI?

  • Customer confidence is improved
  • No superfluous data is stored
  • Data is more secure, accurate and old data is expired
  • Avoid criminal and civil actions

What you need to do?

POPI applies to all IT and paper-based data that your company holds.  Your company will take steps to ensure the security of personal data which are held in electronic and paper form.  You must prevent the unauthorized disclosure of data to third parties, and loss or damage to data that may affect the interests of end person/subjects.  You will also ensure that data processors your organization uses to provide an appropriate level of security for the personal data which they are processing on your behalf.  Any data must be restricted to the appropriate person and your company needs to take steps to ensure it is not allowing unauthorized access to data and information.

What happens if you Violate POPI - EY South Africa
United States
FATCA requires a financial institution to identify and report US customers. 

Safe Harbour  - US companies storing EU customer data would self-certify that they adhere to 7 principles to comply with the EU Data Protection Directive and with Swiss requirements. Overturned in 2015.  The EU-US Privacy Shield is an agreement between the European Union and the United States to enable US businesses to store EU citizens personal data that complies with EU privacy laws.  EU-US Privacy Shield in effect the replacement to safe harbour agreement.  

Patriot Act - Greatly affects companies as the US can request access to data.  This leads to multinationals choosing to host data outside of the US.

Internal State Laws - Each federal state may have localized laws that your business needs to adhere too. For example California Data Privacy Protection Act (CDPA)

Brazil 
LGPD - General Data Protection Law - Basically GDPR for protecting personal data and peoples user privacy.

China has the PRC Cybersecurity Law relating to protecting personal data.
Hong Kong has the Personal Data (Privacy) Ordinance (Ap.486)
Middle East: Bahrain - Data Protection Law (Law No. 30 of 2018), Qatar - Data Protection Law (Law No. 13 of 2016), UAE has Digital payment Regulation and Data protection laws specific to each emirate
Turkey - Law on the Protection of Personal Data 6698 (LPPD)
South America, all the major countries have Data protection laws including Argentina (Law 25.326)
Canada - Various state laws pretty much PIPA.
Australia has a host of Data privacy laws including the "Australian Privacy Principles"
Japan has APPI.
South Korea has PIPA

Other:
Common Reporting Standard (CRS), same idea as FACTA but not just US customers, heavier and most of Europe and others.  "CRS is a globally coordinated approach to the disclosure of income earned by individuals and organizations outside their country of tax residence", KPMG.com.

Pharma and Medical:
  1. HIPAA - "Health Insurance Portability and Accountability Act of 1996) is United States legislation that provides data privacy and security provisions for safeguarding medical information"
  2. DSP is broadly similar to HIPAA but from the UK, it is a toolkit for compliance for the NHS.
  3. HL7 - "Health Level-7 refers to a set of international standards for the transfer of clinical and administrative data between software applications used by various healthcare providers"
  4. FHIR V4 - replaces HL7 for exchanging data exchange and information modelling standards for over 20 years. FHIR is a new specification like HL7
  5. GxP - Good x Practice e.g. GCP, GMP (manufacturing), CLP, 
  6. GCP - Good Clinical Practice - Guidelines and regs used in the pharma industry
  7. GMP - Good Manufacturing Practice
  8. GAMP - Good Automated Manufacturing Practices - applies to software development.  GAMP has 4 categories.  Part 5 is the most hardcode for companies.  Cat 1 is infrastructure Software, Part 4, for instance, is Configurable software, PArt 5 is for custom software.  So depending on the software you are providing to a customer, you need to be audited by external parties and clients to be compliant.  If you have written your own Software, you need to be GAMP5 compliant.  GAMP is GxP but for IT systems.  Build-in quality, this requires following procedures and principles when building software products.  ISPE - Run GAMP and check qualifications.
  9. Eudralex - Pharma industry in EU guidelines for dev, manufacture and control of medicinal products.  Rules governing medical products in the EU.
  10. EMA - European Medicines Agency, same as FDA but covers Europe.
  11. FDA - Food and Drug Agency out of the US.
  12. MHRA (Medical and Healthcare Products Regulation Agency), same as Eudralex but for the UK.
  13. Title 21 CFR part 11 - FDA reg so that electronic records and signatures equivalent to paper hand signed reconds and consent.  About storing e-records including securing and signatures.  Code of Federal Regulations (CFR) Title 21 Part 11.  Ensure the system is secure, audit logs of all transactions with timestamps maintain the integrity of the open or closed system.  Signatures must ensure non-repudiation (the signer can't claim it wasn't him).  E-signatures can be biometric-based, this is hard for web-based systems without specific hardware.  E-signature that are not biometric-based requires that on the first sign-in all components (general means sign in with username and password) on the first signature assuming the user is already logged into the system.  See Section 11.200 Electronic Signature Components and controls go to point a) (1)(i).  Subsequent e-signatures can 1 component meaning the username. If using biometric signatures each signature uses the biometric method again.
  14. ISO 27001 - ISO is best practices guidelines, not regulations.  27001 is concerned with info security and info assets.  Asses and treatment of security risks.  Also see ISO27002 & ISO27017.
  15. ISO 9001 - Quality Management, checklist / process orientated.
  16. FDA (Food and Drug Agency) is equivalent to EMA (European Medicines Agency) in Europe.  

ISO 27001 for SaaS:

It is pretty easy to get ISO 27001 certified for SaaS companies and it brings huge benefits if implemented correctly.  Azure provide fantastic documentation and if your product is based on Azure.  It is really easy as the technology infrastructure is validated.  

Azure have a fantastic set of documentation for certifications called Blueprints

https://docs.microsoft.com/en-us/microsoft-365/compliance/offering-iso-27001?view=o365-worldwide

https://docs.microsoft.com/en-us/azure/governance/blueprints/samples/iso27001/

ISAE3402 (SOC):

ISAE 3402 is similar to ISO 27001

Accounting and Tax:

XBRL (eXtensible Business Reporting Language) - XML based format for exchanging business information.  iXBRL is a derivative of XBRL used in the UK for submitting company accounts, VAT, self assessments.  iXBRL is also used to submit annual accounts to companies house each year.

ISO 21378 - Audit data


Saturday 29 June 2013

SharePoint Host Apps - User Permissions

Problem:  How do I give users or groups permissions in a SP Host App?  This post is not looking at giving the App Web permissions against the SharePoint Site.  It merely looks at giving users rights to work within the app.

Initial Hypothesis:
SharePoint has a structure Object hierarchy that developers can secure users against namely: SPSite (Site Collection), SPWeb (Sitess/Webs), SPList (Lists/Document Libraries) and SPItem (Rows).  Based on the hierarchy I initially assumed SP Apps would maintain their own security.  Well this would get rather confusing as you would need to maintain the host web and the app web user permissions.

There is no UI in the App web to be setting security but on the UI you can get to the lists permission page.  In the SP Hosted App, navigate to the list, in my case the url is: http://dev-d8f436fea0378f.apps.dev.local/SPHostedApp/Lists/Comment/AllItems.aspx this takes me to the "Comments" list.  I click the "List" tab on the ribbon.  Click the "Shared With" button and Choose "Advanced".


Here we can stop inheriting the security permissions for the list.
I thought I could set the App webs permissions by selecting the "Web" link.  You receive an error.

Resolution:  You need to stop inheriting permissions on the Host App (parent) and then the App web inherits theses permissions.

How if works:
 

Monday 21 February 2011

Mapping internal users (LDAP) to the cloud

Overview: Steve Plank has a great video on "How ADFS and the Microsoft Federation Gateway work together up in the Office 365 Cloud". 

To get your internal ADFS users to authenticate in the Microsoft cloud (Azure and Office 365), you do need ADFS 2.0.  The claims based authentication that can be setup in SharePoint 2010 is how Office 365 and AZURE will authenticate AD users. 

You users will access SP2010/MS Online365/AZURE Web application using their browser.  The end application sends the browser a response redirecting them to the MS Federation Gateway (MFG)/App Fabric/STS web service (SP2010 on site editions), this in turn passes the users browser onto ADFS.

ADFS generates the user a SAML token and the are redirected to the MFG, MFG in turn generates it's own SAML token containing it's claims and the browser is redirect back to the originally requested web application.

For a user trying to access SharePoint Online from their internal network, you can see the user makes several requests to different points along the chain however the key result being the user get securely authenticated against you internal Active Directory (AD).
Steve Planks video is easier to follow than this post but it's worth understanding the process as it applies to Azure, SharePoint claims based authentication and Office 365.  This coupled with custom LDAP providers results in a consistent manner to handle authentication in the cloud using you internal LDAP directory.

Below is an animation describing the process whereby a user is authenticated on their internal network and then they use SharePoint Online (Office 365).
More Info:
http://blogs.msdn.com/b/plankytronixx/archive/2011/01/25/whiteboard-video-how-adfs-and-the-microsoft-federation-gateway-work-together-up-in-the-office-365-cloud.aspx?wa=wsignin1.0

Settingup ADFS for SharePoint Reference:
 

Wednesday 16 February 2011

UK Cloud User Group

Yesterday I attended the 2nd Cloud Evening meeting it was terrific - this event focused on Microsoft's offering.  Specifically the latest version of Azure (1.4 I think) and it looks impressive.  Previously, I felt limited by Azure however, I think Azure SQL is a good option for storing application data especially when Office 365 is released or to extend SharePoint 2010. 

Both Mark Rendle and Planky (Steve Plank) gave good presentations.  I came away with tons of information and I'm pleased I went along.  Planky's blog is definitely worth following.  The security around Azure and Office 365 is fairly complex but considering the integration with LDAP providers and the security considerations it is well thought out and tooled.

Additional Info:
http://cloudeve.ning.com/

Saturday 5 February 2011

What version of SharePoint do I have? Common mistake!

Problem: You have a SharePoint web site and you have no idea what version is running.

Initial Hypthesis: Using Central Admin (CA) you can lookup the exact version.  However, if you are looking ata  website or don't have CA access, a lot of people think they can look at the http header.  (You can see the https header using fiddler or the IE developer tools).  However, the http header with the version number is not the actual version installed.  It is the version when the SharePoint install was performed!
This behaviour applied to MOSS and I have no idea if it is fixed in SP2010.  The http and you version number will be idenitcal if you install using slipstreamed images as the install will include the patches you have in you install.
Resolution:
  • You can't rely on the http header SharePoint version number returned via the page requests to determine the version of SharePoint you are looking at.
  • I recommend removing the http header info for SharePoint as it makes a hackers life marginally more difficult as they don't know what version you are patched to or originally installed.  They will also need to use another technique to determine if your website is a SharePoint server. 
  • Todd Bleaker has a simple technique for identify SharePoint public websites, "Since most people allow anonymous access to the images in the 60 Hive/12 Hive and there really isn't any reason to remove the default images (in fact it probably isn't supported), it is an easy litmus test to detect SharePoint.  However, I recently wanted to determine whether a site was SharePoint 2010 or not, but since visual upgrade is now a viable upgrade scenario and CPVW.GIF is not unique to the SharePoint 2010 I needed something new to look for. So, I poked around a bit and found that /_layouts/images/FGIMG.PNG isn't too hard to remember and it shows an equally unlikely image to be in a non-SharePoint 2010 site". 
More Info:
http://vspug.com/bobbyhabib/2008/05/16/getting-the-correct-moss-wss-version-for-each-server-in-the-farm/

http://www.sharepointdevwiki.com/display/SharePointAdministrationWiki/SharePoint+Versions

http://www.sharepointdevwiki.com/display/spadmin2010/SharePoint+2010+Versions

Update 28/03/2010 - Todd Carter has a post on versions of SharePoint Versions
Update 20/02/2012 - PS to determine SP edtition and patching level

Saturday 31 July 2010

SharePoint 2010 Claims based security & Security notes

Non-Active Directory users in MOSS could support Forms Based Authentication (FBA) so can use SQL to authenticate users, or other providers.
Claims based model decouples authentication from SharePoint.  You can declaratively setup multiple providers. Using Claims Based Authentication (CBA) you can now mix multiple users from different sources in a single zone/site.
In MOSS needed a separate web.config for each set of users.
Using claims based providers can logic/meta-data to provide different users rights depending on rights.
SAML - security access markup language, used instead of Windows identity security tokens. SAM is better in that the token is extendable to give additional authority/claims. I.e. can give additional info on the security token.
CBA allows use to authenticate internal Windows users and external FBA users in the same web app.
Note: Once a claim is validated, the user is added to the SPWeb properties: Users, AllUsers & SiteCollectionUsers before they are authorised.  So as long as they have been authenticated they are added to the properties shown above.

More Info:
Claims explained on Channel 9