Friday 25 January 2019

Sunday 13 January 2019

Using Box.com Pragmatically

Overview:  In the SaaS Document Management space Box.com is a competitor to SharePoint Online.  A medium size client request recently came into integrate with a client and deliver files into Box.com, as it is something I have not done before I was eager to see how easy it is.

Requirement:
  1. I merely need to create folders within the clients tenant if needed and drop files into specific folders from a scheduled job that runs every 5 minutes.  
  2. Box.com has API similar to SharePoint CSOM to pragmatically work with your Box tenant.   
  3. Box.com functionality is specific to Documents so the API is really small and easy to learn.
  4. There are multiple ways to pragmatically authenticate to you Box.com tenant, PoC keeps it simple.  Actual must use JWT for the connecting service account.
Implementation Details:
Box.com has several was to problematically connect and to test the API's.  For my PoC I used the Developer Token approach.  We should switch this over to the JWT OAuth approach but for the PoC and working with the API's I used the Developer Token approach.
1. Once you tenant is setup and you have configured your client, set the developer Token that last for 1 hrs as shown in the screen above.
2. Create a new C# console and add the Box C# SDK reference as shown below.
3. You will need to add the Developer Token, Client Secret and ClientId in order to programtically connect from the console.  Below is my app.config.
<?xml version="1.0" encoding="utf-8"?>
<configuration>
    <startup> 
        <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.6.1" />
    </startup>
  <appSettings>
    <add key="ClientId" value="f9y555fiqwqcbv555lst88dmzbxzqa7n"/>
    <add key="ClientSecret" value="CoTT555U7oN555wKF555aPYz5555"/>
    <add key="DeveloperToken" value="TjxJh555ivvW555EE555NTerb555"/>
  </appSettings>
4. Connect to your tenant using Box.com's API's/ SDK

5. Run the Console and the console looks as follows:
6. Code the file upload logic:

Final Thoughts:
  • Overall I think Box.com is a good option if you don't already have O365.  It's pretty expensive for a small feature set but it is a valid option for clients.
  • The search indexing is ridiculously slow so very hard to build search based solutions using the API.
  • Microsoft Flow has connectors for Box.com before developing, check if Flow can meet your migration, moving requirement.

Thursday 10 January 2019

NoSQL Document Database options on Azure - CosmosDB

Overview: Azure has a plethora of options for using NoSQL, I have used RavenDB and DocumentDB a couple of years back.  Both are easy and great tools for the right situation, DocumentDB now falls under CosmosDB as a product at Microsoft. However, I feel that CosmosDB would be anyone's default choice today on Azure as DocumentDB is really a feature subset of CosmosDB.

CosmosDB"Azure Cosmos DB is a global distributed, multi-model database (db) that is used in a wide range of applications and use cases. It is a good choice for any serverless application that needs low order-of-millisecond response times, and needs to scale rapidly and globally."  CosmosDB is used by Microsoft's Skype, MSN, Xbox, Office 365, Azure products.

Def: CosmosDB is a Planet scalable NoSQL JSON database that has multiple API support (including SQL(Core)).  Multiple copies/instance around the world (think SQL AOAG).
  • Encrypted on Azure at Rest and in Motion.
  • Multiple API's supported including SQL API (DocumentDB) and multiple other JavaScript and Table.
  • A logical breakdown of CosmosDB API
  • Partitions are managed transparently and users are routed based on geographic location and usage.
  • One write db and multiple reads.  Or can setup multitple write databases  Can set automatic failover so if the write db is unavailable, one of the read db's becomes the write db.
  • CosmosDB Migration tool is great at bringing in data from JSON, SQL, CSV, MongoDB, Amazon DynamoDB.
Concerns:
  • Determinant geo-replication - Use to be 1 master and multiple read copies of the data.  Not all copies can be written to but if you have country data residency rules you can't configure data to be within specific regions.  I.e. I can't specify certain bits of data are only stored in a specific region.  You can specify a region/location for a container, but not split a container.   : Check!  Not a fact.
  • Backup and Recovery - Point in time recovery and MS ticket needs to be raised.  Can't structure complex backup plans.  Take it or leave it approach.
  • Limited LINQ support
  • SQL API is very limited compared to SQL relational databases, offering no joins or aggregation capability such as GROUP BY.
  • Temporal Tables don't exist, there are good auditing options such as the "Change Feed" where all changes can be streamed into an external database/system.
  • Entity Framework support is limited. Consider a PoC before using.
  • Consistency (copy data to other read-only debs) 5 options: "Strong" (commit to all dbs and acknowledge state, so slow to align but all reading same data but it may be stale.  "Eventual", reads what is in the local db you are going to.   The default is "Session".  As always, it depends on the requirement.
Tip: Amazon DynamoDB is AWS's equivilant to Azures CosmosDB

More Info:
NoSQL options - https://www.nebbiatech.com/2017/02/09/exploring-the-nosql-options-on-azure/