Wednesday, October 18, 2017

Immersive And Conversational Apps And Why Improper Design Will Hurt Your Customer Experience...

As  we  are  seeing  that   emerging  trends  suggest  total  new  channels  and   new  ways  of user experience which mainly  point  to  the  technologies .

  • Conversational Platforms
  • Immersive Experience
These  trends   make  several  customer  facing  organizations  to  adapt  them  to  make  a difference for themselves.  However  some  of  these  efforts  like  introducing  automated  BOTS  or  Cortana based  applications  as  part  of  your  customer  interactions  needs  to be  planned  carefully.

Lot  of  us  would  have  seen  the  epic fail  of  Cortana  in a  large  conference,  while  that  is a  smaller  issue  may  be due  to  crowds clapping  and  laughing  in the  back  ground  and  the  Cortana would  not  have  understood  the  voice  fully.  But  the  bigger  issue  is  that   the  customer  is  not there  to  experiment  on  voice  conversations  or  chat  conversations  unless  they  are so  natural  and  fully understand  customer  intent  and  help  him.

Most  of  the  time  we  hear  customers  saying  that,  "I  could  have  typed  or  clicked  the  screens  faster"  than  going  through  a  so called  intelligent  or  smart   application  which  still  has  not understood   what  I  really  want.

So  the bottom  line  is  that  your  Voice  Based  Applications  as  well  as   Chat based  BOT  applications  should  include   within  them  lot  of  intelligence,  so  that  they  really  help the customers  in  what  they  are  looking  for  and  not  serve the  function  of  a  experimentation  tool where the customer  has  to  really  talk  or  type  only a  few  exact  sentences  and  then the  system  does  the work.

Microsoft BOT Framework :     BOT is basically  an intelligent  application hosted on Azure  that  will  let it's consumer  interact  with it in a  conversational  way, mainly  using  chat based interface, however it  will also support  voice  and  form  based  inputs  also.  BOTS guide  the  user towards a  solution and typically  are useful  when  we don't  know the user intent  and let  him to  choose from a  free form of  ideas.

There  are  multiple  ways  we  can  build  BOTS  using  Microsoft  framework.
  1. Using Azure  BOT  Service  which  provides  an end to end  integrated environment for  building, testing  and deploying  BOTS. Lot of the BOT Management activities are automated in this option and  makes  it easier for building less complex  BOTS
  2. BOT  Builder is an Open Source framework  and  uses  traditional  Visual Studio IDE to develop  BOTS  using  standard  programming  approach. BOT Builder SDK is available for both .net  and  Node Js.
Using Industry Standard Design Principles :    As  always  in the design  of  any  "user interface related" computer program  lot  of common sense is  needed in building BOTS  also. BOTS are not meant to  show  cause  a  technological  feature  but  more  about  ensuring  that  user  is able to finish their  work  easily and  able  to  get  best  out  of the  interaction.

Also  Microsoft  has  provided  guidelines  on  wrong  usage  of  conversations  using  BOT  Technology,  some  of the guidelines are below.

  • First Interaction with the customer is important  and  it  should  not provide an unlimited and open ended options such that  customers  may  get  lost  in the conversation. Most times as an organization we  know  our  list  of  services  and  let the user to narrow down what  he/she requires  will be an better usage.
  • Guidelines  warn  against  building  what  is known as "Stubborn Bot"  which does  not  let user to change the  track  during the course  of  transaction  and  makes the user frustrated. For example  in a  BOT  for  booking  airline tickets,  if  the  BOT  keeps  asking  about  your destination  without  even  letting the user to understand  various  discounts and  offers available in your  site.
  • Guidelines   warn  against  building "cludless Bot"  which are unintelligent  and  does  not understand the  context.  For example  your  user  is already  logged  in  and  you know his name and other  details,  there is no point for Bot  to  ask  the details  again. We  will talk more about  how to fix the problem of "clueless Bot"  in the  below  sections.
  • While  Unintelligent  BOTs are not  unwelcome,  it  is  also  true  that  some  times  BOTS  exhibiting  too much  of  intelligent  beyond  what  is  required  for the  conversation. Suppose  assuming  that  user  is  trying  to  book a  hotel  room  for  his business travel  to  attend a conference,  the  BOT  should  not  use  the  Social  Media  profile  of the  user  and  assume and ask questions  about  User's  family or  hobbies.  Like  the  BOT  should  not  offer  some services  for  user's  kids  during  the  stay.
While  these  guidelines are  just  indicative,  there is  not a  single  useful  navigational  flow  that  will  provide a  blueprint  of  a  perfect  customer  centric  BOT. It  will differ  from case to  case. However  applying common  sense  and  utilizing  the  appropriate  intelligence  as  part  of the solution is key to its success. The  following  are some  of  Azure  AI & Cognitive  Services  that will  assist in BOT  development  and  make  them more useful for customers.

Understand Customer Language (Basic Level):    Understanding  the  natural language  is  fundamental to the success of this new user experience interfaces.  The  best  thing  with a  human interaction is that however differently  asked  humans  will understand  the intent  most of the times. However  BOTS unless given the intelligence,  they may  just  act  like  dump machines  which merely  replays  what  is  pre programmed. LUIS (Language Understanding  Intelligent Service) is  one  important  Azure cognitive Service  that  will  provide the much needed  intelligence  to BOT interfaces.  LUIS understand the  intent  from  User's  input  which is the  Verb  which is the action user wanted to perform. For example  Intent  could  be like,  Register for a  Event, or   Book a flight. LUIS also  extracts  entities  that  is  of  interest  from  a  textual conversation, which is very important, because when some  one  wanted to  book a ticket  to  Paris,  "Paris"  is  about the  destination that needs to be understood and  BOT services  tailors it's  response  accordingly.

Analyze Customer Language (Deeper Level) :   While  LUIS will help you  with basic understanding and the direction with which the customer  wanted  to  take  the conversation you need to make further analysis  much  like  human  do  in  a conversation.  That is where the  Linguistic Analysis  comes  into  picture.  One  important  aspect  is  to  understand  the  Phrases  within the  conversation. Because  some  times  individual   words  will  convey  a  different  meaning where as the phrases  will  convey  totally different  meaning.  For example  some  one  say to a BOT that  " I wanted to book a luxury hotel in New York".  In  this  case  "luxury hotel" is a  phrase  that  we  need to respond  rather than  just  any other  hotel.  Similarly  there  are  other  techniques  like , "part  of speech tagging", "sentence separation" all  are  important  while  conversing  naturally  with the user.  Most  of these APIs are available  as  part of  Azure "Linguistics Analysis API".

Understand  the  Mood  Of Customer :   If you happen  to  see a  call center  conversation, some times  calls  may  show  customer's  anger, dejection, frustration and other  sentiments.  And  most times human  adjust  their  conversation with respect to the  tone  of the customer. For example  if the customer is  extremely unhappy with the service,  the   call center person  will first  ask for excuses before proceeding. BOTS are expected to behave the same way. That is why  Sentiment Analysis of each part of conversation is  important  and  once the  BOT  determines that the customer is unhappy the direction of the conversation should be different.  Azure  "Text Analysis API" will help with  understanding the sentiments.

Guide The ConversationWith Corrections To Language:   Spelling  makes a huge portion  of language understanding. Also  the world  of  social media  like twitter  have  created a huge set of slang language  that  the  BOT may have to understand. While there is no  exhaustive method  which can correct all the spelling issues  possible in a  language,  integrating  with Azure Spell Check API mitigates most of the problems. This API  recognizes common  name errors,  corrects word breaking issues and recognizes slang  and informal language.  Especially for US-English is the language of communication then this API  provides more advanced  functionality.  When ever the BOT  finds a conversation with a wrong spelling it is better to suggest to user and progress as per the correction.

Make Best Utilization Of Your Existing Support Channels:    Most  organizations  have existing options  to  get support  for their customers  which are  proven  and  working already.  For example  most  organizations in their  web site   generally have a   FAQ (Frequently Asked Questions) section which are trained  over a period  and  normally reflect the best  issues that their customers face. When we build a new automated  BOT  to  respond  to users, we should  not  start  fresh, rather  best utilize the existing  FAQ. Azure Q&A  Maker  is one such cognitive  service  which is  meant to  extract best possible  conversations  from a  existing Q&A content. Q&A  maker  service accepts  the existing Q&A in the form of  TXT,DOC,PDF,HTML files  and uses  Artificial Intelligence and Natural Language processing  to match these questions against  user's  queries  and  match them appropriately. This service  provides  option  to  train  the Q&A sequences as well  as  constantly update the knowledge  base  which makes the service more dynamic in nature.

The  above  are  only  a  indicative  set  of  services  that  can  enrich  the customer  experience,  however  as  we  involve  with  users  we  should  keep on  improving  the  intelligence  behind the BOTs and other  Automated  Assistants  and  should  not only depend  on the past  training  data. Because  the  training  data  may not be available  or may be obsolete.  One  such  option  is also to use the "Reinforcement Learning"   which keep learning  and adapts  itself.

Finally  all  of  the  Automated  Assistants  should  at  some  point  have to  hand over the humans to ensure  that  the situations  which beyond their  ability  are  still  handled. Without  Human intervention  these  Automated  Agents  may  frustrate  users  and  make  them  not interested  in your site  or  app.

Using  the  newer channels  provide  lot  of  options  for the organization  to  serve more  people with less resources  because  humans  have  limitations  and resources are limited.  However immersive and conversational  agents  should  be planned  carefully  and  not  just  be  viewed  as a nice technology add on,  but rather  human centric  usage  has  to be understood  and  designed for. May be these  interface  require  Manual  Testing  than  Automated   testing  to ensure that  whether  Users  are indeed  like the flow  and  gets satisfied  from the service.

Tuesday, October 10, 2017

Implementing Digital Twins Using Azure IoT Hub & BigchainDB

Digital Twins:   Analyst  Gartner  has  recently  released  2018  Technology  Trends.  Along  with  expected  technologies  like   Artificial Intelligence,  Immersive  User Experiences, import  trend is  about  Digital  Twin.

As  the  name  suggest  is  Digital  Twin  is the  Digital  representation  of a  real  world  entity,  typically  available  in  Cloud  and  maintains  the  current  status  about  the  Device  along  with its  properties.  As  it  reduces  the  need  to  Query  the  Physical  Device   (which is  generally  costly and  subject  to   connectivity  issues),  Digital  Twins  find  lot of  value  in  Asset  Management  of  large  machinery  , especially  in  reducing  the  Maintenance  and  Repair  Operations. Also  Device twins  facilitate   Remote  Management  of  machinery making  them  more  efficient.

Digital Twins  practically  can extend  any industry,  starting  with Smart Cities, Government, Manufacturing,  Healthcare, Retail  and more.

BigchainDB  As Asset Management Solution :     BigchainDB is  a  scalable  database  with Blockchain  characteristics.  As  mentioned  in  my previous  articles,  BigchainDB  while  can be used  for  many situations,  one  of the  important  use  case  being  'Asset Registrations and Transfers'. When you  make a  CREATE transaction  in BigChainDB  it  registers an Asset  and tags it  with  the  respective  metadata.  BigchainDB  subsequently  tracks  the  Asset  ownership  till  its life time.

Azure  Iot Hub :   IoT  hub  is  part  of Azure's  IoT  services  and  provides  secure bi-directional connectivity  to  Devices.  It  provides  Per  Devices  Authentication  and  Secure  Connectivity.  IoT Hub also  provides  options to  send  configuration and maintenance  operations to Device  and  in turn record  the  status  of  Devices after those operations.

So  considering  the  use  cases  of  above  2  products  it  would  be  worth while to  implement  Digital Twins  (Device  Twins)  using   BigChainDB  and  Azure  IoT  Hub.

Scenario :    Consider  a  Scenario  where  a  large  Crane  Manufacturer  ,  leases  cranes  to  be  used  by  various   real  estate builders and other  organizations.  In this scenario BigChainDB  is used to maintain the  larger  Device Specifications  about the  Crane  itself  and  also  maintain the  current  lessee (owner). The  specific  agreements  for  the  usage  and  other legal  and  regulatory  compliance  of the  Crane  can  be  maintained  in  a  blockchain  network  like  BigchainDB. However the crane  is also monitored  for  functional  efficiencies  using  IoT and   status  of  the  crane is   updated in  real  time  to  Azure  IoT  Hub.  Now  the  following  piece  of  implementation  will  link  both   Bigchain DB  and  IoT  Hub  and  will  show  cause the value  of   Digital  Twins.

Register  Device  In BigchainDB :
bdb_root_url = 'http://localhost:9984'  # Use YOUR BigchainDB Root URL here
bdb = BigchainDB(bdb_root_url)
crane_asset = {
            'data': {
                        'crane': {
                            'manufacturer':'crane manufacturer',
                            'model': 'my model',
                                 'model':'engine model',
                                 'type':'4 cycle, 6 cylinder',
crane_asset_metadata = {
            'currentlocation': 'new york'
prepared_creation_tx = bdb.transactions.prepare(
fulfilled_creation_tx = bdb.transactions.fulfill(
sent_creation_tx = bdb.transactions.send(fulfilled_creation_tx)

The  above  piece  of  code  creates  the  initial  registration  of  the  Crane.  While  the  set  of  metadata  is  not  an  exhaustive  set  of  attributes  for  a  crane  it  does  provide  an  idea  about  how a  Device's  meta  data  will  be maintained  in the BigchainDB  along  with  the  information about  current  owner.

Now  let  us  assume  that  after  registration,  BigChainDB  creates a   DeviceID  as  follows.

Crane Device Id  In BigchainDB :  9da06f5a07c3d8a3ae27cd9d5bb7019ec276651d0d49fc4275d2771834b344c2

Create Azure Iot Hub:   Azure  IoT  Hub  is  quite  easy  to  create  from  Azure  Portal,  the  following  screen  shot  shows  an  implementation  of  Azure IoT Hub.

The  Important  values  to  look are  the Shared  Access Policies  which  provides  connectivity and authentication  information  for  connecting  to  this IoT  Hub.

Register the  Device Created In BigChainDB  with Azure IoT  Hub;

The  next  step  is  to  register the Device  that  is  created  in BigchainDB  with Azure IoT  Hub, so that  it  provides a  link between the two systems. Such  that  the  Asset ownership and  legal needs can be  tracked  using  BigchainDB  and  the  Device  Telemetry  and  Remote  Monitoring  and  Preventive Maintenance  can  be  tracked  using  Azure  IoT  Services. However due to  linkage of the metadata  both  the  implementations  can be viewed  in  sync.

var iothub = require('azure-iothub');
var connectionString = ';SharedAccessKeyName=iothubowner;SharedAccessKey=**********=';
var registry = iothub.Registry.fromConnectionString(connectionString);
var device = {
          deviceId: '9da06f5a07c3d8a3ae27cd9d5bb7019ec276651d0d49fc4275d2771834b344c2'
registry.create(device, function(err, deviceInfo, res) {
          if (err) {
                      registry.get(device.deviceId, printDeviceInfo);
          if (deviceInfo) {
                      printDeviceInfo(err, deviceInfo, res)
function printDeviceInfo(err, deviceInfo, res) {
          if (deviceInfo) {
                      console.log('Device ID: ' + deviceInfo.deviceId);
                      console.log('Device key: ' + deviceInfo.authentication.symmetricKey.primaryKey);

Now  the  above  code  will  create  the  device identity  in  Azure IoT  Hub. While  the  code  is  self explanatory  and  uses  the  connectivity  string  as  obtained  under  the  'shared access policies',  the important  thing  to  look into  is  about  the  Device Id.

Device ID  can be  any  meaningful  identity  for the  Device  which  will  act as a  primary  key for the  device,   However in this  case  we  have  used  the   Asset  ID  that  is  created by BigchainDB  and  thus  the linkage  between the  Asset  representation for legality purposes  and  Device  Representation  from  Telemetry  perspective  are  linked.

After  the  above  program  is  executed, you  will   find  the  Device  is registered in  IoT  Hub  with the  same  ID  as  established by BigchainDB.

Device  Twins  may  provide  much  more  benefits  when  it  comes  to large  machinery  whose  maintenance  and  legal  management  of  compliance  needs  are always  complicated.  This  kind  of  solution  involving  BigchainDB  and Azure  IoT Hub  may be  of interest  to enterprises.

This  solution  not only makes   the  Machinery  Operations  smarter with  Device  Telemetry,  by  linking  with  the  Asset Management  solution  of  BigchainDB,   device ownership  over a  period  and  its  history  can  be  tracked  and  transparently  available.  This will  help  with  lot  of maintenance  issues  and  associated  legal  compliances.

(Please  note  that  in this  post  Device/Digital  and  Asset  have  been  used  interchangeably).

Thursday, September 28, 2017

Serverless Event Driven Blockchain Applications With BigchainDB And Azure

Serverless Computing    is  the  new  design  pattern  that  is emerging  when  creating  business applications.  It  does  not  mean  that ,  there  are  no  Servers  in executing  the functionality. It is more of ,  the complexities  about  managing  the  servers, compute capacities are hidden from the application itself  and  the  application  only  concentrate  about  its  functionality.

Serverless  architectures  are  more closely aligned  with  Event  driven  Computing,  where  the applications  are  viewed  as  a  set of  small  functions  which  execute  based  on  the  triggering events.  This model is different  from  traditional  procedural  and  monolithic  application  architectures. 

There  are  lot  to  be  read  about  Event  Driven  and  Serverless  architectures in the context  of agile applications  which  follow  Microservices  design  pattern.

Azure Functions :  Azure Functions is an event-driven compute experience which allows you to execute your code, written in the programming language of your choice, without worrying about servers. Benefit from scale on demand and never pay for idle capacity.

BigchainDB  :    In my  earlier  articles  we  have  seen  the introduction  of BigchainDB as  one  of  the  Scalable  Blockchain  databases.  With Bigchain DB  we  can  build  De-centralized  applications whose  transactions  are  immutable.With its  current  schema  model,  BigchainDB is a  good choice for  Asset Registrations, Ownership and  transfers. Typically  BigchainDB  will work in  tandem with Smart Contract  blockchain  platforms  like Ethereum.

One  of  the  important  aspects  of  BigchainDB  is  its  support  for Web Sockets. BigchainDB provides real-time event streams over the WebSocket protocol with the Event Stream API. Connecting to an event stream from your application enables a BigchainDB node to notify you as events occur. 

With BigchainDB  ability  to  store  assets  and  considering  the fact  that  transactions  can be notified  to  client applications  using  Web Socket  we could  think  of  many  real  life applications which are event driven and  use the serverless architectures.

The  below  the  architecture  of   Serverless  Event  Driven application  using  BigchainDB web socket interface.

The  following  are  the   brief  description  of  components  involved in this  architecture.

BigchainDB  Web Socket Event Stream API :   BigchainDB  provides a stream for all validated transactions. Each stream is meant as a unidirectional communication channel, where the BigchainDB node is the only party sending messages. The  following  API will show the  end  points for connecting  the  Bigchain DB Event Streams.

"assets": "/assets/",
"docs": "",
"outputs": "/outputs/",
"statuses": "/statuses/",
"streams": "ws://localhost:9985/api/v1/streams/valid_transactions",
"transactions": "/transactions/"

Azure Web Jobs :   WebJobs is a feature of Azure App Service that enables you to run a program or script in the same context as a web app, API app, or mobile app.  Web Jobs are typically used for  creating and running back ground  jobs. Webjobs  can  be  continuous  or  triggered  jobs. Web  Jobs in continuous  mode  can  play the role  of  Web Socket  client  so that  it  received  events  from  BigchainDB  for  process  them.  The following  is  a  simple  code  that  listens  on  a  Event Stream and  logs  the  output.

            using (var ws = new WebSocket("ws://****"))
                ws.OnMessage += (sender, e) =>
                    Console.WriteLine("Event Data: " + e.Data);

Azure Functions  :     As  mentioned  earlier,  Azure functions  is the  event  driven  and  serverless component  of the  architecture. Use Azure Functions to run a script or piece of code in response to a variety of events.  Azure Functions  has  multiple  kind of triggers  which makes  the underlying process  to  run.

Generic webhook  is  one of the triggers which  Process webhook HTTP requests from any service that supports webhooks

Now  if  we  look into  above  architecture  from  the  following  scenario.

  • Bigchain DB  stores  the  Car  Leasing   data  for  a  major  car leasing  company
  • The  initial  Car  Asset  is  created with the ownership of  car  leasing  company
  • Every  time  the  car  is leased  out  a  new   TRANSFER  transaction is created in BigchainDB
  • BigchainDB  event   stream  web socket  trigger notified  the Web Jobs 
  • The  Web Jobs  which  is  created  and  owned  by  an insurer  will be notified  of  car leasing ownership changes
  • The  Web Jobs  triggers a  serverless API  call  using  Azure  functions
  • The  data  gets  stored  at its  destination.
The  insurance  company  on  getting notified  of  changes  in   Driver,  can  adjust the  insurance  premium  according  driver's  past history.

This is a  simplistic  example  of  a architecture  that  uses  Event  driven  and  Serverless  architecture from the perspective of  BigchainDB  and  Azure.

Currently I  face  a  few  errors  with Web Socket  implementation  but  they  will  get solved.  Write to me  if you  are  interested  in knowing  how  Event  Driven  Architectures  will be of  use in a Blockchain  application.

Also  current  Azure  functions  does   not  support  Websocket  directly and  hence  we  need to introduce  Web Jobs    as a  intermediate  component,  however this can be avoided  if Azure  functions support  Web Sockets  as one of the trigger.

Friday, September 22, 2017

Implementing JPMC Quorum Using Azure Blockchain As A Service

One Size Does Not Fit All :   As  we position  blockchain  enabled  de-centralized  and  distributed  applications  for  enterprise,  there   could  be multiple  needs  for  different  enterprises.  That  is the reason  initially permissionless  public blockchain  started  the  transition  and subsequently  the  permissioned  private blockchain  continued  with the  transformation.

However  even  the  permissioned  private  blockchain  perceived  to  have  some  limitations  with respect to enterprise compliance  needs,  and  hence  there  is a  thought  process  to  customize  the  private  blockchain further  and  this  has  resulted  in  the  creation of  new  blockchain platform Quorum  by JP Morgan.

Quorum is an Ethereum-based distributed ledger protocol with transaction/contract privacy and new consensus mechanisms.

What  makes  Quorum  different  from  Standard Ethereum  is  the  concept  of  Private Transactions. Quorum supports Transaction-level privacy and network-wide transparency, customizable to business requirements.

Quorum Implementation Using Azure  Blockchain As A Service:    Azure BaaS  provides a  easy to use  template  a  reference  implementation  of Quorum  On  Azure.  The  template  only  looks for minimal  parameters  like.
  • VM size
  • Storage Account
  • Public Key  for  authentication
  • Network Related Information
Once  the  node  is created,  we  can SSH into the  node  and  start  the node operations.

Two  Script  file s  (,  which  needs  to be run  in  sudo  mode  are  required to start the  Quorum Blockchain.

This  setup  simulates  7  Quorum  Nodes  in  a  single  host  and  a  list  of  configuration  files  tm1.conf...  tm7.conf  are  used  as  configuration  for the nodes.

For  example  the  following  command  is  used  to  Start the  logical  node  1.

PRIVATE_CONFIG=tm1.conf nohup geth --datadir qdata/dd1 $GLOBAL_ARGS --rpcport 22000 --port 21000 --unlock 0 --password passwords.txt 2

Similarly  all  the  7   nodes  at the end of execution.

The  examples  work  well  with  the   IPC  interface  of  quorum. For  example the below  command executes  a  script  against  node 1.

 sudo PRIVATE_CONFIG=tm1.conf geth --exec 'loadScript("./chk.js")' attach ipc:qdata/dd1/geth.ipc

Private Transactions :   'Private Transactions' are those Transactions whose payload is only visible to the network participants whose public keys are specified in the privateFor parameter of the Transaction .   The following  is an example of  private  transaction.

var simple =, {from:web3.eth.accounts[0], data: simpleCompiled[simpleRoot].code, gas: 300000, privateFor: ["ROAZBWtSacxXQrOe3FGAqJDyJjFePR5ce4TSIzmJ0Bc="]},

Here  the  PrivateFor   is  a  customized  version  of  transaction  specific  to Quorum. 

Is  Private  Transactions  will  be  a  acceptable  standard  in  Blockchain,  as  some  times  it  may  be viewed  as  bringing  centralized  control,  and  not  bringing  transparency  to  the  transactions. However  considering  the    Multi Bank Consortium,  all  banks  may wanted  to  share  the information  like KYC  to  prevent  issues  like  Money  laundering,  however  any  particular  two  banks  still  get into  private  transactions, which  other banks may  not be aware.

Thursday, September 21, 2017

Implementing WorkFlow Applications Using BigchainDB

Blockchain Enterprise Applications :    As  mentioned  in  earlier  article,  thinking  Blockchain way will  transform the  way  traditional  enterprise  applications  are  done  so far. As  mentioned earlier,  BigchainDB  is  a   scalable  blockchain  database.  It  is a  NOSQL big data base  with blockchain chracteristics.  Also this  database  is  good  for  Asset  Registers  and  Transfers. Infact viewing  database  transaction  in terms  of  Asset  is  one of the fundamental  attributes  of this database.

On  thinking  about  the  database,  the  very   nature  of  Asset  Registration  and  subsequent  transfer between  multiple  owners,  can  be  viewed  from  the  angle  of  traditional  workflow  applications  that  is  prevalent  in  enterprises  today.

Considering  a   Loan  Application  of  a  new  Car.  The  work  flow  could  be.

1.  The  End  User  Sends  the  Request  for  a  loan,  say  through  a  dealer
2. The   Application Clerk (Alice),   validates  the  data  performs  Pre-Checks  and  creates  the  Initial  application in the  system  through a  user  interface
3. The  Loan  Application is  then  forwarded (transferred) to   next  level,  where  the  Credit and Document  Validations are  done.  Done By Bob.
4. Finally  the  Loan  is  passed  to  Disbursement  of  final  amount. Done By Mary.

This  may  be  a  simplistic loan  approval  process  and  in  actual  there could  be multiple  steps  in between,  but  this  particular  scenario  can  easily   be  modelled  in  BigchainDB  database  and  the whole  process  can  be  viewed  from a  different  angle.

Now  considering  that  each  of these  individuals  belong  to a  different  organization or group, A  consortium  blockchain  database  could be an  ideal solution.

1. In the  below  steps,  the  initial  Asset ,  which is basically  a  Loan  Request  is  created  by  Alice.

request_asset = {
            'data': {
                        'request': {
                            'name':'Allen Anderson',
                            'requesttype':'car loan',
                            'carmake': 'Toyota',
                            'address':'284 N Cross St XXXX ',

request_asset_metadata = {
            'comments': ' Request Created In The System'

prepared_creation_tx = bdb.transactions.prepare(

fulfilled_creation_tx = bdb.transactions.fulfill(

sent_creation_tx = bdb.transactions.send(fulfilled_creation_tx)

2.  In the  second  step,  Alice  performs  the   Pre Screening  and  transfers  the  Asset  i.e the  Request to  Bob.  Even  though  due  to  immutable  nature  the  asset/request  itself  cannot be modified  it  is  the  metadata  that  gets  changed  to  reflect the  current  status.

transfer_asset_metadata = {
            'comments': 'Pre Screening Done and  Ready For Credit Check'
            recipients= bob_public

3.  Finally  Bob  performs  the  Credit  Check   and  Documentation Check  and  transfers  the  Asset to Mary.

transfer_asset_metadata = {
            'comments': 'Credit Check  Done, Documentation check Done, Approved For Disbursement'
            recipients= mary_public

In  each step,  the  Asset  ownership  changes  reflecting  the  steps  in the workflow.  Also  the  Asset Meta Data  can  be  effectively used  to  get the  details  about  the  current  approval  status.

Also  there  are  many  Query  constructs  that  are  available  to  perform RDBMS  like  search on the whole  data  set  to   perform   further  reporting  on it.

While  the  above  scenario  can  also  be done  using  Other Relational Databases,  thinking  Blockchain  and  Distributed  way   of  building  enterprise  applications  will  lead  to  new  possibilities.

Let  me know  of  other use  cases  that  will   fit  these  scenarios.

Monday, September 18, 2017

Why Blockchain (DApps) Will Transform Application Development

Traditional Trasaction Processing :    Over  the past  decades enterprises  used  high performance OLTP applications  to  meet their business  transaction needs.  If we really  look into  what consists of a  typical  application, they are all about executing  a  business  transaction (like a  sales order, invoice, purchase order, employment, shipment, audit report) between  two parties,  and in the process multiple steps  of  work flow support in the execution  of  that  contract.

Blockchain  based  application,  which  are  known as  DApps  or Decentralized Application  is  about  creating  and  managing  Smart  Contracts  on top of a  blockchain based  database.

Also  Data Integrity,  Data Latency, Data Availability, Data Security  are  some  of  the  typical  challenges  of  enterprise  business  applications  all  along, such  that  costly  techniques  like  ETL, Role Based Security,  Data Encryption  all  have  been  applied  on top of the existing  applications to meet  these  demands.

However Blockchain  platform  and the underlying  distributed  databases automatically  provide these attributes  and  reimagining  the  enterprise  applications  from  the  point of  view of  Blockchain based  Smart Contract Dapps  may  provide  lot  of  hidden  benefits  for enterprises.

In other words  the   Innovation that  are  automatic  "by  product"  of  Blockchain Eco System will help the enterprises  in other  areas also  so  Enterprises  best  benefited  by  blockchain  if  they think them  as  a  Digital Platform  and  not  just  a Digital Currency.

Few  thoughts are shared  on some  recent  developments  from with Microsoft Azure Services that point  in this directions.

Azure Confidential Computing:   Recently  Microsoft  has   released   new security  architecture/platform/pattern  known  as  Confidential Computing.  This  platform  ensures  that  the  Data  that  is owned  by an application is  protected inside a  Trusted Execution Environment (TEE). TEE ensures that  the  data  is  viewable  only  from the context  of  the  application and  not  even viewable by the Administrators  of the underlying  bare metal physical hardware. The  initial plans from Microsoft  to enable TEE in a  hardware based and software based infrastructure namely,

  • Virtual Secure Mode (VSM) TEE that’s implemented by Hyper-V in Windows 10 and Windows Server 2016.
  • Intel SGX TEE capable servers in the public cloud
Microsoft Enterprise Blockchain Framework Coco :    Recently  there is another announcement from Microsoft  about Coco framework,  which  mitigates  the limitations  of  Permissionless public blockchain protocols  into more enterprise  friendly  characteristics, like  High Throughput,  Low Latency  and Reduced Energy Usage.

Coco framework  exactly uses the TEE which forms  the  backbone  of  Azure  Confidential Computing  to  create the  network  of  nodes  that  form the  part of Enterprise Blockchain Consortium. Coco framework  uses  multiple  layers  of  components  that  ultimately  execute on a TEE.  The  following  picture  courtesy  of  Microsoft Coco Framework Whitepaper  explains how TEE  provides a  trusted blockchain  environment,  irrespective of the blockchain protocol.

Further  information  about Coco Framework  as  well  as  Confidential computing  can be obtained from Microsoft Documentation and  Blogs.

This  gives  opportunities  for  developing  Financial, healthcare applications  using Coco framework which  implicitly  provides   innovations  on the  security  architecture  in the form of Confidential Computing.  As  mentioned  in the  earlier  part  of  blog,   Reimagining  the  existing  enterprise applications  using  Blockchain and Smart Contracts to be executed in TEE  can bring new possibilities  for enterprises.   This may ultimately  lead  to transformation of  application development.

Sunday, September 17, 2017

BigchainDB Automobile Use Cases - SOC Tracking

Blockchain Databases :   As  the  Smart  Contracts  which is  the  primary  use  case of , block chain platform emerge as a  key  interest  for enterprises, one  of the  major  concern  is  the  performance  of  blockchain  as a data store  from enterprise perspective. There are a  few attempts  to  inject  RDBMS Like  Transaction Processing capabilities  and  Big Data  like  Horizontal Scaling in to the blockchain data store  while  maintaining  the  core  tenants of blockchain like  immutability, peer to peer replication etc...

BigchainDB  is  one  such database  that  merges  traditional  database characteristics  with blockchain.  Though  the  vendors  for  now claim that  the  product  is  nearing  a  production class implementation,  there is already  reference  implementation  using  Azure Kubernetes  Container As A Service that  provides  enterprise  grade  architecture.  This  product  works on  top  of  MongoDB that  gives  all  the scalability of the underlying database.

One  important  aspect  of  BigChainDB  is  fundamentally geared  towards  the  concept  of  Assets and subsequent  manage  the  life cycle  of  Asset  through  its  operations. A CREATE transaction can be use to register any kind of asset , along with arbitrary metadata. A TRANSFER transaction is used to  transfer the asset  from  one owner  to another  owner,  while  the   reference  to the  base asset  is maintained. An ASSET  can also have  metadata which will  be useful in tagging and searching of the asset.

Managing Substance Of Concern In Automobiles:  As  you may know a  typical  Automobile  Original Equipment  like a Car  is  made up of 100 or even 1000s of parts and components.  Each of these components  and  sub components  are  manufactured  by  various suppliers.  These sub components  potentially  could  be  manufactured  using  SOC (Substance  of Concerns),  which are typically  dangerous  chemicals and contents  like Lead, Nickel.  Also  a  manufacturing  of a  car  involves multiple  suppliers  and  who may  produce  these  components  in different  countries. Also  the supply chain of a  car  involves  multiple  parties  and some times  it is  global in scale, such that the parts  are  manufactured  in one country but may be used in another country. Also once a car  comes out of OEM manufacturing  facility,  it's ownership  may  be transferred  between multiple owners before  it reaches  ELV ( End of Life of Vehicles). Chemicals legislation, such as REACH, puts significant responsibility on the communication, notification and phase-out of substances of concern (SOCs) throughout the complete supply chain.

Assume that  the OEM is  typically  responsible  for  keeping  track  of  the SOC usage in a car(automobile)  and  needs to report on  it,  but  if  it is not maintained  through a centralized record, this information  may  get  lost  once the vehicle  is  stored  or  even  if  one of the  part manufacturer has  not  reported  all the information, this may  result  in  break  of  compliance  requirements.

Blockchain To The Rescue :    Considering  the  fact  that  SOC  tracking  requires a  single  version of   truth  common to all parties  involved  and  also should  stay through  the  life  cycle  of the Asset, a platform  like  Blockchain which  provides   a  distributed  database  across  all the stake holders while maintaining the integrity  of  data, naturally  fits  a best  solution for handling the  issues.

We  already  see  recent  announcements  of  blockchain usage  in  Retail Supply Chain  in tracking the quality of  food  items  like  farm products.  Recently  major  retailers like Dole, Driscoll’s, Golden State Foods, Kroger, McCormick and Company, McLane Company, NestlĂ©, Tyson Foods, Unilever and Walmart  collaborated  using  IBM  blockchain  for tracking  the  food  safety details. In the  same way  the blockchain  network  can be used  to   track  the SOC  usage  in the  base  components  as well  as  assembled  components  and the  Asset  can be  tracked  throughout its life time, including  the transfer between owners.

BigchainDB Creation of Asset :    The  following  is  a  sub section of the code  which can be used by BigchainDB  python  driver  to  create  the  initial  asset,  which  will  done  by  the  OEM after the product  is manufactured.  Here  is the OEM  creates  the  Asset  with its ID,  and the  Asset  representation  is  just for  illustrative  purposes  and  contain  just  few fields  to  identity the  vehicle  and  its  SOC components.

bdb_root_url = 'http://localhost:9984'
bdb = BigchainDB(bdb_root_url)
car_asset = {
            'data': {
                        'car': {
                            'make':'my make',
                            'model': 'my model',
                                { 'name':'soc component 1', 'manufacturer':'vendor 1'},
                                { 'name':'soc component 2', 'manufacturer':'vendor 2'},
                                { 'name':'soc component 3', 'manufacturer':'vendor 3'}
car_asset_metadata = {
            'plant': 'USA'
prepared_creation_tx = bdb.transactions.prepare(
fulfilled_creation_tx = bdb.transactions.fulfill(
sent_creation_tx = bdb.transactions.send(fulfilled_creation_tx)

Once  the  transaction is submitted  to  bigchainDB  it  will be tagged  with  the OEM  as the owner. When subsequently  during  the  life cycle  of the Car  it may  move  multiple parties  and  all these transactions  can refer to the  original asset , such that  the  substance  of concern  info. will be known to every one.  Finally  when the  car reaches  the End of  Life,  appropriate  action  could be taken based  on SOC  handling  procedures.

The  above  is a  simplistic  example,  how  Asset  Life Cycle  Management  can  be improved  by  using  Blockchain  technologies  and  how  BigchainDB  facilitates  use  cases  in this direction.  I am analyzing  further   possibilities  in this  direction,  let  me know  of  any other  use  cases that  can fit  this.

One current  issue  is  that  the Asset attributes  which is maintained  as  Json  is static  from the time of creation,  however  during the  maintenance  of  a  car  a new  component  may be added  and the SOC list  could be updated. It  looks like there are attempts  to make  updates  to the Asset  content using a ORM Driver, but  further  details  needs to be  obtained  on this.