Thursday, September 28, 2017

Serverless Event Driven Blockchain Applications With BigchainDB And Azure

Serverless Computing    is  the  new  design  pattern  that  is emerging  when  creating  business applications.  It  does  not  mean  that ,  there  are  no  Servers  in executing  the functionality. It is more of ,  the complexities  about  managing  the  servers, compute capacities are hidden from the application itself  and  the  application  only  concentrate  about  its  functionality.

Serverless  architectures  are  more closely aligned  with  Event  driven  Computing,  where  the applications  are  viewed  as  a  set of  small  functions  which  execute  based  on  the  triggering events.  This model is different  from  traditional  procedural  and  monolithic  application  architectures. 

There  are  lot  to  be  read  about  Event  Driven  and  Serverless  architectures in the context  of agile applications  which  follow  Microservices  design  pattern.

Azure Functions :  Azure Functions is an event-driven compute experience which allows you to execute your code, written in the programming language of your choice, without worrying about servers. Benefit from scale on demand and never pay for idle capacity.

BigchainDB  :    In my  earlier  articles  we  have  seen  the introduction  of BigchainDB as  one  of  the  Scalable  Blockchain  databases.  With Bigchain DB  we  can  build  De-centralized  applications whose  transactions  are  immutable.With its  current  schema  model,  BigchainDB is a  good choice for  Asset Registrations, Ownership and  transfers. Typically  BigchainDB  will work in  tandem with Smart Contract  blockchain  platforms  like Ethereum.

One  of  the  important  aspects  of  BigchainDB  is  its  support  for Web Sockets. BigchainDB provides real-time event streams over the WebSocket protocol with the Event Stream API. Connecting to an event stream from your application enables a BigchainDB node to notify you as events occur. 

With BigchainDB  ability  to  store  assets  and  considering  the fact  that  transactions  can be notified  to  client applications  using  Web Socket  we could  think  of  many  real  life applications which are event driven and  use the serverless architectures.

The  below  the  architecture  of   Serverless  Event  Driven application  using  BigchainDB web socket interface.


The  following  are  the   brief  description  of  components  involved in this  architecture.

BigchainDB  Web Socket Event Stream API :   BigchainDB  provides a stream for all validated transactions. Each stream is meant as a unidirectional communication channel, where the BigchainDB node is the only party sending messages. The  following  API will show the  end  points for connecting  the  Bigchain DB Event Streams.

{
"assets": "/assets/",
"docs": "https://docs.bigchaindb.com/projects/server/en/v1.0.1/http-client-server-api.html",
"outputs": "/outputs/",
"statuses": "/statuses/",
"streams": "ws://localhost:9985/api/v1/streams/valid_transactions",
"transactions": "/transactions/"
}

Azure Web Jobs :   WebJobs is a feature of Azure App Service that enables you to run a program or script in the same context as a web app, API app, or mobile app.  Web Jobs are typically used for  creating and running back ground  jobs. Webjobs  can  be  continuous  or  triggered  jobs. Web  Jobs in continuous  mode  can  play the role  of  Web Socket  client  so that  it  received  events  from  BigchainDB  for  process  them.  The following  is  a  simple  code  that  listens  on  a  Event Stream and  logs  the  output.


            using (var ws = new WebSocket("ws://****.westus.cloudapp.azure.com:9985/api/v1/streams/valid_transactions"))
            {
             
                ws.OnMessage += (sender, e) =>
             
                    Console.WriteLine("Event Data: " + e.Data);
                };
                 
                ws.Connect();
             
             
            }

Azure Functions  :     As  mentioned  earlier,  Azure functions  is the  event  driven  and  serverless component  of the  architecture. Use Azure Functions to run a script or piece of code in response to a variety of events.  Azure Functions  has  multiple  kind of triggers  which makes  the underlying process  to  run.

Generic webhook  is  one of the triggers which  Process webhook HTTP requests from any service that supports webhooks

Now  if  we  look into  above  architecture  from  the  following  scenario.

  • Bigchain DB  stores  the  Car  Leasing   data  for  a  major  car leasing  company
  • The  initial  Car  Asset  is  created with the ownership of  car  leasing  company
  • Every  time  the  car  is leased  out  a  new   TRANSFER  transaction is created in BigchainDB
  • BigchainDB  event   stream  web socket  trigger notified  the Web Jobs 
  • The  Web Jobs  which  is  created  and  owned  by  an insurer  will be notified  of  car leasing ownership changes
  • The  Web Jobs  triggers a  serverless API  call  using  Azure  functions
  • The  data  gets  stored  at its  destination.
The  insurance  company  on  getting notified  of  changes  in   Driver,  can  adjust the  insurance  premium  according  driver's  past history.

This is a  simplistic  example  of  a architecture  that  uses  Event  driven  and  Serverless  architecture from the perspective of  BigchainDB  and  Azure.

Currently I  face  a  few  errors  with Web Socket  implementation  but  they  will  get solved.  Write to me  if you  are  interested  in knowing  how  Event  Driven  Architectures  will be of  use in a Blockchain  application.

Also  current  Azure  functions  does   not  support  Websocket  directly and  hence  we  need to introduce  Web Jobs    as a  intermediate  component,  however this can be avoided  if Azure  functions support  Web Sockets  as one of the trigger.

Friday, September 22, 2017

Implementing JPMC Quorum Using Azure Blockchain As A Service

One Size Does Not Fit All :   As  we position  blockchain  enabled  de-centralized  and  distributed  applications  for  enterprise,  there   could  be multiple  needs  for  different  enterprises.  That  is the reason  initially permissionless  public blockchain  started  the  transition  and subsequently  the  permissioned  private blockchain  continued  with the  transformation.

However  even  the  permissioned  private  blockchain  perceived  to  have  some  limitations  with respect to enterprise compliance  needs,  and  hence  there  is a  thought  process  to  customize  the  private  blockchain further  and  this  has  resulted  in  the  creation of  new  blockchain platform Quorum  by JP Morgan.

Quorum is an Ethereum-based distributed ledger protocol with transaction/contract privacy and new consensus mechanisms.

What  makes  Quorum  different  from  Standard Ethereum  is  the  concept  of  Private Transactions. Quorum supports Transaction-level privacy and network-wide transparency, customizable to business requirements.

Quorum Implementation Using Azure  Blockchain As A Service:    Azure BaaS  provides a  easy to use  template  a  reference  implementation  of Quorum  On  Azure.  The  template  only  looks for minimal  parameters  like.
  • VM size
  • Storage Account
  • Public Key  for  authentication
  • Network Related Information
Once  the  node  is created,  we  can SSH into the  node  and  start  the node operations.

Two  Script  file s  (init.sh,  start.sh)  which  needs  to be run  in  sudo  mode  are  required to start the  Quorum Blockchain.

This  setup  simulates  7  Quorum  Nodes  in  a  single  host  and  a  list  of  configuration  files  tm1.conf...  tm7.conf  are  used  as  configuration  for the nodes.

For  example  the  following  command  is  used  to  Start the  logical  node  1.

PRIVATE_CONFIG=tm1.conf nohup geth --datadir qdata/dd1 $GLOBAL_ARGS --rpcport 22000 --port 21000 --unlock 0 --password passwords.txt 2

Similarly  all  the  7   nodes  at the end of  start.sh execution.

The  examples  work  well  with  the   IPC  interface  of  quorum. For  example the below  command executes  a  script  against  node 1.

 sudo PRIVATE_CONFIG=tm1.conf geth --exec 'loadScript("./chk.js")' attach ipc:qdata/dd1/geth.ipc

Private Transactions :   'Private Transactions' are those Transactions whose payload is only visible to the network participants whose public keys are specified in the privateFor parameter of the Transaction .   The following  is an example of  private  transaction.

var simple = simpleContract.new(42, {from:web3.eth.accounts[0], data: simpleCompiled[simpleRoot].code, gas: 300000, privateFor: ["ROAZBWtSacxXQrOe3FGAqJDyJjFePR5ce4TSIzmJ0Bc="]},

Here  the  PrivateFor   is  a  customized  version  of  transaction  specific  to Quorum. 

Is  Private  Transactions  will  be  a  acceptable  standard  in  Blockchain,  as  some  times  it  may  be viewed  as  bringing  centralized  control,  and  not  bringing  transparency  to  the  transactions. However  considering  the    Multi Bank Consortium,  all  banks  may wanted  to  share  the information  like KYC  to  prevent  issues  like  Money  laundering,  however  any  particular  two  banks  still  get into  private  transactions, which  other banks may  not be aware.
 



Thursday, September 21, 2017

Implementing WorkFlow Applications Using BigchainDB

Blockchain Enterprise Applications :    As  mentioned  in  earlier  article,  thinking  Blockchain way will  transform the  way  traditional  enterprise  applications  are  done  so far. As  mentioned earlier,  BigchainDB  is  a   scalable  blockchain  database.  It  is a  NOSQL big data base  with blockchain chracteristics.  Also this  database  is  good  for  Asset  Registers  and  Transfers. Infact viewing  database  transaction  in terms  of  Asset  is  one of the fundamental  attributes  of this database.

On  thinking  about  the  database,  the  very   nature  of  Asset  Registration  and  subsequent  transfer between  multiple  owners,  can  be  viewed  from  the  angle  of  traditional  workflow  applications  that  is  prevalent  in  enterprises  today.

Considering  a   Loan  Application  of  a  new  Car.  The  work  flow  could  be.

1.  The  End  User  Sends  the  Request  for  a  loan,  say  through  a  dealer
2. The   Application Clerk (Alice),   validates  the  data  performs  Pre-Checks  and  creates  the  Initial  application in the  system  through a  user  interface
3. The  Loan  Application is  then  forwarded (transferred) to   next  level,  where  the  Credit and Document  Validations are  done.  Done By Bob.
4. Finally  the  Loan  is  passed  to  Disbursement  of  final  amount. Done By Mary.

This  may  be  a  simplistic loan  approval  process  and  in  actual  there could  be multiple  steps  in between,  but  this  particular  scenario  can  easily   be  modelled  in  BigchainDB  database  and  the whole  process  can  be  viewed  from a  different  angle.

Now  considering  that  each  of these  individuals  belong  to a  different  organization or group, A  consortium  blockchain  database  could be an  ideal solution.

1. In the  below  steps,  the  initial  Asset ,  which is basically  a  Loan  Request  is  created  by  Alice.

request_asset = {
            'data': {
                        'request': {
                            'name':'Allen Anderson',
                            'requesttype':'car loan',
                            'carmake': 'Toyota',
                            'currency':'USD',
                            'dealerlocation':'chicago',
                            'loanamt':'15000',
                            'repaymentmonths':36,
                            'yearlyincome':60000,
                            'address':'284 N Cross St XXXX ',
                            'otherloans':'none'
                            },
                        },
            }

request_asset_metadata = {
            'comments': ' Request Created In The System'
            }

prepared_creation_tx = bdb.transactions.prepare(
            operation='CREATE',
                signers=alice_public,
                    asset=request_asset,
                        metadata=request_asset_metadata
                        )

fulfilled_creation_tx = bdb.transactions.fulfill(
            prepared_creation_tx,
                private_keys=alice_private
                )

sent_creation_tx = bdb.transactions.send(fulfilled_creation_tx)

2.  In the  second  step,  Alice  performs  the   Pre Screening  and  transfers  the  Asset  i.e the  Request to  Bob.  Even  though  due  to  immutable  nature  the  asset/request  itself  cannot be modified  it  is  the  metadata  that  gets  changed  to  reflect the  current  status.

transfer_input={
           'fulfillment':output['condition']['details'],
           'fulfills':{
               'output_index':output_index,
               'transaction_id':requestassetid,
            },
           'owners_before':output['public_keys'],
        }
transfer_asset={
           'id':requestassetid
        }
transfer_asset_metadata = {
            'comments': 'Pre Screening Done and  Ready For Credit Check'
            }
prepared_transfer=bdb.transactions.prepare(
            operation='TRANSFER',
            asset=transfer_asset,
            metadata=transfer_asset_metadata,
            inputs=transfer_input,
            recipients= bob_public
        )
fulfilled_transfer=bdb.transactions.fulfill(
        prepared_transfer,private_keys=alice_private
        )
transfertx=bdb.transactions.send(fulfilled_transfer)


3.  Finally  Bob  performs  the  Credit  Check   and  Documentation Check  and  transfers  the  Asset to Mary.

transfer_input={
           'fulfillment':output['condition']['details'],
           'fulfills':{
               'output_index':output_index,
               'transaction_id':prevtransid,
            },
           'owners_before':output['public_keys'],
        }
transfer_asset={
           'id':requestassetid
        }
transfer_asset_metadata = {
            'comments': 'Credit Check  Done, Documentation check Done, Approved For Disbursement'
            }
prepared_transfer=bdb.transactions.prepare(
            operation='TRANSFER',
            asset=transfer_asset,
            metadata=transfer_asset_metadata,
            inputs=transfer_input,
            recipients= mary_public
        )
fulfilled_transfer=bdb.transactions.fulfill(
        prepared_transfer,private_keys=bob_private
        )
transfertx=bdb.transactions.send(fulfilled_transfer)


In  each step,  the  Asset  ownership  changes  reflecting  the  steps  in the workflow.  Also  the  Asset Meta Data  can  be  effectively used  to  get the  details  about  the  current  approval  status.

Also  there  are  many  Query  constructs  that  are  available  to  perform RDBMS  like  search on the whole  data  set  to   perform   further  reporting  on it.

While  the  above  scenario  can  also  be done  using  Other Relational Databases,  thinking  Blockchain  and  Distributed  way   of  building  enterprise  applications  will  lead  to  new  possibilities.

Let  me know  of  other use  cases  that  will   fit  these  scenarios.

Monday, September 18, 2017

Why Blockchain (DApps) Will Transform Application Development

Traditional Trasaction Processing :    Over  the past  decades enterprises  used  high performance OLTP applications  to  meet their business  transaction needs.  If we really  look into  what consists of a  typical  application, they are all about executing  a  business  transaction (like a  sales order, invoice, purchase order, employment, shipment, audit report) between  two parties,  and in the process multiple steps  of  work flow support in the execution  of  that  contract.

Blockchain  based  application,  which  are  known as  DApps  or Decentralized Application  is  about  creating  and  managing  Smart  Contracts  on top of a  blockchain based  database.

Also  Data Integrity,  Data Latency, Data Availability, Data Security  are  some  of  the  typical  challenges  of  enterprise  business  applications  all  along, such  that  costly  techniques  like  ETL, Role Based Security,  Data Encryption  all  have  been  applied  on top of the existing  applications to meet  these  demands.

However Blockchain  platform  and the underlying  distributed  databases automatically  provide these attributes  and  reimagining  the  enterprise  applications  from  the  point of  view of  Blockchain based  Smart Contract Dapps  may  provide  lot  of  hidden  benefits  for enterprises.

In other words  the   Innovation that  are  automatic  "by  product"  of  Blockchain Eco System will help the enterprises  in other  areas also  so  Enterprises  best  benefited  by  blockchain  if  they think them  as  a  Digital Platform  and  not  just  a Digital Currency.

Few  thoughts are shared  on some  recent  developments  from with Microsoft Azure Services that point  in this directions.

Azure Confidential Computing:   Recently  Microsoft  has   released   new security  architecture/platform/pattern  known  as  Confidential Computing.  This  platform  ensures  that  the  Data  that  is owned  by an application is  protected inside a  Trusted Execution Environment (TEE). TEE ensures that  the  data  is  viewable  only  from the context  of  the  application and  not  even viewable by the Administrators  of the underlying  bare metal physical hardware. The  initial plans from Microsoft  to enable TEE in a  hardware based and software based infrastructure namely,

  • Virtual Secure Mode (VSM) TEE that’s implemented by Hyper-V in Windows 10 and Windows Server 2016.
  • Intel SGX TEE capable servers in the public cloud
Microsoft Enterprise Blockchain Framework Coco :    Recently  there is another announcement from Microsoft  about Coco framework,  which  mitigates  the limitations  of  Permissionless public blockchain protocols  into more enterprise  friendly  characteristics, like  High Throughput,  Low Latency  and Reduced Energy Usage.

Coco framework  exactly uses the TEE which forms  the  backbone  of  Azure  Confidential Computing  to  create the  network  of  nodes  that  form the  part of Enterprise Blockchain Consortium. Coco framework  uses  multiple  layers  of  components  that  ultimately  execute on a TEE.  The  following  picture  courtesy  of  Microsoft Coco Framework Whitepaper  explains how TEE  provides a  trusted blockchain  environment,  irrespective of the blockchain protocol.



Further  information  about Coco Framework  as  well  as  Confidential computing  can be obtained from Microsoft Documentation and  Blogs.

This  gives  opportunities  for  developing  Financial, healthcare applications  using Coco framework which  implicitly  provides   innovations  on the  security  architecture  in the form of Confidential Computing.  As  mentioned  in the  earlier  part  of  blog,   Reimagining  the  existing  enterprise applications  using  Blockchain and Smart Contracts to be executed in TEE  can bring new possibilities  for enterprises.   This may ultimately  lead  to transformation of  application development.

Sunday, September 17, 2017

BigchainDB Automobile Use Cases - SOC Tracking

Blockchain Databases :   As  the  Smart  Contracts  which is  the  primary  use  case of , block chain platform emerge as a  key  interest  for enterprises, one  of the  major  concern  is  the  performance  of  blockchain  as a data store  from enterprise perspective. There are a  few attempts  to  inject  RDBMS Like  Transaction Processing capabilities  and  Big Data  like  Horizontal Scaling in to the blockchain data store  while  maintaining  the  core  tenants of blockchain like  immutability, peer to peer replication etc...

BigchainDB  is  one  such database  that  merges  traditional  database characteristics  with blockchain.  Though  the  vendors  for  now claim that  the  product  is  nearing  a  production class implementation,  there is already  reference  implementation  using  Azure Kubernetes  Container As A Service that  provides  enterprise  grade  architecture.  This  product  works on  top  of  MongoDB that  gives  all  the scalability of the underlying database.

One  important  aspect  of  BigChainDB  is  fundamentally geared  towards  the  concept  of  Assets and subsequent  manage  the  life cycle  of  Asset  through  its  operations. A CREATE transaction can be use to register any kind of asset , along with arbitrary metadata. A TRANSFER transaction is used to  transfer the asset  from  one owner  to another  owner,  while  the   reference  to the  base asset  is maintained. An ASSET  can also have  metadata which will  be useful in tagging and searching of the asset.

Managing Substance Of Concern In Automobiles:  As  you may know a  typical  Automobile  Original Equipment  like a Car  is  made up of 100 or even 1000s of parts and components.  Each of these components  and  sub components  are  manufactured  by  various suppliers.  These sub components  potentially  could  be  manufactured  using  SOC (Substance  of Concerns),  which are typically  dangerous  chemicals and contents  like Lead, Nickel.  Also  a  manufacturing  of a  car  involves multiple  suppliers  and  who may  produce  these  components  in different  countries. Also  the supply chain of a  car  involves  multiple  parties  and some times  it is  global in scale, such that the parts  are  manufactured  in one country but may be used in another country. Also once a car  comes out of OEM manufacturing  facility,  it's ownership  may  be transferred  between multiple owners before  it reaches  ELV ( End of Life of Vehicles). Chemicals legislation, such as REACH, puts significant responsibility on the communication, notification and phase-out of substances of concern (SOCs) throughout the complete supply chain.

Assume that  the OEM is  typically  responsible  for  keeping  track  of  the SOC usage in a car(automobile)  and  needs to report on  it,  but  if  it is not maintained  through a centralized record, this information  may  get  lost  once the vehicle  is  stored  or  even  if  one of the  part manufacturer has  not  reported  all the information, this may  result  in  break  of  compliance  requirements.

Blockchain To The Rescue :    Considering  the  fact  that  SOC  tracking  requires a  single  version of   truth  common to all parties  involved  and  also should  stay through  the  life  cycle  of the Asset, a platform  like  Blockchain which  provides   a  distributed  database  across  all the stake holders while maintaining the integrity  of  data, naturally  fits  a best  solution for handling the  issues.

We  already  see  recent  announcements  of  blockchain usage  in  Retail Supply Chain  in tracking the quality of  food  items  like  farm products.  Recently  major  retailers like Dole, Driscoll’s, Golden State Foods, Kroger, McCormick and Company, McLane Company, Nestlé, Tyson Foods, Unilever and Walmart  collaborated  using  IBM  blockchain  for tracking  the  food  safety details. In the  same way  the blockchain  network  can be used  to   track  the SOC  usage  in the  base  components  as well  as  assembled  components  and the  Asset  can be  tracked  throughout its life time, including  the transfer between owners.

BigchainDB Creation of Asset :    The  following  is  a  sub section of the code  which can be used by BigchainDB  python  driver  to  create  the  initial  asset,  which  will  done  by  the  OEM after the product  is manufactured.  Here  is the OEM  creates  the  Asset  with its ID,  and the  Asset  representation  is  just for  illustrative  purposes  and  contain  just  few fields  to  identity the  vehicle  and  its  SOC components.

oem_public='***'
oem_private='***'
bdb_root_url = 'http://localhost:9984'
bdb = BigchainDB(bdb_root_url)
car_asset = {
            'data': {
                        'car': {
                            'make':'my make',
                            'model': 'my model',
                            'vin':'VIN0000001',
                            'type':'petrol',
                            'transmission':'automatic',
                            'cylinders':6,
                            'socinfo':[
                                { 'name':'soc component 1', 'manufacturer':'vendor 1'},
                                { 'name':'soc component 2', 'manufacturer':'vendor 2'},
                                { 'name':'soc component 3', 'manufacturer':'vendor 3'}
                            ]
                            },
                        },
            }
car_asset_metadata = {
            'plant': 'USA'
            }
prepared_creation_tx = bdb.transactions.prepare(
            operation='CREATE',
                signers=oem_public,
                    asset=car_asset,
                        metadata=car_asset_metadata
                        )
fulfilled_creation_tx = bdb.transactions.fulfill(
            prepared_creation_tx,
                private_keys=oem_private
                )
sent_creation_tx = bdb.transactions.send(fulfilled_creation_tx)

Once  the  transaction is submitted  to  bigchainDB  it  will be tagged  with  the OEM  as the owner. When subsequently  during  the  life cycle  of the Car  it may  move  multiple parties  and  all these transactions  can refer to the  original asset , such that  the  substance  of concern  info. will be known to every one.  Finally  when the  car reaches  the End of  Life,  appropriate  action  could be taken based  on SOC  handling  procedures.

The  above  is a  simplistic  example,  how  Asset  Life Cycle  Management  can  be improved  by  using  Blockchain  technologies  and  how  BigchainDB  facilitates  use  cases  in this direction.  I am analyzing  further   possibilities  in this  direction,  let  me know  of  any other  use  cases that  can fit  this.

One current  issue  is  that  the Asset attributes  which is maintained  as  Json  is static  from the time of creation,  however  during the  maintenance  of  a  car  a new  component  may be added  and the SOC list  could be updated. It  looks like there are attempts  to make  updates  to the Asset  content using a ORM Driver, but  further  details  needs to be  obtained  on this.



Friday, September 15, 2017

Smart Contract Management System


Smart Contacts :  We  all  know  that  the  Blockchain  technology  is the underlying  platform  for emergence  of  Bitcoin and  crypto  currencies.  Because  of  it’s  secured distributed database and associated  philosophies  like  immutability  makes  blockchain  a growing technology with the  increased adoption  by  various  verticals, especially  finance, Governement .

However  the  industry  finding  the  biggest  use  of  Blockchain  technologies  in  the  form  of Smart  Contracts.  Smart Contracts are  software defined  version  of  existing  paper based contracts, in the sense that  they are basically  set  of  compiled  code  that  runs inside a  block chain network. By inheriting the  qualities  of  blockchain  like  distributed  leger, cryptographically signed, immutability etc.. , smart  contracts enable  enterprises  to  get into  trust  based  legal  agreements  which reduces the  cycle time  of business  operations  and reduce the cost  of  executing  them.

Consortium Blockchain Network:   As  evident  blockchain  technologies  and  smart  contracts facilitate the execution  of business agreements  between two business entities,  however in a  typical  business use case there are a  set  of  players that involve as stake holders with respect to that  transaction. Hence it makes  sense that  blockchain  network  is not  just  built  with 2 members,  but  with a set of  like minded  organizations who are the stake holder in that  particular  transaction. Some of the examples  being.

·         Set of  financial  institutions manage common contracts like  customer’s KYC details  or  manage inter bank  transfers to  prevent  activities  like money laundering.

·         Health care providers,  insurance agencies,  Government Welfare bodies share patient treatment information  to  prevent fraud and to  improve the quality of health care.

·         Manufacturing, Logistics, Part OEMs  share supply chain  related  smart contracts for  best utilization of  inventory management.

At  its simplest  form  a  consortium network  is  also  a  Private  blockchain network,  however  considering the  involvement  of  selected  stake holders it  is  correct to use the term  Consortium blockchain network.

Smart Contract Management Platform:   As  enterprises  adopt  Blockchain in to their landscape, especially  with their  participation in  consortium  networks,  they  require  additional layer of management  solution  on  top of the  base blockchain  layer.  This  layer  is  termed  as   “Smart Contract Management Platform”  (SCMP).  A SCMP is  defied  as  a  additional layer on top of the  base underlying  block chain network,  which  provides a  business friendly features  for  enterprises  to  pilot,adopt, execute  and expand  their  smart  contract initiatives.  SCMP  also  help  enterprises  to add additional  features and  usability  aspects  to  smart contracts  which are not  available in the base technology itself.

The  following  are  the  key  features  of  a SCMP  which  makes  them  useful  from  an  enterprise context.

One Enterprise Part Of Many Private Consortium:   Due  to  the  very  nature  of  consortium blockchain

network   consists  of  interested stake holders,  it  is  quite possible that  one enterprise  will  be  part of multiple  private consortium networks. For  example  a  health care provider may be  part of a healthcare consortium  to  share the patient  treatment and insurance claim information and at the same time  the same enterprise  can be part of a  vendor  network  that  supplies  hospital  equipment for better transparency. There is  no  need  for  these  two  set of information  exists in the same blockchain network. However  a   SCMP  can  help  a health care provider to manage  both  of these networks seamlessly. So  that  business users can  submit  transactions,  view  transactions, modify transactions  from  any  of  the  networks that the  organization  is  part  of.

Multiple Blockchain Platforms  In Place:   Ethereum, Hyperledger, Monax,Ripple  to name a few are currently available  as blockchain platforms  that  implement  smart contracts and  this list can grow. There are  no  uniformity in the  application interface and implementation of these platforms. Considering the  point 1 above  with respect  to one enterprise   being  part of  many private consortium, this can further be complicated  by the fact that  each  of  these consortium can be implemented  using a different  platform. SCMP  can help  enterprises to  adopt  and  use multiple different  platforms.

Smart Contracts are Inherently Complicated For Business Stake Holders :  The primary stake holders of smart contracts are business people like CFO, Purchase Managers, Government Regulators, Auditors and more,  however  smart contracts in itself  are  basically a  software code. And  even in  popular  platforms like ethereum, implementing  a  smart contract involves  bytecode, ABI, compilation and deploying  them to  network.  Also  to get to the handle  of a  existing contract, businesses need to  get to the hexadecimal  address for  the  same. This  means that businesses  cannot  derive  and  understand  the  usage unless it is  simplified. SCMP abstracts  the technical  complexities  of  smart  contracts and  provides a simple  and  easy to  use  interface  for  businesses  to  adopt.

More security  controls  needed in  smart contract,  beyond what is supported today :   The programming languages and  constructs for  creating  smart contracts like solidity  are evolving, which means that  there may  be  some  limitations  which businesses  cannot  afford to  leave in their implementation.  For  example  ethereum  smart  contracts  have the concept  of  addresses, which means that  we could  restrict a  smart  contract  to be created  only  by  certain sender addresses. But  how about  the  need to  have  further  organizational  controls  like  Only  CFO  within  an  organization can create certain  financial  contract.  SCMP   can  help  enterprises  to  provide  these additional layers of  security  controls.

Off-chain meta data needed  for  efficient  usage  of  smart contracts :  As  smart  contract is a legal contract,  it  may not  contain  meaningful  comments  or  tagging information  which are  useful to  search and identify  the smart contract itself.  In  blockchain  terminology, off-chain  refers  to  transactions  and  events  that  occur  outside  the  blockchain network. By  effectively  utilizing  the  Off-chain  concept  a SCMP  can  store  important  information that  can be useful  in  searching and other  reporting operations on the block chain itself.  As with the current implementation performing  search operations  directly  in the  blockchain  platforms  like ethereum  are  quite  complicated and hence the  meta data assisted  search  will   simplify  this  process.

Security  Challenges  In  the  Current Block chain platforms :  As  an  evolving  technology,  there are some inherent limitations in the platforms  which  may   prevent  its  successful  adoption.  For  example  geth  which  is  one  of  client  implementations  of  Ethereum  block  chain,  exposes  a  RPC  end  point which can  be  taken  advantage  by  any  one to access the network.  Though  the features like  address level  security  can  prevent  certain  things,  it  is  not  fully  protected. SCMP  can  help  the  enterprises to over come  these  limitations  with  features  like  IP  Based  access security  and  avoidance  of  RPC  protocol etc…

Changes to  Smart  Contract and  Versioning :   Due  to  current  limitation,  a  smart contract which is basically  a  set of  code,  cannot  be  changed  once  it  is deployed.  However in  practical scenarios there are always  needs for change. While the  new  versions  of  smart  contract  can  always  be  deployed,  what  about  transactions executed  on  the  old  version  of the code  and  how  about the  linking  of  those  records. SCMP  can  provide  options  to  upgrade  a  smart  contract  from   version to a newer version.

Making  Smart Contracts  Really Smart :  At  this  point,  smart  contracts  don’t  have intelligent  beyond  what  is  programmed  in  it.  However  IoT  and  Machine Learning  can  be integrated  with  smart contracts   to  make  them  truly  smart.  Imagine  a  situation  of a  Purchase  Order  payment contract  is  fulfilled  at a  warehouse as soon  as  the  goods are received.  SCMP  facilitates  integration  of  smart  contract with other  technologies  like  IoT  and  Machine  Learning. Similarly  currently  self  executing  capabilities  of  smart  contract  are limited,  however  when  coupled  with  machine  learning   smart  contracts  can  think of  next  action  and  can  update  accordingly.

Dynamic User Interface For Smart Contracts : A private  consortium  network  will  keep evolving,  which means  that  new  contracts  will  be  added,  existing  contracts may be renewed.  Currently  there are no options  to  provide  dynamic  user  interface  to  contracts  without  redeploying  the user interface code. However SCMP  can  attempt  to  provide  dynamic  user  interface to  smart  contracts, so that  addition  of  new  contracts  to the network  will be seamless.

Infrastructure  Operations  On  Blockchain Network :   While  the  distributed  nature  of  block chain relieves  enterprises  from  traditional  database  management  tasks. There  are some  tasks  still needs to be performed,  for  example   in  ethereum  block chain  accounts  need  to  the exported  and imported to  be  available  in  new  nodes.  Also  several  network  operations  needs to  be  performed  when  adding  a  new  member  to  the  consortium.  Azure  Block Chain As  a Service  and   similar cloud  providers  provide  building  blocks for  automating  the  infrastructure  management  of  block  chain  platform. SCMP   can  provide  the   automation  and  infrastructure management  capabilities  for  smart contract  consortium  network.

The  above  are  not  the  complete  list  of  points  that  support  the  need  for a  SCMP,   but  definitely  they  provide  a  start.



Wednesday, September 13, 2017

Multi Container Applications Using Azure Container Service

Azure Container Service:   As  the  usage  of  containers  increase  in enterprise,  one  important  aspect  of  it  is  the  production  class  clustering  of  containers  and  provide  an  architecture  to  host the containerized  applications.  In  this  context  Azure Container Service   provides   pre  configured  templates  and  built  in  components  to  create, manage  and  configure clustered  virtual machines,  ready  to  run containerized applications. 

Azure Container  Service   supports  3  popular   container  orchestration  platforms  to  host  containerized  applications.
  • DC/OS  which  is a distributed operating system based on the Apache Mesos distributed systems kernel
  • Docker Swarm which provides native clustering for Docker
  • Kubernetes  which is a popular open-source, production-grade container orchestrator tool.
The  good  thing  with  Azure  Container Service  is  that  it  provides  a  common  template for all these 3 major platforms  and  with a  switch  of a  single  parameter  complete  environment  can be deployed.  The  following  code piece  shows that  aspect.  The  3 parameter  values are self explanatory.

orchestratorType": {
      "type": "string",
      "defaultValue": "Kubernetes",
      "allowedValues": [
        "Kubernetes",
        "DCOS",
        "Swarm"
      ],
      "metadata": {
        "description": "The type of orchestrator used to manage the applications on the cluster."
      }
    }

Organizations  with  existing  investment  in Azure  Services  are  better  to  handle  the  Azure Container Services,  because  these  container  orchestration  platforms  are  built  using  Azure components  only, and  no  need  to  learn any  additional  networking  or clustering  concepts. For example  the  following  Azure  Services /  Components  are  created  when  deploying a DCOS/Swarm/Kubernetes Cluster.
  • Vitual Machine Scale Set  :  This  provides  Elastic Scaling  for  Agent  Nodes.
  • Availability Set  :  Provides  High Availability for  Master Nodes
  • Load Balancer :  Provides  Load  Balancing  Across  Nodes
  • Network Security Group :  Provides  various  access controls  for  Nodes
  • Network, Storage, Virtual Machines  :  Traditional  Components.
Multi Container Applications :

While  the  container  orchestration  platforms   are meant  for  deploying  any  container,  a  typical application  architecture  will  involve  many  containers  that  are  tied  together. This is  typically known as  Multi Container  Application.  The  following  are  typical  multi  container  frameworks that  Azure  Container Service  supports.

Docker Compose,   is  the  Multi Container  framework  to  be  deployed  on  Docker  Swarm Clusters. Azure Container  Service  supports  Docker  Compose.  The  following code snipper (Courtesy  From Microsoft Azure Site)  gives a  simple view  of  multi container application using Docker Compose.  As  evident  it  uses  two  containers that  are connected.

web:
  image: adtd/web:0.1
  ports:
    - "80:80"
  links:
    - rest:rest-demo-azure.marathon.mesos
rest:
  image: adtd/rest:0.1
  ports:
    - "8080:8080"


Kubernets Service Config ,   is  the  multi container  framework to be deployed on Kubernetes cluster on ACS.  The  following  example  from  Azure  GitHub page  provides a  multi container application  that  can  be  deployed  to  Azure  Container Service  using  kubectl  command line.

guestbook-all-in-one.yaml

DC/OS  also  supports  multi  container  applications  using  an option called  PODS. Azure  supports  deployment  into DC/OS  using  marathon REST API.  The  following  link shows how to create a POD  with multiple containers.

DCOS Multi Containers

While Azure  Container Service documentation DCOS  provides  deployment  of a  single container using Marathon,  the  above  multi container  application is also expected to  work.

So with  these  options  it  is  much  easier  for organizations  to  switch to container based  application development  and  subsequent  deployment  for  production  into Azure Container Service.