Measurement and analytics

Before discussing Hyperledger Fabric in particular, let's understand what measurement and analytics means for a distributed system, of which a blockchain application is an example. The process begins with a comprehensive understanding of the architecture of the system, its various components, and the degrees and natures of coupling among those components. The next step is to institute mechanisms to monitor the various components and collect data attributes that have any bearing on performance, either continuously or at periodic intervals. This data must be collected and communicated to a module that can then analyze it to generate meaningful representations of system performance, and possibly provide more insight into the workings of the applications and its existing inefficiencies. The analyzed data can also be used to ensure that the system is working at a desired level of performance, and to detect when it is not, something which is of high (if not critical) importance to user-facing systems.

Such techniques and processes are well known in the world of distributed systems analytics, and also in mobile analytics (which can be considered to be a special case of the former.) Agents can be configured to observe or monitor a system component, either actively or passively: in the former, systems can be instrumented (for example, by inserting special data collection code) to make them self-monitor their activities and gather information, whereas in the latter, data collection can be done by a piece of software that is external to the component being monitored. A pipeline exists to communicate this data on a continuous or periodic basis to a central repository, where the data can be accumulated for later processing, or is immediately processed and consumed. The pipeline may modify the data to make it read for analytics too. In data analytics parlance, this pipeline is typically referred to as extract-transform-load (ETL). If the volume and frequency of data generation is very high, and if the number of data sources is very large, such analytics is also referred to as big data analytics.

ETL processes or big data analytics are beyond the scope of this chapter and book, but the takeaway for a serious blockchain developer or administrator is that there exist frameworks to perform such analytics, either for distributed systems configured with servers and databases at their backends (and a Fabric blockchain application is an example of this) such as Splunk (https://www.splunk.com/en_us/solutions/solution-areas/business-analytics.html) or Apteligent (http://www.apteligent.com/), or for mobile applications such as Tealeaf (https://www.ibm.com/in-en/marketplace/session-replay-and-interaction-analytics) and Google Analytics (https://developers.google.com/analytics/solutions/mobile). The same frameworks can be used or adapted to monitor and analyze blockchain applications too.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset