Stream processing

Goka (https://github.com/lovoo/goka) is a powerful Go stream-processing library that uses Apache Kafka. It can be used to build streaming patterns such as the CQRS and event sourcing architectures that we saw in Chapter 5, Going Distributed.

There are three main building blocks for a stream-processing application in Goka: emitters, processors, and views.

Emitters generate key-value messages into Kafka. For example, this could be a listener on a database change log and emit the change log as an event. A processor is a set of callback functions that consume and perform state transformations of messages. Processors are clustered in groups. As with the consumer group rebalancing feature of Kafka, Goka distributes topic partitions among processors in a processor group. Each processor group maintains state in something called  agroup table. This is a partitioned key-value table stored in Kafka itself, with local caching. Views provide read-only access to the group tables.

The following diagram depicts the abstraction:

Source: https://github.com/lovoo/goka

This is an example of a processor that increments the number of messages seen:

func process(ctx goka.Context, msg interface{}) { 
       var nMessages int 
       if val := ctx.Value(); val != nil { 
          nMessages = val.(int) 
       } 
       
      nMessages++ 
      ctx.SetValue(nMessages) 
      
} 

The goka.Context parameter passed is the most powerful construct, enabling the saving of the Group Table and sending messages to other Processors. More details can be found at https://github.com/lovoo/goka and you can see more examples at https://github.com/lovoo/goka/tree/master/examples.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset