Data structure

We will make an assumption in this chapter, that is, we have a friendly backend developer on our team who's able to provide us data in the format we need. Let's flesh out our requirements for the data that will power our application.

Another assumption is that we're looking for trends and statistics here, so we're going to be basically aggregating logs into something more suitable for user consumption.

Live charts

We planned to have two "live" charts on the dashboard (one to show SQL queries as they come in, and one to show web requests). In order for this to work, we need a URL we can poll every second or so. This will provide us with data on the last second of activity. Something like this:

GET /logStream
Accepts: n/a
Returns: [
    {
        "type":"web",
        "subType":"request",
        "time":"2014-11-04T12:10:14.465Z",
        "ms":10,
        "count":5
    },
    {
        "type":"sql",
        "subType":"query",
        "time":"2014-11-04T12:10:14.466Z",
        "ms":17,
           "count":34
    }
]

A GET request to /logs/all/live gives us an array of objects, one for each log type. As mentioned, we're restricting ourselves to SQL and the Web only. The ms property is the average response time of the operations that occurred in the past second. The count property is the number of operations that took place. We designed the API with a little bit of flexibility in mind, so it could be extended, for example, replace "all" in the URL with "sql" to filter on one log type.

Historical logs

On the dashboard and the subpages of our application, we also need to show graphs of historical data. On the dashboard, it'll be from the past 30 days, but on the subpages, it could be an arbitrary time frame. Here's our API:

GET /logEntry
Accepts: filter=[{"property":"propertyName","operator":"=","value":"value"}, …]
Returns: [{
    "type":"sql",
    "subType":"query",
    "time":"2014-11-04T12:10:14.466Z",
    "ms":17,
    "count":34
}, ...]

We're going to rely on a feature of Ext.data.Store: remoteFilter. When this is set to true, Ext JS will defer filtering to the server and pass through the filter criteria as a JSON array. We can set an arbitrary number of filters, so in order to get SQL data within a date range, we'd be passing something like this:

[
    { property: 'type', operator: '=', value: 'sql' },
    { property: 'time', operator: '<=', value: '2014-01-01' },
    { property: 'time', operator: '>=', value: '2014-02-01' }
]

Our kind server-side developer will combine these filters into something that returns the correct response.

Log statistics

As well as general aggregated information about Web and SQL operations, we also want to display a grid of further detail on our tab pages. Again, these will be filterable based on the date range as well as the category of information we want to view:

GET /statistic
Accepts:
filter=[
    { property: 'type', operator: '=', value: 'web' },
    { property: 'category', operator: '=', value: 'location' },
    { property: 'time', operator: '<=', value: '2014-01-01' },
    { property: 'time', operator: '>=', value: '2014-02-01' }
]
Returns: [{"category":"location","label":"Other","percentage":19.9}, ...]

We're using the remoteFilter feature again, meaning that Ext JS just passes the JSON filter straight through to the server as well as the type and time parameter from before. This time, we will add a category parameter to specify what subset of information—such as location for web logs or query source for SQL—we'd like to retrieve.

In response, we get an array of all of the items within the chosen category and the percentages allocated to each one over the specified time frame.

Model behavior

We've got our API. How does this translate into the JavaScript data models we'll need? Well, we only need two—look at the API responses we just documented—the /logs returns one type and /statistics returns another type. They'll look something like this:

Instrumatics.model.LogEntry: extends Instrumatics.model.BaseModel
- type
- subType
- time
- ms
- count

What's this BaseModel all about? In order to share schema configuration between models, we can use a base model from which all other models inherit. It looks like this:

Instrumatics.model.BaseModel: extends Ext.data.Model
- schema

Now, the model for statistics is as follows:

Instrumatics.model.Statistic: extends Instrumatics.model.BaseModel
- category
- label
- percentage

The percentage field represents the proportion of operations that are represented by this statistic. For example, if category is location and label is Japan, then the percentage could be something like 5 percent (5 percent of our requests come from Japan). This is flexible enough to be used for all the data categories we'd like to view.

Finally, we need one for the live log stream:

Instrumatics.model.LogStream: extends Instrumatics.model.LogEntry

The log stream has the same fields as the LogEntry model, but we have it as a separate class, so its class name can affect the schema configuration. We'll go into more detail later.

Note

We're lucky with this theoretical API; in this, we are allowed to shape our requirements. In the real world, it might not be that simple, but having a friendly backend developer will always make our lives as frontend developers much easier.

Our API has strongly informed our data layer. While it's great to keep things simple—as we've been able to here—it's important not to mistake simplicity for naivety or inflexibility. In this case, our UI components will happily work with our data layer, which in turn works with our API without having to shoehorn any one piece into working with the others.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset