Although we won't take the time to go deeply into this topic here in this chapter, IBM Watson Studio does provide a straightforward method to accomplish this, using a model you developed and deployed using the model builder. This method does require choosing a Spark Service or Environment option and establishing a feedback data store as a project resource, which is where the resulting model performance metrics will be retained (saved).
These resources can be configured using the Configure performance monitoring page shown in the following screenshot:
Once you establish a (feedback) data store, you can use Watson Studio and the model builder to easily define metrics and triggers as part of a continuous learning process and periodically review the updated model performance metrics and use chart controls to switch metrics or to view the results as a chart or as a table.