Sometimes, the difference between the domain model and database schema is so great that using NHibernate features to patch the gaps becomes a painful and buggy process. In such a situation, it is best to use custom Data Transfer Objects (DTOs) that are designed to match the database schema. Adding custom DTOs creates an additional layer in your architecture, which comes with its own overheads, and, if not designed with care, may lead to code that is inefficient and difficult to maintain. There is no magic formula to work with DTOs so that your persistence layer becomes more bearable. Fortunately, there is some common wisdom shared by veterans in the field, and a few tricks that I have learned from my experience that should help. This section is divided into two parts.
In the first part, we will cover basic areas such as: At what level should we add DTOs in the context of onion architecture and how to accommodate DTOs into an existing domain and domain service layers? Use of DTOs nullifies the effect of lazy loading. In the second section, we will see how that happens and go over an experiment that I have carried out to address this issue.
The most appropriate place for DTOs is close to the database. This makes the persistence layer a natural choice for placing the DTOs into. You can either make DTOs an independent layer, or make them a part of the persistence layer.
The only time I would create a separate layer out of DTOs is when DTOs need to be shared beyond one application. Otherwise, I would just stick to adding DTOs under the existing persistence layer. If I had to add DTOs into the persistence layer of the employee benefits application, then the onion architecture, with DTOs included, would look like the following diagram:
In terms of runtime dependency, the following diagram depicts that DTOs are at the bottom, and all other layers depend on it:
Though we introduce DTOs in the solution, the domain layer still uses its own domain entities for everything. All the capabilities, such as IRepository<T>
or IRunQuery<T>
, that we saw in the previous two chapters continue to return domain types. But we load DTOs from the database and hence a conversion layer is required to convert from DTOs to domain entities. This conversion layer could be very simple, as we will see next, or you can make use of a third-party library called AutoMapper
.
Let's assume that we have got the following DTOs defined to match to our legacy database:
In the preceding figure, we have a Person
class which holds the personal details such as name, email address, and so on, of any person in the company. We then have another class named StaffMember,
which has other properties about the employment of that person. There are other sub-classes at this level, but we will ignore them as we are not interested in them. StaffMember
has a collection of Benefits
on it. Let's say that a set of tables that model benefits more or less match our domain model. Now, if we have got a repository interface like the following, defined within the domain layer:
public interface IRepository<T> { T GetById(int id); }
In the implementation of the GetById
method of this interface, we will load a StaffMember
instance from the database, but we cannot return this instance. We will need to convert that into an Employee
instance.
The following code listing shows the implementation of IRepository<Employee>
:
public class Repository : IRepository<Employee> { private readonly ISession session; public Repository(ISession session) { this.session = session; } public Employee GetById(int id) { var staffMember = session.Get<StaffMember>(id); var employee = new Employee { Firstname = staffMember.Firstname, Lastname = staffMember.Lastname, //Initialize other properties here }; return employee; } }
After loading an instance of StaffMember
,
we will instantiate a new Employee
instance and initialize properties on this instance from the StaffMember
instance loaded from the database. Depending on the complexity of the domain model, the conversion code may grow big and complex. A lot of times, it becomes too repetitive as well. You can employ some design or refactoring patterns to avoid duplication, or you can use a library such as AutoMapper,
which uses several default conventions to automatically map properties from DTOs to domain entities. You can find more details about AutoMapper
by visiting its website at http://automapper.org/.
DTOs solve an important and difficult problem for you but it all comes at a cost. Custom DTOs are obviously an overhead, because you need to maintain an extra set of classes and also keep converting from domain model to DTO and vice versa. But one of the major downsides of custom DTOs is that, during the mapping from DTO to domain model, you iterate over each and every property of the DTO, thus loading every association into the memory and making all the benefits of lazy loading go away. Let me use an example from our employee benefit domain to show you how bad it could be.
Suppose you are working on a feature which involves updating the residential address of an employee. So, you load the employee details using some filter, and then pass the loaded instance into the layer that converts from DTO into domain entity. But since Employee
is the root entity and everything else hangs off it, what will happen during conversion is that every association from Employee
to any other entity will be loaded. We only needed to load the ResidentialAddress
property on Employee
, but since the conversion layer has no knowledge of the context we are operating in, it just blindly converts everything. This results in other collections such as Benefits
and Communities
being loaded when they are not required. There is no easy solution to this problem. I have experimented with something I called "custom lazy loading retention" while I was working on a project that involved a legacy and completely disparate database schema. This was more of an experiment which seemed to address issues with lazy loading quite nicely. Let's see what exactly "custom lazy loading retention" is.
Using NHibernate without lazy loading does seem a bit odd. While there are other benefits of using NHibernate, not being able to use lazy loading is a real handicap. On one of the projects I worked on, I had to face a similar situation and I experimented with this design construct that I called "custom lazy loading retention". This is basically about implementing part of NHibernate's lazy loading in our persistence layer. In Chapter 6, Let's Retrieve Some Data from the Database, we briefly discussed how lazy loading works internally. There is a lot that goes on, but if I had to explain the core of the lazy loading algorithm, then here is what it would be. Suppose we have the Employee
class which has the Benefits
collection on it as follows:
public class Employee { public virtual ICollection<Benefit> Benefits {get; set;} }
When our application starts, NHibernate will generate two classes dynamically. These classes are implemented as subclasses of Employee
and Benefit
, and override all properties that are present in base classes. For collection properties, there is a special code added that loads the property from the database. The following code listing tries to depict how that class could look:
public class EmployeeProxy : Employee { public override ICollection<Benefit> Benefits { get { //load Benefits collection from database //set the loaded Benefits collection on base.Benefits } } }
These classes are also called proxies. Note that the preceding code does not show actual proxy code; it is just a simplified version that shows how lazy loading works. If you look carefully, lazy loading is not very difficult to implement outside NHibernate. And by implementing custom lazy loading in our persistence layer, we can solve the problem introduced by DTOs. To see how this custom lazy loading works, let's refactor the preceding implementation of the repository to make use of custom lazy loading.
When repository loads StaffMember
, instead of passing it through a conversion layer it creates a new instance of an EmployeeProxy
class and returns it. EmployeeProxy
class is the proxy that is implemented on the lines of proxy that NHibernate builds dynamically. The following code is how the EmployeeProxy
class looks:
public class EmployeeProxy : Employee { private readonly StaffMember staffMember; public EmployeeProxy(StaffMember staffMember) { this.staffMember = staffMember; } public override ICollection<Benefit> Benefits { get { return staffMember.Benefits; } set { staffMember.Benefits = value; } } public override Address ResidentialAddress { get { return staffMember.ResidentialAddress; } set { staffMember.ResidentialAddress = value; } } }
There are only two important things going on here that we need to understand. First of all, this class accepts an instance of StaffMember
through the constructor. Second, it overrides all the properties from the base class Employee
, and rewires those to return the appropriate property on the StaffMember
instance that was passed into it. With this proxy available, the code in repository could be changed to look like the following:
public Employee GetById(int id) { var staffmember = session.Get<StaffMember>(id); return new EmployeeProxy(staffmember); }
The type returned by a call to ISession.Get<StaffMember>
, is actually a proxy type dynamically generated by NHibernate. We pass that proxy instance into the constructor of EmployeeProxy
, which is returned to the code that calls the GetById
method. If you notice, none of the properties from the underlying StaffMember
instance are converted to the Employee
or EmployeeProxy
class yet. As the calling code goes on interacting with the EmployeeProxy
instance that was returned to it, the actual properties from the underlying StaffMemberProxy
would be invoked, which would load the data on demand, using lazy loading offered by NHibernate.
This is a very simple example, used to show what is possible. If you want to use this in production code, then you might need to handle all the cases and do a thorough testing of your implementation.