As almost anyone who tries to unit test a database application will quickly discover, databases present a huge problem for unit testing. Strictly speaking, if you are testing your C# or VB code and you actually hit a real database, then it isn't really a unit test. It is actually an integration test. However, I have found that it doesn't really matter what you call it, the end result is that your tests are much more useful if they actually hit a real database. You don't have to worry about whether the test failed because you screwed up your mock object or if the actual application is buggy and you get better code coverage because even broken SQL will lead to a failed test.
There are several methods that can be used to prevent your unit tests from actually using a real SQL Server database, but they all have their problems:
- Using an in-memory provider like SQLite
There is an Entity Framework provider for SQLite that allows you to interact with a database without using a network or even going to your file system. This could certainly increase the execution speed of your unit tests and makes it easy to prevent cross contamination in your tests, but they are still integration tests. The only difference is that you are now testing whether your code works on SQLite, rather than the DBMS that you will actually use in production. The problem is that all database systems have different behaviors and feature sets, so your tests are no longer valid if you use a different DBMS for testing. There is also currently no system in place to automatically generate the SQLite schema from your entity data model, so you will need to find your own way of doing that, or you have to manually maintain a separate SQLite schema. Gross. If you are going to use another provider, it needs to be specially designed to behave exactly like your production database (ie: a mock SQL Server provider) but to my knowledge no such providers exists (if I'm wrong, please let me know!).
- Mock the Entity Framework ObjectContext
If all you want to do is read data, then this works well and is easy to implement. Unfortunately, in the vast majority of cases, we also need to write data and that's where this method gets tricky. Your mock ObjectContext needs to be able to track changes and save them to an in-memory repository. And again, you have to make sure that it behaves exactly like your production database. Because this method often involves either a huge wrapper or major alterations to auto-generated code (which means you also need to make your own generator or you'll lose maintainability) the mock object itself is extremely complicated, leaving a high likelihood that it will have errors. Since the mock is so complicated one could argue that you are again doing integration tests, not unit tests. But this time instead of testing your code and the database, you are testing your code and the mock ObjectContext. Just like the SQLite example, this is much worse because you are testing whether your code integrates with something you will not use in production. If you are going to do integration tests anyways, then you might as well integrate with the real thing. This method could lead to faster executing tests, but don't forget that a local SQL Server instance is actually extremely fast and might be just as good.
- Encapsulate your data access layer and then mock it
I see this response on message boards all the time. Whenever someone asks how they unit test their data access code someone will respond "You're doing it wrong, put all of your data access code into a separate module that you can mock". There are a couple problems with this. First of all, you still need to test the code in the data access layer. If you have a function in your DAL that executes a complicated LINQ to Entites query, then you want to test that query. Without using one of the techniques mentioned above, this requires hitting the database. Secondly, making your client code completely unaware of the data access layer's implementation leads to some issues. Let's pretend that my data access layer looks like this:
Public Interface IUsersModel
Function GetUsers() As IEnumerable(Of Users)
Sub Save()
End Interface
Public Class UsersModel
Implements IUsersModel
Private _context As New DataTestEntities
Public Function GetUsers() As IEnumerable(Of Users) Implements IUsersModel.GetUsers
Return _context.Users
End Function
Public Sub Save() Implements IUsersModel.Save
_context.SaveChanges()
End Sub
End Class
It's pretty simple, the code just allows you to get a collection of users and save any changes you make. UsersModel correctly implements the interface using the Entity Framework. Then we also have a controller that accesses the DAL. It looks like this:
Public Class UsersController
Private _usersModel As IUsersModel
Public Sub New(ByVal usersModel As IUsersModel)
_usersModel = usersModel
End Sub
Public Sub ChangeFirstUserNameToFoobar()
_usersModel.GetUsers().First.userName = "foobar"
_usersModel.Save()
End Sub
End Class
UsersController has a dependency on IUsersModel, so when unit testing the ChangeFirstUserNameToFoobar method, we pass in a mock implementation of IUsersModel, but we cannot simply verify that Save() was called, we also need to know what is going to happen when Save is called. Specifically, we need some way of checking that the first user's username was changed to "foobar". This means that a mocking framework like RhinoMocks or Moq will not be sufficient. There must be a fake implementation of IUsersModel that keeps track of the changes that have been made. Now we are getting back into "mock the ObjectContext" territory because that's basically what we will have done.
There is a definite trend here: each of the above methods is complicated enough that you lose the benefits of isolating your tests from the database. They are all integration tests. In every case you are testing your client code, plus the repository. Since you have to test a repository, it might as well be the real one. Of course, this presents its own challenges. You will want to use a local instance of SQL Server (or whatever DBMS you use) to keep the tests fast (and isolated from other developers) and you will need to roll back changes after each test. In subsequent articles I will look at how to deal with these issues.
Update: I have posted the second article: Unit testing an Enitity Framework DAL part 2: Rolling back the test database