MongoDB integration testing in .Net Core – Vetting Mongo2go vs MongoDb Docker Instance

Chances are you already know the difference between unit and integration tests. If you do, skip the paragraph directly below.

What is Integration Testing?

The core goal of integration testing is to evaluate behavior of software components in a combined state, to ensure they will run properly when they interact in live scenarios. For example, combining an API endpoint with a real instance (or as close to a real instance as one can get) of a data store like MySQL to test if the endpoint interacts properly with the entire pipeline of components leading up through the data layer to your database instance. The closer to a real instance you can get, the more you can trust outcomes from your tests. The main difference between integration and unit tests, is the granularity of unit tests. With unit tests, the fewer components you combine, the better. Mocks and stubs stand-in for data stores and other such dependencies wherever possible.

The Problem

I recently had to choose between Mongo2go and a MongoDb Docker instance for integration testing. Probably also worth mentioning that the goal was actually to find an integration testing solution for DocumentDB — the AWS-sponsored version of MongoDb. Using MongoDb as a substitute works, but ideally, if you’re planning to use DocumentDB in production, you’ve familiarized yourself with the subtle differences between the two, and know which features of MongoDb are not supported in DocumentDB so that you’re not blindly testing features that won’t work post-deployment.

Using MongoDb as a substitute for DocumentDB in integration tests works, among other reasons because the former has a substantial amount of supporting tools built for it. Case in point, the Mongo2go wrapper.

Mongo2Go

On it’s Github page, Mongo2Go is described as a “…managed wrapper around the latest MongoDB binaries…” and is promoted as a candidate for use in Unit & Integration tests. Although it’s documentation doesn’t say much about being an in-memory instance of MongoDb, there is no indication that it writes anything to disk during operation so I would classify it as such.

Making Mongo2go available to your application is just a matter of installing it’s NuGet package. When creating tests that have a data-store dependency, I prefer to use XUnit fixtures for initializing the database. What fixtures are, and the patterns and best practices for using them are deserving of their own blog post, but it’s enough to think of them as test setup classes for XUnit. Below is an abbreviated example of the one I’m using for Mongo2go.

public class Mongo2GoFixture : IDisposable
{
        public MongoClient Client { get; }

        public IMongoDatabase Database { get; }

        public string ConnectionString { get; }

        private readonly MongoDbRunner _mongoRunner;

        private readonly string _databaseName = "my-database";

        public IMongoCollection<MyDataBoundObject> DataBoundCollection { get; }

        public Mongo2GoFixture()
        {
            // initializes the instance
            _mongoRunner = MongoDbRunner.Start();

            // store the connection string with the chosen port number
            ConnectionString = _mongoRunner.ConnectionString;

            // create a client and database for use outside the class
            Client = new MongoClient(ConnectionString);

            Database = Client.GetDatabase(_databaseName);

            // initialize your collection
            DataBoundCollection = Database.GetCollection<MyDataBoundObject>("databoundobject");
        }

        public void SeedData(string file)
        {
            var documentCount = DataBoundCollection.CountDocuments(Builders<MyDataBoundObject>.Filter.Empty);
            if (documentCount == 0)
            {
                _mongoRunner.Import(_databaseName, "databoundobject", GetFilePath(file), true);
            }
        }

        // GetFilePath using DockerFixture's approach
        private string GetFilePath(string file)
        {
           // resolve file-path here
        }

        public void Dispose()
        {
            _mongoRunner.Dispose();
        }
}

One noteworthy gotcha from the code block above is the fact that until your Mongo2go client is instantiated, you do not have access to the port it’s going to run on because it cycles through ports on your machine until it finds an open one. So, you cannot pre-emptively configure a port for your database connection in a settings file.

Seeding data into your data store for integration tests is an important step for tests like these, and although the seeding function above can be called at any point in your tests, for the purposes of this article it’s enough to demonstrate how to seed the Mongo2go instance within the test constructor using the fixture.

public class Mongo2GoIntegrationTest : IClassFixture<CustomWebApplicationFactory>, IClassFixture<Mongo2GoFixture>
{
        private readonly CustomWebApplicationFactory _factory;

        private Mongo2GoFixture _mongoDb;

        public Mongo2GoIntegrationTest(CustomWebApplicationFactory factory, Mongo2GoFixture mongoDb)
        {
            _mongoDb = mongoDb;
            _factory = factory;            

            // seed-data.json is json data to be imported into Mongodb
            _mongoDb.SeedData("./test_env/mongodb/seed-data.json");

        }

        [Theory]
        [InlineData("/my-endpoint/resource/1?param1=1")]
        [InlineData("/my-endpoint/resource/1?param1=1&param2=1")]
        [InlineData("/my-endpoint/resource/1?param1=1&param2=1&param3=1")]        
        public async Task GetSalesHistoryByProductId_Returns200(string url)
        {
            // Arrange            
            var client = _factory.InjectMongoDbConfigurationSettings(_mongoDb.ConnectionString).CreateClient();                        

            // Act
            var response = await client.GetAsync(url);

            // Assert
            Assert.Equal(200, ((int)response.StatusCode));
        }        
        
}

Another important part of this test strategy is to feed in a custom appsettings.json config file to intercept the settings that would have been picked up by default. I declare a function in a subclass of the WebApplicationFactory — CustomWebApplicationFactory to do this, however, due to connection data for Mongo2go becoming available late in the test initialization, I call the injection function within the test itself. There’s probably a better way to handle this, I just haven’t found it yet.

public class CustomWebApplicationFactory : WebApplicationFactory<Startup>
    {
        protected override void ConfigureWebHost(IWebHostBuilder builder)
        {
            builder.ConfigureAppConfiguration(config =>
            {
                config.AddConfiguration(new ConfigurationBuilder()
                       .AddJsonFile("appsettings.Integration.json")
                       .Build());
            });
        }        

        protected override IHostBuilder CreateHostBuilder()
        {
            var builder = Host.CreateDefaultBuilder()
                .ConfigureWebHostDefaults(
                    x => x.UseStartup<Startup>().UseTestServer()
                ).UseEnvironment("Integration");

            return builder;
        }        

        // Injects the Mongo2go configuration settings into the running application
        public WebApplicationFactory<Startup> InjectMongoDbConfigurationSettings(string connectionString)
        {
            return WithWebHostBuilder(builder =>
            {
                builder.ConfigureTestServices(services =>
                {
                    services.Configure<MongoConfiguration>(opts =>
                    {
                        opts.ConnectionStringTemplate = connectionString;
                    });

                });
            });
        }
    }

That’s pretty much all there is to putting together basic tests using Mongo2go.

Docker Instance

The alternative approach to using an in-memory instance of MongoDb is to stand-up an instance within a docker container. This has the advantage of always using the latest features available for MongoDb — if you pull the latest image — courtesy of a docker image maintained by MongoDb Inc.

To deploy a mongo instance, you can use a docker compose file that looks like the following:

version: '3.4'

services:  
  eminencedigital-mongodb:
    image: mongo:latest
    hostname: eminencedigital-mongodb
    container_name: mongodb
    environment:
        MONGO_INITDB_ROOT_USERNAME: admin
        MONGO_INITDB_ROOT_PASSWORD: my-top-secret-password
        MONGO_INITDB_DATABASE: my-database
    ports:
      - "27017:27017"
    networks:
        - mongo_net

  eminencedigital-mongo-seed:
    build: .
    links:
     - eminencedigital-mongodb
    networks:
        - mongo_net

  eminencedigital-mongo-express:
    image: mongo-express
    restart: always
    ports:
      - 8084:8081
    depends_on:
      - eminencedigital-mongodb
    environment:
      ME_CONFIG_MONGODB_ADMINUSERNAME: admin
      ME_CONFIG_MONGODB_ADMINPASSWORD:  my-top-secret-password
      ME_CONFIG_MONGODB_SERVER: eminencedigital-mongodb
    networks:
        - mongo_net


networks:
  mongo_net:
    driver: bridge

The first service (eminencedigital-mongodb) deploys the docker image, the second (eminencedigital-mongo-seed) seeds data into mongodb using a CMD script within a Dockerfile that looks like the following.

FROM mongo

COPY seed-data.json /seed-data.json

CMD mongoimport --host eminencedigital-mongodb --db my-database --username admin --password my-top-secret-password --authenticationDatabase admin --collection my-data-collection --type json --file /seed-data.json

I encountered another caveat here that had me spinning my wheels for a while. I initially tried to deploy the mongo instance and seed it from the same Dockerfile. This caused the unexpected outcome of the seeding script not being able to communicate with the MongoDb instance from within the same container. Breaking out the instance deployment and seeding allowed the networks config to fix this.

Like in the Mongo2go example, you can create a Fixture class to instantiate a mongodb client that connects with the docker instance. It’s a cleaner class because you can pre-emptively configure your instance. And use it in your tests.

public class DockerMongoDbFixture : IDisposable
{
        private MongoClient _mongoDbClient { get; }
        public IMongoCollection<MyDataBoundObject> DataBoundCollection { get; }


        public DockerMongoDbFixture()
        {
            var config = new ConfigurationBuilder()
                .AddJsonFile("appsettings.integration.json")
                .Build();

            var connectionUrl = string.Format(
                config.GetValue<string>("MongoDb:ConnectionStringTemplate"),
                config.GetValue<string>("MongoDb:UserName"),
                config.GetValue<string>("MongoDb:Password"),
                config.GetValue<string>("MongoDb:Endpoint"));

            _mongoDbClient = new MongoClient(connectionUrl);
            var database = _mongoDbClient.GetDatabase(config.GetValue<string>("MongoDb:Database"));
            DataBoundCollection = database.GetCollection<MyDataBoundObject>(config.GetValue<string>("MongoDb:MyDataCollection"));
        }

        public void Dispose()
        {               
        }
}

From my experience, there was very little difference in terms of performance between the two approaches. A more thorough load analysis would be required to point out any performance benefits of using one vs the other, but I was impressed, perhaps more-so by Mongo2go for being able to keep up with the Docker version on almost every level. The same tests work on either approach.

To run your tests, you will need to “docker-compose up” your instance either from the command line, or use a plugin that allows you to run it from within your code.

To summarize, Mong2go and a MongoDb instance running in Docker are both viable candidates for use in integration tests for applications that use a MongoDb or DocumentDb database. If I had to pick a winner, it would be the Docker instance just for the fact that it stays up to date with enhancements of a real MongoDb database. Mongo2go is however pretty impressive for how much it can do given it’s not a real instance of the database, or at least, less so relative to the Docker instance. If you go with the Docker instance like I did, bear in mind that Mongo2go is still a solid choice for your unit tests.

Leave a Reply

Your email address will not be published. Required fields are marked *