This topic contains 5 replies, has 2 voices, and was last updated by  Eliot Muir 7 years ago.

Unit testing the Lua code

  • I would like to ask everybody whether you have written unit tests for Lua code?
    If so, which modules do you use? Do you run them in the translator or using a stand-alone Lua?

    If you don’t have unit tests, how do you make sure your software works right? Just a bunch of regression tests, and keeping the fingers crossed?

    I have tried something with LuaUnit, and LeMock for mocking the neighbours. (I mean the other Lua modules, not the farmer next to my office). It seems to work fine, although especially LeMock is a bit cumbersome; the Moq framework for .Net is much easier to use.

    Just an example, to test this simple module chico.lua:

    
    require "groucho"
    chico = {}
    function chico.zeppo()
    	local result = groucho.harpo()
    	return result
    end
    

    I have come up with this test code:

    
    local LuaUnit = require( "luaunit" )
    require( "lemock" )
    test_chico = {}
    function test_chico:test_zeppo()
    	mc = lemock.controller()
    	groucho = mc:mock()
    	groucho.harpo() ;mc :returns(123)
    
    	package.loaded.groucho = nil
    	package.preload['groucho'] = function ()
    	   return {groucho = groucho}
    	end
    	mc:replay()
    
    	require( "chico" )
    	result = chico.zeppo()
    
    	mc:verify()
    	assertEquals(result, 123)
    end
    LuaUnit:run()
    

    We’re doing a project on this ourselves. You can see a small unit test here:

    https://github.com/interfaceware/iguana-web-apps/blob/master/shared/test/file.lua

    and here is some helper code:

    https://github.com/interfaceware/iguana-web-apps/blob/master/shared/unittest.lua

    It wouldn’t be a big deal to slot in another framework for doing unit testing. We haven’t bothered so far. Internally we wrote our own C++ unit testing framework and with Lua it seems even more trivial to write an equivalent ‘framework’ in Lua given the ease with which you can arrange a tree of functions in a nested Lua table and execute them.

    For the most part we’re focusing on the problem of how to store the tests (using GIT) and how to run them on a small farm of Iguana test servers. Wade is working on it in a fork of the main Iguana repository, see:

    http://help.interfaceware.com/kb/regression-testing-round-ii-ding-ding
    https://github.com/wshrewsbury/iguana-web-apps

    He’s starting off with putting the results of a test into a datatable view but we’ll probably end up putting them into a tree view like we are using for another project to make it possible to easy create a self describing web service:

    http://help.interfaceware.com/kb/self-describing-web-service

    It’s on my TODO list to get both of these apps into a state I can demo and run for the Californian work shop road trip we’re doing the week of the 16th. Generally we’re trying to build out the set of tools that meet common problems and use them internally so that we get them to them point that they are really usable for full scale usage.

    Hi Eliot,

    that sounds like quite a promising approach. I like the idea about the web service!

    I wouldn’t have rolled my own unit test module; after all there are plenty of usable modules out there. I’m quite happy with LuaUnit, but I think lunit or lunity would have worked fine, too.

    (I have failed to install busted; it requires Luarocks and a bunch of other modules, and it didn’t quite work out to install them. Oh well :-))

    I don’t like very much LeMock, though, but I haven’t found any other mocking framework for Lua. Maybe, if I’m very inspired, I might write something similar to Moq for Lua.

    About the tests you wrote, are they just proofs of concept or actual tests? Because I was surprised about the tests which write files to the temp dir (and leave them lying around afterwards). As I unserstand it, a unit test should not write to the file system, that should be mocked away. Well, chacun fait c’qui lui plaît 🙂

    Take care,

    Robin

    Most of the effort for the unit test system is the related to:

    1. Getting the distribution of code to machines over the network – i.e. the spinner module (see http://help.interfaceware.com/kb/the-anatomy-of-an-iguana-app/4)
    2. Getting the work flow right.

    It was a couple of hours work to write a bit of code to describe a collection of tests and write a compare function. Less time than to evaluate the frameworks out there. Having said that I am not adverse to someone else spending the time required to understand the big advantages of the frameworks out there – just doesn’t seem like a big problem.

    Some people get very keen on having set up and tear down routines for unit tests – I usually prefer to not get so fancy…

    The unit tests themselves haven’t been fleshed out much (5 unit tests hardly constitutes large test coverage 😉 )- the problem is that until Wade get’s his test runner application finished and in production it’s not convenient to run the tests. Until it’s convenient unit tests are not much use. We do have a lot of other unit tests that we run in production off our typical team city builds – but it’s not very convenient – it’s a pain when a test breaks since the process for getting visibility on why the test broke isn’t smooth – the solution doesn’t provide an easy mechanism of dropping the code into the translator and just being able to see the

    Where I would really like to get to is the work flow that when a test breaks you can get a red icon which you should be able to click on. When you do so it should drop you into a live translator IDE with the code that failed with a sample data priming the test so that it reproduces the problem right there. From that point it should be a simple matter to fix the problem in the translator and then a simple commit of the code back into the repository.

    It should be possible to run the tests before you commit too.

    It’s likely to take us a few iterations before we get to that point.

    For testing a file API it’s necessary to exercise the file system otherwise it won’t detect differences between the different operating systems we support. Unit test mocking is a double edged sword in that it can be helpful but at the same time mocking can hide the real underlying foundation that you use in production… (see http://help.interfaceware.com/kb/the-anatomy-of-an-iguana-app/5 for some background).

    The unit tests for files should have logic which cleans up remnants of old tests though.

    Alrighty – Wade had hit another few milestones on the unit testing tool:

    http://help.interfaceware.com/kb/regression-testing-round-ii-ding-ding

    This has been ported over to Iguana 6 and is now available from Github.

    https://github.com/interfaceware/iguana-apps

    We sometimes get questions about how one can automate running unit tests. You have a number of options for doing this.

    The questions come down to things like how to trigger automated running of tests when there is a check in. There are a variety of solutions to that. If you are running something like Git Hub then github itself has hooks that you can have call an Iguana instance via HTTP:

    https://developer.github.com/webhooks/

    If you are just using a pure GIT instance then you can use a check in hook to fire off a script using something like the Curl command line tool to make an HTTP request. Ditto if you want to trigger builds from a build server like Team City.

    In terms of checking code out of GIT then you can use the GIT command line which can be invoked from within Iguana. Then you can run a script using the spinner library as documented here:

    http://help.interfaceware.com/kb/the-anatomy-of-an-iguana-app/4

    That gives you all the ingredients needed to run automated unit tests. If you are experienced in using Iguana with webservices, popen for processes etc. it’s all straightforward enough. If you do not have these skills then not so much.

    We have a low priority internal project which is looking at this for some of our own needs which we will share when it is available.

    If you are interested and motivated around this topic by all means do chime in.

Tagged: 

You must be logged in to reply to this topic.