Flex Unit Testing and Continuous Integration: Part 2 of 2

This is part 2 of an article about testing Flex and integrating Flex into a continuous integration environment so it's worth reading part 1 first.

FlexUnit in Ant

Once you've written your unit tests for your Flex project, how are you going to run them? Sure, you can just fire up the test runner that I wrote about before, but you've got to remember to run it and actually look at the green/red bar to know if things are going wrong. Ideally you'd get your unit tests to run as part of your build so that you get the advantages of continuous integration.

With this in mind, I took a look at Peter Martin's FlexAntTask which allows you to run your FlexUnit tests from within an Ant build.

This works a treat on my local machine with a build something like this:

What's happening here is that we use the mxmlc task to compile the AntTestRunner.mxml which is Peter's custom test runner that allows reporting of test results. We then tell the flexunit task to run that compiled SWF. What's really clever is that the flexunit task sits and listens for test results which are send from the AntTestRunner app via an socket connection. The flexunit task then allows the build to continue or fail depending on the results it got back from the AntTestRunner. What's important to note is that the flexunit task will fire up your tests in the stand alone desktop flash player. If you've got code that does any networking or reads a config file from the assets directory for instance, you may need to set FlashPlayerTrust to allow your app to run in the local trusted sandbox. See the sparse Adobe docs on the matter. This is all good so far, but what about running a build in a continuous integration environment such as Hudson or CruiseControl where you might have a headless server where you can't run the flash player?

FlexUnit and Hudson

The chances are that most Flex developers just have FlexBuilder on their local machine and they code away all day and build on their local machine before uploading to the test or production website. However, in the continuous integration world, we want to have a build server that is checking our source code and building our app all the time and letting us know how the tests are getting on. We use Hudson for this purpose. Getting a new project into Hudson is simplicity itself, but it's a little trickier when you're building a Flex project. First of all you have to make sure that you have the Flex SDK installed on your build machine, but thankfully you can grab this for free without having to install FlexBuilder. The second tricky thing is that you will find that if you're running your build on a remote Linux server without a screen, you won't be able to popup the flash player to run your tests. So, we need to push it to a virtual display by means of running something like a vnc server. As it happens, Hudson comes with a xVNC plugin that allows you to start and stop a VNC server before and after your build and run your entire ANT build in a virtual display. However...before I found this out, I had already decided that I'd like to customise my build and the FlexUnit task to do this for me.

<!-- 0 means the service was correctly started, 98 (on linux) means it was already running. -->

However, how do I tell the flash player to run in its own display? I had a little fiddle with Peter's FlexUnit project and altered the UnixSWFLauncher to do this:

public class UnixSWFLauncher implements SWFLauncher
{
   private String display;
   private String command = "gflashplayer";
   private Project project;
   public UnixSWFLauncher(Project project, String display, String cmd) {
       this.display = display;
       if (cmd != null && !cmd.equals("")) {
           this.command = cmd;
       }
       this.project = project;
   }
   public void launch( String swf ) throws Exception
   {
      swf = swf.replace( '', '/' );
      ExecTask exec = new ExecTask();
      exec.setProject(project);
      exec.setExecutable(command);
      exec.createArg().setValue(swf);
      if (display != null && !display.equals("")) {
          Variable var = new Environment.Variable();
          var.setKey("DISPLAY");
          var.setValue(display);
          exec.addEnv(var);
      }
      exec.execute();
   }
}

Essentially, if the use has sent through the "display" parameter on the flexunit task, then set the environment variable "DISPLAY" with that value before running the player.

Now for aside regarding starting the Hudson server...it turns out that our Hudson server was running as the user "hudson", but if you login as say "kieran" and do "sudo /etc/init.d/hudson restart", it does run as "hudson" except that deep down somewhere the VNC code somehow knows that it was really "kieran" who ran it and it ends up starting the VNC server under "kieran"s environment, but the flash player tries to run under the "hudson" environment...causing much chaos with permissions...bleah.

The Complete Build Process

So...once that is all done, we have a build process that runs something like this:

  1. Write a test to define your code behaviour
  2. Add test to test suite
  3. Test will fail or not even compile
  4. Implement code to make test pass
  5. Run test/write code until it passes via FlexBuilder for a quick turn around on your tests without having to run a full ant build
  6. Run a local ant build to confirm that your project is building
  7. Check your changes into your source repository (SVN/CVS)
  8. Hudson will then detect the changes to the source code and schedule a build
  9. Hudson goes through your ant build:
    1. Clean/create build directory
    2. Compile test runner
    3. Start VNC server
    4. Run flexunit task on the remote display
    5. AntTestRunner reports back to the flexunit task the test results
    6. flexunit task generates junit style xml reports
    7. Stop VNC server
    8. Compile actual real MXML application with the mxmlc task into a SWF
  10. Hudson notifies you if you've got a failed build, or hopefully generates nice 100% pass junit reports and keeps an archive of your build artifacts (probably your SWF).
  11. Repeat...

What's really pleasing about all of this is that it essentially puts our Flex development on a par with our Java development in terms of a set of processes to ensure code quality. Ensuring you have this process in place also forces you to develop your Flex in a more test driven manner with business logic code better separated from your views. Now we just need to get our automated Flex acceptance testing into place.

This site uses cookies. Continue to use the site as normal if you are happy with this, or read more about cookies and how to manage them.

X