Wednesday, December 23, 2009

Merry Christmas!

Christmas is closing in and the Whole Team will be gone until 7 january. We're leaving mid-sprint, one week done and one week to be done 7 january and forward. It was either that or having slack time for a week working freely. Having a four day sprint didn't feel like an option. Any way I think it will work fine with a two-session sprint. And certainly better than unplanned time.

On another topic I received a Christmas gift today as I finally got the book I ordered three weeks ago. It should have been delivered in 3-6 days, but I guess it was busy times for the book shop... The book is The Art of Unit Testing by Roy Osherove. I've heard a lot of good things about the book so I hope it will be an interesting read. I'll try to post a review when we're back.

Merry Christmas to all of you, and a Happy New Year. See you in the next decade!

Wednesday, December 9, 2009

Selenium not a success story

Since the beginning of this project our Test Manager has been developing automated acceptance tests for our User Stories using Selenium. While it is great to have automated acceptance tests, it is not so great when you can't get them to run on your build server. The whole point of automated acceptance tests is to get feedback as soon as possible so you can correct either the tests or the code that broke them.

Our goal is to run the Selenium server as a Windows service. First I read this post http://clearspace.openqa.org/thread/17226 where I got the impression it was not possible to get Selenium to run as a service. Then I read this post http://clearspace.openqa.org/message/39530 telling that it was possible. Anyhow we have not got the Selenium to run successfully as a service, or rather, it does run but not reliable.

What are our options?
We decided to timebox the life of the Selenium server experiments to a maximum of a week. During that time we will set another developer to look at the problem to get a new perspective. There are three options to try here:
1. Get the Selenium server to run as a service. As we already know it might not be possible.
2. Always login to the server and start the Selenium server manually and not log out. The Selenium server works fine when you are interactively logged in.
3. Try to start the Selenium server using a scheduled task. I don't think we have tried this.

Another option than getting the acceptance tests to run on the build server is to run the tests on the developers machine, just like with unit tests.

I will give you our final decision in a future post and would of course appreciate your feedback on how you are working with automated acceptance tests and what tooling you use.

Friday, November 27, 2009

Easing up the XP

We have not reached the desired productivity level just yet even though development is faster for each day. Since we have a challenging dead line we have decided to ease up a little on the XP to get faster forward. This is what we did.

We first had a retrospective with the guys in the team to get their idea of what is slowing each of them down. Pair Programming and inexperience with TDD was the key problems. Some said that coding in pair takes longer than doing it alone since you need to explain everything you do. Some said that thinking about creating the test first probably took three times as long as just doing it.

The actions we decided on was
  1. If you get stuck with the Test First, just do the code and write the test after.
  2. If you feel that you don't add anything while Pairing, go to the task board and start investigating the next task.
  3. Don't ask too much and don't tell too much while Pair Programming. Stick to the problem you're working on.

Thursday, November 12, 2009

TDD is hard

Writing the tests isn't the hard part. At least not for me. It's when you have them. I often find myself do my refactorings first and then correct the tests to work again once refactoring is done.

How do you work with refactoring in a TDD project?

Another hard part is to stop writing the tests - to find the decent level where you have good tests and remain productive. I could spend an entire day writing tests to validate some data, but that wouldn't be very productive. Probably 15 minutes may produce the code needed with just a small number of tests - adding more tests would make sure that even more cases work, but they all probably do from the start.

When do you settle and think a test is good enough and move on to the next function?

Wednesday, November 4, 2009

Intensity

In the past three weeks...

I've learnt the basics of XP.
I've learnt the basics of Planning Poker.
I've learnt the basics of Pair Programming.
I've learnt the basics of Test Driven Development (TDD) with Rhino mocks.
I've learnt that a project that grows with new members rapidly is exhausting.
I've learnt that you must develop MOSS sites in a Virtual Machine.
I've learnt how to enable stack trace on MOSS.
I've learnt new keyboard shortcuts in Visual Studio.

and probably more...