The article series
This is the second article in our dogfooding series, and here I’ll describe how we plan and track the features to be developed for the Testlab. One could call this a requirements management or backlog management. I could say the target is very much a moving one. As I’m writing this we are developing a Kanban board and tasks feature that will once again change our process – most probably for better. Anyway, here’s what we do now.
If you haven’t yet read the first article, you should check it first – this article makes more sense when you know the background. The whole series is planned as follows:
- testing a test management tool – The way how we develop and use Milestones to schedule our releases
- from ideas to features – working with development ideas as requirements and getting them done
- Test design – How we plan what to test
- Execution planning – How we plan the whole testing
- Issues – How we work with bugs and other issues
- Reporting – What we track and how we benefit from it
- Wrap up – What have we learned, what happens next
Where do we stand
Each company, tool, application, and service works differently and thus features are picked to be developed in a plethora of ways. Sometimes designers try prototypes, sometimes customers define strict business requirements which are sometimes required by law to be implemented in a certain way. There is not one single one true way that would work for all. Our tool, Meliora Testlab, has a few important characteristics we have to take into account:
- Our customers. We have a steady group of customers, and we always want to serve them better with a better product. We need to add new interesting features to our tool. This serves also our company – a better our Test management tool is, the more attractive it is to future customers as well.
- Our product is very central to many of our customers’ business. Thus we need to weight well how the changes affect the usage. Some of our end users are business people that use our product only to test something or to define the business cases. We don’t want to change the behavior of our solution too much or, if we are forced to make a more significant change, make sure the solution is at least as intuitive as it was before.
- You would be surprised to see how different the ways of using our tool are. We need to understand them all and support the most important ones.
- We need to stay fresh. This means always finding new ways to make the tool better. Not just copy what the others do.
Managing the ever-increasing amount of bright ideas
We are sometimes asked if our tool supports DevOps. It does. Do we do it ourself? Well, we do not follow any DevOps holy book or set of rules as such. We do, however, try to collect ideas for development from all possible sources effectively, keep the development process lean and effective, and have, again, effective QA process. The cooperation with Development and Operations is seamless. So in that sense, we pretty much use DevOps ourselves.
So how do we do it? To pool up the ideas from any source, we write them down as issue tickets in our own Testlab. We use the type of “enhancement” for changes to existing features or “New feature” for new features. Although we do try to be superstars of QA and we have an uneasy night if we slip bugs in production, it happens sometimes. Thus we need to use the type “Defect” sometimes. Some of the enhancement ideas originate from our own team, and some come from our customers. When our customers send us any ideas to make Testlab better – via our ticketing system, in meetings, training sessions, or using any other method, they all end up in our Testlab. Most of the ideas from our own team come from lengthy conversations in our office. Testing produces enhancement ideas quite often and brainstorming sessions irregularly when we have them. If the same idea comes from multiple sources, we update an existing ticket and add a tick to the request count. Thus we have an idea what features have been asked more often than others.
So we have a pretty big list of these enhancement ideas. To cope with the list, we classify the requests using the following fields:
- Area in Testlab
Classifying issues help us group the issues to more manageable amounts. We go through this list each quarter, but also pick features from the list in the middle of the release when feasible. We try to focus on Defects first and look at the priority often as well. We also sometimes pick areas in Testlab where we focus and pick multiple enhancements from that area. If some feature is easy to implement it has higher chance to jump to the development queue. We use tags to mark the picked features.
What happens after we decide to develop something varies. Some features require more research and design before we even know how the feature should be developed. Sometimes we need to experiment with the idea. Sometimes we can write a technical description right away. When developers get ideas on how Testlab would work better we sometimes skip the ticked, discuss a but how the feature would work, and just implement it. What is common to all methods is that at some point we describe the feature in Testlab as a requirement (or as a user story – more about this later). We have ordered our requirement tree based on features. From our requirement list, we can easily see how certain features should work, and what other features we have designed for the same functional area.
Using Requirements and User stories to define how Testlab ticks
For defining how Testlab works, we have used both traditional requirement format and use case formats. For clarity, in this post, we talk requirements, but they can be written in user story format as well. Both of these formats work equally well. We try to maintain a good definition of features but do not try to document all possible combinations of functionalities as requirements. In short, that would be impossible. If somehow we would write everything down the requirements would be so long that reading the documentation would become very impractical. All in all, we find our existing and upcoming features in the Testlabs requirement part. We use the milestone field to mark when the requirements were implemented or when it should be implemented – Most often this is the next milestone coming up. Anyway, it is always easy for us to produce a list of what is being developed for the upcoming release. If we make a change to an existing feature, we then “continue design” the requirement, instead of creating a new one. This way our requirement tree is always up to date, and we also have the history information saved. If we do this, we uncheck the “covered” checkbox so we know to check if the test cases for the requirements are up to date, and update them if not.
We often refine the requirements into finer detail test requirements. They are sub-requirements of “official” requirements, that tell what is to be tested of the requirements. These can be written early on before the feature is actually developed. As the developers see how the feature is to be tested, they can take that into account when developing the feature. This is a very cost-efficient way of averting bugs before the code is written and it helps to design and maintain the test cases. It also drives communication and collaboration between the people responsible for testing and the developers implementing the feature and makes the team tighter.
As the requirement is split into smaller pieces, a change in requirement does not require changing all the test requirements. Thus we need to check only the test cases that are linked to the test requirements.
What will the future bring us – are requirements still relevant?
When we think about how the process around designing features works for us now, I would say that pretty well. Our product backlog is easy and fast to use and well organized. The list of wishes is quite big, but working is as effective as it can be. We do have loads of information stored as requirements. A lot. That makes our requirements tree a big one. When someone new to Testlab sees it, it can be a bit overwhelming. Then again, we’ve kept the information there pretty relevant. That task is not always easy, as information is given in different formats as there is no single correct way of storing the information. Still, the idea of restructuring the requirement tree haunts us.
We are currently developing a task feature that will allow us to use our own tool to split design and other work into tasks. This will surely affect the way we work in the future.
Reacting to feature requests
As mentioned, we collect enhancement ideas from our customers that we add to our backlog. Sometimes customers of Meliora Testlab have more specific or more urgent needs that they do not want to wait until they get developed using the normal scheduling. For this Meliora offers sponsored features. When customer requests feature in this way, Meliora writes a feature description document describing how the business need of request would be addressed. Meliora wants to keep control of how the feature details are to be implemented so that feature would support all the users. If the customer agrees with the design, we’ll start developing the feature. Almost always these sponsored features will be deployed for all Testlab users. This has many benefits: all Testlab users have the same set of features available and product management for us as a vendor is as lean as possible.
This concludes this article about working with requests. Stay tuned for an article about designing test cases!