Getting started with

This article is to help you plan your deployment of Testlab. It gives pointers on things to consider when enrolling Testlab in use, and food for thought when adapting Testlab to your software development process.

Adapting a test management tool to your software process usually takes a bit of planning. Tools are easy to use but to make sure the features – especially analysis features – are taken in use in the best way possible there are some things to plan and decide before jumping in to use head-on. In this post, we will discuss different details of Testlab’s features, way of storing the quality-related assets and give pointers on how you should enroll Testlab in use as efficiently as possible.

  • Familiarizing yourself with Testlab

    Even though we speak of Testlab as a software-centric test management tool, you should keep in mind that Testlab is a fully-featured quality management suite. This means, that you can use Testlab for specification, verification, and analysis of your product quality regardless of what your product actually is.

    For simplicity’s sake, in this article, we’ll discuss Testlab in the context of the development of software projects. This is also natural because Testlab offers out-of-box integrations to commonly used software development tools such as Atlassian JIRA and Confluence.

    The easiest way to grasp what Testlab is about is to:

    1. read through this article.
    2. Log into the “Demo” project and browse through the data. When you look around it is good to read through the description of the demo project which gives you insight on the (fictional) software development project the Demo project is about.
    3. Give it a go and feel free to add assets, edit assets and execute some tests in your Demo project. This helps you shed light on how different things affect others in Testlab. You can also add a new blank project to your Testlab for sandboxing if you wish to retain your copy of the Demo project as it is.
    4. Browse through the help manual found in your Testlab. You can also click the question mark icons on each view to jump around the manual in a context-sensitive manner.
    5. Browse through the documentation on-line.
    6. Add a new project, some users and evaluate your Testlab in a small real-world project of yours. This is by far the easiest way to really experience the benefits of using such a tool and see the structure it brings to your development project.
  • How does Testlab store my data?

    It helps to understand how the central quality-related assets are stored in your tool. First, all your data (excluding your company details) will be stored under a project. You should think of the project as a domain that will include all requirements (specification of some kind), test cases for them, executed testing results, issues, etc for a single product, application or a system under testing.

    The next diagram shows a simplified diagram about how the different assets relate to each other in Testlab inside a project.

    Each project can be split into milestones, which are project reference points that aim to be completed at some point of time. Milestones can be used in various ways to separate the assets inside the project from each other and in addition, using milestones makes it easier to track the progress of testing. For more examples on how milestones can be used please see the 5.2 Milestones section of Testlab’s help manual.

    Every good test plan is based on a solid set of requirements. In Testlab, requirements can be anything that can be used to specify, define or qualify features and functionality for the product, service or an application under testing. Testlab is development process agnostic, so you can define your requirements as user stories, use cases, traditional business requirements or something else that fits best to your development process. You can even use requirements to group your test cases to the grouping of your liking if you for some reason prefer to skip using the requirements in the specification phase. See the next chapter on more pointers on user requirements.

    For the testers to be able to test the system they must have information about what they are testing and most of the time instructions on how something should be tested. For this, test cases are defined to include description, pre-conditions, the actual step by step instructions and expected end results. For the tool to be able to automatically track which functionality each test case is testing, they are usually linked to requirements. When a test case is linked to a requirement we say the test case verifies this requirement.

    When test cases are executed they are always executed in a test run. A test run records the results for each test case that is included in it and specifies for which version (of the software for example) the testing was executed against and in which environment the testing was executed.

    During testing, test runs are recorded with testing results for each test case. Good. But when something is encountered which requires attention later on it should be recorded as an issue. Issues can be seen as tasks that should be handled somehow later on. Issues can be defects encountered (i.e. bugs in software), future development ideas, new features or something else you wish to record for later use. Keep in mind that you can add issues anytime – they don’t have to be results of some specific testing session.

    All major testing data assets: requirements, test cases, and issues have automatically stored history data. This means you can find out what these assets have been at each point in time.

    Is there any data in Testlab which can be shared between Projects?

    Yes, there is.

    • Users: Users of your company are of course shared between all your projects.
    • Workflows: Workflow schemes can be configured once and shared between projects. When you set up a new project you can select a workflow scheme from a list of your previously configured schemes. A workflow scheme defines the “who can do what on which assets” in a project.
    • Notice schemes: Notice schemes can be shared between projects. A notice scheme defines the events which trigger e-mail notifications to your users in a project.
    • Reports: Reports can be configured and published to be shared between your projects.
    • Announcements: Announcements can be published on your dashboard to all your projects.
  • I just want to manage my testing – do I have to use requirements?

    It is common that the quality, coverage, precision, and availability of the system’s specification varies. Sometimes, you would just like to manage your test cases and the effort of tracking requirements in the tool seems too much (even though importing them is quite easy). All is good, using requirements in Testlab is not required; You can add just your test cases, execute them and manage the found issues.

    Just to note as said earlier, the requirements in Testlab don’t have to be traditional business requirements even though the term implies so. Requirements can be anything that fits well with your development process and can be used to specify the qualities and functionality of your system under testing. In agile projects, these can be user stories for example. Or better yet, a mixture of different kinds of specification assets.

    The type of the requirement is really not important but the link between the requirement and it’s verifying test cases is; From this link, Testlab can analyze and report out the coverage of your testing. In other words, you can easily see what functionality has been tested as passing and which parts of the system still have issues with them. From the testers’ point of view the advantage is that you always know why are you testing certain things and if the requirement or implementation of requirement should change, you can check which test cases you might have to update or run.

    We recommend that you consider some middle ground alternatives on how to manage your requirements in Testlab even if you don’t prefer to use Testlab for specification:

    • If you have your specification somewhere else outside Testlab you could add requirement “stubs” to your Testlab project which makes a reference to your existing specification. For example, If your specification has the functions of your system with identifiers (which it should anyway) the easiest way would be to add requirements to your Testlab project with the same identifiers set and just leave the descriptions and other fields as blank as you prefer. This way linking the test cases to your (outside) specification is a breeze by just dragging and dropping the test cases and you would get transparency on how your testing is doing from your specification point of view.
    • If you don’t have any specification at hand for which to link the test cases to you should think of some kind of categorization for which you would like to see and analyze your testing. You should think about this as a grouping of test cases. For example: Let’s presume you have a UI-centric web application which you would like to execute test cases. You decide, that it would be great that you could see the testing coverage from the user interface point of view. You could then add each page of your web application as a requirement to your Testlab project and link all test cases to these pages. When requirements and test cases are set up this way in Testlab, you could see the testing status and coverage for each page and/or group of pages from test coverage analysis.

    Keep in mind, that requirements are added to a tree. The coverage analysis is hierarchical, so all leaf nodes in the tree are summed up when calculating the coverage.

  • Deciding on a workflow

    When talking about workflow in Testlab, we mean a way of working with different types of assets in a project. A workflow basically specifies:

    • who (= which user role)
    • can do what (= execute an action)
    • on what (= a requirement, test case or an issue in a specific state).

    Testlab provides two different kinds of workflows out of the box for you to assign for your projects. A “Simple” style workflow is shown in the following diagram:

    “Simple” workflow defines a simple asset workflow for requirements and test cases where assets are first created to “In design” status. When the asset seems ready it is approved as ready. That’s it. If some requirements or test cases get stale they can be deprecated and therefore regarded as removed. Choose this workflow for your projects if you prefer to keep workflow with your test cases and requirements as simple as possible.

    If you prefer, that your requirements and/or test cases should be reviewed by a 3rd party (for example your product owner, customer or someone else) you can use the “Default” workflow described in the following diagram:

    In this workflow, assets are created to “In design” status and when they seem ready, they are marked as ready for review. Persons in the appropriate REVIEWER role (see “Roles for users” chapter below for more information) then either reject or approve the asset. You should pick this workflow for your projects if you prefer your assets to be reviewed through the above process.

    Keep in mind, that you can customize all workflows for your liking by copying an existing workflow and making the needed changes for it. You can customize permissions on user roles and asset workflows for requirements, test cases and issues separately.

  • Roles for users

    When you sign up for Testlab you will be granted a user account as a company administrator. This user account is a special account granting your administration rights. Administrators have the power to do anything in their projects.

    The permission system in Testlab is based on user roles. When creating user accounts you should grant the user an appropriate user role in the projects they will be assigned to. So as said, user roles are project-specific so each account can have different user roles in each project.

    A role grants a set of permissions and workflow actions to the user in the project. The roles provided by the out of box workflows of Testlab defined the user roles as follows (for more specific details please see 8.2 Default workflows and roles -chapter in Testlab help):

    Roles Description
    VIEWER VIEWER-role is for users who should only be able to view and read the existing content in the project. Users in this role can view the data, generate reports and diagrams and export data in CSV format.

    This role is useful for users who should only be able to view the content.

    REQUIREMENTDESIGNER REQUIREMENTDESIGNER-role is for users who should be able to design and manage the requirements of the testing project. Designer roles have all permissions of the VIEWER and in addition, permissions to add, edit, delete and import requirements.

    This role should be granted for users who are responsible for managing the requirements in the project.

    REQUIREMENTREVIEWER REQUIREMENTREVIEWER-role is for users who should be able to approve and reject existing requirements. Users in this role have all permissions of the VIEWER and in addition, they can edit requirements to approve and reject them.

    If you are using the Default-workflow with a review, this role should be granted for users who are responsible for approving or rejecting the requirements.

    TESTDESIGNER TESTDESIGNER-role is for users who should be able to design and manage the test cases of the testing project. This role is similar to the REQUIREMENTDESIGNER role but applies to test case design only.

    This role should be granted for users who are responsible for managing the test cases in the project.

    TESTREVIEWER TESTREVIEWER-role is for users who should be able to approve and reject existing test cases. This role is similar to the REQUIREMENTREVIEWER role but applies to test case design only.

    If you are using the Default-workflow with a review, this role should be granted for users who are responsible for approving or rejecting the test cases.

    TESTER TESTER-role is for users who should be able to plan to test, execute test cases through test runs and manage issues. Users in this role have all permissions of the VIEWER and in addition, permissions to manage test sets and test runs, execute tests, add, edit and import issues.

    This role should be granted for all users who should be able to execute test cases (i.e. testers).

    ISSUEMANAGER ISSUEMANAGER-role is for users who should be able to manage issues and close them. Users in this role have all permissions of the VIEWER and in addition, they can add and edit issues and mark them as closed.

    This role should be granted for all users who should be able to review resolved issues and accept them as closed.

    ISSUERESOLVER ISSUERESOLVER-role is for users who should be able to manage issues excluding the permission to close them. Users in this role have all permissions of the VIEWER and in addition, they can add and edit issues without permission to mark the issues as closed.

    This role should be granted for all users who should be able to handle and resolve issues (i.e. developers, programmers).

    PROJECTMANAGER PROJECTMANAGER-role is for users who should be able to manage the details of the testing project and user rights for the project and add new users. Users in this role have all permissions of the VIEWER and in addition, they can edit project details, add new users and edit user details, grant project roles for users and import and export data.

    This role should be granted for all users who should be able to handle administrative project-specific tasks (i.e. project managers).

    TESTMANAGER TESTMANAGER-role is for users who should have full permissions to manage the assets and information included in the testing project. Users in this role have all permissions combined from the above roles. Keep in mind, that TESTMANAGERs don’t have the company management permissions of the company administrator users.

    This role should be granted for all users who should be able to handle all tasks in a testing project (i.e. test managers).

    When assigning roles for users with provided workflows, a simple strategy is as follows:

    • Grant TESTMANAGER role to all users who should have permission to do about everything in the project (manage requirements, test cases, and issues, execute tests, manage users, etc.). Typically this might be the test manager leading the testing project.
    • Grant REQUIREMENTDESIGNER and/or TESTDESIGNER roles to all users who should have permission to manage the assets.
    • If you are using review based workflows grant REQUIREMENTREVIEWER and/or TESTREVIEWER roles to all users who should be permissions to approve or reject the assets.
    • All developers who should be able to resolve issues should be granted with ISSUERESOLVER role.
    • Grant VIEWER role to all users who should be read-only access to the project’s data.
  • Customizing fields of assets

    The fields you see on the forms when you edit requirements, test cases and issues are a default set of fields visible out of the box. If you have some information you would like to store to an additional field, let’s say for example for your issues, you can configure a custom field for it.

    In Testlab, custom fields are configured via project details and are always project-specific. You can add custom fields for requirements, test cases, and issues.

  • Milestones – do I need them?

    First of all, using milestones is not mandatory. You can start with your project without milestones and if and when you need one, add one.

    Milestones are project reference points that make it easier for you to track the progress of testing inside the milestone and in addition, you can target assets to milestones which makes it easier to separate assets from each other inside the project. You should think of a milestone as something that gets completed later on. In software projects, these are typically software releases, sprints, or something else that you aim to track your project in.

    You can set up your milestones in a way that enables you to control how milestones relate to each other. This is called milestone inheritance and it allows you to set up a milestone to inherit requirements (and test cases) of another.

    Some examples follow.

    Separate milestones

    When no inheritance relations are set the milestones are separated inside the project. In the above example, the project has three milestones: Release 1.0, New features and Billing. Non-targeted requirements illustrate the requirements which have no target milestone set at all. In this example, all three milestones have their own requirements, test cases, and other assets. You could think of these milestones inside the project as small sub-projects of their own: when reporting the progress and coverage of these milestones gives you a view of only the requirements each milestone is targeted with.

    Inherited milestones

    The above example has the same three milestones as the example presented earlier: Release 1.0, New features and Billing. The Release 1.0 is a milestone developing the first release version of the product, New features milestone is developing some new additional features for the product and the Billing milestone is developing billing-related functions for it.

    The inheritance relations between the relations are set as presented:

    • The project has a bunch of assets (requirements in this example) with no target milestone set. These requirements are though to be global and shared in all milestones of this project. A set of security-related requirements is a good example: they are expected to be in effect always and all milestones inside the project should cover them.
    • Release 1.0 is set to inherit all [Non-targeted] requirements. This means, that from a testing point of view the Release 1.0 should cover all global requirements and all requirements explicitly targeted to the milestone.
    • The New features milestone is set to inherit the Release 1.0 milestone. The project team has decided, that when testing the new features the coverage should be as good as a possible meaning, that the testing should include covering all requirements of the (earlier) Release 1.0 and in addition, all requirements and functions implemented in the New features milestone.
    • The Billing milestone is implemented in an outsourced team with no or little relation to the team developing Release 1.0 and New features milestones. The milestone is set to inherit the [Non-targeted] assets to ensure that the Billing features are covered in testing with all global requirements (including the mentioned security requirements) set for the project.

    You can always edit the inheritance relations of the milestones later on. The edits are non-destructive in a way, that Testlab will keep all inheritance relations of the milestones intact and when you edit them, the reporting and the views of the project will reflect the changes immediately.

  • How should I use versions and/or environments?

    To reiterate,

    • a version is something under testing. In software development projects a version is typically some version of the application you are testing. When doing QA for some other kind of a product, a version might be some variant or a prototype of your product you are verifying to be in spec.
    • An environment is a domain, setting or a state the version is tested in. In software development projects an environment might typically be a server environment where the application is running in (such as “system testing”, “staging” or “production”). In other kinds of QA projects, the environment can be something else to fit your needs.

    It is good to understand, that a version is an important factor in analysis and reporting. It is often used as a term to group and/or filter the data reported. For example,

    • you should not mix versions up with milestones. Milestone is something that you aim to complete to satisfaction in your project. Typically, completing a milestone might include testing of various versions.
    • Testing coverage can be reported for a version. You select a single version and the analysis shows you how your requirements and test cases are passing and failing for the selected version.
    • Issues are often handled against a version. In software projects, developers often need to know for which version they should resolve the issues for and in which version of the application an issue is detected in.
    • Many reports can be generated for a (single) version.
  • My first project

    As a quick guide, when you are ready to add and kick off your first project in Testlab, technically you need to:

    1. Add a new Project to Testlab. You’ll need to decide on a workflow – select “Simple” to keep it simple if you don’t want your requirements and test cases to be reviewed and approved.
    2. Add users to Testlab. You’ll need to create an account for each person participating in your project. Note that all users will get their account details sent to them by e-mail – no need to inform your users on account details separately.
    3. Grant users the roles they need in your project. As said, a good practice is that you grant a TESTMANAGER role to someone who is in charge of the testing in the project. If you need to limit the access for the rest of the users you might want to grant some other roles for them.

    That’s it!