Posts tagged with: integration


4.6.2019

Upcoming releases with Automation workbench

In this post, we will discuss a bit about our upcoming release schedule and offer a glimpse to the features in upcoming releases.

 

Challenges with test automation

Business challenges related to interpreting results of automated tests are obvious. More often, tests are created by developers with various different motives, the number of tests is often numerous and traceability to the specification is often lacking. Features are needed to make sense of these results and gain the business benefits outside the scope of regression or smoke testing – which in practice, often tends to be the use for automated tests.

If you are able to map and scope the tests efficiently, you

  1. have a better understanding of what is currently passing or failing in your system under testing,
  2. have your team collaborate better,
  3. ensure that key parties in your project can make educated decisions instead of being reactive and
  4. have a tool to really transition from manual to automated testing as existing test plans can be easily automated by mapping automated tests to existing test cases.

 

Automation in Testlab

In Testlab, results from automated tests can be mapped to test cases in your project. This way of working with your automated tests has benefits as the results can be tracked and reported in exactly the same way as the manual tests in your current test plan. What happens is that when results for automated tests are received (for example from a Jenkins’ job), a test run is added with the mapped test cases holding the appropriate results. The logic how the mapping works is currently fixed so that the test cases can be mapped with pre-defined fixed logic: If the ID of the automated test starts with an ID bound to a test case, it will receive results of this automated test.

 

Upcoming Automation workbench

Future versions of Testlab will include a new Test automation view with a workbench aimed at managing the import and mapping of your automated tests. A screenshot of the current prototype can be seen below (click for a peek):

Issue situation radiator

 

The workbench

  • allows you to define “sources” for your tests: a source is a new concept which allows you to configure how the incoming results from such source are handled. You might want to think the source as your “Jenkins job” or your Jenkins installation as a whole if your jobs follow the same semantics and practices. A source is defined with “rules”, which define how the automated tests are mapped. The workbench
  • features a rule engine, which will allow comprehensive mapping rules to be defined with
    • ignores (which tests are ignored),
    • creation rules (how test cases for tests are automatically created) and
    • mapping rules (such as “starts-with”, “ends-with”, regular expressions – defines how the results for the tests are mapped to your test plan) and
  • the workbench allows you to
    • easily upload new results by dragging and dropping,
    • execute any rules as a dry run to see how the results would be mapped before the actual import,
    • create any test cases by hand – if preferred over the creation rules – and
    • use pre-built wizards which help you to easily create the needed rules by suggesting appropriate rules for your source.

We feel this is an unique concept in industry to help you get better visibility in your automation efforts.

The prototype shown is work-in-progress and is due to be changed before release.

 

Upcoming releases

As our roadmap implies, The new major automation related features are planned to be released in Q3 of our 2019 release cycle. The team will be hard working with these new features which means that the Q2 release will be a bug-fix release only. That said, we will be releasing maintenance releases in between, but the next feature release will be the version with automation features in Q3/2019.

Meliora Testlab Team

Facebooktwitterlinkedinmail


Tags for this post: automation features integration product release usage 


4.6.2019

Changes to authentication with Atlassian plugins

When integrating Testlab with your Atlassian products such as JIRA, there might be credentials related at your Atlassian side. Previously, for example with JIRA, the integrations required you to create an user account to JIRA and configure the credentials (including the password) to your Testlab project. Atlassian has made some changes and at least with JIRA Cloud, using passwords in this kind of scenario has been deprecated.

Documentation for setting up our integrations have been updated. In this post, we also provide instructions what to do to get your integrations up and running.

 

Replacing passwords with API tokens

Atleast for some operations, the integrations rely on operations provided by JIRA’s so called REST APIs. It might vary a bit depending on your JIRA version, but at least with the current JIRA Cloud, you should replace all passwords with so called API tokens. What you should do is that for the (JIRA) user you are integrating with, you should create an API token and use the token as the password at Testlab side. For technical details, you can read more about JIRA’s authentication via the links provided at the end of this article.

When using JIRA Cloud, to create an API token,

  1. With the user account you are integrating with, log in to https://id.atlassian.com/manage/api-tokens
  2. Click Create API token
  3. Give your token an label and click Create
  4. Click Copy to clipboard and paste the token as a “password” when configuring the integration at Testlab side

That’s it. Your server installed JIRA might also require use of API tokens. If so, refer to your JIRA administrator or JIRA documentation on help how to do this.

 

Resources

Facebooktwitterlinkedinmail


Tags for this post: integration jira security usage 


13.4.2018

Testlab – Canis Majoris release

We’ve released a new version of Meliora Testlab – Canis Majoris. This version includes major additions to the REST interfaces of Testlab and several UI related enhancements. Please read more about the changes below.

 

REST API enhancements
REST API enhancements

The REST based integration API of Testlab has been enhanced with a number of new supported operations. With Canis Majoris, it is possible to add, update and remove

  • requirements,
  • test categories,
  • test cases and
  • issues.

Add operation maps to HTTP’s POST-method, update operation maps to PUT-method (and in addition, supports partial updates) and remove operation maps to DELETE-method. More thorough documentation for these operations can be found in your Testlab instance via the interactive API documentation (/api).

 

Test categories with fields

In previous versions of Testlab, test cases were categorized to simple folders with a name. In Canis Majoris, test categories have been enhanced to be full assets with

  • a rich-text description,
  • time stamps to track creation and updates,
  • a change history and
  • a possibility add comments on them.

 

In addition to the above
  • Date and time formats are now handled more gracefully in the user interface by respecting the browser sent locale.
  • Test cases of a requirement can now be easily executed by choosing “Run tests…” from
    • the tree of requirements or
    • from the table view of requirements.
  • Similarly, the test cases linked to a requirement can be easily added to a your work (test) set by choosing “Add to work set” from the table view of requirements.
  • Test case listing -report renders execution steps in an easy-to-read table.

 

Thanking you for all your feedback,

Meliora team


Canis Majoris

VY Canis Majoris (VY CMa) is one of the largest stars detected so far and is located 3900 light-years from Earth. The estimates on it’s size vary, but it is estimated to be 1400 to 2200 solar radii (distance from center of Sun to it’s photosphere).

The size of this object is difficult to comprehend. It would take 1100 years travelling in jet aircraft at 900km/hr to circle it once. Also, It would take over 7 000 000 000 (7 Billion) Suns or 7 000 000 000 000 000 (7 Quadrillion) Earths to fill VY Canis Majoris. There are also few videos on YouTube which try to explain the size for you. 

(Source: Wikipedia, A Sidewalk Astronomer blog)

Facebooktwitterlinkedinmail


Tags for this post: announce features integration product release screenshots 


12.4.2017

Testlab – Lilliput Sight released

Meliora is proud to announce a release of new major Testlab version: Lilliput Sight. This release includes some new major features such as asset templating but in the same time, bundles various smaller enhancements and changes for easier use. New features are described below.

 

Asset templating

 
Asset templating

Creating assets for your projects in a consistent way is often important. This release of Testlab includes support for templates which you can apply when adding new requirements, test cases or issues.

A template is basically a simple asset with some fields set with predefined values. As you are adding a new asset, you have an option for applying templates. When you apply a template, the asset is set with the values from the applied template. You can also apply multiple templates, so designing templates to suit the needs of your testing project is very flexible.

A set of new permissions (testcase.templates, requirement.templates, defect.templates) has been added to control who has access to using the templates. These permissions have been granted to all your roles which had the permission to edit the matching asset type.

 

Robot Framework output support
Robot Framework output support

When pushing automated results to Testlab, we now have native support for Robot Framework’s custom output file. By supporting the native format, the results include detailed information of keywords from the output which are pushed to Testlab as the steps executed.

The support for Robot Framework’s output has also been added to the Jenkins CI plugin. With the plugin, it is easy to publish Robot Framework’s test results to your Testlab project in a detailed format.

 

Copying and pasting steps

The editor for test case steps now includes a clipboard. You can select steps (select multiple steps by holding your shift and ctrl/cmd keys appropriately), copy them to the clipboard and paste them as you wish. The clipboard also persists between test cases so you can also copy and paste steps from a test case to another.

 

 

Filtering with operators in grids

The text columns in Testlab’s grid now feature operator filters which allow you to filter in data from the grid in more specific manner. You have an option of choosing the operator the column is filtered with such as “starts with”, “ends with”, … , and of course the familiar default “contains”.

With a large number of data in your project, this makes it easier to filter in and find the relevant data from your project.

 

 

Mark milestones, versions and environments as inactive

When managing milestones, versions, and environments for your project, you now can set these assets as active or inactive. For example, if a version is set as inactive, it is filtered out from relevant controls in the user interface. If your project has numerous versions, environments or milestones, keeping only the relevant ones active makes the use easier as the user interface is not littered with the non-relevant ones.

For versions and environments, the active flag is set in the Project management view. For milestones, the completed flag is respected as in completed milestones are interpreted as inactive.

 

Usability related enhancements
  • Editing asset statuses in table views: You can now edit statuses of assets in table views – also in batch editing mode.
  • New custom field type – Link: A new type of custom field has been added which holds a single web link (such as http, https or mailto -link).
  • Support deletion of multiple selected assets: The context menu for requirements and test cases now includes “Delete selected” function to delete all assets chosen from the tree.
  • Delete key shortcut: The Delete key is now a keyboard shortcut for context menu’s Delete-function.
  • Execution history respects revision selection: The execution history tab for a test case previously showed a combined execution history for all revisions of the chosen test cases. This has been changed in a way that the tab respects the revision selection in a similar manner to the other tabs in the Test case design view. When you choose a revision, the list of results in the execution history tab is only for the chosen revision of the test case.
  • Custom fields hold more data: Earlier, custom fields were limited to a maximum of 255 characters. This has been extended and custom fields can now hold a maximum of 4096 characters.
  • Test cases already run in test runs can be run again: If the user holds the permission for discarding a result of an already run test case, you can now choose and execute test cases already with a result (pass or fail) directly from the list of test cases in Test execution view. Earlier, you needed to discard all the results one by one for the tests you wish to run again.
  • Enhancements for presenting diagrams: The presentation view for relation and traceability diagrams has been improved – you can now zoom the view and pan the view in a more easier manner by dragging and by double clicking.
  • Copy link to clipboard: The popup dialog with a link to open up an asset from your Testlab has been added with a “Copy to clipboard” button. Clicking this button will copy the link directly to your clipboard.

 

Reporting enhancements
  • “Filter field & group values” option added for grouping reports: Requirement, test case, and issue grouping reports have been added with an option to apply the filter terms of the report to the values which are fetched via “Field to report” and “Group by”. For example, if you filter in a requirement grouping report for importance values “High” and “Critical”, choose to group the report by “Importance” and check the “Filter field & group values” option, the report rendered will not include reported groups for any other importance values than “High” and “Critical”.

 

Enhancements for importing and exporting
  • Export verified requirements for test cases: Test case export now includes a “verifies” field which includes the identifiers of requirements the test cases are set to verify.
  • Input file validation: The input file is now validated to hold the same number of columns in each read row than the header row of the file has. When reading the file, if rows are encountered with an incorrect number of columns, an error is printed out making it easier to track down any missing separator characters or such.

 

Thanking you for all your feedback,

Meliora team


Alice

Various disorientating neurological conditions that affect perception are known to exist. One of them is broadly categorized as Alice in Wonderland Syndrome in which the people experience size distortion of perceived objects. Lilliput Sight – or Micropsia – is a condition in which objects are perceived to be smaller than they actually are in the real world.

The condition is surprisingly common: episodes of micropsia or macropsia (seeing objects larger than they really are) occur in 9% of adolescents. The author of Alice’s Adventures in Wonderland – Lewis Carroll – is speculated to have had inspiration for the book from his own experiences in Micropsia. Carroll had been a well-known migraine sufferer which is one possible cause of these visual manifestations.

(Source: WikipediaThe Atlantic, Image from Alice’s Adventures in Wonderland (1972 Josef Shaftel Productions))

 

Facebooktwitterlinkedinmail


Tags for this post: announce features integration jenkins plugin product release reporting 


30.1.2017

Testlab – Raining Animal released

Meliora is proud to announce a new version of Meliora Testlab – Raining Animal. This version brings in a concept of “Indexes” which enable you to more easily collaborate with others and copy assets between your projects.

Please read on for more detailed description of the new features.

 

Custom fields for steps
Custom columns for steps

Execution steps of test cases can now be configured with custom columns. This allows you to customize the way you enter your test cases in your project.

Custom columns can be renamed, ordered in the order you want them to appear in your test case and typed with different kinds of data types.

 

Indexes
Collaborating with indexes

A new concept – Indexes – has been added which enables you to pick assets from different projects on your index and collaborate with them.

An index is basically a flat list of assets such as requirements or test cases from your projects. You can create as many indexes you like and you can share them between users in your Testlab. All users who have access to your index can comment on it and edit it – this makes it easy to collaborate with a set of assets in your Testlab. 

 

Copying assets between your projects

Each asset on your index is shown with the project it belongs to. When you select assets from your index, you have an option to paste the selected assets to your current Testlab project. This enables you to easily copy content from a project to another. 

 

SAML 2.0 Single Sign-On support

The authentication pipeline of Testlab has been added with an support for SAML 2.0 Single Sign-On support (WebSSO profile). This makes it possible to use SAML 2.0 based user identity federation services such as Microsoft’s ADFS for user authentication.

The existing CAS based SSO is still supported but providing SAML 2.0 based federation offers more possibilities for integrating Testlab with your identity service of choice. You can read more about setting up the SSO from the documentation provided.

Better exports with XLS support

The data can now be exported directly to Excel in XLS format. CSV export is still available, but exporting data to Excel is now more straightforward.

Also, when exporting data from the table view, only the rows selected in the batch edit mode are exported. This makes it easier for you to hand pick the data when exporting.

In addition to the above and fixes under the hood,
  • the actions and statuses in workflows can now be rearranged by dragging and dropping,
  • stopping the testing session is made more straightforward by removing the buttons for aborting and finishing the run and
  • a new permission “testrun.setstatus” has been added to control who can change the status of a test run (for example mark the run as finished).
 

Meliora team


Throughout history, a rare meteorological phenomenon in which animals fall from the sky has been reported. There are reports of fish, frogs, toads, spiders, jellyfish and even worms coming raining down from the skies.

Curiously, the saying “raining cats and dogs” is not necessarily related to this phenomenon and is of unknown etymology. There are other some quite bizarre expressions for heavy rain such as “chair legs” (Greek) and “husbands” (Colombian).

 

(Source: Wikipedia, Photo – public domain)

Facebooktwitterlinkedinmail


Tags for this post: announce features integration product release usage 


23.1.2016

New feature release: Oakville Blob

We are proud to announce a new major release of Meliora Testlab: Oakville Blob. A major feature included is a Rest-based API for your integration needs. In addition, this release includes numerous enhancements and fixes. Read on.

 

Integration API

 
Integration API

Meliora Testlab Integration API offers Rest/HTTP-based endpoints with data encapsulated in JSON format. The first iteration of the API offers endpoints for

  • fetching primary testing assets such as requirements, test cases and issues,
  • fetching file attachments and
  • pushing externally run testing results as test runs. 

We publish the API endpoints with Swagger, a live documentation tool, which enables you to make actual calls against the data of your Testlab installation with your browser. You can access this documentation today by selecting the “Rest API …” from the Help menu in your Testlab. You can also read more about the API from the corresponding documentation page.

Tagging users in comments

 
 
Tagging team members to comments

When commenting assets in your project such as issues or test cases, you can now tag users from your project to your comments. Just include the ID, e-mail address or name of the team member prefixed with @ character and this user automatically gets a notification about the comment. The notice scheme rules have also been updated to make it possible to send e-mail notifications for users who are tagged to the comments of assets. Also, a new out-of-box notice scheme “Commenting scheme” has been added for this.

See the attached video for an example how this feature works.

 

Execution history of a test case

Sometimes it is beneficial to go through the results of earlier times of execution for a test case. To make this as easy as possible, Test design view has been added with a new “Execution history” tab.

This new view lists all times the chosen test case has been executed with the details similar to the listing of test run’s items in Test execution view. Each result can be expanded to show the detailed results of each step.

 

Notification enhancements

When significant things happen in your Testlab projects, your team members are generated with notifications always accessible from the notifications panel at the top. Oakville Blob provides numerous enhancements to these notifications such as

  • notifications panel has been moved to the top of viewport and is now accessible from all Testlab views,
  • notifications panel can now be checked as kept open meaning, that the panel won’t automatically close on lost focus of the mouse cursor and
  • detailed documentation for all notification types has been added to the help manual.

In addition, the following new notification events have been added:

  • if an issue, a requirement or a test case which you have been assigned with is commented, you will be notified,
  • if during testing, a test case or it’s step is commented, the creator of the test run will get notified,
  • if a step of test which you have been assigned with is added with a comment during testing, you will be notified.

 

Targeted keyword search

The quick search functionality has been enhanced with a possibility to target the keyword to specific information in your project’s assets. By adding a fitting prefix: to your keyword, you can for example target the search engine to search from the name of a test case only. For example, 

  • searching with “name:something” searches “something” from the names of all assets (for example, test cases, issues, …) or,
  • searching with “requirement.name:something” searches “something” from the names of requirements.

For a complete listing of possible targeting prefixes, see Testlab’s help manual.

 

Usability enhancements

Several usability related changes have been implemented.

  • Easy access to test case step’s comments: Test execution view has been added with a new “Expand commented” button. By clicking this button the view expands all test run’s test cases which have been added with step related comments.
  • Rich text editing: The rich test editors of Testlab have been upgraded to a new major version of the editor with simpler and clearer UI.
  • Keyboard access:
    • Commenting: keyboard shortcuts have been added for saving, canceling and adding a new comment (where applicable).
    • Step editing: Step editor used to edit the steps of a test case has been added with proper keyboard shortcuts for navigating and inserting new steps.
  • Test coverage view: 
    • Test case coverage has been added with a new column: “Covered requirements”. This column sums up and provides access to all covered requirements of listed test cases.
    • Test run selector has been changed to an auto-complete field for easier use with lot of runs.

 

Reporting enhancements

Several reporting engine related changes have been implemented.

  • Hidden fields in report configuration: When configuring reports, fields configured as hidden are not anymore shown in the selectors listing project’s fields.
  • Sorting of saved reports: Saved reports are now sorted to a natural “related asset” – “title of the report” -order
  • Better color scheme: The colors rendered for some fields of assets are changed to make them more distinct from each other.
  • Test run’s details: When test runs are shown on reports, they are now always shown with the test run’s title, milestone, version and environment included.
  • Wrapped graph titles: When graphs on reports are rendered with very long titles, the titles are now wrapped on multiple lines.
  • Results of run tests report: The report now sums up the time taken to execute the tests.
  • Test case listing report: Created by field has been added.

 

Other changes

With the enhancements listed above, this feature release contains numerous smaller enhancements.

  • New custom field type – Unique ID: A new custom field type has been added which can be used to show an unique non-editable numerical identifier for the asset. 
  • Editing of test case step’s comments: The step related comments entered during the testing can now be edited from the Test execution view. This access is granted to users with “testrun.run” permission.
  • Confluence plugin: Listed assets can now be filtered with tags and, test case listing macro has an option to filter out empty test categories.
  • Test run titles made unique: Testlab now prevents you from creating a test run with a title which is already present if the milestone, version and environment for these runs are the same. Also, when test runs are presented in the UI, they are now always shown with this information (milestone, version, …) included.
  • “Assigned to” tree filter: The formerly “Assigned to me” checkbox typed filter in the trees has been changed to a select list which allows you to filter in assets assigned to other users too.
  • File attachment management during testing: Controls have been added to add and remove attachments of the test case during testing.
  • Dynamic change project -menu: The Change project -selection in Testlab-menu is now dynamic – if a new project is added for you, the project will be visible in this menu right away.
  • Permission aware events: When (the upper right corner floating) events are shown to users, the events are filtered against the set of user’s permissions. Now the users should be shown only with events they should be interested in.
  • Number of filtered test runs: The list of test runs in Test execution view now shows the number of runs filtered in.
  • UI framework: The underlying UI framework has been upgraded to a new major version with many rendering fixes for different browsers.

 

Thanking you for all your feedback,

Meliora team


Oakville

“On August 7, 1994 during a rainstorm, blobs of a translucent gelatinous substance, half the size of grains of rice each, fell at the farm home of Sunny Barclift. Shortly afterwards, Barclift’s mother, Dotty Hearn, had to go to hospital suffering from dizziness and nausea, and Barclift and a friend also suffered minor bouts of fatigue and nausea after handling the blobs. … Several attempts were made to identify the blobs, with Barclift initially asking her mother’s doctor to run tests on the substance at the hospital. Little obliged, and reported that it contained human white blood cells. Barclift also managed to persuade Mike Osweiler, of the Washington State Department of Ecology’s hazardous materials spill response unit, to examine the substance. Upon further examination by Osweiler’s staff, it was reported that the blobs contained cells with no nuclei, which Osweiler noted is something human white cells do have.”

(Source: WikipediaReddit, Map from Google Maps)

 

Facebooktwitterlinkedinmail


Tags for this post: announce features integration jenkins plugin product release reporting 


24.10.2015

Testlab Girus released

Meliora Testlab has evolved to a new major version – Girus. The release includes enhancements to reporting and multiple new integration possibilities.

Please read more about the new features and changes below. 

 

Automatic publishing of reports

Testlab provides an extensive set of reporting templates which you can pre-configure and save to your Testlab project. It used to be that to render the reports, you would go to Testlab’s “Reports” section and choose the preferred report for rendering.

Girus enhances reporting in a way where all saved projects can be scheduled for automatic publishing. Each report can be set with a

  • recurrence period (daily, weekly and monthly) for which the report is refreshed,
  • report type to render (PDF, XLS, ODF),
  • optionally, a list of e-mail addresses where to automatically send the report and
  • an option to web publish the report to a pre-determined web-address.

When configured so, the report will then be automatically rendered and sent to interested parties and optionally, made available to the relatively short URL address for accessing.

 

Slack integration

slack_rgb (1)

Slack is a modern team communication and messaging app which makes it easy to set-up channels for messaging, share files and collaborate with your team in many ways.

Meliora Testlab now supports integrating with Slack’s webhook-interfaces to post notifications about your Testlab assets to your Slack. The feature is implemented as a part of Testlab’s notice schemes. This way, you can set up rules for events which you prefer to be publishes to your targeted Slack #channel.

You can read more about the Slack integration via this link.

 

Webhooks

Webhooks are HTTP-protocol based callbacks which enable you ro react on changes in your Testlab project. Webhooks can be used as a one-way API to integrate your own system to Testlab.

Similarly to the Slack integration introduced above, the Webhooks are implemented as a channel in Testlab’s notice schemes. This way you can set up rules to pick the relevant events you wish to have the notices of.

An example: Let’s say you have an in-house ticketing system and you need to mark a ticket resolved each time an issue in Testlab is closed. With Webhooks, you can implement a simple HTTP based listener on your premises and set up a notification rule in Testlab to push an event to your listener everytime an issue is closed. With little programming, you can then mark the ticket in your own system as resolved. 

You can read more about the Webhooks via this link.

 

Maven plugin

Apache Maven is a common build automation tool used primarily for Java projects. In addition, Maven can be used to build projects written in C#, Ruby, Scala and other languages.

Meliora Testlab’s Maven Plugin is an extension to Maven which makes it possible to publish xUnit-compatible testing results to your Testlab project. If you are familiar with Testlab’s Jenkins CI plugin, the Maven one provides similar features easily accessible from your Maven project’s POM.

You can read more about the Maven plugin via this link.

 

Automatic creation of test cases for automated tests

When JUnit/xUnit-compatible testing results of automated tests are pushed to Testlab (for example from Jenkins, Maven plugin, …), the results are mapped and stored against the test cases of the project. For example, if you push a result for a test com.mycompany.junit.SomeTest, you should have a test case in your project with this identifier (or sub-prefix of this identifier) as a value in the custom field set up as the test case mapping field. The mapping logic is best explained in the Jenkins plugin documentation.

To make the pushing of results as easy as possible the plugins now have an option to automatically import the stubs of test cases from the testing results themselves. This way, if a test case cannot be found from the project for some mapping identifier, the test case is automatically created to a configured test category.

 

Jenkins plugin: TAP support

Test Anything Protocol, or TAP, is a simple interface between testing modules in a test harness. Testing results are represented by simple text files (.tap) describing the steps taken. The Jenkins plugin of Testlab has been enhanced in a way, that the results from .tap files produced by your job can be published to Testlab.

Each test step in your TAP result is mapped to a test case in Testlab via the mapping identifier, similarly to the xUnit-compatible way of working supported previously.

The support for TAP is best explained in the documentation of the Testlab’s Jenkins plugin.

 

Support for multiple JIRA projects in Webhook-based integration

When integrating JIRA with Testlab using the Webhooks-based integration strategy, the integration now supports integrating multiple JIRA projects to a single Testlab project. When multiple projects are specified and an issue is added, the user gets to choose which JIRA project the issue should be added to.

Due to this change, it is now also possible to specify which JIRA project the Testlab project is integrated with. Previously, the prefix and the key of the projects had to match between the JIRA and Testlab project.

 

Miscellaneous enhancements and changes
  • Results of run tests report added with new options to help you on reporting out your testing results:
    • Test execution date added as a supported field,
    • “Group totals by” option added which sums up and reports totals for each group of specified field,
    • reports have been added with columns and rows of sum totals and
    • test categories selection supports selection of multiple categories and test cases for filtering.
  • Updates on multiple underlying 3rd party libraries.
  • Bugs squished.

 

Sincerely yours,

Meliora team


Pithovirus

Pithovirus sibericum was discovered buried 30 m (98 ft) below the surface of late Pleistocene sediment in a 30 000-year-old sample of Siberian permafrost. It measures approximately 1.5 micrometers in length, making it the largest virus yet found. Although the virus is harmless to humans, its viability after being frozen for millennia has raised concerns that global climate change and tundra drilling operations could lead to previously undiscovered and potentially pathogenic viruses being unearthed.

A girus is a very large virus (gigantic virus). Many different giruses have been discovered since and many are so large that they even have their own viruses. The discovery of giruses has triggered some debate concerning the evolutionary origins of the giruses, going so far as to suggest that the giruses provide evidence of a fourth domain of life. Some even suggest a hypothesis that the cell nucleus of the life as we know it originally evolved from a large DNA virus.

(Source: Wikipedia, Pithovirus sibericum photo by Pavel Hrdlička, Wikipedia)

 

Facebooktwitterlinkedinmail


Tags for this post: announce features integration jenkins plugin product release reporting 


24.11.2014

New JIRA integration and Pivotal Tracker support

The latest release of Testlab has been added with a bunch of enhancements for integrations:

  • A new integration method (based on JIRA’s WebHooks) has been added. This brings us the support for JIRA OnDemand.
  • New JIRA integration method supports pushing JIRA’s issues as requirements / user stories to Testlab. This makes it possible to push stories from your JIRA Agile to Testlab for verification.
  • A brand new support has been added for Pivotal Tracker agile project management tool. You can now easily manage and track your project tasks in Pivotal Tracker and in the same time, test, verify and report your implementation in Testlab.

Read further for more details on these new exciting features.

 

Support for JIRA OnDemand

To this day, Testlab has supported JIRA integration with best in class 2-way strategy with a possibility to freely edit issues in both end systems. For this to work the JIRA has had to be installed with Testlab’s JIRA plugin, which provides the needed synchronization techniques. Atlassian’s cloud based offering, JIRA OnDemand, won’t allow us to install custom plugins which leads us to the situation that the 2-way synchronized integration possibility has been possible only with JIRA instances installed on customer’s own servers.

The new JIRA integration strategy has been implemented in a way, that it is possible to configure it in use by just using JIRA’s WebHooks. This requires no plugins and brings us the support for JIRA OnDemand instances. Keep in mind, that the new simpler integration strategy is also available for JIRA instances installed on your own servers.

 

One-way integration for JIRA’s issues and stories

The new integration method works in a way that issues and stories are created in JIRA and can be pushed to Testlab. This is possible for issues / bugs and also for requirements. For example, you can push stories from your JIRA Agile to Testlab as requirements which makes it possible to integrate the specification you end up designing in your JIRA as part of specification in your Testlab project which you aim to test.

You can read more about the different integration strategies for Atlassian JIRA here.

 

Configuring the integrations

integration_configuration

All new integrations, including the JIRA integration for issues and requirements and the Pivotal Tracker integration, are implemented in a way in which they can be taken in use by yourself. The project management view in Testlab has a new Integrations tab which allows you to configure the (Testlab side configuration) for these integrations. Integrations may also include some preliminary set up for them to work and instructions for these are provided in the plugin specific documentation.

 

 
Pivotal tracker integration

Pivotal Tracker is an agile project management tool for software team project management and collaboration. The tool allows you to plan and track your project using stories.

Your Testlab project can be integrated with Pivotal Tracker to export assets from Testlab to Pivotal Tracker as stories and, to push stories from Pivotal Tracker to Testlab’s project as requirements.

pivotalkuva

 
Importing Testlab assets to Pivotal Tracker

pivotal_import_testlabThe integration works in a two way manner. First, you have an option to import assets from your Testlab project to your Pivotal Project as stories to be tracked. You have an option to pull requirements, test runs, test sets, milestones, issues and even test cases from your Testlab project. When done so, a new story is created to your Pivotal Tracker project. It is as easy as dragging an asset to your project’s Icebox in Pivotal.

 

 
Pushing stories from Pivotal Tracker to Testlab

pivotal_importedYou also have an option to push stories from your Pivotal Tracker project to your Testlab project. This way, you can easily

  • push your stories from your Pivotal Tracker to Testlab project’s specification as user stories to be verified and
  • push bugs from your Tracker to Testlab project as issues.

Using Pivotal Tracker with Testlab is an excellent choice for project management. This way you can plan and track your project activities in Pivotal Tracker and in the same time, test, verify and report your implementation in Testlab. 

You can read more about setting up the Pivotal Tracker integration here.

 

To recap, the new features

  • bring you support for Pivotal Tracker, one of the best in industry agile project management tools,
  • add support for Atlassian JIRA integration where issues from JIRA are pushed to Testlab as issues and/or requirements and
  • add support for JIRA OnDemand.

We hope these new features make the use of Testlab more productive for all our existing and future clients.

 

 

 

Facebooktwitterlinkedinmail


Tags for this post: announce features integration jira pivotal tracker release screenshots 


25.9.2014

Integrating with Apache JMeter

Apache JMeter is a popular tool for load and functional testing and for measuring performance. In this article we will give you hands-on examples on how to integrate your JMeter tests to Meliora Testlab.

Apache JMeter in brief

Apache JMeter is a tool for which with you can design load testing and functional testing scripts. Originally, JMeter was designed for testing web applications but has since expanded to be used for different kinds of load and functional testing. The scripts can be executed to collect performance statistics and testing results.

JMeter offers a desktop application for which the scripts can be designed and run with. JMeter can also be used from different kinds of build environments (such as Maven, Gradle, Ant, …) from which running the tests can be automated with. JMeter’s web site has a good set of documentation on how it should be used.

 

Typical usage scenario for JMeter

A common scenario for using JMeter is a some kind of load testing or smoke testing setup where JMeter is scripted to make a load of HTTP requests to a web application. Response times, request durations and possible errors are logged and analyzed later on for defects. Interpreting performance reports and analyzing metrics is usually done by people as automatically determining if some metric should be considered as a failure is often hard.

Keep in mind, that JMeter can be used against various kinds of backends other than HTTP servers, but we won’t get into that in this article.

 

Automating load testing with assertions

The difficulty in automating load testing scenarios comes from the fact that performance metrics are often ambiguous. For automation, each test run by JMeter must produce a distinct result indicating if the test passes or not. The JMeter script can be added with assertions to tackle this problem.

Assertions are basically the criteria which is set to decide if the sample recorded by JMeter indicates a failure or not. For example, an assertion might be set up to check that a request to your web application is executed in under some specified duration (i.e. your application is “fast enough”). Or, an another assertion might check that the response code from your application is always correct (for example, 200 OK). JMeter supports a number of different kinds of assertions for you to design your script with.

When your load testing script is set up with proper assertions the script suits well for automation as it can be run automatically, periodically or in any way you prefer to produce passing and failing test results which can be pushed to your test management tool for analysis. On how to use assertions in JMeter there is a good set of documentation available online.

 

Integration to Meliora Testlab

Meliora Testlab has a Jenkins CI plugin which enables you to push test results and open up issues according to the test results of your automated tests. When JMeter scripted tests are run in a Jenkins job, you can push the results of your load testing criteria to your Testlab project!

The technical scenario of this is illustrated in the picture below.

jmeterkuva

You need your JMeter script (plan). This is designed with the JMeter tool and should include the needed assertions (in the picture: Duration and ResponseCode) to determine if the tests should pass or not. A Jenkins job should be set up to run your tests, translate the JMeter produced log file to xUnit compatible testing results which are then pushed to your Testlab project as test case results. Each JMeter test (in this case Front page.Duration and Front page.ResponseCode) is mapped to a test case in your Testlab project which get results posted for when the Jenkins job is executed.

 

Example setup

In this chapter, we give you a hands on example on how to setup a Jenkins job to push testing results to your Testlab project. To make things easy, download the testlab_jmeter_example.zip file which includes all the files and assets mentioned below.

 
Creating a build

You need a some kind of build (Maven, Gradle, Ant, …) to execute your JMeter tests with. In this example we are going to use Gradle as it offers an easy to use JMeter plugin for running the tests. For running the JMeter scripts there are tons of options but using a build plugin is often the easiest way.

1. Download and install Gradle if needed

Go to www.gradle.org and download the latest Gradle binary. Install it as instructed to your system path so that you can run gradle commands.

2. Create build.gradle file

apply plugin: 'java'
apply plugin: 'idea'
apply plugin: 'jmeter'

buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath "com.github.kulya:jmeter-gradle-plugin:1.3.1-2.6"
}
}

As we are going to run all the tests with plugin’s default settings this is all we need. The build file just registers the “jmeter” plugin from the repository provided.

3. Create src directory and needed artifacts

For the JMeter plugin to work, create src/test/jmeter -directory and drop in a jmeter.properties file which is needed for running the actual JMeter tool. This jmeter.properties file is easy to obtain by downloading JMeter and copying the default jmeter.properties from the tool to this directory.

 
Creating a JMeter plan

When your Gradle build is set up as instructed you can run the JMeter tool easily by changing to your build directory and running the command

# gradle jmeterEditor

This downloads all the needed artifacts and launches the graphical user interface for designing JMeter plans.

To make things easy, you can use the MyPlan.jmx provijmeterplanded in the zip package. The script is really simple: It has a single HTTP Request Sampler (named Front page) set up to make a request to http://localhost:9090 address with two assertions: 

  • Duration -assertion to check that the time to make the request does not exceed 5 milliseconds. For the sake of this example, this assertion should fail as the request probably takes more than this.
  • ResponseCode -assertion to check that the response code from the server is 200 (OK). This should pass as long as there is a web server running in port 9090 (we’ll come to this later).

It is recommended to give your Samplers and Assertions sensible names, as you refer directly these names later when mapping the test results to your Testlab test cases.

The created plan(s) should be saved to the src/test/jmeter -directory we created earlier, as Gradle’s JMeter plugin automatically executes all plans from this directory.

 

Setting up a Jenkins job

1. Install Jenkins

If you don’t happen to have Jenkins CI server available, setting one up locally couldn’t be easier. Download the latest release to a directory and run it with 

# java -jar jenkins.war --httpPort=9090

Wait a bit, and Jenkins should be accessible from http://localhost:9090 with your web browser. 

The JMeter plan we went through earlier made a request to http://localhost:9090. When you are running your Jenkins with the command said, the JMeter will fetch the front page of your Jenkins CI server when the tests are run. If you prefer to use some other Jenkins installation you might want to edit the MyPlan.jmx provided to point to this other address.

2. Install needed Jenkins plugins

Go to Manage Jenkins > Manage Plugins > Available and install

  • Gradle Plugin
  • Meliora Testlab Plugin
  • Performance Plugin
  • xUnit Plugin

2.1 Configure plugins

Go to Manage Jenkins > Configure System > Gradle and add a new Gradle installation for your locally installed Gradle. 

3. Create a job

Add new “free-style software project” job to your Jenkins and configure it as follows:

3.1 Add build step: Execute shell

Add a new “Execute shell” typed build step to copy the contents of your earlier set up Gradle project to job’s workspace. This is needed as the project is not in a version control repository. Setup the step as for example:

jmeter_executeshellplugin

.. Or something else that will make your Gradle project available to Jenkins job’s workspace.

Note: The files should be copied so that the root of the workspace contains the build.gradle file for launching the build.

3.2 Add build step: Invoke Gradle script

Select your locally installed Gradle Version and enter “clean jmeterRun” to Tasks field. This will run “gradle clean jmeterRun” command for your Gradle project which will clean up the workspace and execute the JMeter plan.

jmeter_gradleplugin

3.3 Add post-build action: Publish Performance test result report (optional)

Jenkins CI’s Performance plugin provides you trend reports on how your JMeter tests have been run. This plugin is not required for Testlab’s integration, but provides handy performance metrics on your Jenkins job view. To set up the action click “Add a new report”, select JMeter and set the Report files as “**/jmeter-report/*.xml”:

jmeter_performanceplugin

Other settings can be left to defaults or you can configure the settings for your liking.

3.4 Add post-build action: Publish xUnit test result report

Testlab’s Jenkins plugin works in a way, that it needs the test results to be available in so called xUnit format. In addition, this will generate test result trending graphs to your Jenkins job view. Add a post-build action to publish the test results resolved from JMeter assertions as follows by selecting a “Custom Tool”:

jmeter_xunitplugin

Note: The jmeter_to_xunit.xsl custom stylesheet is mandatory. This translates the JMeter’s log files to the xUnit format. The .xsl file mentioned is located in the jmeterproject -directory in the zip file and will be available in the Jenkins’ workspace root if the project is copied there as set up earlier.

3.5 Add post-build action: Publish test results to Testlab

The above plugins will set up the workspace, execute the JMeter tests, publish the needed reports to Jenkins job view and translate the JMeter log file(s) to xUnit format. What is left is to push the test results to Testlab. For this, add a “Publish test results to Testlab” post-build action and configure it as follows:

jmeter_testlabplugin

For sake of simplicity, we will be using the “Demo” project of your Testlab. Make sure to configure the “Company ID” and “Testlab API key” fields to match your Testlab environment. The Test case mapping field is set to “Automated” which is by default configured as a custom field in the “Demo” project.

If you haven’t yet configured an API key to your Testlab, you should log on to your Testlab as company administrator and configure one from Testlab > Manage company … > API keys. See Testlab’s help manual for more details.

Note: Your Testlab’s edition must be one with access to the API functions. If you cannot see the API keys tab in your Manage company view and wish to proceed, please contact us and we get it sorted out.

 

Mapping JMeter tests to test cases in Testlab

For the Jenkins plugin to be able to record the test results to your Testlab project your project must contain matching test cases. As explained in the plugin documentation, your project in Testlab must have a custom field set up which is used to map the incoming test results. In the “Demo” project field is already set up (called “Automated”). 

jmeterplanEvery assertion in JMeter’s test plan will record a distinguishing test result when run. In the simple plan provided, we have a single HTTP Request Sampler named “Front page”. This Sampler is tied with two assertions (named “Duration” and “ResponseCode”) which check if the request was done properly. When translated to xUnit format, these test results will get idenfied as <Sampler name>.<Assertion name>, for example:

  • Front page/Duration will be identified as: Front page.Duration and
  • Front page/ResponseCode will be identified as: Front page.ResponseCode

To map these test results to test cases in the “Demo” project,

1. Add test cases for JMeter assertions

Log on to Testlab’s “Demo” project, go to Test case design and

  • add a new Test category called “Load tests”, and to this category,
  • add a new test case “Front page speed”, set the Automated field to “Front page.Duration” and Approve the test case as ready and
  • add a new test case “Front page response code”, set the Automated field to “Front page.ResponseCode” and Approve the test case as ready.

Now we have two test cases for which the “Test case mapping field” we set up earlier (“Automated”) contains the JMeter assertions’ identifiers.

 

Running JMeter tests

What is left to do is to run the actual tests. Go to your Jenkins job view and click “Build now”. A new build should be scheduled, executed and completed – probably as FAILED. This is because the JMeter plan has the 5 millisecond assertion which should fail the job as expected.

 

Viewing the results in Testlab

Log on to Testlab’s “Demo” project and select the Test execution view. If everything went correctly, you should now have a new test run titled “jmeter run” in your project:

jmeter_testrun

As expected, the Front page speed test case reports as failed and Front page response code test case reports as passed.

As we configured the publisher to open up issues for failed tests we should also have an issue present. Change to Issues view and verify, that an issue has been opened up:

jmeter_defect

 

Viewing the results in Jenkins CI

The matching results are present in your Jenkins job view. Open up the job view from your Jenkins:

jmeter_jenkinsview

The view holds the trend graphs from the plugins we set up earlier: “Responding time” and “Percentage of errors” from Performance plugin and “Test result Trend” from xUnit plugin. 

To see the results of the assertions, click “Latest Test Result”:

jmeter_jenkinsresults

The results show that the Front page.Duration test failed and one test has passed (Front page.ResponseCode).

 

Links referred

 http://jmeter.apache.org/  Apache JMeter home page
 http://jmeter.apache.org/usermanual/index.html  Apache JMeter user manual
 http://jmeter.apache.org/usermanual/component_reference.html#assertions  Apache JMeter assertion reference
 http://blazemeter.com/blog/how-use-jmeter-assertions-3-easy-steps  BlazeMeter: How to Use JMeter Assertions in 3 Easy Steps
 https://www.melioratestlab.com/wp-content/uploads/2014/09/testlab_jmeter_example.zip  Needed assets for running the example
 https://github.com/kulya/jmeter-gradle-plugin  Gradle JMeter plugin
 http://www.gradle.org/  Gradle home page
 http://mirrors.jenkins-ci.org/war/latest/jenkins.war  Latest Jenkins CI release
 https://www.melioratestlab.com/wp-content/uploads/2014/09/jmeter_to_xunit.xsl_.txt  XSL-file to translate JMeter JTL file to xUnit format  
 https://wiki.jenkins-ci.org/display/JENKINS/Meliora+Testlab+Plugin  Meliora Testlab Jenkins Plugin documentation 

 

 

 

Facebooktwitterlinkedinmail


Tags for this post: example integration jenkins load testing usage 


 
 
Best-of-class cross-browser hosted SaaS quality and test management, testing and issue management tools to improve your quality. Site information.