Error: Cannot open the run results because the HP Run Results Viewer is not installed on this computer

After creating and running a test, an error is displayed "Cannot open the run results because the HP Run Results Viewer is not installed on this computer"

QuickTest Professional (QTP) / Service Test (ST) / Sprinter version 11 introduced a new reporting tool called HP Results Viewer. Some other products such as LoadRunner may be using this tool as well.

Unlike previous versions, its installation process automatically happens after the core product related to it get installed on a visual-less, invisible/on-background process which only get noticed as soon as the shortcut icons on desktop, among other things, gets placed. The QuickTest Add-in for Quality Center also makes use of this tool, and its installer (when used on machine without QTP) performs this reporting tool’s installation when finished on background.

Note: This new reporting tool, has separate installers on the core product disc

· Ensure the following registries, folders and files get created on the machine once the core product finish its installation:

Folder: C:\Program Files\HP\Run Results Viewer

Desktop Shortcut: "HP Run Results Viewer.lnk"

Registry Entries:

HKEY_LOCAL_MACHINE\SOFTWARE\Mercury Interactive\Test Results

HKEY_CURRENT_USER\Software\Mercury Interactive\Test Results (generated after opening)

· If the core product was installed properly, simply run directly the HP Results Viewer installation files:

QTP: <Installation files>\RunResultsViewer\<Language>\setup.exe

ST: <Installation files>\STSetup\MSI\ThirdPartyInstallations\HP_Run_Results_Viewer.msi

Sprinter support on virtual environments

The ALM 11 tool denominated as "Sprinter" currently supports the following virtualization environments/technologies and versions:

  • VMware WorkStation: 5.5, 6, 7
  • VMware ESX: 4, 4.1

Note: Any major (i.e. VMware Workstation 5) and minor (5.5) versions are displayed as separate items.

If a version isn’t displayed, it needs to be considered non-quality-assured therefore not officially supported by the product. Note: There are versions not listed above, which may work however still not quality-assured.

Increasing the resolution of images captured by Sprinter, which appear in the Sprinter Story Board

Images captured during a Sprinter Test Run, and appear in the sprinter storyboard are low.

The image resolution captured by Sprinter in the Sprinter Story board are low by default.

The Sprinter 11.0 patch 9 has enhanced the resolution quality of captured images.

After applying patch 09, in order to view the images in their best quality, click on the Maximize button in the upper part of the storyboard and maximize the window to Full Screen.

If the resolution is still not satisfactory, try changing the "ImageQualityPercentage" key in the Sprinter config file.

This key controls the quality and size of the images in the storyboard. (which is the JPG compression level.)

When this key is set to a value of 100, the resolution of the images is at its best quality, however the size of the images increases.

The default value of the key is 30.

Steps to change this key:

1. Open Sprinter.exe.config file under <Sprinter Installation Folder>\Bin.

2. find the key "ImageQualityPercentage".

3. Change the value to the preferred value of the user (take into account that the larger this value is, the larger the images are).

4. Save and Close the file.

5. Start Sprinter

Error message:”There are problems with your test configurations” when run Sprinter test in Power mode with instructions to start Internet Explorer

When running web test with Sprinter, where Internet Explorer is used the customer received the following error message: "There are problems with your test configurations" . In the Sprinter logs the following details are recorded: Path ‘C:\Program Files\Internet Exporer\iexplore.exe -nomerge’ not exists at this host.

The reason of issue is modification in registry key, which is responsible for how internet explorer is launched by default.

The problem is solved when the registry key {Default} under [HKEY_LOCAL_MACHINE\SOFTWARE\Clients\StartMenuInternet\IEXPLORE.EXE\shell\open\command] is set to the default value: "C:\Program Files\Internet Explorer\iexplore.exe" or C:\Program Files (x86)\Internet Explorer\iexplore.exe" for 64 bit OS

To modify the registry please follow the steps:

1. Click on Start->Run and type regedit

2. Navigate to HKEY_LOCAL_MACHINE\SOFTWARE\Clients\StartMenuInternet\IEXPLORE.EXE\shell\open\command

3. Double click on the string {Default} and set the default path for internet explorer startup file:

– C:\Program Files\Internet Explorer\iexplore.exe – for 32bit OS

– C:\Program Files (86)\Internet Explorer\iexplore.exe – for 64bit OS

Web object not working during recording yet on replay its works fine

When QuickTest Professional is recording against web-based applications using a great deal of JavaScript within, there are scenarios where some objects aren’t working, however once same page/screen gets refreshed when QTP isn’t recording or when Web Add-in isn’t loaded, the object works fine (for example a link not redirecting page to new location when recording, yet during replay it does work fine)

The hook(s) on either one or multiple events used by problematic web object is/are conflicting. This is normally caused because of the web-based application JavaScript related objects getting created or finish loaded in parallel when QTP loads the Web support to the page/screen/application, therefore if the support loading process finishes before an object isn’t fully loaded (for example events getting loaded last), hooks will be included, yet possibly causing for actual functionality of the event being overwritten, that is the “Click” event on a link or WebElement being analyzed, however its functionality getting broken (clicking not doing anything) due to conflicts.

Determining which particular event is encountering problems will provide inside either for a fix or reconfiguration of QTP to allow proceeding with automation.

To analyze which events are causing the problem:

· Access the Web Event Recording Configuration dialog (go to “Tools” and select “Web Event Recording Configuration” option

· Click on “Custom Settings…” button to show “Custom Web Event Recording Configuration” dialog. Note: On the left side, a tree view control shows different types of web test objects and on right side are shown all events being listened/recorded

· Before moving forward, backup/export to a file the current settings, by going to “File” and selecting “Save Configuration As”. Note: exported file can be “imported” by going to “File” and selecting “Load Configuration”

· Expand “Web Objects” (under “Any Web Object” item on left side) and find/highlight/select test object showing problematic behavior

· Once selected a test object, delete all the events displayed on right side of dialog (select event on list, then go to “Event” menu and select “delete)

· Click “Ok” on currently opened dialogs to confirm changes and go to back to main screen of QTP

· Confirm undesirable functionality against problematic test object isn’t happening no more. Note: it is expected to not record (scripts getting generated or objects getting added to repository)

· Now add back each deleted event (with same settings as before on “Listen” and “Record” columns, testing problem after adding each event back

· Note: Once determined which particular event (or multiple events) is reproducing issue, try following options if applicable

The following are workarounds to deal with scenarios where situation presents:

A. If "onclick" event is causing problem, replace it for “onmousedown” plus “onmouseup” events:

· Delete the "onclick" event from under both item called "Any Web Object" and from item related to problematic test object (for example "Link") on the Web Recording Configuration dialog

· Add/Insert both the events "onmousedown" and "onmouseup" events under modified items of step #1 with settings for "Listen" and "Record" the values "Always" and "Enabled" respectively

· Go to QTP Tools menu, select "Options" and access the "Advanced" section for "Web" tab/option

· Enable "Record settings > Record MouseDown and MouseUp as Click" option

· Click "Apply" then "Ok" to go back to main screen

· Click "New" to create a new blank test (or go to File > New > Test) or load desired test script (if already opened, close and reopen so new settings get applied to test)

· Test new settings

B. In case is not "onclick" event the one causing conflict:

· Start recording against problematic scenario

· Once problem happens, stop recording

· Refresh page/screen and confirm problematic object is now working. Note: it may be necessary reload entire browser to get support be loaded appropriately if refreshing the page/screen isn’t possible or such presents problems/restrictions

· (Optional) On QTP Script, click the location where new recorded script lines will be inserted/appended.

· Start recording once object is working correctly

Note: Repeat steps against each time during recording an object "looses" it’s functionality

IMPORTANT: In case script’s replay is also affected by this situation, then after the SYNC method usage on Browser test object, include another line using the REFRESHWEBSUPPORT method. For example:





Pause and minimize Sprinter during a test run.

It is possible to pause the capturing of the user actions from the Run sidebar. Once it is paused the user will be able to operate with the application under test or with any other not relevant to the test application and none of his actions will be captured.

There is also an option to minimize Sprinter by pressing keyboard Ctrl+Backspace keys. Clicking on it again should get Sprinter back to normal size.

View all the differences detected in the difference viewer of HP Sprinter

When there are many screen differences detected by HP Sprinter, only a summary view appears in the difference viewer, indicating the total number of differences detected, but all the differences cannot be viewed.

Displaying a summary view of the differences detected, when many differences were detected, is product design.

To fix this issue, you can set the "MaximumDiffNumberToReport" parameter in the settings.xml file (Path: %appdata%\HP\Sprinter folder).

Set the value to around 5000, and reboot Sprinter.

You should be able to view all the differences detected.