Saturday, May 26, 2012



Open Script Errors :An internal error occurred during: "Launching Open Script".



Some times we get an internal error in the open script during recording and playback of the script and open script  workspace stopped responding.


Error and exceptions : java.lang.NullPointerException


Resolution of this problem: 

a) Save the script and all the assets related to open script worksapce and close open script.

b)  delete the osworkspace directory  or rename it .

c) It is always advisible to rename it rather than deleting it , but in case renaming doesnt resolve the issue then it is recommendable to delete it (but backup before deleting it).







Monday, May 14, 2012

Database Testing Using JMeter

Steps For Database Testing
  • Download JMeter
  • Download the Latest Version of JMeter

  • Run the JMeter.bat
  • Run the jmeter.bat file from the JMeter installation directory.

  • JMeter GUI
  • JMeter GUI will be shown…

  • Adding Basic controls to the Test Plan
    • Thread Group control
    • Right click on the Test Plan and add the Thread Group control.
      Note: Thread Group allows us to run script with n no. of users with defined ramp-up period and also the scheduling of execution.
    • JDBC Request control
    • Add the JDBC Request control below the Thread Group. Note: JDBC Request allows us to run a SQL query, procedure etc.
    • JDBC Connection Configuration control
    • Add the JDBC Connection Configuration control to the Test Plan.
      Note: JDBC Connection Configuration control is used to configure the database i.e. Oracle, SQL Server etc. In this control, we need to provide values to the following parameters to communicate with DB server
      Database Connection Configuration
        For Oracle
        ============================
        Database URL: jdbc:oracle:thin:@localhost:IP:service (e.g. IP: 1521, Service: oracle)
        JDBC Driver Class: oracle.jdbc.driver.OracleDriver
        Username: Username of the database (e.g. Username: scott)
        Password: Password of the user (e.g. Password: tiger)

        For SQL Server 2005
        ============================
        Database URL: jdbc:sqlserver://localhost:1433;databaseName=Learn_DB;
        JDBC Driver Class: com.microsoft.sqlserver.jdbc.SQLServerDriver
        Username: Username of the database (e.g. Username: sa)
        Password: Password of the user (e.g. Password: password)
        Copy this jar file to “lib” folder. This is jdbc driver for SQL server 2005 [downloaded from Microsoft’s site]
        For MySQL
        ============================
        Database URL: jdbc:mysql://localhost:3306/mydb
        JDBC Driver Class: com.mysql.jdbc.Driver
        Username: Username of the database (e.g. Username: guest)
        Password: Password of the user (e.g. Password: password)
        Where “mydb” is Database Name E.g. for MySQL is given in jmeter’s user manual “http://jakarta.apache.org/jmeter/usermanual/build-db-test-plan.html”
    • View Result Tree control Add the View Result Tree control to the Test Plan.
    • Note: View Result Tree control is used to view the result of the query executed.
    • Parameterization
    • User Parameters Control
      1. Add the User Parameters control to the Thread Group.
      2. Pass the variable to the query or procedure.
      Syntax: ${variable_name} (e.g. ${Dept_No})
      Sample Query: update dept set loc='JMeter' where deptno=${Dept_No}
    • Calling Procedure
    • 1. Add the JDBC Request control to the Thread Group.
      2. Procedure which we are going to call must exist in database.
      3. Type the following command in SQL Query parameter of the JDBC Request control
      begin
      {call update_DEPT_PKG.update_DEPT(10)};
      end;
      Whichever procedure we want to call must be inside the begin…end block. Call keyword is used to call the procedure from the database.

Monday, May 7, 2012

Performance, Load, Stress Testing and Related Terms..

Performance testing: This type of testing determines or validates the speed, scalability, and/or stability characteristics of the system or application under test. Performance is concerned with achieving response times, throughput, and resource-utilization levels that meet the performance objectives for the project or product. In this guide, performance testing represents the superset of all of the other subcategories of performance-related testing.

Load testing: This subcategory of performance testing is focused on determining or validating performance characteristics of the system or application under test when subjected to workloads and load volumes anticipated during production operations.

Stress testing: This subcategory of performance testing is focused on determining or validating performance characteristics of the system or application under test when subjected to conditions beyond those anticipated during production operations. Stress tests may also include tests focused on determining or validating performance characteristics of the system or application under test when subjected to other stressful conditions, such as limited memory, insufficient disk space, or server failure. These tests are designed to determine under what conditions an application will fail, how it will fail, and what indicators can be monitored to warn of an impending failure.

Terminologies:

Baselines: Creating a baseline is the process of running a set of tests to capture performance metric data for the purpose of evaluating the effectiveness of subsequent performance-improving changes to the system or application. A critical aspect of a baseline is that all characteristics and configuration options except those specifically being varied for comparison must remain invariant. Once a part of the system that is not intentionally being varied for comparison to the baseline is changed, the baseline measurement is no longer a valid basis for comparison.

Benchmarking: enchmarking is the process of comparing your system’s performance against a baseline that you have created internally or against an industry standard endorsed by some other organization.

Capacity: The capacity of a system is the total workload it can handle without violating predetermined key performance acceptance criteria.

Capacity test : A capacity test complements load testing by determining your server’s ultimate failure point, whereas load testing monitors results at various levels of load and traffic patterns.

Component test: A component test is any performance test that targets an architectural component of the application. Commonly tested components include servers, databases, networks, firewalls, and storage devices.

Endurance test: An endurance test is a type of performance test focused on determining or validating performance characteristics of the product under test when subjected to workload models and load volumes anticipated during production operations over an extended period of time. Endurance testing is a subset of load testing.

Performance: Performance refers to information regarding your application’s response times, throughput, and resource utilization levels.

Performance test: A performance test is a technical investigation done to determine or validate the speed, scalability, and/or stability characteristics of the product under test.

Performance goals: Performance goals are the criteria that your team wants to meet before product release, although these criteria may be negotiable under certain circumstances.

Performance objectives: Performance objectives are usually specified in terms of response times, throughput (transactions per second), and resource-utilization levels and typically focus on metrics that can be directly related to user satisfaction.

Performance targets:Performance targets are the desired values for the metrics identified for your project under a particular set of conditions, usually specified in terms of response time, throughput, and resource-utilization levels.

Performance testing objectives: Performance testing objectives refer to data collected through the performance-testing process that is anticipated to have value in determining or improving product quality.

Performance thresholds:Performance thresholds are the maximum acceptable values for the metrics identified for your project, usually specified in terms of response time, throughput (transactions per second), and resource-utilization levels.

Resource utilization: Resource utilization is the cost of the project in terms of system resources. The primary resources are processor, memory, disk I/O, and network I/O.

Saturation:Saturation refers to the point at which a resource has reached full utilization.

Scalability: calability refers to an application’s ability to handle additional workload, without adversely affecting performance, by adding resources such as processor, memory, and storage capacity.

Stability: In the context of performance testing, stability refers to the overall reliability, robustness, functional and data integrity, availability, and/or consistency of responsiveness for your system under a variety conditions.

Throughput: hroughput is the number of units of work that can be handled per unit of time; for instance, requests per second, calls per day, hits per second, reports per year, etc.

Source:http://msdn.microsoft.com

Friday, May 4, 2012

Wann To do Loadrunner Certification ??????


Load Runner Certifications:

1) HP AIS - Loadrunner v11(Recommended for Beginners)
    Exam HP0-M48 - HP LoadRunner 11.x Software
    Exam HP0-M49 HP Virtual User Generator 11.x Software

2) HP ASE - LoadRunner v11(Recommended for Experts)
     Exam HP0-M99 - Advanced LoadRunner and Performance Center v11 Software

1). HP AIS--Exam HP0-M48:

Minimum Qualifications:
To pass this exam, it is recommended that you have at least three months experience with HP LoadRunner 11.x Software. Exams are based on an assumed level of industry-standard knowledge that may be gained from the training, hands-on experience, or other pre-requisite events.

Exam Details:
The following are details about this exam:
 Number of items: 67
 Item types: multiple choice and drag-and-drop
 Exam time: 105 minutes
 Passing score: 72%
 Reference material: No on-line or hard copy reference material will be allowed at the testing site.

Exam Content:
The following testing objectives represent the specific areas of content covered in the exam. Use this outline to guide your study and to check your readiness for the exam. The exam measures your understanding of these areas.























2). HP AIS--Exam HP0-M49:

Exam Details:
The following are details about this exam:
 Number of items: 63
 Item types: multiple choice and drag-and-drop
 Exam time: 105 minutes
 Passing score: 74%
 Reference material: No on-line or hard copy reference material will be allowed at the testing site.

Exam Content:
The following testing objectives represent the specific areas of content covered in the exam. Use this outline to guide your study and to check your readiness for the exam. The exam measures your understanding of these areas.

















3).HP ASE --Exam HP0-M99:

Minimum Qualifications:
To pass this exam, you should have at least six months field experience in scripting using the Virtual User Generator, load test scenarios using the Controller and Analysis tools, automated software testing, and the software testing lifecycle. Exams are based on an assumed level of industry-standard knowledge that may be gained from the training, handson experience, or other pre-requisite events. You should also be knowledgeable about:
 Web interfaces, HTML, software testing fundamentals
 C
 Basic SQ

Exam Details:
The following are details about this exam:2
 Number of items: 85
 Item types: Multiple choice and performance-based
 Exam time: 3 hours
 Passing score: 71.76%
 Reference material: This is a performance-based test. The candidate will be provided with a LoadRunner environment to perform tasks directed during the exam. No other on-line or hard copy reference material will be allowed.

Exam Content:

The following testing objectives represent the specific areas of content covered in the exam. Use this outline to guide your study and to check your readiness for the exam. The exam measures your understanding of these areas.

Sections/Objectives:
1. Plan a load test
2. Install LoadRunner
3. Create and enhance Vuser scripts
4. Demonstrate advanced scripting
5. Configure load test scenarios
6. Analyze results
7. Demonstrate core Performance Center software knowledge
8. Performance based activity (VuGen scripting, Scenario setup, Analysi

Exam Registration:
To register for this exam, please go to the exam tab in The Learning Center and click on “Access more information”. Visit http://www.hp.com/go/ExpertONE for access


Please let us know for any type of query/doubt regarding the exam

We wishes you success in passing the exam.All the best !!!!!

Wednesday, May 2, 2012

Approach While starting Performance Testing Project

If we are going to start a performance testing project than what approach is to be followed is the simple question in our mind:

Following is the generic approach that we are following while starting Performance testing assignment:

1) Functional and Architectural Understanding of Application

2) Identification of Critical Business workflows{In terms of user load and volume)

3) Identify first level bottlenecks at DB:Start performing activities manually and capture the slow running/CPU consuming queries by database profiling.

4)Find Index Suggestions/Requirement on slow running queries.

5)Find Sessions and Deadlock(If any) by Profiling[SQL Server Profiler/Alert file for Oracle]

6) Monitoring CPU,Memory,TCP Connection side by side.

7) Identify First level memory leak bottleneck[Perform a endurance run at night and check for out of memory error in server logs.

8) User Memory profiling tools Like Ants Profiler for .Net application,Eclipse for heap and thread dump analysis in case of apache tomcat server to find memory consuming objects.

9) Identify the first level bottleneck of TCP threads:Perform load execution with some users say 10 for long duration and monitor TCP Connections.