Performance testing info
 
About Me

Info on all aspects of non-functional testing including performance testing, load testing, scalability testing, business continuity testing, disaster recovery testing, resilience testing, endurance testing and volume testing.

Recent Posts
Menu
Links

4/14/2006 - IBM Unleashes a Barrage of SOA Announcements
Posted in performance testing

"If quantity is in any way connected to quality or importance, then IBM's announcement last Monday of new software and services to help customers adopt service oriented architecture, or SOA, was a humdinger. All in all, Big Blue rolled out 11 new products (some of which had been previously announced, and not all of which have yet been released) and 20 enhancements to existing products--all designed to speed up adoption and implementation of SOA.

For the full story visit here

 


4/8/2006 - Software testing for FIX connectivity group
Posted in Load testing

NEW YORK --(Business Wire)-- March 21, 2006

 

Aegis Software Inc., a New York-based provider of trading software, announced today that eSpeed, Inc. (Nasdaq:ESPD), a leading developer of electronic marketplaces and trading technology for the global capital markets, has chosen "Exchange Simulator" and "Client Simulator" for testing and verification of trading systems that are used in the Global FIX Connectivity Group.
Exchange Simulator allows developers, QA engineers and production support departments to verify trading systems against simulations of popular exchanges and ECNs, including NYSE, NASDAQ, ARCA, INET, LAVA, CME, CBOE and ISE. Several exchanges can be run simultaneously and each user can have their own virtual exchange. Exchange Simulator includes capabilities for functional, regression, conformance and load testing.

For the full story visit here


4/1/2006 - Software test automation - new testing tool
Posted in performance testing

Original Software, a leader in software test automation, today advanced the field of automated software testing with the introduction of its new solution, TestDrive-Gold.  Unlike testing solutions that rely on skilled IT staff to constantly write new test scripts, TestDrive-Gold is the only testing solution that requires no coding knowledge, and provides self-healing scripts and advanced scripting controls.

The core of Original Software's solution is its ease of use and intuitive nature, which enables automated testing to be readily applied at every stage of development, cuts time-to-market development cycles, and dramatically reduces application errors and failure.  TestDrive-Gold combines several innovative software testing technologies:

For the full story visit here


3/24/2006 - Rational Performance Tester Extension for SAP Solutions demo
Posted in performance testing
"High Performance SAP Programming: Rational Performance Tester Extension for SAP Solutions extends the performance and scalability testing tool of Rational Performance Tester to SAP Solutions. View the demo for answers to SAP testing FAQ's and more.

Rational Performance Tester Extension for SAP Solutions extends the performance and scalability testing of Rational Performance Tester to SAP Solutions. In doing so, it helps improve quality through pre-deployment scalability testing that mimics the true end-user experience and reduce the cost of system scalability testing with reusable test scenarios that can emulate a large population of users. It also helps reduce the time to develop performance tests with an easy to use, "no-code" test recording solution and increase the return on investment of your IT infrastructure through automated pre-deployment capacity planning tests"

 

For the full story visit here


3/17/2006 - SOA Testing
Posted in performance testing

(From Business Wire)

 

iTKO announced today the release of iTKO LISA 3 Complete SOA Testing Platform. LISA 3, generally available immediately, is a proven suite with the breadth to test serviced-oriented architecture (SOA) workflows across every component type and development phase. The complete LISA 3 suite was unveiled today at the Software Development Conference & Expo in Santa Clara, Calif.
LISA 3 expands on the concept of distributed testing for every development phase, from functional and unit testing, to regression, load and performance testing. It offers a comprehensive platform for deep testing and monitoring of dynamic Web applications and all of the supporting SOA layers behind them.

For the full story visit here


3/13/2006 - Performance testing - a different perspective
Posted in performance testing

Performance testing article in QAThreads:

 

"Today, applications are getting heavy, no. of users accessing the application is increasing and a result of this importance of performance testing is increasing. On the other hand fast servers delivering a great performance against millions of simultaneous users are available in the market.

The problem is that these servers don’t come cheap and also the software required and experts available in this field. In this kind of scenario when an organization think of carrying out a performance testing of an application lot of factors need to be considered from the basics of performance testing to which tool to be used and how it should be used. In this article "The declining importance of performance testing should change your priorities"  Robert L. Bogue discusses performance testing with respect to today's changing face of software and its accessibility."

For the full story visit here


2/25/2006 - Performance testing approaches
Posted in performance testing

Found a great article on performance testing on Sys-con. The intro is below:

"Performance testing a J2EE application can be a daunting and seemingly confusing task if you don't approach it with the proper plan in place. As with any software development process, you must gather requirements, understand the business needs, and lay out a formal schedule well in advance of the actual testing.

The requirements for the performance testing should be driven by the needs of the business and should be explained with a set of use cases. These can be based on historical data (say, what the load pattern was on the server for a week) or on approximations based on anticipated usage. Once you have an understanding of what you need to test, you need to look at how you want to test your application.

For the full story visit here


2/17/2006 - Performance testing consultant - guess the salary
Posted in performance testing

Tester (Performance Testing Specialist) - Banking - London

 

"As an experienced performance tester you will join a leading Investment Bank in a team responsible for testing system applications on small, medium and large projects. You will be responsible for the testing of systems applications on Unix/Sybase/Java/C platform and will need to understand project test requirements, prepare test cases, test environments, and manage the Performance test phase of a project / regular release.

 

For the full story visit here


2/12/2006 - IBM to boost testing in Russia with free testing tools

From EFY Times

 

"With a few clicks of a mouse, developers can download free versions of IBM middleware, IBM WebSphere Application Server Community Edition and IBM DB2 Universal Database Express-C, as well as access trial code, tutorials, technical forums, emerging technologies and blogs where IBM technical experts share their tips and expertise Russia has one of the largest, growing developer populations in the world."

 

For the full story visit here


2/1/2006 - Penetration testing worms

"A researcher has reopened the subject of beneficial worms, arguing that the capabilities of self-spreading code could perform better penetration testing inside networks, turning vulnerable systems into distributed scanners.

 
For the full story visit here


1/23/2006 - Automated software testing solution emerges from R&D
Posted in Scalability testing
Interesting article on the results of the EUREKA ITEA Cluster TT-Medal project which states that software testing costs of embedded software in electronics systems can be dramatically reducing, providing significant savings for European Industry.

The EUREKA ITEA Cluster TT-Medal project has achieved a major breakthrough for the European electronics industry by developing a generic solution to enable automated testing of software systems. The methodologies and tools developed in the project were validated in industrial-scale demonstrators for automotive, railway, financial and telecommunications applications, proving the feasibility of a significant improvement in test efficiency, effectiveness and product quality. This, in turn, leads to significant cuts in testing costs. As a result, the TT-Medal project provides a unique opportunity for European suppliers and consultants to position themselves better in a world market previously dominated by the USA.”

Generic automated testing tools and methods enabled these savings:

"The EUREKA Cluster TT-Medal project developed generic automated testing methodologies and tools based on the TTCN-3, the international standardised testing language from the European Telecommunication Standards Institute (ETSI) that enable systems testing from beginning to end, using common tools. This makes the reuse of test ware between different phases of a product's lifecycle possible -from initial simulation at the design stage to regression testing during maintenance - and also saves on training.

"An added advantage of the internationally recognised Testing and Test Control Notation (TTCN) language is that it is driven by Europe. It can be used for many applications, including mobile communications, wireless local area networks (LANs), digital cordless phones, broadband technologies and Internet protocols. It is more productive, powerful, flexible and extendable than previous approaches, as well as being easier to learn. "Three application areas were selected to validate the approach:
  1. Transportation - telematics for information and entertainment systems in cars and interlock subsystems for railway signalling and control;
  2. Telecommunications - 3G radio access network operation and maintenance, GSM mobile terminal location, and 2.5G and 3G mobile module integration; and
  3. Finance - integration of TTCN-3 on both user and application sides of financial distribution systems testing

For the full story visit Automated solution enhances European software testing capabilities

For the full story visit here


1/14/2006 - Automated software testing interview
Posted in performance testing

An interesting interview with Danny Faught at Qthreads. A couple of software testing snippets to whet your appetite to read the whole article:

"As for the evolution of the software testing field, there have been some innovations over the years. What's striking about the innovations, however, is how few people know about them, and even fewer people are using them. In terms of evolution, I think we're in the Dark Ages of testing. Many test teams work in isolation, knowing little of the existing literature on the subject, and providing little input to improve the state of the craft in general. I think that software testing will evolve into a proper engineerign discipline, but I don't think that will happen for several decades yet."

"Most projects I see are not ready for functional test automation. Because they're irrelevant to most of my work, I have little experience with the big GUI automation tools and no opinion about how well they work. I do use other kinds of tools on a regular basis. James Bach and I wrote about this in “Not Your Father’s Test Automation: An Agile Approach to Test Automation", and it's a core part of what I do. I use mostly open source and freeware tools, along with lots of small scripts that are easy to develop and make my job a lot easier."
 
"The simple answer to the question “Can automated testing replace all manual testing” is “No.”"

 

 


1/9/2006 - Load, stress and performance testing tool for sites
Posted in performance testing

SoftLogica LLC announces WAPT 4.0, the new version of its load, stress and performance testing tool for web sites, web servers and applications with web interfaces.

WAPT creates a workload for testing performance which is virtually the same as the load experienced by a web site in the real world. For example, for retail sites, some users may be surfing the catalog, others searching for a specific product and submitting an order, while an administrator may be updating the catalog. Taking into account that various users perform different actions while browsing your website, the program lets you define as many different profiles of virtual user, as types of real web application users.

Virtual users in each profile can be set up with their own individual IP address (IP spoofing), user name and password, and persistent cookies, to name a few. Basic and Integrated Windows (NTLM) authentication methods are supported. The program handles user-specific dynamic hidden values and session variables assigned by a server.

Graphs and reports are shown in real-time, thus helping to manage the performance testing process. You don’t need to wait for the completion of the test to get results, so if you have already identified a problem, you can stop testing, fix the problem and start the load test again to check for performance changes.


The command line interface allows you to integrate WAPT into the existing development environment. Standard XML files are used to store performance testing scenarios and can be modified by third party software. WAPT supports different language encodings, so you can perform load testing for web sites in virtually any language, including forms and dynamic content.


For more information on WAPT site, load, stress and  performance testing tool

 


12/31/2005 - Performance testing tools review from Computerworld
Posted in performance testing

Computerworld runs some interesting items in its review of 2005 asoftware pplication development and testing, including performance testing tools like OpenSTA.

 

"We also witnessed the emergence of static code analysis tools -- from vendors such as Compuware, Coverity, Fortify Software, and Secure Software -- as a promising way to identify software defects and security issues"

 

The testing tools it picks out include:

 

"Compuware DevPartner Fault Simulator 1.0
Fair 6.5
DevPartner Fault Simulator is a great idea that solves a difficult and rarely addressed problem: testing little-used program exception code. Unfortunately, this release is hampered by lack of integration with Compuware's other tools, as well as by limited coverage. Wait for the next release.

Empirix e-Test Suite 8.0
Very Good 8.0
e-Test Suite 8.0 greatly expands its reach with a revised Java-based agent architecture and greater back-end server support. Although still not as well-integrated or capable as higher-end products such as Segue's SilkPerformer, e-Test is headed in the right direction. It's an easy-to-master Web app testing solution.

Parasoft Jtest 7.0
Excellent 8.7
For industrial-strength Java application testing, Jtest 7.0 is the tool to get. Because it allows you to do so much without getting your hands in the code, it will be as useful to your QA department as it will be to your developers.

Parasoft SOAPtest 4.0
Very Good 8.6
Parasoft's SOAPtest 4.0 is a Web service testing tool that will be useful to developers and QA engineers alike. Its strength is the speed with which robust test suites can be built from humble unit-test beginnings. New security tests for Web services are a fine addition.

Minq PureLoad Enteprise Edition 3.3.1
Excellent 8.7
Sites that need to test Web protocols and enterprise applications will find PureLoad Enterprise Edition up to the task. Although its reporting is not as extensive as that of some competitors, PureLoad is very easy to set up and use for single- and multiple-machine load tests. A Web Edition that supports HTTP and HTTPS testing is also available.

OpenSTA 1.4.3
Very Good 8.3
OpenSTA is a solid, easy-to-use, open source Web load-testing tool. It does a nice job of answering how well a particular Web site or app might scale. Its proprietary scripting language may slow some developers a bit, however, and its basic reporting and lack of certain features (built-in comparison tools, for example) may prompt some IT departments to look elsewhere. "

There are a stack of other tools reviewed for application development and testing in the Computerworld round-up. There's some excellent white papers on the site but unfortunately I couldn't find one on performance testing or performance monitoring.

 




12/22/2005 - Peformance testing with Open Source testing tools
Posted in performance testing

I came across this brilliant site on Open Source performance testing which "aims to boost the profile of open source testing tools within the testing industry, principally by giving users easy access from one central location to the wide range of open source testing tools available." It contains tool information on:

 

Functional testing tools

Performance testing tools - such as OpenSTA

Test management tools

Bug databases

Link checkers

Security tools

 

It also contains a summary of information on feedback on Open Source testing tools.

"... some pearls of wisdom that I shall repeat here:

  • All of the tools have hidden costs, however, the major vendor provided tools have more up-front costs.
  • In the perfect world, vendor provided solutions would be made available cheaper.
  • Companies need to be aware of all solutions available, and whether to use “free," open source, or vendor provided tools should be part of the automated testing tool evaluation process.
  • Using a combination of very low cost vendor provided testing tools along with zero up-front cost, open-source testing tools would be the ideal solution, provided compatibility.
  • Testers need to be technical in order to be able to implement the most effective solution.
  • One of the main points: There is no such thing as a free tool."

Great source of useful opensource performance testing tool information (and other testing tools too). Great site.

 


12/18/2005 - Performance monitoring, testing and management
Posted in performance testing

Performance management tool vendor for business enterprises, WiredCity, announced it is expanding the infrastructure monitoring market with IT Monitor(TM), a software architecture that crosses organizational boundaries for management and testing of the end-to-end performance and processes.

 

WiredCity's IT Monitor platform integrates real-time events to trigger actions throughout  the business operations, providing enterprise performance monitoring and management that helps users anticipate issues before they happen, while also enabling them to continuously improve operational performance. The architecture is a scalable software solution that allows end-to-end performance testing, monitoring and management.

 


 

For more information go to performanc emanagement tool

 

For an overview on performance testing tools.

 

performance testing.
 

 


12/17/2005 - Performance testing - SAP and LoadRunner


12/14/2005 - Whitepaper on performance testing in a SAP application environment.
Posted in performance testing

Just read an interesting new whitepaper from Mercury. It covers diagnostics and root-cause analysis capabilities for optimizing heterogeneous business processes in an SAP applications environment across the performance lifecycle. There's an nteresting description of their root cause analysis capabilities which span from SAP NetWeaver to SAP R/3 backends.

 

Its aimed at IT managers, SAP application performance testers, and SAP support and operations groups. So if your interesting in testing and are part of this group then its worth a read.

 

You can get a copy from Bitpipe and the paper is called: Diagnostics for SAP Solutions: A performance Lifecycle White Paper

 

.

 

Performance testing services (including SAP)

 


12/11/2005 - Parasoft: security and reliability in web services

 


11/20/2005 - Performance testing and scalability testing
Posted in Scalability testing

Systems that work well during development, deployed on a small scale, can fail to meet performance goals when they are scaled up to support real levels of use. An apposite example of this comes from a major blue chip company that recently outsourced the development of an innovative high technology platform. Though development was behind schedule that wasn't a major problem. The system passed through functional elements of the user acceptance testing and eventually it looked like a deployment date could be set. But then the supplier started load testing and scalability testing. There followed a prolonged and costly period of architectural changes and changes to the system requirements. The supplier battled heroically to provide an acceptable system, until finally the project was mothballed.

 

This is not an isolated case. From an ambulance dispatch system to electronic submission systems of tax returns, systems fail as they scale and experience peak demands. All of these projects appear not to have identified and prioritised the major risks they faced. This is a fundamental stage of risk based testing, and applies equally to scalability testing or load testing as it does to functionality testing or business continuity testing. With no risk assessment they did not recognise that scaling was amongst the biggest risks, far more so that delivering all the functionality Recent trends towards Service Oriented Architecture (SOA) attempt to address the issue of scalability but also introduce new issues. Incorporating externally provided services into your overall solution means that your ability to scale now depends upon these external system operate under load.

 

Assuring this is a demanding task and sadly the load testing and stress testing here is often overlooked. Better practice is to start the development of a large scale software system with its performance clearly in mind, particularly scalability testing, volume testing and load testing. To create this performance testing focus:

  1. Research and quantify the data volumes and transaction volumes the target market implies. Some of these figures can be eye openers and help the business users realise the full scale of the system. This alone can lead to reassessment of the priority of many features.
  2. Determine the way features could be presented to users and the system structured in order to make scaling of the system easier. Do not try and have the same functionality you would have for a single user desktop solution provide an appropriate scalable alternative.
  3. Recognise that an intrinsic part of the development process is load testing at representative scale on each incremental software release. This is continual testing, focusing on the biggest risk to the project: the ability to operate at full scale.
  4. Ensure load testing is adequate both in scope and rigour. Load testing is not just about measuring response times with a performance test. The load testing programme needs to include other types of load testing including stress testing, reliability testing, and endurance testing.
  5. Don’t forget that failures will occur. Large scale systems generally include server clusters with fail-over behaviour. Failure testing, fail-over testing and recovery testing carried out on representative scale systems operating under load should be included.
  6. Don’t forget catastrophic failure could occur. For large scale problems, disaster testing and disaster recovery testing should be carried out at representative scale and loads. These activities can be considered the technical layers of business continuity testing.
  7. Recognise external services if you use them. Where you are adopting an SOA approach and are dependent on external services you need to be certain that the throughput and turnaround time on these services will remain acceptable as your system scales and its demands increase. A smart system architecture will include a graceful response and fall-back operation should the external service behaviour deteriorate or fail.
Copyright Acutest 2005

Permanent Link


 
ADS AREA
Mac online poker - Free blog sky
Copyright Free Blog Site All Rights Reserved.