Software Testing and Types – An Overview

Introduction   

In this article, we will have a detailed look at Software Testing and its type.  Software testing is a process of executing an application with the intention of finding bugs. Testing is also done for making sure whether the application is developed as per the customer requirement or not. Checking satisfies customer requirement in all situations.

  • Independent verification and validation 
  • ​Configuration control 
  • ​Integration 
  • ​User Documentation
  • ​Unit testing 
  • ​Function Testing
  • ​Regression testing 
  • Integration testing 
  • Performance testing 
  • ​Security testing 
  • ​Usability testing 
  • ​System testing 
  • ​Cloud testing 
  • ​Field (beta) testing 
  • ​Acceptance testing 
  • ​Independent testing

Integration Testing

The integration test is conducted to verify that the combined parts of an application function together correctly. Any application is made up of smaller units called modules, which interact with each other via APIs or interfaces. Also, integration testing on a larger scale involves testing of the integration of one system with another system and validating the communication between them. 

The techniques: 

Integration testing is done by using three approaches: 

  • Big bang
  • Bottom-up, and 
  • Top-down. 

In each technique, modules are integrated with each other level-wise either top to down or bottom to up.

Bottom Up
Top Down

What is to be tested? 

  • User interface interaction (GUI) with the application databases like Oracle or MySQL or MS SQL Server, etc.  
  • All interaction with other applications (feeds into and out of the system)  
  • Examples of application testing involve testing various features the application provides. Testing of sending out an email, deleting an email, storing an email in the draft folder, etc. in MS Outlook can be considered as application testing. Each of these features is a module, and when integrated together, they form an application.  
  • System-to-system integration and testing of communication between them.  
  • Validating job execution (proper triggering of jobs inflow and outflow)  
  • Manually checking the database for accuracy  
  • Manually checking for notification messages on GUI  
  • Validation of all generated reports after the integration has been accomplished.  

Success criteria: Successful interaction of the modules in the application and successful interaction of various systems communicating with each other. 

Example: Testing of features in MS Outlook-like sending an email, deleting an email, composing an email, etc. All these modules integrate to form an application. The testing of all these modules integrated together is known as integration testing.

User Interface Testing

Test Objective: 

The user interface testing is performed to validate whether the Graphical User Interface (GUI) of the application is functioning as specified. For example, navigation through the target-of-test properly reflects business functions and requirements, including window-to-window (user screens), field-to-field, and use of access methods (tab keys, mouse movements, accelerator keys, etc.). 

Also validate the window objects and characteristics, such as menu(s), size, position, state, and focus conforming to standards. GUI testing has gained a lot of importance in today’s world where all the web and mobile apps are highly user-centric. Another important aspect of GUI testing is user experience. Testing should be focused on validating whether the screens match up the wireframes provided by the client and also ensure that the developed UI is easy to use for the end-user. 

The techniques:

 Create or modify tests for each window to verify proper navigation and object states for each application window and object. “State Transition Testing” should be performed to ensure the navigation flow from one page to another (back and forth). Test cases should focus on testing each and every component that would be visible on the screen to the end user. Utmost care should be taken to ensure all the hyperlinks on the web page redirect to the designated destination.

Success criteria: 

This involves successfully verification of each window to remain consistent with the benchmark version or within an acceptable standard. Special consideration: Not all properties for the custom and third-party objects can be accessed. Example: Navigate to any site such as ‘amazon.com’. When you visit this web site, you look out for various controls, menus, displays, fonts, colors, navigation to web pages, etc., which are part of the “look and feel” requirements. Testing of these as per the wireframes provided by the clients and validating the exact requirement is called GUI testing.

Function Testing

Test Objective: 

Ensure proper target-of-test functionality, including navigation, data entry, processing, and retrieval. Functional testing should focus on validating whether the built-in application intends to do what it is meant for. Expected result and actual results should be logged and compared with each other to ensure that there is no discrepancy n the application developed.

Technique:

Execute each use case, use case flow, or function, using valid and invalid data to verify the following:  

  • The expected results occur when valid data is used.  
  • Test the corner cases and validate how the application reacts to the inputted data.  
  • The appropriate error or warning messages are displayed when invalid data is used.  
  • Each business rule is properly applied.  

Various functional testing techniques can be applied for performing it, viz:  

  • Unit Testing
  • Smoke Testing 
  • Sanity Testing
  • Integration Testing
  • White-box testing 
  •  Black Box testing 
  • User Acceptance testing 
  • Regression Testing

Success Criteria: 

  • All planned tests have been executed by using various testing techniques  
  • All identified defects have been identified, logged, and reported.

Data Integrity Testing 

Test Objective: 

To ensure the accuracy and consistency of the database, as well as access methods and processes, and proper functioning without the data is getting corrupted. Data integrity testing should verify data in the database is accurate and functions in line with the application. 

Technique:

  • Invoke each database access method and process, using each with valid as well as invalid data (or requests for data).  
  • Inspect the database to ensure the data has been populated as planned. Also, check if all database events occurred properly, and verify the returned data to ensure that the correct data is retrieved (for the precise reasons).  

Completion Criteria:

All database access methods and processes function as designed and without any data corruption.

Special Considerations: 

  • A DBMS development environment or driver required for entering or modifying data directly in the databases.  
  • Processes should be manually invoked. A sample size of small records or database (limited number of records) can be used to increase the visibility of any non-acceptable events

Security and Access Control testing

Test Objective: 

Application-level security: It is to verify that an actor can access only the permissible functions or data for which approval is granted for his user type. 

System-level security: It is to verify that only those actors having access to the system and application(s) are approved to access them. 

Technique:

Application-level: Identify and list each actor type and the functions/data each type has permissions for. Test the authentication and authorization for each role and validate whether the user can access only the designated part of the application. 

Test the admin login for authorization and granting access to the list of users. Create the tests for each actor type to verify each permission by setting up the transactions especially for each user actor.  Modify user type and re-run tests for the same users.  In each case, verify whether those additional functions or data are available or denied correctly.  

Completion Criteria:

This is to verify that the appropriate function or data are available for each identified actor type, and all the transaction functions run as expected. 

Special Considerations:

This is to verify that the access to the system is reviewed and communicated to the appropriate network or systems administrator. This testing may not be required as it may be a function of network or systems administration.

Usability Test 

Test Objective: 

To verify that the application design fully integrates with the user’s business processes, giving the user of the application a smooth and seamless flow through the application while performing their job. The main focus of usability testing should be on ease of using the application and whether it fulfills the end-users’ requirements. 

Technique:

  • One-on-one interaction of a business user with a usability engineer.  
  • Encouraging user input as they work with the application.  
  • Recording a user’s nonverbal activity while using the application. 
  • Always questioning the business user for the feedback on navigation, screen design, screen content, etc.  
  • Recording the user experience and non-functional aspects such as ease of use, performance of the application, etc.  

Completion Criteria: 

  • The application is complete and intuitive to the user.  
  • System navigation is consistent with the business user’s workflow.  
  • Training is minimized due to a good intuitive design.

Fail Over and Recovery testing

Test Objective:

To verify that recovery processes (manual or automated) properly restore the database, applications, and system to a desired or known state. The following types of conditions are to be included in the testing:

  • Power interruption to the client  
  • Power interruption to the server  
  • Communication interruption via network server(s)  
  • Interruption, communication, or power loss to DASD and or DASD controller(s)  Incomplete cycles (interruption of data filter and data synchronization processes)  
  • Invalid database pointer or keys invalid or corrupted data element in databases.   

Technique:

Tests created for function and business cycle testing should be used to create a series of transactions. Once the desired starting test point is reached, the following actions should be individually performed or simulated individually: 

  • Power interruption to the client: Power the PC down  
  • Power interruption to the server: Simulate or initiate power down procedures for the server  

Interruption via network servers: Simulate or initiate communication loss with the network (physically disconnects communication wires or power down network server(s) or routers). 

Interruption, communication, or power loss to DASD and or DASD controller(s):

  • Simulate or physically eliminate communication with one or more DASD controllers or devices. 
  • Once the above conditions or simulated conditions are achieved, additional transactions should be executed, and upon reaching this second test point state, recovery procedures should be invoked. 
  • Testing for incomplete cycles utilizes the same technique as described above except that the database processes themselves should be aborted or prematurely terminated. 
  • Testing for the following conditions requires that a known database state be achieved. Several database fields, pointers, and keys should be corrupted manually and directly within the database (via database tools). Database checkpoints should be added. When the failure occurs, it should be validated that the transaction has been rolled back to the checkpoint. Additional transactions should be executed using the tests from application function and business cycle testing and full cycles executed. 

Completion Criteria: 

In all of the cases above, the application, database, and system should, upon completion of recovery procedures, will return to a known or desirable state. This state includes data corruption limited to the known corrupted fields, pointers/keys, and reports indicating the processes or transactions that were not completed due to interruptions. 

Special Considerations:

Recovery testing is highly intrusive. Procedures to disconnect cabling (simulating power or communication loss) may not be desirable or feasible. Alternative methods, such as diagnostic software tools, may be required. Resources from the systems (or computer operations), database, and networking groups are required. These tests should either be run after hours or on an isolated machine(s).

Performance Profiling Testing

Test Objective: 

To verify performance behaviors for designated transactions or business functions under the following conditions: 

  • ​Normal anticipated workload
  • Anticipated worst-case workload 

Technique:

Use test procedures developed for function or business cycle testing. 

  • Modify data files (to increase the number of transactions) or the scripts to increase the number of iterations.  
  • Scripts should be run on one machine (best case to benchmark single user or single transaction) and be repeated with multiple clients (virtual or actual, see special considerations below).  

Note the turnaround time for the transaction and validate it with the expected result. 

Completion Criteria:

Single transaction/single user: Successful completion of the test scripts without any failures and within the expected/required time allocation (per transaction) 

Multiple transactions / multiple users:  Successful completion of the test scripts without any failures and within acceptable time allocation. 

Special Considerations: Comprehensive performance testing includes having a “background” workload on the server.

There are several methods that can be used to perform this, including:

  • “Drive transactions” directly to the server, usually in the form of SQL calls.  Create a “virtual” user load to simulate many (usually several hundred) clients.  Remote terminal emulation tools are used to accomplish this load.  This technique can also be used to load the network with “traffic.”  
  • Use multiple physical clients, each running test scripts to place a load on the system.   
  • Performance testing should be performed on a dedicated machine or at a dedicated time. This permits full control and accurate measurement. 
  • The databases used for performance testing should be either actual size or scaled equally. 

Application-specific testing: 

The online banking application must be tested under the following conditions in order to certify performance response times: 

  • Connected to the network  
  • Not connected to the network (connected directly to the server)  
  • If response times are in question, this testing will define if the problem is in the application code or on the network. The “network” test will be conducted in a remote planner’s office (i.e., New York) while the “not connected to the network” test will be conducted at the location of the host server.  
  • The performance metrics will be tested as per the requirements. These metrics will include transaction rates, uptime/downtime screen refresh rate, etc. 

The online banking application must also be tested online as well as off-line. Using a PC configured with the minimum acceptable hardware requirements (as defined in the online banking – Release one business requirements document) the application will be tested both online and off-line to assure response times are compliant with the business requirements.

Load Testing

Test Objective: 

To verify performance behaviors’ time for designated transactions or business cases under varying workload conditions. 

Technique:

Use tests developed for function or business cycle testing. 

Modify data files (to increase the number of transactions) or the tests to increase the number of times each transaction occurs.

Note the maximum load/transactions the system can handle and test it to validate the breakpoint/failure point. 

Completion Criteria:

Multiple transactions/multiple users:  Successful completion of the tests without any failures and within an acceptable time allocation. 

Special Considerations:

Load testing should be performed on a dedicated machine or at a dedicated time. This permits full control and accurate measurement. The databases used for load testing should be either actual size or scaled equally.

Stress Testing

Test Objective: 

Verify that the target-of-test functions properly and without error under the following stress conditions: 

  • Little or no memory available on the server (RAM and DASD)  
  • More than the designated number of users (actual or physically capable) connected to the system (or simulated)  
  • Multiple users performing the same transactions against the same data/account 
  • Worst-case transaction volume/mix (see performance testing above).  

Note:  The goal of a stress test might also be stated as identify and document the conditions under which the system FAILS to continue functioning properly. 

Stress testing of the client is described under the section on configuration testing. 

Technique:

Use tests developed for performance profiling or load testing. 

To test limited resources, a test should be run on a single machine, and RAM and DASD on a server should be reduced (or limited). 

For remaining stress tests, multiple clients should be used, either running the same tests or complementary tests to produce the worst-case transaction volume/mix. 

Completion Criteria:

All planned tests are executed and specified system limits are reached/exceeded, without the software or software failing (or conditions under which system failure occurs is outside of the specified conditions). 

Special Considerations: 

  • Stressing the network may require network deployment of tools to load the network with messages/packets. 
  • The DASD used for the system should temporarily be reduced to restrict the available space for the database to grow. 
  • Also, test the synchronization of simultaneous clients accessing the same records or data accounts.

Volume Testing

Test Objective: 

To verify that the target-of-test successfully functions under the following high-volume scenarios: 

  • Maximum (actual or physically capable) number of clients connected (or simulated) all performing the same, worst-case (performance) business function for an extended period.  
  • Maximum database size has been reached (actual or scaled) and multiple queries or report transactions are executed simultaneously.

Technique:

Use tests developed for performance profiling or load testing. 

  • Multiple clients should be used, either running the same tests or complementary tests to produce the worst-case transaction volume/mix (see stress test above) for an extended period.  
  • Maximum database size is created (actual, scaled, or filled with representative data) and multiple clients are used to running queries/report transactions simultaneously for extended periods.  

Completion Criteria:

All planned tests have been executed, and specified system limits are reached/exceeded without the software or software failing. 

Special Considerations:

What is a time period that would be considered acceptable for high volume conditions (as noted above)?

Configuration Testing

Test Objective:

To verify that the target-of-test functions properly on the required hardware/software configurations. 

Technique: 

Use function test scripts. 

Open/close various non-target-of-test related software, such as the Microsoft applications, Excel and Word, either as part of the test or prior to the start of the test.  

Execute selected transactions to simulate actors interacting with the target-of-test and the non-target-of-test software  

Repeat the above process, minimizing the available conventional memory to the client.  

Completion Criteria:

For each combination of the target-of-test and non-target-of-test software, all transactions are successfully completed without failure. 

Special Considerations:

  • What non-target-of-test software is available or accessible on the desktop? 
  • What are the typical applications used? 
  • What data are the applications running (i.e., a large spreadsheet opened in Excel or 100-page document in Word)? 

The entire systems, NetWare, network servers, databases, etc. should also be documented as part of this test.

installation Testing 

Test Objective:

Verify that the target-of-test correctly installs onto each required hardware configuration, under the following conditions (as required): 

  • New installation, a new machine, never installed previously with [software].  
  • Update computer that has previously installed software with the same version  
  • Update computer that has previously installed software with an older version  

Technique: 

Either manually or develop automated scripts to validate the condition of the target machine [new software] never installed, [software] same version, or an older version already installed. 

  • Launch or perform the installation. 

Using a predetermined sub-set of function test scripts to run the transactions.

Completion Criteria: 

[Software] Transactions executed successfully without failure. 

Special Considerations:

What [software] transactions should be selected to comprise a confidence test so that [software] application could be successfully installed without missing any major software components?

Business Cycle Testing

Test Objective:

Ensure proper target-of-test and background processes function according to required business models and schedules. 

Technique: Testing will simulate several business cycles by performing the following: 

The tests used for the target-of-test’s function testing will be modified or enhanced to increase the number of times each function is executed to simulate several different users over a specified period. 

  • All-time or date sensitive functions will be executed using valid and invalid dates or time periods.
  • All functions that occur on a periodic schedule will be executed or launched at the appropriate time. 
  • Testing will include using valid and invalid data, to verify the following: 
    • The expected results occur when valid data is used. 
  • The appropriate error or warning messages are displayed when invalid data is used.  Each business rule is correctly applied.  

Completion Criteria: 

  • ​All planned tests have been executed. 
  • All identified defects have been addressed. 

Special Considerations: 

  • System dates and events may require special support activities 
  • A business model is required to identify appropriate test requirements and procedures.
Conclusion

In this article we had a detailed view on Software Testing and its types, and I hope you all found useful. Feel free to share your feedback in the comments section.

Happy Learning !!!!!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Website Built with WordPress.com.

Up ↑

%d bloggers like this: