Monday, May 26, 2008

Error,Defect and Failure

Error : An error is a mistake, misconception or misunderstanding on
the part of a software developer
Defect : A defect is introduced into the software as a result of an

error. It is an anomaly in the software that may cause it to behave

incorrectly and not according to its specification

Failure : A failure is the inability of a software system or

component to perform its required functions within specified

performance requirements


Software testing

Software testing is a planned and scheduled activity which helps in
delivery of quality products


Test Bed

A Test Bed is an environment that contains all the hardware and software
needed to test a software component or a software system


Black box testing Methods

Black box testing Methods


1.Graph-based Testing Methods


Black-box methods based on the nature of the relationships (links)
among the program objects (nodes), test cases are designed to
traverse the entire graph


Transaction flow testing (nodes represent steps in some transaction
and links represent logical connections between steps that need to be
validated)


Finite state modeling (nodes represent user observable states of the
software and links represent transitions between states)


Data flow modeling (nodes are data objects and links are
transformations from one data object to another)


Timing modeling (nodes are program objects and links are sequential
connections between these objects, link weights are required
execution times)

2.Equivalence Partitioning


Black-box technique that divides the input domain into classes of
data from which test cases can be derived


An ideal test case uncovers a class of errors that might require many
arbitrary test cases to be executed before a general error is
observed


Equivalence class guidelines:


If input condition specifies a range, one valid and two invalid
equivalence classes are defined


If an input condition requires a specific value, one valid and
two invalid equivalence classes are defined


If an input condition specifies a member of a set, one valid
and one invalid equivalence class is defined


If an input condition is Boolean, one valid and one invalid
equivalence class is defined

3.Boundary Value Analysis


Black-box technique that focuses on the boundaries of the input
domain rather than its center


BVA guidelines:


If input condition specifies a range bounded by values a and b,
test cases should include a and b, values just above and just
below a and b


If an input condition specifies and number of values, test
cases should be exercise the minimum and maximum numbers, as
well as values just above and just below the minimum and
maximum values


Apply guidelines 1 and 2 to output conditions, test cases
should be designed to produce the minimum and maxim output
reports


If internal program data structures have boundaries (e.g. size
limitations), be certain to test the boundaries

4.Comparison Testing


Black-box testing for safety critical systems in which independently
developed implementations of redundant systems are tested for
conformance to specifications


Often equivalence class partitioning is used to develop a common set
of test cases for each implementation

5.Orthogonal Array Testing


Black-box technique that enables the design of a reasonably small set
of test cases that provide maximum test coverage


Focus is on categories of faulty logic likely to be present in the
software component (without examining the code)


Priorities for assessing tests using an orthogonal array


Detect and isolate all single mode faults


Detect all double mode faults


Multimode faults

6.Specialized Testing


Graphical user interfaces


Client/server architectures


Documentation and help facilities


Real-time systems


Task testing (test each time dependent task independently)


Behavioral testing (simulate system response to external
events)


Intertask testing (check communications errors among tasks)


System testing (check interaction of integrated system software
and hardware)

7.Advantages of Black Box Testing


More effective on larger units of code than glass box testing


Tester needs no knowledge of implementation, including specific
programming languages


Tester and programmer are independent of each other


Tests are done from a user's point of view


Will help to expose any ambiguities or inconsistencies in the
specifications


Test cases can be designed as soon as the specifications are complete

8.Disadvantages of Black Box Testing


Only a small number of possible inputs can actually be tested, to
test every possible input stream would take nearly forever


Without clear and concise specifications, test cases are hard to
design


There may be unnecessary repetition of test inputs if the tester is
not informed of test cases the programmer has already tried


May leave many program paths untested


Cannot be directed toward specific segments of code which may be very
complex (and therefore more error prone)


Most testing related research has been directed toward glass box
testing


Screen Validation Checklist

Aesthetic Conditions:





1. Is the general screen background the correct color?





Are the field prompts the correct color?







Are the field backgrounds the correct color?







In read-only mode, are the field prompts the correct color?







In read-only mode, are the field backgrounds the correct color?







Are all the screen prompts specified in the correct screen font?







Is the text in all fields specified in the correct screen font?







Are all the field prompts aligned perfectly on the screen?







Are all the field edit boxes aligned perfectly on the screen?







Are all group boxes aligned correctly on the screen?







Should the screen be resizable?







Should the screen be allowed to minimize?







Are all the field prompts spelt correctly?







Are all character or alphanumeric fields left justified? This is the

default unless otherwise specified.







Are all numeric fields right justified? This is the default unless

otherwise specified.







Is all the micro-help text spelt correctly on this screen?







Is all the error message text spelt correctly on this screen?







Is all user input captured in UPPER case or lowercase consistently?







Where the database requires a value (other than null) then this

should be defaulted into fields. The user must either enter an

alternative valid value or leave the default value intact.







Assure that all windows have a consistent look and feel.







Assure that all dialog boxes have a consistent look and feel.









Validation Conditions:





Does a failure of validation on every field cause a sensible user

error message?





Is the user required to fix entries, which have failed validation

tests?





Have any fields got multiple validation rules and if so are all rules

being applied?





If the user enters an invalid value and clicks on the OK button (i.e.

does not TAB off the field) is the invalid entry identified and

highlighted correctly with an error message?





Is validation consistently applied at screen level unless

specifically required at field level?





For all numeric fields check whether negative numbers can and should

be able to be entered.





For all numeric fields check the minimum and maximum values and also

some mid-range values allowable?





For all character/alphanumeric fields check the field to ensure that

there is a character limit specified and that this limit is exactly

correct for the specified database size?





Do all mandatory fields require user input?





If any of the database columns don't allow null values then the

corresponding screen fields must be mandatory. (If any field, which

initially was mandatory, has become optional then check whether null

values are allowed in this field.)







Navigation Conditions:





Can the screen be accessed correctly from the menu?





Can the screen be accessed correctly from the toolbar?





Can the screen be accessed correctly by double clicking on a list

control on the previous screen?





Can all screens accessible via buttons on this screen be accessed

correctly?





Can all screens accessible by double clicking on a list control be

accessed correctly?





Is the screen modal? (i.e.) Is the user prevented from accessing

other functions when this screen is active and is this correct?





Can a number of instances of this screen be opened at the same time

and is this correct?







Usability Conditions:





Are all the dropdowns on this screen sorted correctly? Alphabetic

sorting is the default unless otherwise specified.





Is all date entry required in the correct format?





Have all pushbuttons on the screen been given appropriate Shortcut

keys?





Do the Shortcut keys work correctly?





Have the menu options that apply to your screen got fast keys

associated and should they have?





Does the Tab Order specified on the screen go in sequence from Top

Left to bottom right? This is the default unless otherwise specified.





Are all read-only fields avoided in the TAB sequence?





Are all disabled fields avoided in the TAB sequence?





Can the cursor be placed in the microhelp text box by clicking on the

text box with the mouse?





Can the cursor be placed in read-only fields by clicking in the field

with the mouse?





Is the cursor positioned in the first input field or control when the

screen is opened?





Is there a default button specified on the screen?





Does the default button work correctly?





When an error message occurs does the focus return to the field in

error when the user cancels it?





When the user Alt+Tab's to another application does this have any

impact on the screen upon return to the application?





Do all the fields edit boxes indicate the number of characters they

will hold by there length? e.g. a 30 character field should be a lot

longer







Data Integrity Conditions:





Is the data saved when the window is closed by double clicking on the

close box?





Check the maximum field lengths to ensure that there are no truncated

characters?





Where the database requires a value (other than null) then this

should be defaulted into fields. The user must either enter an

alternative valid value or leave the default value intact.





Check maximum and minimum field values for numeric fields?





If numeric fields accept negative values can these be stored

correctly on the database and does it make sense for the field to

accept negative numbers?





If a set of radio buttons represents a fixed set of values such as A,

B and C then what happens if a blank value is retrieved from the

database? (In some situations rows can be created on the database by

other functions, which are not screen based, and thus the required

initial values can be incorrect.)





If a particular set of data is saved to the database check that each

value gets saved fully to the database. (i.e.) Beware of truncation

(of strings) and rounding of numeric values.







Modes (Editable Read-only) Conditions:





Are the screen and field colors adjusted correctly for read-only

mode?





Should a read-only mode be provided for this screen?





Are all fields and controls disabled in read-only mode?





Can the screen be accessed from the previous screen/menu/toolbar in

read-only mode?





Can all screens available from this screen be accessed in read-only

mode?





Check that no validation is performed in read-only mode.







General Conditions:





Assure the existence of the "Help" menu.





Assure that the proper commands and options are in each menu.





Assure that all buttons on all tool bars have a corresponding key

commands.





Assure that each menu command has an alternative (hot-key) key

sequence, which will invoke it where appropriate.





In drop down list boxes, ensure that the names are not abbreviations

/ cut short





In drop down list boxes, assure that the list and each entry in the

list can be accessed via appropriate key / hot key combinations.





Ensure that duplicate hot keys do not exist on each screen





Ensure the proper usage of the escape key (which is to undo any

changes that have been made) and generates a caution message "Changes

will be lost - Continue yes/no"





Assure that the cancel button functions the same as the escape key.





Assure that the Cancel button operates, as a Close button when

changes have been made that cannot be undone.





Assure that only command buttons, which are used by a particular

window, or in a particular dialog box, are present. – (i.e) make sure

they don't work on the screen behind the current screen.





When a command button is used sometimes and not at other times,

assures that it is grayed out when it should not be used.





Assure that OK and Cancel buttons are grouped separately from other

command buttons.





Assure that command button names are not abbreviations.





Assure that all field labels/names are not technical labels, but

rather are names meaningful to system users.





Assure that command buttons are all of similar size and shape, and

same font & font size.





Assure that each command button can be accessed via a hot key

combination.





Assure that command buttons in the same window/dialog box do not have

duplicate hot keys.





Assure that each window/dialog box has a clearly marked default value

(command button, or other object) which is invoked when the Enter key

is pressed - and NOT the Cancel or Close button





Assure that focus is set to an object/button, which makes sense

according to the function of the window/dialog box.





Assure that all option buttons (and radio buttons) names are not

abbreviations.





Assure that option button names are not technical labels, but rather

are names meaningful to system users.





If hot keys are used to access option buttons, assure that duplicate

hot keys do not exist in the same window/dialog box.





Assure that option box names are not abbreviations.





Assure that option boxes, option buttons, and command buttons are

logically grouped together in clearly demarcated areas "Group Box"





Assure that the Tab key sequence, which traverses the screens, does

so in a logical way.





Assure consistency of mouse actions across windows.





Assure that the color red is not used to highlight active objects

(many individuals are red-green color blind).





Assure that the user will have control of the desktop with respect to

general color and highlighting (the application should not dictate

the desktop background characteristics).





Assure that the screen/window does not have a cluttered appearance





Ctrl + F6 opens next tab within tabbed window





Shift + Ctrl + F6 opens previous tab within tabbed window





Tabbing will open next tab within tabbed window if on last field of

current tab





Tabbing will go onto the 'Continue' button if on last field of last

tab within tabbed window





Tabbing will go onto the next editable field in the window





Banner style & size & display exact same as existing windows





If 8 or less options in a list box, display all options on open of

list box - should be no need to scroll





Errors on continue will cause user to be returned to the tab and the

focus should be on the field causing the error. (i.e the tab is

opened, highlighting the field with the error on it)





Pressing continue while on the first tab of a tabbed window (assuming

all fields filled correctly) will not open all the tabs.





On open of tab focus will be on first editable field





All fonts to be the same





Alt+F4 will close the tabbed window and return you to main screen or

previous screen (as appropriate), generating "changes will be lost"

message if necessary.





Microhelp text for every enabled field & button





Ensure all fields are disabled in read-only mode





Progress messages on load of tabbed screens





Return operates continue





If retrieve on load of tabbed window fails window should not open

Specific Field Validation Checklist

Date Field Checks

Assure that leap years are validated correctly & do not cause
errors/miscalculations.


Assure that month code 00 and 13 are validated correctly & do not
cause errors/miscalculations.


Assure that 00 and 13 are reported as errors.


Assure that day values 00 and 32 are validated correctly & do not
cause errors/miscalculations.


Assure that Feb. 28, 29, 30 are validated correctly & do not cause
errors/ miscalculations.


Assure that Feb. 30 is reported as an error.


Assure that century change is validated correctly & does not cause
errors/ miscalculations.


Assure that out of cycle dates are validated correctly & do not cause
errors/miscalculations.

Numeric Fields


Assure that lowest and highest values are handled correctly.


Assure that invalid values are logged and reported.


Assure that valid values are handles by the correct procedure.


Assure that numeric fields with a blank in position 1 are processed
or reported as an error.


Assure that fields with a blank in the last position are processed or
reported as an error an error.


Assure that both + and - values are correctly processed.


Assure that division by zero does not occur.


Include value zero in all calculations.


Include at least one in-range value.


Include maximum and minimum range values.


Include out of range values above the maximum and below the minimum.


Assure that upper and lower values in ranges are handled correctly.

Alpha Field Checks


Use blank and non-blank data.


Include lowest and highest values.


Include invalid characters & symbols.


Include valid characters.


Include data items with first position blank.


Include data items with last position blank.

Testing process and the Software Testing Life Cycle

Testing process and the Software Testing Life Cycle

Every testing project has to follow the waterfall model of the testing
process.
The waterfall model is as given below
1.Test Strategy & Planning

2.Test Design

3.Test Environment setup

4.Test Execution

5.Defect Analysis & Tracking

6.Final Reporting

According to the respective projects, the scope of testing can be tailored,
but the process mentioned above is common to any testing activity.


Testing Techniques

Testing Techniques

Black-Box testing technique:
This technique is used for testing based solely on analysis of
requirements (specification, user documentation.). Also known as functional
testing.

White-Box testing technique:
This technique us used for testing based on analysis of internal
logic (design, code, etc.)(But expected results still come requirements).
Also known as Structural testing.


Steps to start Testing

Steps to start Testing


Before starting testing we need to follow these steps:-

1)Get requirements.
2)Make test Plan
3)write test cases
4)Build test bed.
5)Get all required software tools.
6)Divide work b/w team members.
5)Finally start testing.
6)Execute test case
7)Report bugs
8)retest fixed bugs
9)generate test report


softwaretestingexpertise

http://softwaretestingexpertise.blogspot.com


Test Plan

Brief introduction of test plan


Test plan can be defined as a document that describes the scope, approach,
resources and schedule of intended test activities. The main purpose of
preparing test plan is that every one concerned with the project are in
synchronized with regards to scope, deliverables, deadlines and response
for the project.

Test planning can and should occur at several levels. The first plan to
consider is the Master Test Plan. The purpose of the Master Test Plan is to
consider testing at all levels (unit, integration, system, acceptance,
beta, etc.). The Master Test Plan is to testing what the Project Plan is to
the entire development/testing effort.

General contents of a test plan:


Purpose:


This section should contain the purpose of preparing the test plan.


Scope:


This section should talk about the areas of the application which are to be
tested by the QA team and specify those areas which are definitely out of
the scope.


Test approach :


This would contain details on how the testing is to performed and whether
any specific strategy is to be followed.


Entry criteria:


This section explains the various steps to be performed before the start of
test (i.e) pre-requisites.


E.g. Environment setup, starting web server/ application server, successful
implementation of latest build etc.


Resources:


This list out the people who would be involved in the project and their
designation etc


Tasks and responsibilities:


This talk about the tasks to be performed and the responsibilities assigned
to the various members in the project.


Exit criteria:


This contains tasks like bringing down the system or server, restoring
system to pre-test environment, database, refresh etc.


Schedules/ Milestones :


This section deals with the final delivery date and the various milestone
dates to be met in the course of project.


Hardware/ software requirements :


This section contains the details of system/server required to install the
application or perform the testing, specific s/w that needs to be installed
on the system to get the application running or to connect to the database,
connectivity related issues etc.


Risks and mitigation process :


This section should list out all the possible risks that can arise during
the testing and mitigation plans that the QA team plans to implement incase
the risk actually turns into a reality.


Tools to be used :


This would list out the testing tools or utilities that are to be used in
the project.


E.g. Winrunner, QTP, Test Director PCOM etc.


Deliverables :


This section contains various deliverables that are due to the client at
various points of time. i.e. daily, weekly, start of project, end of
project etc. these could include test plans, test procedures, test
matrices, status reports, test scripts etc. templates for all these also be
attached.


Annexure :


This section contains the embedded documents or links to document which
have been/will be used in the course of testing. E.g. Templates used for
reports, test cases etc. reference documents can also be attached here.


Sign off :


This section contains the mutual agreement between the client and QA team
with both leads/ managers signing off their agreement on the test plan.