Subscribe Us

header ads

Testing Situational Bases Questions And Answers


 Situational based:

1- Tell me about yourself ?

My Name is XXX , having X years' experience. Currently working as Lead QA with YY company.

I am ISTQB certified Software Engineer,

Handling Team with 4 Team members 

Role and Responsibilities : 

Involved in test planning / resource planning / Preparing of Test Bed ( Environment ) / Setup of test process in current project, Maintaining the test artifacts (Test Plan / Test cases / Bug reports /Project Weekly status report, etc.)

Involved in Integration / Data base / System testing activities.

Involved in Production Deployment calls and Perform sanity on production environment.

Having experience of ecommerce / Travel / Power domain-based applications

I have Never said NO to Any one attitude.

Looking for some opportunities / assignments to enhance my technical and personal growth.

2- What do you mean by Use case ?

Use Cases are High level of test cases where we are representing the business logics in graphical form by work flows, activity diagrams, or business process models.

We are showing relation between the Actor ( who is going to use the system) and system.

We can have multiple actors in one business process.

We can have single actor in multiple business process.

We are showing the interaction with system / software items .

These can be used for the test cases creations.

3- What is diff between sanity and smoke ?

Smoke : In Smoke testing we are validating the build is working Alive or Dead. A test suite that covers the main functionality of a component or system to determine whether it works properly before planned testing begins.

Alive - Unit testing is performed by Developer on code and it is working conditions with basic functionalities. These are ideal type of application we will be meeting the time estimations allotted for the testing activities. This will lead the good relation between the testing / development activities.

Dead - Unit testing is not performed by Developer on code and it is not in working conditions with basic functionalities. If we are validating these types of builds, then we are losing the time and testing effort. We can be deviated from our testing estimation. 

Smoke Testing is activities which is just carried just after the build deployment. Mostly deployment team members , one from developer team, and one from qc team will be involved into it.

Sanity : It is type of functional testing, in which we are validating all the business process on broader / higher level.

It is carried out after Smoke testing. Mostly QC team is performing it.

In this we are validating those issues which were observed into last build.

It is sub set of functional testing.

In ideal condition no such testing is present in Quality concepts.

4- what is difference between retesting and regression testing ?

Re-Testing : Testing that runs test cases that failed the last time they were run, in order to verify the success of corrective actions.

It is activity which we performed after bug fix, in this we are just confirming that bug is fix and its business feature is working fine.

we call it is up as bug verification.

Regression Testing :  In Regression testing we are going same thing again and again , in repetitive manner.

Regression testing should be performed after ,

Any bug fixes. 

Addition / Subtraction of feature from the current flow.

Any code optimization happens.

Any business logic change happens. 

Any some third-party integration happens with current running system.

In this we are checking the rest of the other business process should ream in intact , if any of the above mention conditions

meets

It is type of confirmation testing.

5- What is difference between sanity and retesting ?

Sanity : It is type of functional testing, in which we are validating all the business process on broader / higher level.

It is carried out after Smoke testing. Mostly QC team is performing it.

In this we are validating those issues which were observed into last build.

It is sub set of functional testing.

In ideal condition no such testing is present in Quality concepts.

Re-Testing : Testing that runs test cases that failed the last time they were run, in order to verify the success of corrective actions.

It is activity which we performed after bug fix, in this we are just confirming that bug is fix and its business feature is working fine.

we call it is up as bug verification.

6- How would you handle the situation when client found bug and very angry on it and your team mate is not in mood to co-operate ?

I am keeping my personal and professional matter separate ;in this case my first priority is to get the issues resolved and deployed on Production environment.

Once this firefighting , low down then I will do the RCA for the issue.

After analyzing the root cause , will try to mitigate , so that we will not face the same situation again in future.

In mitigation, i will follow 

Training (in terms of technical /project related)

any interpersonal issues running between team mates.

having healthy communication between peers.

if this escalation happens on regular basis,, then ,will escalate it to my manager/ senior.

7- What your roles and responsibilities ?

lead QA is reporting to QA manager

Keep close eye on testing process / controls / try to mitigate the risk to happen 

Involved into System / Integration.

8- What's daily activity ?

Before to start my day first i am checking my mails

Had discussion with my QA manager : 

what we did yesterday.

What is the planned to today task and next coming sprints.

Discussions on any impediment which restrict me to proceed. Any schedule variance is theirs.

Any priority task which is taken care first.

Based on discussion with QA manager will have some peer meetings where :

I look up on the work status.

R we on time right track for the timelines 

What we did yesterday / what we have planned for today.

Will keep posted about the testing status and challenges to QA manger and other stack holders 

Sharing the daily status report.

Involved into calls having business users ( on alternate basis).

At the end of the am trying to ensure that as team come should have task on their plates.

Updating the test artifacts on regular basis. (RTM / Coverages/ testing status / WBS etc.) 

9- How you start your day ?

Before to start my day first i am checking my mails

Had discussion with my QA manager : 

what we did yesterday.

What is the planned to today task and next coming sprints.

Discussions on any impediment which restrict me to proceed. Any schedule variance is theirs.

Any priority task which is taken care first.

Based on discussion with QA manager will have some peer meetings where :

I look up on the work status.

R we on time right track for the timelines 

What we did yesterday / what we have planned for today.

Will keep posted about the testing status and challenges to QA manger and other stack holders 

Sharing the daily status report.

Involved into calls having business users ( on alternate basis).

At the end of the am trying to ensure that as team come should have task on thier plates.

Updating the test artifacts on regular basis. (RTM / Coverages/ testing status / WBS etc.) 

10- What you will do when your team members not support each other ?

TBC....

11- Client is asking for early release and you don't have enough time to test the application ?

Will ask him to get the most critical point first.

Priority task will be taken care first.

will try to bargain for the time.

In that case I will assign my team in multiple things so that we can cover the application as much as possible.

Will ask team to put some extra effort to get the work completed on time.

12- What you will do when one major functionality missed by your team member ?

I am keeping my personal and professional matter separate ;in this case my first priority is to get the issues resolved and deployed on Production environment.

Once this firefighting , low down then I will do the RCA for the issue.

After analyzing the root cause , will try to mitigate , so that we will not face the same situation again in future.

In mitigation, i will follow 

Training (in terms of technical /project related)

any interpersonal issues running between team mates.

having healthy communication between peers.

if this escalation happens on regular basis,, then ,will escalate it to my manager/ senior.

13- Diff between test case and test scenario and use case ?

Use cases : A sequence of transactions in a dialogue between an actor and a component or system with a tangible result, where an actor can be a user or anything that can exchange information with the system

Use Cases are High level of test cases where we are representing the business logics in graphical form by work flows, activity diagrams, or business process models.

We are showing relation between the Actor ( who is going to use the system) and system.

We can have multiple actors in one business process.

We can have single actor in multiple business process.

We are showing the interaction with system / software items .

These can be used for the test cases creations.

Test Cases : A set of Input values, execution preconditions, expected results, and execution post conditions, developed for a particular objective or test condition such as to exercise a particular program path or to verify compliance with a specific requirement.

Test cases can we written with the help of Use cases.

Creating / Drafting Test cases is part of Implementation and Execution software testing life cycle process.

Test cases are formal document which showing a clear traceability to the requirements with containing an expected result.

Attributes of test cases (Test cases name / Test name / Date of Creation / Steps/ Description / Actual / Expects results test data / pass fail status)

In test cases we are mentioning the test conditions / scenarios.

Test Scenarios : A document Specifying a sequence of actions for the execution of a test.

It is also known as test scripts or manual test scripts.

14- What are the testing techniques you are following ?

There are few  testing techniques .

Static testing techniques 

Informal Reviews

Walkthroughs

Technical Reviews

Inspection

Static Analysis

Data Flow

Control Flow

Dynamic Testing Techniques:

Structure based (White Box Testing)

Statement

Decision

Condition / Multiple condition 

Experienced - Based 

Error Guessing 

Exploratory Testing

Specification based (Black Box testing)

Equivalence class partitioning 

Boundary value Analysis

Decision Tables 

State Transition

Use cases testing 

15- Which tool you like most and why ?

To be Honest I have short hand is scripting part. I am comfortable with any configuration management tool example (TFS / HP-QC/JIRA)

 16: You have mentioned in. Your resume that you follow agile methodology . Can you explain ?

Agile methodology : A Group of software development methodologies based on iterative incremental development where requirements and solutions evolve through collaboration between self-organizing cross-functional teams.

Agile testing : Test practice for project using agile methodologies , such as XP  (Extreme Programming),treating development as the customer of testing and emphasizing the test first design paradigms

Agile Manifesto : A statement of the values that underpin agile software development the values are :

individuals and integrations over process and tools 

working software over comprehensive documentation 

customer collaborations over contract negotiation

responding to change over following a plan    

17: If client have given requirement in verbal communication then how will you proceed with project ?

In this case ,after completion of client call will share the MOM to the client , in that i will share the all the points which we discussed

After taking consent on the points, we discussed will create some demo / prototype of application 

Will show case the prototype to client and gather the feedbacks 

after taking the feedbacks and confirmation on prototype will start our development and testing activities.

18: When you feel that testing need to stop and start 

Entry Criteria : The set of generic and specific condition for permitting a process to go forward with the defined task , e.g., test phases. the purpose of entry criteria is to prevent a task from starting which entail more effort compared to the effort needed to remove the failed entry criteria.

When we are having unit tested code 

All pre-requisites of application installed / application are in working condition / with correct user rights assigned to everyone.

When we have some FRS / requirements with us.

Exit Criteria : The set of generic and specific condition , agreed upon with the stake holders , for permitting a process to be officially completed. The purpose of exit criteria is to prevent task from being considered completed when there is still outstanding part of task which have not been finished. Exit criteria are used to report against and to plan when to stop testing 

When deft arrival rate is less then rate defect fixation.

When we have covered all points mention in requirement document and they are working fine.

When we meet the client acceptance criteria of application.

We have covered all priority / most critical points and observe that they are working fine

When the project cost / time lines are completed.

When all the High Priority / High severity bugs is fixed and working fine.

19: What is system testing ?

The Process of testing an integrated system to verify that it meets specified requirements

After completing of component / integration testing we performed system testing.

It is testing process where we are testing the complete whole system. It also includes the systems of systems testing.

20: Please describe ?

Verification : Confirmation by examination and through provision of objective evidence that specified requirements have been fulfilled.

It is process where we are not running / executing the code and try to find out the flaws in implementation of solutions

It comes under static type (Informal reviews / inspection / walkthrough ) software testing techniques.

In this we are checking the solution is built as per the requirement or not.

QA comes under Verification.

Validation : Confirmation by examination and through provision of objective evidence that requirement for the specific intended use or application have feen fulfilled.

It is process where we are running / executing the code and try to find out the flaws in developed solution.

It comes under dynamic type (specification/ structured specification based ) software testing techniques.

Software product actually meets the exact needs of the customer or not

Quality control comes under validation. 

Smoke : A subset of all defined / planned test cases that cover the main functionalities of component or a system , to ascertaining that most crucial functions of a program work but not bothering with the finer details.

A daily build and smoke test is among industry best practices.

It is type of testing where we are checking the health of build.

If build is working fine, then we can start testing over application.

Alpha : Simulated or actual operational testing by potential user / customers or an independent test team at the developer site , but outside of the development organization.

Alpha testing is often employed for off the shelf software as a form of internal acceptance testing.

Beta : Operation testing by potential and or / existing users/ customers at an external site not otherwise involved with the developers, to determine whether or not a component of system satisfies the user/ customers needs and fits with in business process.

Beta testing is often employed as a form of external acceptance testing for off-the-shelf software in order to acquire feedback from the market

 Integration Testing :  Testing performed to expose defects in the interfaces and in the interactions between integrated components or systems.

It is carried after the component / unit testing is done.

In Integration testing we are checking for the boundary of components / unit.

-Data base testing (Data validation/Data Migration / Reports


21: -CMMI-L 5, ISO 900 and respective documentation ?

CMMI- capability maturity model Integration : A framework that describe the key elements of an effective product development and maintenance process. The capability Maturity Model Integration covers best practices for planning , engineering and managing product development and maintenance. CMMI is the designated successor of CMM.

IT have 5 Levels

CMMI-L1-No formal documentation / 

CMMI-L2

CM – Configuration Management

MA – Measurement and Analysis

PPQA – Process and Quality Assurance

REQM – Requirements Management

SAM – Supplier Agreement Management

SD – Service Delivery

WMC – Work Monitoring and Control

WP – Work Planning

CMMI-L3 

CAM – Capacity and Availability Management

DAR – Decision Analysis and Resolution

IRP – Incident Resolution and Prevention

IWM – Integrated Work Managements

OPD – Organizational Process Definition

OPF – Organizational Process Focus...

OT – Organizational Training

RSKM – Risk Management

SCON – Service Continuity

SSD – Service System Development

SST – Service System Transition

STSM – Strategic Service Management

CMMI-L4

OPP – Organizational Process Performance

QWM – Quantitative Work Management

CMMI-L5

CAR – Causal Analysis and Resolution.

OPM – Organizational Performance Management.

22: Describe -FRS, Test Plan, RTM, Release Notes, Test Cases, Release Status report

- Daily Status Report, Configuration Management( that will discuss tomorrow) 

FRS / SRS / PDS : _______________________


Test Plan : A Document that describing the scope , approach, resources, and schedule of intended test activities. It identifies amongst others test items the features to be tested ,the testing task, degree of tester independence , the test environments , test design techniques and entry / exit criteria to be used and the rationale for their choice and any risks requiring contingency planning .

It is record of the test planning 

RTM : _______________________

Release Notes / Release Status report : A document identifying test items ,thier configuration, current status and other delivery information delivered by development to testing and possibly other stake holders, at the start of a test execution phase.

Test Cases :  A set of input values , execution preconditions , expected results and execution post conditions , developed for particular or test condition , such as to exercise a particular program path or to verify compliance with specific requirement.


23: Difference between primary key and unique key 

Primary Key :  The PRIMARY KEY constraint uniquely identifies each record in a table.

Primary keys must contain UNIQUE values, and cannot contain NULL values.

A table can have only ONE primary key; and in the table, this primary key can consist of single or multiple columns (fields).

Primary key is always auto incremental , no duplicate values acceptable.

We cannot update the Primary key value.

Unique Key : A unique key is a set of one or more than one fields/columns of a table that uniquely identify a record in a database table.

You can say that it is little like primary key but it can accept only one null value and it cannot have duplicate values.

The unique key and primary key both provide a guarantee for uniqueness for a column or a set of columns

There is an automatically defined unique key constraint within a primary key constraint.

There may be many unique key constraints for one table, but only one PRIMARY KEY constraint for one table.

The UNIQUE constraint ensures that all values in a column are different.

Both the UNIQUE and PRIMARY KEY constraints provide a guarantee for uniqueness for a column or set of columns.


24: What is stub and driver ?

Stub : A special purpose implementation of software component , used to develop or test a component that calls or is otherwise dependent on it . It replaces a called component

In Top-down integration testing techniques we are using stubs.

We may call stubs as mock objects.

Stubs are called from software component to be tested.

Drivers : A software component or test tool that replaces a component that take care of the control and / or the calling of a component or system

In Botton up integration testing technique we use drivers.

When we are integrating our developed solution into some existing running product, we are using drivers.

Driver call component to be test.

25: Type of functional and non-functional testing 

Functional : Testing based on an analysis of the specification of the functionality of a component or system.

Dynamic Testing Techniques:

Structure based (White Box Testing)

Statement

Decision

Condition / Multiple condition 

Experienced - Based 

Error Guessing 

Exploratory Testing

Specification based (Black Box testing)

Equivalence class partitioning 

Boundary value Analysis

Decision Tables 

State Transition

Use cases testing 

Non-Functional : Testing the attributes of a component or system that do not relate to functionality 

Reliability

Robustness

Fault tolerance

compliance

Recoverability 

Efficiency 

Usability

Under stability 

Learnability 

Operability 

Attractiveness

compliance 

Maintainability 

Analyzability

changeability

Testability

compliance 

Portability

Adaptability 

Install ability

consistence

Replaceability

Compliance

Performance

Load 

Stress

Volume

Spike 

26: What is severity and priority with example as per AMS project ?

Severity : The degree of impact that a defect has on the development or operation of a component or system.

Priority : The level of (business) importance assigned to an item.

27: As you have worked in your current company. From last 8+ year that what the reason your looking for job ?

In past 8 year I have being placed on multiple places to work, due to this reason i have stayed long time to my company.

I played multiple roles.

Now I wish take my QC skills to the next senior role. 

Now I wish to multiple my technical skills

Post a Comment

0 Comments