Quality Testing

Quality is delighting customers

I am working on OBIEE reports testing.
The documents we have been provided with is the metrics and report description.

I am facing dificulty in visualizing those reports description as images(actual reports).

 

Is there anything else that should be provided for writting the tets cases for reports?

 

Any suggestion will be helpful.

 

Thanks,

 Smita

Views: 7007

Reply to This

Replies to This Discussion

Hi Smitha,

Documents which you have metric and report description won’t help you much to create a detailed testcase. Basically you need to have functional documents or try to understand what is the purpose of this report and try to compare with reports description as images actual reports. Based on that come out with derived testcase for the test scenario.

Regards,
Satheesh C.M
Thnaks Satheesh,
Actually in the metrics doc we have
1. Metrics
2. Reports
3. Reports path.
4. Dashboards.

When we asked for the images they said its not developed and we need to write our test cases based on this docs.
Please let me know what is the general process that is being followed for report testing?
Are they suppose to give the images and Functonal doc or the metric document is all what is provided for report testing.
Hi Smita,

Before understanding general process first you need to understand the functional spec documents. smitha you need to have functional spec before creating a testcase. In that functional spec it should mention clearly with screen shot about the metrics,reports,reports path and dashboard. Goging forward it will help you to identify the test scenario based on the metrics, reports etc to create a good testcase.

Regards,
Satheesh C.M
cm.satheesh@yahoo.com

I am not sure I understand the question, and I apologize if I am not answering what you are asking.

Reports can be challenging to test.  In my experience, reports are the last part of a project that anyone pays attention to. There are frequently no requirements.  If there are requirements, they are frequently not specific enough to translate into test cases; someone describes a general goal for the report, a developer translates that goal into a report, and then they iterate until all parties are satisfied.  It is difficult to translate a loosely stated goal into a set of rigorous test cases.

 

If the goal isn't specific enough, sometimes you can ask the developer to help you understand the requirements in enough detail to write test cases.  The requirements might be a natural language description or they might be a set of SQL queries.  

 

Another option is to ask for a set of sample inputs and corresponding sample reports. 

 

If you start with SQL queries, now you have two problems.  First problem: you need to verify that the report does what the developer intended for it to do, i.e. you need to write test cases that verify what you think the queries are intended to do. Of course, even if the SQL queries work perfectly, they still may not produce the report that the original requirements (or the original goal) described.  That brings you to the second problem: you need to decide whether the developer understood the report's original goal.

 

Another way to approach the report testing is to give up on understanding the report's goal, and instead to focus on regression testing.  For example, if you are working with an existing report, and now the report needs to be re-tested because the underlying schema has changed, you may want to *assume* the original report was correct and instead focus on whether the report still produces the same output if you begin with the same input.

 

I hope that helps.

RSS

TTWT Magazine


Advertisement

Advertisement

Advertisement

Advertisement

© 2021   Created by Quality Testing.   Powered by

Badges  |  Report an Issue  |  Terms of Service