Quality is delighting customers
data should be in numbers or should be in alphabets
Some general Data Integrity case:
1. Numeric field should allow to insert numeric value.
2. If any field have specified length then should not allow to insert more then that or null value or length should be same.
3. Verify that you can create, modify, and delete any data in tables.
4. Verify that sets of radio buttons represent fixed sets of values.
5. Verify that a blank value can be retrieved from the database.
6. Verify that, when a particular set of data is saved to the database, each value gets saved fully, and the truncation of strings and rounding of numeric values do not occur.
7. Verify that the default values are saved in the database, if the user input is not specified.
8. Verify compatibility with old data, old hardware, versions of operating systems, and interfaces with other software.
9. Verify that after insertion, update and deletion of entry should fired trigger if available to update other table should executed and update the column of other table.
Data integrity testing is done in order to verify that data is being stored by the system in a manner where the data is not compromised by updating, restoration, or retrieval processing.
- This type of testing is intended to uncover design flaws that may result in data corruption, unauthorized data access, lack of data integrity across multiple tables, and lack of adequate transaction performance.
- The databases, data files, and the database or data file processes should be tested as a subsystem within the application.
Following are the things that we should keep in mind while testing for data integrity:
--> The first step in determining what need to be tested is to identify the types of database activity (insert, delete, update) that will be invoked, and to identify when these transactions will occur within the application-under-test. It can also be helpful to work with the application designers and architects to determine the calculations or processing rules used by the application, as well as the time-critical transactions / functions / conditions, and possible causes of poor performance. At this time the dataflow for the application should also be examined and tests that mimic it's path(s) should be earmarked for planning and execution.
--> The next step is to look in depth at the design, code, databases, and file structures. You will want to check data fields for alphabetic and numeric characters. You will need to look at files and fields to ensure proper spacing and length. You will want to develop some method to verify correct data format(s). Your data consistency checks should include both internal and external validations of essential data fields. Internal checks involve data type checking and ensure that columns are of the correct types; external checks involve the validation or relational integrity to determine whether duplicate data are being loaded from different files. Also at this time you will want to be sure to plan tests for all data files; including clip art, templates, tutorials, samples, etc.
Finally, you will want to take a look at conditions of the system that can affect the performance or stability of the data file(s) or database(s). This will include testing conditions like low memory, low disk space, and other limited resources. It will also include backup integrity and recovery testing at multiple levels; including, but not limited to, restoring individual files, restoring application data, restoring a database or other information stores, and full system recovery.
When planning and designing your data and database tests it will be helpful if you keep the following in mind. It is a good practice to keep backup copies of test files and test databases. Before reporting an error, check your working copies of the input and comparison files against the backups. Further, all data and database sub-systems should be tested, at some point, without the target-of-test's user interface. This removes the possibility of errors being introduced by the user interface.
It is a wonderful question. Later I try to share practical issues with lack of data integrity testing especially while working on massive databases with realtime live records. The comments posted so far useful to create a document for data integrity testing.
Refrential integrity can also be tested. i.e. PK, FK relationships amongst dimension and fact tables