Quality Testing

Quality is delighting customers

Dear Friends,

Can you please help me to understand what is Bug Leak or Defect Leak and Memory Leak ?

Thanks in advance.

Views: 861

Reply to This

Replies to This Discussion

Bug Leak-: If something is missed out by developer and caught Tester.
Defect Leak-: If something is missed out by Tester and caught by client.
Memory Leak-:When we access the large amount of data from the database that require same amout of buffer memory, if data is exceeded the buffer memory then MEMORY LEAKAGE will happen.
Example-:If we need 100 data then it require same (100) buffer memory, if buffer memory is 99 then memory leakage will happen.

Please let me know if i missed something
Good explanation for Memory leakage
thanks Rupesh and Mohan for ur inputs....

Guys....Is it that Bug leak and Defet leak is what Rupesh said and Memory leak is what Mohan replied ??
Adding to Mohan,

Memory leakage could be due to Starvation of Process, Indefinite Postponement or deadlock.

Nice explanation...gud...

Somanathan R.
Hi Rupesh....Nice information.

Could you pls explain me little about Buffer memory....

Thanks in Advance..
Thanks Vanita
Bug leakage or defect leakage is the opposite of "phase containment". The idea is that defects may be introduced at any stage of software development (planning, requirements definition, design, construction, documentation, even testing), but appropriate quality control techniques (including inspection and static analysis) should detect them in that phase and prevent them from leaking to downstream phases--i.e., the defects and their effects are "contained" within the phase that created them. This practice can counter the "1:10:100" ratio effect--the round-number increase in the relative cost of finding and fixing bugs at the stage in which they are introduced, versus in subsequent stages if the bugs "leak". (The actual ratios tend to be worse than 10 but not as bad as 100, but will be very variable from project to project anyway.)

Memory leakage (just adding to earlier comments) refers to a characteristic of programming languages that were designed for writing operating systems, but are used for writing applications. Since operating systems need to be able to manipulate the allocation and deallocation of memory, applications written in these languages have effectively the same capability (via the actual operating system), so they can request extra memory whenever they need more--for example, to provide buffers for an additional user. (The safer alternative would be for the operating system to detect the application's memory needs automatically, which some OS's do.) If the application then fails to return the memory when it's finished with it (e.g., when a user logs off), it will gradually--or swiftly!--absorb more and more memory, so that "spare" memory available to the same or other applications appears to "leak" out of the system. Eventually, the offending application and/or other applications will run out of steam (memory) to work with.


TTWT Magazine





© 2020   Created by Quality Testing.   Powered by

Badges  |  Report an Issue  |  Terms of Service