US20050071807A1 - Methods and systems for predicting software defects in an upcoming software release - Google Patents

Methods and systems for predicting software defects in an upcoming software release Download PDF

Info

Publication number
US20050071807A1
US20050071807A1 US10/718,400 US71840003A US2005071807A1 US 20050071807 A1 US20050071807 A1 US 20050071807A1 US 71840003 A US71840003 A US 71840003A US 2005071807 A1 US2005071807 A1 US 2005071807A1
Authority
US
United States
Prior art keywords
software
upcoming
software release
release
defects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/718,400
Inventor
Aura Yanavi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JPMorgan Chase Bank NA
Original Assignee
JPMorgan Chase Bank NA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JPMorgan Chase Bank NA filed Critical JPMorgan Chase Bank NA
Priority to US10/718,400 priority Critical patent/US20050071807A1/en
Assigned to JP MORGAN CHASE BANK reassignment JP MORGAN CHASE BANK ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YANAVI, AURA
Publication of US20050071807A1 publication Critical patent/US20050071807A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/008Reliability or availability analysis

Definitions

  • the present invention relates generally to software engineering, and, more particularly, to methods and systems for predicting software defects in an upcoming software release.
  • the present invention provides a novel way to forecast the number of software defects for an upcoming software release.
  • the relative size of an upcoming software release with respect to a baseline software release is determined, and the number of software defects for the upcoming software release is forecast based on the relative size of the upcoming software release and the number of observed software defects for the baseline software release.
  • the relative size of the upcoming software release can be obtained by determining the number of new test requirements for the upcoming software release, determining the number of test requirements for the baseline software release, and dividing the number of new test requirements for the upcoming software release by the number of test requirements for the baseline software release.
  • the forecasted number of software defects can be then be calculated by multiplying the number of observed software defects for the baseline software release by the relative size of the upcoming software release.
  • a quality measurement for the upcoming software release can be determined based on the actual number of software defects for the upcoming software release relative to the forecasted number of software defects for the upcoming software release. This quality measurement value can be calculated by dividing the forecasted number of software defects by the actual number of software defects. A quality measurement value greater than one indicates that the software release achieved higher quality than the baseline software release. A quality measurement value of one indicates that the software release achieved the same level of quality as the baseline software release. A quality measurement value less than one indicates that the software release has a lower quality level than the baseline software release.
  • the forecasting step includes multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a regression defect factor.
  • the forecasting step includes multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a refactoring factor.
  • aspects of the present invention are incorporated into a project management system.
  • FIG. 1 is a block diagram of a computer processing system to which the present invention may be applied according to an embodiment of the present invention
  • FIG. 2 shows a flow diagram outlining an exemplary technique for forecasting the number of software defects for an upcoming software release
  • FIG. 3 shows an exemplary screen display of a project management system incorporating the software defect prediction features of the present invention.
  • the present invention provides a technique to forecast the number of software defects for an upcoming software release that involves evaluating the relative size of the upcoming software release with respect to a baseline software release, and estimating the number of expected defects based on the relative size of the upcoming software release and the number of observed software defects for the baseline software release.
  • a metric is provided to measure the quality achieved after product implementation.
  • the present invention may be implemented in various forms of hardware, software, firmware, special purpose processors, or a combination thereof.
  • the present invention is implemented in software as a program tangibly embodied on a program storage device.
  • the program may be uploaded to, and executed by, a machine comprising any suitable architecture.
  • the machine is implemented on a computer platform having hardware such as one or more central processing units (CPU), a random access memory (RAM), and input/output (I/O) interface(s).
  • CPU central processing units
  • RAM random access memory
  • I/O input/output
  • the computer platform also includes an operating system and microinstruction code.
  • the various processes and functions described herein may either be part of the microinstruction code or part of the program (or combination thereof) which is executed via the operating system.
  • various other peripheral devices may be connected to the computer platform such as an additional data storage device and a printing device.
  • FIG. 1 is a block diagram of a computer processing system 100 to which the present invention may be applied according to an embodiment of the present invention.
  • the system 100 includes at least one processor (hereinafter processor) 102 operatively coupled to other components via a system bus 104 .
  • processor hereinafter processor
  • a read-only memory (ROM) 106 , a random access memory (RAM) 108 , an I/O interface 110 , a network interface 112 , and external storage 114 are operatively coupled to the system bus 104 .
  • peripheral devices such as, for example, a display device, a disk storage device(e.g., a magnetic or optical disk storage device), a keyboard, and a mouse, may be operatively coupled to the system bus 104 by the I/O interface 110 or the network interface 112 .
  • the computer system 100 may be a standalone system or be linked to a network via the network interface 112 .
  • the network interface 112 may be a hard-wired interface.
  • the network interface 112 can include any device suitable to transmit information to and from another device, such as a universal asynchronous receiver/transmitter (UART), a parallel digital interface, a software interface or any combination of known or later developed software and hardware.
  • UART universal asynchronous receiver/transmitter
  • the network interface may be linked to various types of networks, including a local area network (LAN), a wide area network (WAN), an intranet, a virtual private network (VPN), and the Internet.
  • LAN local area network
  • WAN wide area network
  • VPN virtual private network
  • the external storage 114 may be implemented using a database management system (DBMS) managed by the processor 102 and residing on a memory such as a hard disk. However, it should be appreciated that the external storage 114 may be implemented on one or more additional computer systems.
  • DBMS database management system
  • FIG. 2 is a flow diagram illustrating an exemplary technique for predicting the number of software defects in an upcoming software release.
  • a test requirement can include any software feature that will be the subject of testing.
  • the test requirements will generally have been determined during the course of project planning. For example, many project management systems employ function point analysis. Function point analysis requires a project manager to estimate the number of software features that will be needed for a software system. The time necessary to develop the project is taken as the sum of the development time for each feature of the software. In this case, the number of new functions to be implemented could be used as the number of test requirements for the upcoming software release. This value could be manually input, or obtained directly from the project management system, for example.
  • step 204 the number of test requirements for a baseline software release (TR n ⁇ y ) is determined.
  • this “baseline release” will be a major software release, whereas the upcoming release will include relatively fewer new features.
  • major releases are often designated by a whole number such as “Release 2.0”.
  • Minor releases are often designated with a decimal value, such as “Release 2. 1”.
  • the number of test requirements for the baseline release will generally be a known quantity.
  • step 206 the New Functionality Factor for the upcoming release is calculated
  • NFF n TR n /TR n-y (1)
  • step 208 the actual number of defects for the baseline release (D n-y ) is input.
  • this will be a known value and will reflect defects that have so far been observed. Defects could include critical defects, major defects, minor defects, etc. However, it is important that the type of defect counted in this step be of the type that the user wishes to have forecast. Thus, if only critical defects were to be forecasted, then the value for D n-y should only include observed critical defects for the baseline release.
  • step 210 the number of defects for the upcoming software release (D n ) is calculated.
  • D n D n-y *NFF n (2)
  • the New Functionality Factor may be determined by dividing the number of new test requirements for an upcoming software release by the number of test requirements for a “benchmark” software release.
  • R factor of actual regression defects
  • the following formula may be used in lieu of Formula 2 to calculate the estimated number of defects in an upcoming software release, taking into consideration regression defects.
  • D n D n-y *( NFF n +R n-y ) (4)
  • the present invention can also be used in the situation where software code is re-factored.
  • Software code is refactored when it is substantially re-written.
  • We can overcome the problem of code re-factoring by adding the value “1” (or another suitable value) to the New Functionality Factor for that release. This means that we expect regression defects across the functionality as a benchmark. (If the regression defects were expected across 80% of the functionality, then the value “0.80” could be added to the New Functionality Factor).
  • the following formula expresses this concept (where the assumption is that regression defects will be across all functionalities).
  • D n D n-y *( NFF n +1) (5)
  • FIG. 3 illustrates an exemplary screen display of a project management system incorporating features of the present invention.
  • a baseline release (“Release 1.0”) had 241 test requirements
  • an upcoming software release (“Release 2.0”) had 82 new test requirements.
  • Release 1.0 had 32 Critical Defects and 41 Major Defects.

Abstract

The present invention provides a novel way to forecast the number of software defects for an upcoming software release. The systems and methods of the present invention involve evaluating the relative size of the upcoming software release with respect to a baseline software release, and estimating the number of expected defects based on the relative size of the upcoming software release and the number of observed software defects for the baseline release. Additional robustness may be achieved by adjusting the forecast to take into consideration regression defects that were detected in the baseline release as well as any code re-factoring. The present invention may be used in various applications such a project management system to allow a project manager to allocate sufficient resources to handle software defects, and to plan accordingly. In various embodiments, a metric is provided to measure the quality achieved after product implementation, based on the forecasted number of software defects.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application Ser. No. 60/506,794, filed by Aura Yanavi on Sep. 29, 2003 and entitled “Methods and Systems For Predicting Software Defects In an Upcoming Software Release”, which is incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The present invention relates generally to software engineering, and, more particularly, to methods and systems for predicting software defects in an upcoming software release.
  • BACKGROUND OF THE INVENTION
  • In an effort to improve software quality, various project management systems have been developed. Although these project management systems improve the chances that projects will be completed in a timely manner, managers continue to find it difficult to predict the number of software defects for upcoming software releases. If the number of software defects could be reliably predicted, then managers would be able to commit the necessary resources to more accurately deal with problems that arise.
  • In the academic world, this area of software defect prediction has been the subject of considerable research. There are complex, quantitative methods that focus on the relationship between the number of defects and software complexity. Typically, these models make numerous, unrealistic assumptions. Still other models focus on the quality of the development process as the best predictor of a product's quality. Unfortunately, none of these approaches have yielded accurate results. Accordingly, it would be desirable and highly advantageous to provide improved and simplified techniques for predicting software defects.
  • SUMMARY OF THE INVENTION
  • The present invention provides a novel way to forecast the number of software defects for an upcoming software release. According to the methods and systems of the present invention, the relative size of an upcoming software release with respect to a baseline software release is determined, and the number of software defects for the upcoming software release is forecast based on the relative size of the upcoming software release and the number of observed software defects for the baseline software release. The relative size of the upcoming software release can be obtained by determining the number of new test requirements for the upcoming software release, determining the number of test requirements for the baseline software release, and dividing the number of new test requirements for the upcoming software release by the number of test requirements for the baseline software release. The forecasted number of software defects can be then be calculated by multiplying the number of observed software defects for the baseline software release by the relative size of the upcoming software release.
  • According to an embodiment of the invention, a quality measurement for the upcoming software release can be determined based on the actual number of software defects for the upcoming software release relative to the forecasted number of software defects for the upcoming software release. This quality measurement value can be calculated by dividing the forecasted number of software defects by the actual number of software defects. A quality measurement value greater than one indicates that the software release achieved higher quality than the baseline software release. A quality measurement value of one indicates that the software release achieved the same level of quality as the baseline software release. A quality measurement value less than one indicates that the software release has a lower quality level than the baseline software release.
  • According to another embodiment of the invention, the forecasting step includes multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a regression defect factor.
  • According to another embodiment of the invention, the forecasting step includes multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a refactoring factor.
  • According to another embodiment of the invention, aspects of the present invention are incorporated into a project management system.
  • These and other aspects, features and advantages of the present invention will become apparent from the following detailed description of preferred embodiments, which is to be read in connection with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a computer processing system to which the present invention may be applied according to an embodiment of the present invention;
  • FIG. 2 shows a flow diagram outlining an exemplary technique for forecasting the number of software defects for an upcoming software release; and
  • FIG. 3 shows an exemplary screen display of a project management system incorporating the software defect prediction features of the present invention.
  • DESCRIPTION PREFERRED EMBODIMENTS
  • The present invention provides a technique to forecast the number of software defects for an upcoming software release that involves evaluating the relative size of the upcoming software release with respect to a baseline software release, and estimating the number of expected defects based on the relative size of the upcoming software release and the number of observed software defects for the baseline software release. In various embodiments, a metric is provided to measure the quality achieved after product implementation.
  • It is to be understood that the present invention may be implemented in various forms of hardware, software, firmware, special purpose processors, or a combination thereof. Preferably, the present invention is implemented in software as a program tangibly embodied on a program storage device. The program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine is implemented on a computer platform having hardware such as one or more central processing units (CPU), a random access memory (RAM), and input/output (I/O) interface(s). The computer platform also includes an operating system and microinstruction code. The various processes and functions described herein may either be part of the microinstruction code or part of the program (or combination thereof) which is executed via the operating system. In addition, various other peripheral devices may be connected to the computer platform such as an additional data storage device and a printing device.
  • It is to be understood that, because some of the constituent system components and method steps depicted in the accompanying figures are preferably implemented in software, the actual connections between the system components (or the process steps) may differ depending upon the manner in which the present invention is programmed.
  • FIG. 1 is a block diagram of a computer processing system 100 to which the present invention may be applied according to an embodiment of the present invention. The system 100 includes at least one processor (hereinafter processor) 102 operatively coupled to other components via a system bus 104. A read-only memory (ROM) 106, a random access memory (RAM) 108, an I/O interface 110, a network interface 112, and external storage 114 are operatively coupled to the system bus 104. Various peripheral devices such as, for example, a display device, a disk storage device(e.g., a magnetic or optical disk storage device), a keyboard, and a mouse, may be operatively coupled to the system bus 104 by the I/O interface 110 or the network interface 112.
  • The computer system 100 may be a standalone system or be linked to a network via the network interface 112. The network interface 112 may be a hard-wired interface. However, in various exemplary embodiments, the network interface 112 can include any device suitable to transmit information to and from another device, such as a universal asynchronous receiver/transmitter (UART), a parallel digital interface, a software interface or any combination of known or later developed software and hardware. The network interface may be linked to various types of networks, including a local area network (LAN), a wide area network (WAN), an intranet, a virtual private network (VPN), and the Internet.
  • The external storage 114 may be implemented using a database management system (DBMS) managed by the processor 102 and residing on a memory such as a hard disk. However, it should be appreciated that the external storage 114 may be implemented on one or more additional computer systems.
  • FIG. 2 is a flow diagram illustrating an exemplary technique for predicting the number of software defects in an upcoming software release.
  • In step 202, the number of new test requirements for a software release (TRn) is input. In general, a test requirement can include any software feature that will be the subject of testing. The test requirements will generally have been determined during the course of project planning. For example, many project management systems employ function point analysis. Function point analysis requires a project manager to estimate the number of software features that will be needed for a software system. The time necessary to develop the project is taken as the sum of the development time for each feature of the software. In this case, the number of new functions to be implemented could be used as the number of test requirements for the upcoming software release. This value could be manually input, or obtained directly from the project management system, for example.
  • Next, in step 204, the number of test requirements for a baseline software release (TRn−y) is determined. Generally, this “baseline release” will be a major software release, whereas the upcoming release will include relatively fewer new features. In the software industry, major releases are often designated by a whole number such as “Release 2.0”. Minor releases are often designated with a decimal value, such as “Release 2. 1”. The number of test requirements for the baseline release will generally be a known quantity.
  • In step 206, the New Functionality Factor for the upcoming release is calculated The following formula specifies one way to determine the New Functionality Factor:
    NFF n =TR n /TR n-y   (1)
    where
      • NFFn is the New Functionality Factor for release n;
      • TRn is the number of new test requirements for release n; and
      • TRn is the number of test requirements for release n-y, where y=1, . . . m-1, and y<n.
  • Next, in step 208, the actual number of defects for the baseline release (Dn-y) is input. In general, this will be a known value and will reflect defects that have so far been observed. Defects could include critical defects, major defects, minor defects, etc. However, it is important that the type of defect counted in this step be of the type that the user wishes to have forecast. Thus, if only critical defects were to be forecasted, then the value for Dn-y should only include observed critical defects for the baseline release.
  • Next, in step 210, the number of defects for the upcoming software release (Dn) is calculated. One way to calculate the number of defects is to use the following formula:
    D n =D n-y *NFF n   (2)
    where
      • Dn is the estimated number of defects for release n,
      • Dn-y is the number of observed software defects for release n-y, and
      • NFFn is the New Functionality Factor (determined in Formula 1) for release n.
  • Finally, it may be desirable to measure the quality of the new software release. In step 212, a quality measurement value (Qn) can optionally be determined after product implementation, using the following formula:
    Q n =D n /A n   (3)
    where
      • Qn is the quality measurement value,
      • Dn is the estimated number of defects for release n, and
      • An is the actual number of defects for release n.
  • The quality measurement value (Qn) may be interpreted as shown in Table 1.
    TABLE 1
    Interpretation of Quality Measurement Value
    Qn < 1 Release n is of lower quality than the
    baseline release
    Qn = 1 Release n has the same quality as the
    baseline release
    Qn > 1 Release n is of higher quality than the
    baseline release
  • Although the method described above, with reference to FIG. 2, is a relatively straightforward technique to forecast the number of software defects, it is to be appreciated that variations to the above formula(s) may be made without departing from the spirit and scope of the present invention.
  • The following will now describe additional ways in which the basic methodology may be expanded to create a more robust tool.
  • As discussed above, the New Functionality Factor (NFFn) may be determined by dividing the number of new test requirements for an upcoming software release by the number of test requirements for a “benchmark” software release. However, this assumes that all defects are discovered only in the a new functionality. We can overcome this assumption by taking into account the factor of actual regression defects (R) (percentage of actual regression defects divided by 100) in the release that we are using as the benchmark. The following formula may be used in lieu of Formula 2 to calculate the estimated number of defects in an upcoming software release, taking into consideration regression defects.
    D n =D n-y *( NFF n +R n-y)   (4)
    where
      • Rn-y is the percentage of actual regression defects divided by 100.
  • The present invention can also be used in the situation where software code is re-factored. Software code is refactored when it is substantially re-written. We can overcome the problem of code re-factoring by adding the value “1” (or another suitable value) to the New Functionality Factor for that release. This means that we expect regression defects across the functionality as a benchmark. (If the regression defects were expected across 80% of the functionality, then the value “0.80” could be added to the New Functionality Factor). The following formula expresses this concept (where the assumption is that regression defects will be across all functionalities).
    D n =D n-y *( NFF n+1)   (5)
  • The invention will be clarified by the following examples.
  • EXAMPLE 1
  • FIG. 3 illustrates an exemplary screen display of a project management system incorporating features of the present invention. As depicted in FIG. 3, a baseline release (“Release 1.0”) had 241 test requirements, and an upcoming software release (“Release 2.0”) had 82 new test requirements. Applying Formula 1, the New Functionality Factor was calculated, as follows:
    NFF n=241/82=0.34
  • As indicated, Release 1.0 had 32 Critical Defects and 41 Major Defects.
  • Applying Formula 2, the estimated number of critical defects for Release 2.0 was calculated as follows:
    D n=(32*0.34)=11
  • Applying Formula 2, the estimated number of major defects for Release 2.0 was calculated as follows:
    D n=(41*0.34)=14
  • EXAMPLE 2
  • Suppose, after implementation of Release 2.0, there were actually 10 critical defects and 12 major defects. Using the estimated number of software defects from Example 1 and applying Formula 3, the quality measurements would be calculated as follows:
    Q n=11/10=1.10 (critical defect quality)
    Q n=14/12=1.67 (major defect quality).
    In this case, the project achieved slightly higher critical defect quality and major defect quality than the baseline.
  • Although illustrative embodiments of the present invention have been described herein with reference to the accompanying drawings, it is to be understood that the invention is not limited to those precise embodiments, and that various other changes and modifications may be affected therein by one skilled in the art without departing from the scope or spirit of the invention.

Claims (22)

1. A method for predicting the number of software defects for an upcoming software release, comprising the steps of:
determining the relative size of the upcoming software release with respect to a baseline software release; and
forecasting the number of software defects for the upcoming software release based on the relative size of the upcoming software release and the number of observed software defects for the baseline software release.
2. The method of claim 1, wherein determining the relative size of the upcoming software release includes the steps of:
determining the number of new test requirements for the upcoming software release;
determining the number of test requirements for the baseline software release; and
dividing the number of new test requirements for the upcoming software release by the number of test requirements for the baseline software release.
3. The method of claim 1, wherein the forecasting step includes multiplying the number of observed software defects for the baseline software release by the relative size of the upcoming software release.
4. The method of claim 1, wherein the forecasting step includes multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a regression defect factor.
5. The method of claim 1, wherein the forecasting step includes multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a refactoring factor.
6. The method of claim 1, further including determining a quality measurement for the upcoming software release based on the actual number of software defects for the upcoming software release relative to the forecasted number of software defects for the upcoming software release
7. The method of 6, wherein the quality measurement is used by a project management system.
8. The method of claim 1, wherein number of software defects for the upcoming software release is used by a project management system.
9. The method of claim 1, wherein information used to forecast the software defects is graphically depicted.
10. The method of claim 1, wherein the baseline software release is selected by a user.
11. A system for predicting the number of software defects for an upcoming software release, comprising:
an input device for obtaining information regarding an upcoming software release and a baseline software release;
a processor for determining the relative size of the upcoming software release with respect to a baseline software release and forecasting the number of software defects for the upcoming software release based on the relative size of the upcoming software release and the number of observed software defects for the baseline software release; and
an output device for outputting the forecasted number of software defects for the upcoming software release.
12. The system of claim 11, wherein the information obtained by the input device includes the number of new test requirements for the upcoming software release and the number of test requirements for the baseline software release, and the processor determines the relative size of the upcoming software release by dividing the number of new test requirements for the upcoming software release by the number of test requirements for the baseline software release.
13. The system of claim 11, wherein the processor forecasts the number of software defects for the upcoming software release by multiplying the number of observed software defects for the baseline software release by the relative size of the upcoming software release.
14. The system of claim 11, wherein the processor forecasts the number of software defects for the upcoming software release by multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a regression defect factor.
15. The system of claim 11, wherein the processor forecasts the number of software defects for the upcoming software release by multiplying the number of observed software defects for the baseline software release by the sum of the relative size of the upcoming software release and a refactoring factor.
16. The system of claim 11, wherein the processor further determines a quality measurement for the upcoming software release based on the actual number of software defects for the upcoming software release relative to the forecasted number of software defects for the upcoming software release
17. The system of 16, wherein the quality measurement is used by a project management system.
18. The system of claim 11, wherein number of software defects for the upcoming software release is used by a project management system.
19. The system of claim 11, wherein the output device is configured to graphically depict information regarding the forecasted number of software defects.
20. The system of claim 11, wherein the input device is configured to allow a user to select the baseline software release.
21. A program storage device readable by a machine, tangibly embodying a program of instructions executable on the machine to perform method steps for predicting the number of software defects for an upcoming software release, the method steps comprising:
determining the relative size of the upcoming software release with respect to a baseline software release; and
forecasting the number of software defects for the upcoming software release based on the relative size of the upcoming software release and the number of observed software defects for the baseline software release.
22. The program storage device of claim 21, wherein the instructions for performing the step of determining the relative size of the upcoming software release includes instructions for performing the steps of:
determining the number of new test requirements for the upcoming software release;
determining the number of test requirements for the baseline software release; and
dividing the number of new test requirements for the upcoming software release by the number of test requirements for the baseline software release.
US10/718,400 2003-09-29 2003-11-20 Methods and systems for predicting software defects in an upcoming software release Abandoned US20050071807A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/718,400 US20050071807A1 (en) 2003-09-29 2003-11-20 Methods and systems for predicting software defects in an upcoming software release

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US50679403P 2003-09-29 2003-09-29
US10/718,400 US20050071807A1 (en) 2003-09-29 2003-11-20 Methods and systems for predicting software defects in an upcoming software release

Publications (1)

Publication Number Publication Date
US20050071807A1 true US20050071807A1 (en) 2005-03-31

Family

ID=34381279

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/718,400 Abandoned US20050071807A1 (en) 2003-09-29 2003-11-20 Methods and systems for predicting software defects in an upcoming software release

Country Status (1)

Country Link
US (1) US20050071807A1 (en)

Cited By (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030018573A1 (en) * 2001-06-28 2003-01-23 Andrew Comas System and method for characterizing and selecting technology transition options
US20040083158A1 (en) * 2002-10-09 2004-04-29 Mark Addison Systems and methods for distributing pricing data for complex derivative securities
US20040088278A1 (en) * 2002-10-30 2004-05-06 Jp Morgan Chase Method to measure stored procedure execution statistics
US20040153535A1 (en) * 2003-02-03 2004-08-05 Chau Tony Ka Wai Method for software suspension in a networked computer system
US20050204029A1 (en) * 2004-03-09 2005-09-15 John Connolly User connectivity process management system
US20060041864A1 (en) * 2004-08-19 2006-02-23 International Business Machines Corporation Error estimation and tracking tool for testing of code
US20060085492A1 (en) * 2004-10-14 2006-04-20 Singh Arun K System and method for modifying process navigation
US20070018823A1 (en) * 2005-05-30 2007-01-25 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device and driving method thereof
US20080263507A1 (en) * 2007-04-17 2008-10-23 Ching-Pao Chang Action-based in-process software defect prediction software defect prediction techniques based on software development activities
US20080313633A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Software feature usage analysis and reporting
US20080313617A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Analyzing software users with instrumentation data and user group modeling and analysis
US20080313507A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Software reliability analysis using alerts, asserts and user interface controls
US20090319984A1 (en) * 2008-06-24 2009-12-24 Internaional Business Machines Corporation Early defect removal model
US7665127B1 (en) 2004-06-30 2010-02-16 Jp Morgan Chase Bank System and method for providing access to protected services
US20100293072A1 (en) * 2009-05-13 2010-11-18 David Murrant Preserving the Integrity of Segments of Audio Streams
US7870114B2 (en) 2007-06-15 2011-01-11 Microsoft Corporation Efficient data infrastructure for high dimensional data analysis
US7895565B1 (en) 2006-03-15 2011-02-22 Jp Morgan Chase Bank, N.A. Integrated system and method for validating the functionality and performance of software applications
US20110061041A1 (en) * 2009-09-04 2011-03-10 International Business Machines Corporation Reliability and availability modeling of a software application
US20110066887A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to provide continuous calibration estimation and improvement options across a software integration life cycle
US20110066490A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method for resource modeling and simulation in test planning
US20110066558A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to produce business case metrics based on code inspection service results
US20110066890A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method for analyzing alternatives in test plans
US20110066893A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to map defect reduction data to organizational maturity profiles for defect projection modeling
US20110067006A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to classify automated code inspection services defect output for defect analysis
US20110066557A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to produce business case metrics based on defect analysis starter (das) results
US20110066486A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method for efficient creation and reconciliation of macro and micro level test plans
US20110067005A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to determine defect risks in software solutions
US7913249B1 (en) 2006-03-07 2011-03-22 Jpmorgan Chase Bank, N.A. Software installation checker
US20120017195A1 (en) * 2010-07-17 2012-01-19 Vikrant Shyamkant Kaulgud Method and System for Evaluating the Testing of a Software System Having a Plurality of Components
US8126987B2 (en) 2009-11-16 2012-02-28 Sony Computer Entertainment Inc. Mediation of content-related services
US8181016B1 (en) 2005-12-01 2012-05-15 Jpmorgan Chase Bank, N.A. Applications access re-certification system
US20130061202A1 (en) * 2011-09-05 2013-03-07 Infosys Limited Methods for assessing deliverable product quality and devices thereof
US8433759B2 (en) 2010-05-24 2013-04-30 Sony Computer Entertainment America Llc Direction-conscious information sharing
US8572516B1 (en) 2005-08-24 2013-10-29 Jpmorgan Chase Bank, N.A. System and method for controlling a screen saver
US8635056B2 (en) 2009-09-11 2014-01-21 International Business Machines Corporation System and method for system integration test (SIT) planning
US20140033174A1 (en) * 2012-07-29 2014-01-30 International Business Machines Corporation Software bug predicting
US20140033176A1 (en) * 2012-07-26 2014-01-30 Infosys Limited Methods for predicting one or more defects in a computer program and devices thereof
US20140366140A1 (en) * 2013-06-10 2014-12-11 Hewlett-Packard Development Company, L.P. Estimating a quantity of exploitable security vulnerabilities in a release of an application
US8966557B2 (en) 2001-01-22 2015-02-24 Sony Computer Entertainment Inc. Delivery of digital content
US9088459B1 (en) 2013-02-22 2015-07-21 Jpmorgan Chase Bank, N.A. Breadth-first resource allocation system and methods
CN104899135A (en) * 2015-05-14 2015-09-09 工业和信息化部电子第五研究所 Software defect prediction method and system
US9213624B2 (en) 2012-05-31 2015-12-15 Microsoft Technology Licensing, Llc Application quality parameter measurement-based development
US9483405B2 (en) 2007-09-20 2016-11-01 Sony Interactive Entertainment Inc. Simplified run-time program translation for emulating complex processor pipelines
US9542259B1 (en) 2013-12-23 2017-01-10 Jpmorgan Chase Bank, N.A. Automated incident resolution system and method
CN106528417A (en) * 2016-10-28 2017-03-22 中国电子产品可靠性与环境试验研究所 Intelligent detection method and system of software defects
US9619410B1 (en) 2013-10-03 2017-04-11 Jpmorgan Chase Bank, N.A. Systems and methods for packet switching
US9720655B1 (en) 2013-02-01 2017-08-01 Jpmorgan Chase Bank, N.A. User interface event orchestration
CN107133179A (en) * 2017-06-06 2017-09-05 中国电力科学研究院 A kind of website failure prediction method based on Bayesian network and its realize system
CN107168868A (en) * 2017-04-01 2017-09-15 西安交通大学 A kind of software based on sampling and integrated study changes failure prediction method
US9868054B1 (en) 2014-02-10 2018-01-16 Jpmorgan Chase Bank, N.A. Dynamic game deployment
US10002041B1 (en) 2013-02-01 2018-06-19 Jpmorgan Chase Bank, N.A. System and method for maintaining the health of a machine
CN109543707A (en) * 2018-09-29 2019-03-29 南京航空航天大学 Semi-supervised change level Software Defects Predict Methods based on three decisions
CN109634833A (en) * 2017-10-09 2019-04-16 北京京东尚科信息技术有限公司 A kind of Software Defects Predict Methods and device
US20190163706A1 (en) * 2015-06-02 2019-05-30 International Business Machines Corporation Ingesting documents using multiple ingestion pipelines
US20190377665A1 (en) * 2015-03-19 2019-12-12 Teachers Insurance And Annuity Association Of America Evaluating and presenting software testing project status indicators
WO2020047173A1 (en) * 2018-08-31 2020-03-05 Procore Technologies, Inc. Computer system and method for predicting risk level of punch items
US11036613B1 (en) 2020-03-30 2021-06-15 Bank Of America Corporation Regression analysis for software development and management using machine learning
US11144435B1 (en) 2020-03-30 2021-10-12 Bank Of America Corporation Test case generation for software development using machine learning
US11144308B2 (en) 2017-09-15 2021-10-12 Cognizant Technology Solutions India Pvt. Ltd. System and method for predicting defects in a computer program
US11151023B2 (en) 2017-11-20 2021-10-19 Cognizant Technology Solutions India Pvt. Ltd. System and method for predicting performance failures in a computer program
US11288065B2 (en) 2018-07-02 2022-03-29 International Business Machines Corporation Devops driven cognitive cost function for software defect prediction

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5446895A (en) * 1991-12-13 1995-08-29 White; Leonard R. Measurement analysis software system and method
US5655074A (en) * 1995-07-06 1997-08-05 Bell Communications Research, Inc. Method and system for conducting statistical quality analysis of a complex system
US5758061A (en) * 1995-12-15 1998-05-26 Plum; Thomas S. Computer software testing method and apparatus
US5903897A (en) * 1996-12-18 1999-05-11 Alcatel Usa Sourcing, L.P. Software documentation release control system
US5960196A (en) * 1996-12-18 1999-09-28 Alcatel Usa Sourcing, L.P. Software release metric reporting system and method
US6073107A (en) * 1997-08-26 2000-06-06 Minkiewicz; Arlene F. Parametric software forecasting system and method
US6363524B1 (en) * 1999-09-10 2002-03-26 Hewlett-Packard Company System and method for assessing the need for installing software patches in a computer system
US6405364B1 (en) * 1999-08-31 2002-06-11 Accenture Llp Building techniques in a development architecture framework
US20020147961A1 (en) * 2001-03-05 2002-10-10 Charters Graham Castree Method, apparatus and computer program product for integrating heterogeneous systems
US20020162090A1 (en) * 2001-04-30 2002-10-31 Parnell Karen P. Polylingual simultaneous shipping of software
US6477471B1 (en) * 1995-10-30 2002-11-05 Texas Instruments Incorporated Product defect predictive engine
US20030018952A1 (en) * 2001-07-13 2003-01-23 Roetzheim William H. System and method to estimate resource usage for a software development project
US6513154B1 (en) * 1996-10-21 2003-01-28 John R. Porterfield System and method for testing of computer programs in programming effort
US6519763B1 (en) * 1998-03-30 2003-02-11 Compuware Corporation Time management and task completion and prediction software
US20030033586A1 (en) * 2001-08-09 2003-02-13 James Lawler Automated system and method for software application quantification
US6546506B1 (en) * 1999-09-10 2003-04-08 International Business Machines Corporation Technique for automatically generating a software test plan
US6601018B1 (en) * 1999-02-04 2003-07-29 International Business Machines Corporation Automatic test framework system and method in software component testing
US6601233B1 (en) * 1999-07-30 2003-07-29 Accenture Llp Business components framework
US6601017B1 (en) * 2000-11-09 2003-07-29 Ge Financial Assurance Holdings, Inc. Process and system for quality assurance for software
US6629266B1 (en) * 1999-11-17 2003-09-30 International Business Machines Corporation Method and system for transparent symptom-based selective software rejuvenation
US6626953B2 (en) * 1998-04-10 2003-09-30 Cisco Technology, Inc. System and method for retrieving software release information
US20030188290A1 (en) * 2001-08-29 2003-10-02 International Business Machines Corporation Method and system for a quality software management process
US20030196190A1 (en) * 2002-04-12 2003-10-16 International Business Machines Corporation Generating and managing test plans for testing computer software

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5446895A (en) * 1991-12-13 1995-08-29 White; Leonard R. Measurement analysis software system and method
US5655074A (en) * 1995-07-06 1997-08-05 Bell Communications Research, Inc. Method and system for conducting statistical quality analysis of a complex system
US6477471B1 (en) * 1995-10-30 2002-11-05 Texas Instruments Incorporated Product defect predictive engine
US5758061A (en) * 1995-12-15 1998-05-26 Plum; Thomas S. Computer software testing method and apparatus
US6513154B1 (en) * 1996-10-21 2003-01-28 John R. Porterfield System and method for testing of computer programs in programming effort
US5903897A (en) * 1996-12-18 1999-05-11 Alcatel Usa Sourcing, L.P. Software documentation release control system
US5960196A (en) * 1996-12-18 1999-09-28 Alcatel Usa Sourcing, L.P. Software release metric reporting system and method
US6073107A (en) * 1997-08-26 2000-06-06 Minkiewicz; Arlene F. Parametric software forecasting system and method
US6519763B1 (en) * 1998-03-30 2003-02-11 Compuware Corporation Time management and task completion and prediction software
US6626953B2 (en) * 1998-04-10 2003-09-30 Cisco Technology, Inc. System and method for retrieving software release information
US6601018B1 (en) * 1999-02-04 2003-07-29 International Business Machines Corporation Automatic test framework system and method in software component testing
US6601233B1 (en) * 1999-07-30 2003-07-29 Accenture Llp Business components framework
US6405364B1 (en) * 1999-08-31 2002-06-11 Accenture Llp Building techniques in a development architecture framework
US6363524B1 (en) * 1999-09-10 2002-03-26 Hewlett-Packard Company System and method for assessing the need for installing software patches in a computer system
US6546506B1 (en) * 1999-09-10 2003-04-08 International Business Machines Corporation Technique for automatically generating a software test plan
US6629266B1 (en) * 1999-11-17 2003-09-30 International Business Machines Corporation Method and system for transparent symptom-based selective software rejuvenation
US6601017B1 (en) * 2000-11-09 2003-07-29 Ge Financial Assurance Holdings, Inc. Process and system for quality assurance for software
US20020147961A1 (en) * 2001-03-05 2002-10-10 Charters Graham Castree Method, apparatus and computer program product for integrating heterogeneous systems
US20020162090A1 (en) * 2001-04-30 2002-10-31 Parnell Karen P. Polylingual simultaneous shipping of software
US20030018952A1 (en) * 2001-07-13 2003-01-23 Roetzheim William H. System and method to estimate resource usage for a software development project
US20030033586A1 (en) * 2001-08-09 2003-02-13 James Lawler Automated system and method for software application quantification
US20030188290A1 (en) * 2001-08-29 2003-10-02 International Business Machines Corporation Method and system for a quality software management process
US20030196190A1 (en) * 2002-04-12 2003-10-16 International Business Machines Corporation Generating and managing test plans for testing computer software

Cited By (106)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8966557B2 (en) 2001-01-22 2015-02-24 Sony Computer Entertainment Inc. Delivery of digital content
US8234156B2 (en) 2001-06-28 2012-07-31 Jpmorgan Chase Bank, N.A. System and method for characterizing and selecting technology transition options
US20030018573A1 (en) * 2001-06-28 2003-01-23 Andrew Comas System and method for characterizing and selecting technology transition options
US20040083158A1 (en) * 2002-10-09 2004-04-29 Mark Addison Systems and methods for distributing pricing data for complex derivative securities
US20040088278A1 (en) * 2002-10-30 2004-05-06 Jp Morgan Chase Method to measure stored procedure execution statistics
US20040153535A1 (en) * 2003-02-03 2004-08-05 Chau Tony Ka Wai Method for software suspension in a networked computer system
US7702767B2 (en) 2004-03-09 2010-04-20 Jp Morgan Chase Bank User connectivity process management system
US20050204029A1 (en) * 2004-03-09 2005-09-15 John Connolly User connectivity process management system
US7665127B1 (en) 2004-06-30 2010-02-16 Jp Morgan Chase Bank System and method for providing access to protected services
US20060041864A1 (en) * 2004-08-19 2006-02-23 International Business Machines Corporation Error estimation and tracking tool for testing of code
US20060085492A1 (en) * 2004-10-14 2006-04-20 Singh Arun K System and method for modifying process navigation
US20070018823A1 (en) * 2005-05-30 2007-01-25 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device and driving method thereof
US8572516B1 (en) 2005-08-24 2013-10-29 Jpmorgan Chase Bank, N.A. System and method for controlling a screen saver
US8972906B1 (en) 2005-08-24 2015-03-03 Jpmorgan Chase Bank, N.A. System and method for controlling a screen saver
US10200444B1 (en) 2005-08-24 2019-02-05 Jpmorgan Chase Bank, N.A. System and method for controlling a screen saver
US8181016B1 (en) 2005-12-01 2012-05-15 Jpmorgan Chase Bank, N.A. Applications access re-certification system
US7913249B1 (en) 2006-03-07 2011-03-22 Jpmorgan Chase Bank, N.A. Software installation checker
US7895565B1 (en) 2006-03-15 2011-02-22 Jp Morgan Chase Bank, N.A. Integrated system and method for validating the functionality and performance of software applications
US9477581B2 (en) 2006-03-15 2016-10-25 Jpmorgan Chase Bank, N.A. Integrated system and method for validating the functionality and performance of software applications
US20080263507A1 (en) * 2007-04-17 2008-10-23 Ching-Pao Chang Action-based in-process software defect prediction software defect prediction techniques based on software development activities
US7856616B2 (en) * 2007-04-17 2010-12-21 National Defense University Action-based in-process software defect prediction software defect prediction techniques based on software development activities
US7739666B2 (en) 2007-06-15 2010-06-15 Microsoft Corporation Analyzing software users with instrumentation data and user group modeling and analysis
US7870114B2 (en) 2007-06-15 2011-01-11 Microsoft Corporation Efficient data infrastructure for high dimensional data analysis
US7747988B2 (en) 2007-06-15 2010-06-29 Microsoft Corporation Software feature usage analysis and reporting
US7681085B2 (en) 2007-06-15 2010-03-16 Microsoft Corporation Software reliability analysis using alerts, asserts and user interface controls
US20080313507A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Software reliability analysis using alerts, asserts and user interface controls
US20080313617A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Analyzing software users with instrumentation data and user group modeling and analysis
US20080313633A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Software feature usage analysis and reporting
US9483405B2 (en) 2007-09-20 2016-11-01 Sony Interactive Entertainment Inc. Simplified run-time program translation for emulating complex processor pipelines
US20090319984A1 (en) * 2008-06-24 2009-12-24 Internaional Business Machines Corporation Early defect removal model
US8352904B2 (en) * 2008-06-24 2013-01-08 International Business Machines Corporation Early defect removal model
US20100293072A1 (en) * 2009-05-13 2010-11-18 David Murrant Preserving the Integrity of Segments of Audio Streams
US20110061041A1 (en) * 2009-09-04 2011-03-10 International Business Machines Corporation Reliability and availability modeling of a software application
US8689188B2 (en) 2009-09-11 2014-04-01 International Business Machines Corporation System and method for analyzing alternatives in test plans
US10235269B2 (en) * 2009-09-11 2019-03-19 International Business Machines Corporation System and method to produce business case metrics based on defect analysis starter (DAS) results
US10372593B2 (en) 2009-09-11 2019-08-06 International Business Machines Corporation System and method for resource modeling and simulation in test planning
US20110067005A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to determine defect risks in software solutions
US20110066486A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method for efficient creation and reconciliation of macro and micro level test plans
US20110066887A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to provide continuous calibration estimation and improvement options across a software integration life cycle
US10185649B2 (en) 2009-09-11 2019-01-22 International Business Machines Corporation System and method for efficient creation and reconciliation of macro and micro level test plans
US8495583B2 (en) 2009-09-11 2013-07-23 International Business Machines Corporation System and method to determine defect risks in software solutions
US8527955B2 (en) 2009-09-11 2013-09-03 International Business Machines Corporation System and method to classify automated code inspection services defect output for defect analysis
US8539438B2 (en) 2009-09-11 2013-09-17 International Business Machines Corporation System and method for efficient creation and reconciliation of macro and micro level test plans
US8566805B2 (en) 2009-09-11 2013-10-22 International Business Machines Corporation System and method to provide continuous calibration estimation and improvement options across a software integration life cycle
US20110066557A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to produce business case metrics based on defect analysis starter (das) results
US8578341B2 (en) 2009-09-11 2013-11-05 International Business Machines Corporation System and method to map defect reduction data to organizational maturity profiles for defect projection modeling
US9753838B2 (en) 2009-09-11 2017-09-05 International Business Machines Corporation System and method to classify automated code inspection services defect output for defect analysis
US8635056B2 (en) 2009-09-11 2014-01-21 International Business Machines Corporation System and method for system integration test (SIT) planning
US9710257B2 (en) 2009-09-11 2017-07-18 International Business Machines Corporation System and method to map defect reduction data to organizational maturity profiles for defect projection modeling
US9594671B2 (en) 2009-09-11 2017-03-14 International Business Machines Corporation System and method for resource modeling and simulation in test planning
US8645921B2 (en) 2009-09-11 2014-02-04 International Business Machines Corporation System and method to determine defect risks in software solutions
US8667458B2 (en) 2009-09-11 2014-03-04 International Business Machines Corporation System and method to produce business case metrics based on code inspection service results
US20110067006A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to classify automated code inspection services defect output for defect analysis
US8893086B2 (en) 2009-09-11 2014-11-18 International Business Machines Corporation System and method for resource modeling and simulation in test planning
US9558464B2 (en) 2009-09-11 2017-01-31 International Business Machines Corporation System and method to determine defect risks in software solutions
US8924936B2 (en) 2009-09-11 2014-12-30 International Business Machines Corporation System and method to classify automated code inspection services defect output for defect analysis
US20110066893A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to map defect reduction data to organizational maturity profiles for defect projection modeling
US20110066890A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method for analyzing alternatives in test plans
US20110066490A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method for resource modeling and simulation in test planning
US9052981B2 (en) 2009-09-11 2015-06-09 International Business Machines Corporation System and method to map defect reduction data to organizational maturity profiles for defect projection modeling
US20110066558A1 (en) * 2009-09-11 2011-03-17 International Business Machines Corporation System and method to produce business case metrics based on code inspection service results
US9442821B2 (en) 2009-09-11 2016-09-13 International Business Machines Corporation System and method to classify automated code inspection services defect output for defect analysis
US9292421B2 (en) 2009-09-11 2016-03-22 International Business Machines Corporation System and method for resource modeling and simulation in test planning
US9176844B2 (en) 2009-09-11 2015-11-03 International Business Machines Corporation System and method to classify automated code inspection services defect output for defect analysis
US9262736B2 (en) 2009-09-11 2016-02-16 International Business Machines Corporation System and method for efficient creation and reconciliation of macro and micro level test plans
US8126987B2 (en) 2009-11-16 2012-02-28 Sony Computer Entertainment Inc. Mediation of content-related services
US8433759B2 (en) 2010-05-24 2013-04-30 Sony Computer Entertainment America Llc Direction-conscious information sharing
US8601441B2 (en) * 2010-07-17 2013-12-03 Accenture Global Services Limited Method and system for evaluating the testing of a software system having a plurality of components
US20120017195A1 (en) * 2010-07-17 2012-01-19 Vikrant Shyamkant Kaulgud Method and System for Evaluating the Testing of a Software System Having a Plurality of Components
US9134997B2 (en) * 2011-09-05 2015-09-15 Infosys Limited Methods for assessing deliverable product quality and devices thereof
US20130061202A1 (en) * 2011-09-05 2013-03-07 Infosys Limited Methods for assessing deliverable product quality and devices thereof
US9213624B2 (en) 2012-05-31 2015-12-15 Microsoft Technology Licensing, Llc Application quality parameter measurement-based development
US20140033176A1 (en) * 2012-07-26 2014-01-30 Infosys Limited Methods for predicting one or more defects in a computer program and devices thereof
US9038030B2 (en) * 2012-07-26 2015-05-19 Infosys Limited Methods for predicting one or more defects in a computer program and devices thereof
US20140033174A1 (en) * 2012-07-29 2014-01-30 International Business Machines Corporation Software bug predicting
US9898262B2 (en) 2013-02-01 2018-02-20 Jpmorgan Chase Bank, N.A. User interface event orchestration
US9720655B1 (en) 2013-02-01 2017-08-01 Jpmorgan Chase Bank, N.A. User interface event orchestration
US10664335B2 (en) 2013-02-01 2020-05-26 Jpmorgan Chase Bank, N.A. System and method for maintaining the health of a machine
US10002041B1 (en) 2013-02-01 2018-06-19 Jpmorgan Chase Bank, N.A. System and method for maintaining the health of a machine
US9537790B1 (en) 2013-02-22 2017-01-03 Jpmorgan Chase Bank, N.A. Breadth-first resource allocation system and methods
US9088459B1 (en) 2013-02-22 2015-07-21 Jpmorgan Chase Bank, N.A. Breadth-first resource allocation system and methods
US9882973B2 (en) 2013-02-22 2018-01-30 Jpmorgan Chase Bank, N.A. Breadth-first resource allocation system and methods
US20140366140A1 (en) * 2013-06-10 2014-12-11 Hewlett-Packard Development Company, L.P. Estimating a quantity of exploitable security vulnerabilities in a release of an application
US9619410B1 (en) 2013-10-03 2017-04-11 Jpmorgan Chase Bank, N.A. Systems and methods for packet switching
US9900267B2 (en) 2013-10-03 2018-02-20 Jpmorgan Chase Bank, N.A. Systems and methods for packet switching
US9542259B1 (en) 2013-12-23 2017-01-10 Jpmorgan Chase Bank, N.A. Automated incident resolution system and method
US10678628B2 (en) 2013-12-23 2020-06-09 Jpmorgan Chase Bank, N.A. Automated incident resolution system and method
US9868054B1 (en) 2014-02-10 2018-01-16 Jpmorgan Chase Bank, N.A. Dynamic game deployment
US10901875B2 (en) * 2015-03-19 2021-01-26 Teachers Insurance And Annuity Association Of America Evaluating and presenting software testing project status indicators
US20190377665A1 (en) * 2015-03-19 2019-12-12 Teachers Insurance And Annuity Association Of America Evaluating and presenting software testing project status indicators
CN104899135A (en) * 2015-05-14 2015-09-09 工业和信息化部电子第五研究所 Software defect prediction method and system
US20190163706A1 (en) * 2015-06-02 2019-05-30 International Business Machines Corporation Ingesting documents using multiple ingestion pipelines
US10318591B2 (en) 2015-06-02 2019-06-11 International Business Machines Corporation Ingesting documents using multiple ingestion pipelines
US10572547B2 (en) * 2015-06-02 2020-02-25 International Business Machines Corporation Ingesting documents using multiple ingestion pipelines
CN106528417A (en) * 2016-10-28 2017-03-22 中国电子产品可靠性与环境试验研究所 Intelligent detection method and system of software defects
CN107168868A (en) * 2017-04-01 2017-09-15 西安交通大学 A kind of software based on sampling and integrated study changes failure prediction method
CN107133179A (en) * 2017-06-06 2017-09-05 中国电力科学研究院 A kind of website failure prediction method based on Bayesian network and its realize system
US11144308B2 (en) 2017-09-15 2021-10-12 Cognizant Technology Solutions India Pvt. Ltd. System and method for predicting defects in a computer program
CN109634833A (en) * 2017-10-09 2019-04-16 北京京东尚科信息技术有限公司 A kind of Software Defects Predict Methods and device
US11151023B2 (en) 2017-11-20 2021-10-19 Cognizant Technology Solutions India Pvt. Ltd. System and method for predicting performance failures in a computer program
US11288065B2 (en) 2018-07-02 2022-03-29 International Business Machines Corporation Devops driven cognitive cost function for software defect prediction
WO2020047173A1 (en) * 2018-08-31 2020-03-05 Procore Technologies, Inc. Computer system and method for predicting risk level of punch items
CN109543707A (en) * 2018-09-29 2019-03-29 南京航空航天大学 Semi-supervised change level Software Defects Predict Methods based on three decisions
US11036613B1 (en) 2020-03-30 2021-06-15 Bank Of America Corporation Regression analysis for software development and management using machine learning
US11144435B1 (en) 2020-03-30 2021-10-12 Bank Of America Corporation Test case generation for software development using machine learning
US11556460B2 (en) 2020-03-30 2023-01-17 Bank Of America Corporation Test case generation for software development using machine learning

Similar Documents

Publication Publication Date Title
US20050071807A1 (en) Methods and systems for predicting software defects in an upcoming software release
Ibbs et al. Quantified impacts of project change
US7788127B1 (en) Forecast model quality index for computer storage capacity planning
US8498887B2 (en) Estimating project size
EP1624397A1 (en) Automatic validation and calibration of transaction-based performance models
CA2707916C (en) Intelligent timesheet assistance
US20030033586A1 (en) Automated system and method for software application quantification
US8818922B2 (en) Method and apparatus for predicting application performance across machines with different hardware configurations
Staron et al. A method for forecasting defect backlog in large streamline software development projects and its industrial evaluation
Adrian et al. Modeling method-productivity
US20120310697A1 (en) Variance management
EP3942416B1 (en) Estimating treatment effect of user interface changes using a state-space model
EP4086824A1 (en) Method for automatically updating unit cost of inspection by using comparison between inspection time and work time of crowdsourcing-based project for generating artificial intelligence training data
US20040024673A1 (en) Method for optimizing the allocation of resources based on market and technology considerations
US20150254584A1 (en) Estimates using historical analysis
CN112905435B (en) Workload assessment method, device, equipment and storage medium based on big data
Stikkel Dynamic model for the system testing process
Gana et al. Statistical modeling applied to managing global 5ESS®‐2000 switch software development
Blocher et al. Updating analytical procedures
Liao et al. Implementation of traffic data quality verification for WIM sites
US9218582B2 (en) Quantifying the quality of trend lines
Kapur et al. A general software reliability growth model for a distributed environment
KR100901357B1 (en) Measuring method of Software maintenance development size and the System thereof
CN115222149A (en) Resource conversion parameter management method, device, equipment and storage medium
Happe et al. Black-box performance models: prediction based on observation

Legal Events

Date Code Title Description
AS Assignment

Owner name: JP MORGAN CHASE BANK, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YANAVI, AURA;REEL/FRAME:015510/0376

Effective date: 20040316

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION