US9373144B1 - Diversity analysis with actionable feedback methodologies - Google Patents

Diversity analysis with actionable feedback methodologies Download PDF

Info

Publication number
US9373144B1
US9373144B1 US14/931,510 US201514931510A US9373144B1 US 9373144 B1 US9373144 B1 US 9373144B1 US 201514931510 A US201514931510 A US 201514931510A US 9373144 B1 US9373144 B1 US 9373144B1
Authority
US
United States
Prior art keywords
entities
variables
maximum loss
end user
probable maximum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/931,510
Other versions
US20160189301A1 (en
Inventor
George Y. Ng
Philip A. Rosace, III
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guidewire Software Inc
Cyence LLC
Original Assignee
Cyence LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US14/931,510 priority Critical patent/US9373144B1/en
Application filed by Cyence LLC filed Critical Cyence LLC
Assigned to Cyence Inc. reassignment Cyence Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NG, GEORGE Y., ROSACE, PHILIP A., III
Priority to US15/099,297 priority patent/US10341376B2/en
Priority to US15/141,779 priority patent/US9521160B2/en
Priority to US15/142,997 priority patent/US9699209B2/en
Publication of US9373144B1 publication Critical patent/US9373144B1/en
Application granted granted Critical
Publication of US20160189301A1 publication Critical patent/US20160189301A1/en
Priority to PCT/US2016/058711 priority patent/WO2017078986A1/en
Priority to US15/371,047 priority patent/US10230764B2/en
Priority to US15/373,298 priority patent/US10050989B2/en
Priority to US15/374,212 priority patent/US10050990B2/en
Priority to US15/457,921 priority patent/US10218736B2/en
Assigned to CYENCE LLC reassignment CYENCE LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: CAESAR ACQUISITION SUB II, LLC
Priority to US15/971,909 priority patent/US10491624B2/en
Assigned to CAESAR ACQUISITION SUB II, LLC reassignment CAESAR ACQUISITION SUB II, LLC MERGER (SEE DOCUMENT FOR DETAILS). Assignors: Cyence Inc.
Assigned to GUIDEWIRE SOFTWARE, INC. reassignment GUIDEWIRE SOFTWARE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CYENCE LLC
Priority to US15/972,027 priority patent/US10498759B2/en
Priority to US15/971,946 priority patent/US10511635B2/en
Priority to US16/582,977 priority patent/US11146585B2/en
Priority to US16/662,936 priority patent/US11153349B2/en
Priority to US17/465,739 priority patent/US11855768B2/en
Priority to US17/477,294 priority patent/US11863590B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/06Asset management; Financial planning or analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1433Vulnerability analysis

Definitions

  • the present technology relates generally to systems and methods for determining metrics, such as diversity or similarity, between entities and the application of those metrics as actionable feedback loops which can be used to increase diversity or reduce similarity amongst groups of entities.
  • metrics may relate to diversity of aggregate cyber security risk for use in planning or filtering new entities so as to increase that diversity.
  • Embodiments of the present technology include a method, comprising: (a) for each of a plurality of entities, receiving a set of variables that are indicative of attributes of an entity; (b) comparing the sets of variables for the plurality of entities to each other; (c) locating clusters of similar variables shared between two or more of the plurality of entities; (d) determining a probable maximum loss for the plurality of entities that share the clusters, the probable maximum loss comprising a loss value attributed to a cyber event against one or more of the shared variables; (e) receiving, using a processor coupled to a memory, feedback from an end user in response to providing the probable maximum loss to the end user; (f) automatically converting, using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss; and (g) providing one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions decreasing the probable maximum loss.
  • inventions of the present technology include a method, comprising: (a) for each of a plurality of entities, receiving a set of variables that are indicative of attributes of an entity; (b) receiving from an end user, a selection of variables that comprise a subset of the set of variables that the end user desires to analyze; (c) comparing the subsets of variables for the plurality of entities to each other; (d) locating clusters of similar variables shared between two or more of the plurality of entities; (e) determining a probable maximum loss for the plurality of entities that share the clusters, the probable maximum loss comprising a loss value attributed to a cyber event against one or more of the shared variables; and (f) providing, using a processor coupled to a memory, one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions decreasing the probable maximum loss.
  • Additional embodiments of the present technology include a system, comprising: (a) a processor; and (b) a memory for storing executable instructions, the processor executing the instructions to: (i) compare sets of variables for a plurality of entities to each other, each of the sets of variables being indicative of attributes of an entity; (ii) locate clusters of similar variables shared between two or more of the plurality of entities; (iii) calculate a probable maximum loss for the plurality of entities that share the clusters by: (1) determining singular vulnerabilities of each of the plurality of entities; (2) determining vulnerabilities of pairs of the plurality of entities; and (3) performing a randomized scenario analysis using the singular and pairwise vulnerabilities, the probable maximum loss comprising a loss value; (iv) receive, using the processor, feedback from an end user in response to providing the probable maximum loss to the end user; (v) automatically convert, using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss; and (vi) provide one or more actions to the end user based on the probable maximum loss and the
  • FIG. 1 is a high level schematic diagram of computing architecture for practicing aspects of the present technology.
  • FIG. 2 is a flowchart of an example method for determining entity diversity.
  • FIG. 3 is a flowchart of an example action and feedback loop method for updating a diversity score and improving client diversity.
  • FIG. 4 is a flowchart of a method for analyzing a new client's impact on an existing diversity calculation.
  • FIG. 5 is a schematic diagram of a computing system that is used to implement embodiments according to the present technology.
  • FIG. 6 is a flowchart of an example method for determining a probable maximum loss value for a portfolio of entities.
  • FIG. 7 is a flowchart of another example method for determining a probable maximum loss value for a portfolio of entities.
  • FIG. 8 is a diagram illustrating an example pairwise correlation matrix and a corresponding example covariance matrix.
  • FIG. 9 is a diagram illustrating an example distribution with markings showing the corresponding values in an example PML table.
  • Various embodiments of the present technology are directed to systems and methods for determining diversity and/or similarity between entities with respect to risk, e.g., cyber security risk, and the utilization of these metrics in various ways to improve diversity between the analyzed entities.
  • an insurer may desire to understand the diversity of their insured entities with respect to aggregate cyber risk and utilize a measure of diversity to prevent too much similarity between insured entities, and/or to compare their diversity to their industry peers.
  • reinsurers, rating agencies and/or insurance brokers may also utilize the present technology. For example, reinsurers may want to compare one insurer's portfolio to another insurer's to buy, invest, and/or cover.
  • Brokers may wish to review their portfolio of clients, and ratings agencies may review an insurer's portfolio and use it to provide a rating on the financial strength rating of the insurer.
  • cyber insurance and other insurance risks can be a function of similarity.
  • a loss e.g., due to a cyber attack, by one of these insured entities might imply that other insured entities having similar attributes will also experience a loss.
  • a plurality of web hosting provider may source their servers from the same company. A cyber attack of that company's servers may equally affect all of these web hosting providers that use the server, and consequently affect an insured that utilizes one of those web hosting providers to host the insured's website and other web services.
  • an end user may determine similar attributes shared between pluralities of entities. These shared attributes can be aggregated into clusters to locate groups of entities with shared attributes.
  • entities use the same content delivery network (CDN), the same cloud service provider, a similar website traffic profile, have overlapping executives, and report similar revenue. While these entities may also share attributes with other entities, these attributes are used in various embodiments to create a cluster or grouping of entities that, when considered in the aggregate, have a low diversity score due to the similarities in this example.
  • CDN content delivery network
  • End users may use the present technology to learn their aggregate cyber risk compared to industry peers and use that information to, for example, screen potential target entities for inclusion into a group based upon how the potential addition of their attributes to the group would affect the diversity score for the end user's collection of entities.
  • the system may instead of, or in addition to, outputting a diversity or clustering score, may output a different value analyzing the entites, for example, a probable maximum loss (PML) and/or an expected portfolio value.
  • PML probable maximum loss
  • the present technology can be used to analyze diversity/similarity between many entities.
  • the diversity/similarity analyses can use hundreds and even thousands of attributes, looking for diversity or commonality therebetween.
  • the end user can adjust the attributes and/or select which attributes are important to them and the system will analyze only these attributes when determining diversity, e.g., a diversity score for aggregate cyber risk.
  • FIG. 1 is a high level schematic diagram of a computing architecture (hereinafter architecture 100 ) of the present technology.
  • the architecture 100 comprises a diversity analysis system 105 (hereinafter also referred to as system 105 ), which in some embodiments comprises a server or cloud-based computing device configured specifically to perform the diversity analyses described herein. That is, the system 105 is a particular purpose computing device that is specifically designed and programmed (e.g., configured or adapted) to perform any of the methods described herein.
  • the system 105 can be coupled with end user device 105 A, such as computer, tablet, Smartphone, or other similar end user computing device. End users can interact with the system 105 using their end user device 105 A.
  • the end user device 105 A and system 105 can be coupled using a network 105 B.
  • a suitable network 105 B may include or interface with any one or more of, for instance, a local intranet, a PAN (Personal Area Network), a LAN (Local Area Network), a WAN (Wide Area Network), a MAN (Metropolitan Area Network), a virtual private network (VPN), a storage area network (SAN), a frame relay connection, an Advanced Intelligent Network (AIN) connection, a synchronous optical network (SONET) connection, a digital T1, T3, E1 or E3 line, Digital Data Service (DDS) connection, DSL (Digital Subscriber Line) connection, an Ethernet connection, an ISDN (Integrated Services Digital Network) line, a dial-up port such as a V.90, V.34 or V.34bis analog modem connection, a cable modem, an ATM (Asynchronous Transfer Mode) connection, or an FDDI (Fiber Distributed Data Interface) or CDDI (Copper Distributed Data Interface) connection.
  • PAN Personal Area Network
  • LAN Local Area Network
  • WAN Wide Area
  • communications may also include links to any of a variety of wireless networks, including WAP (Wireless Application Protocol), GPRS (General Packet Radio Service), GSM (Global System for Mobile Communication), CDMA (Code Division Multiple Access) or TDMA (Time Division Multiple Access), cellular phone networks, GPS (Global Positioning System), CDPD (cellular digital packet data), RIM (Research in Motion, Limited) duplex paging network, Bluetooth radio, or an IEEE 802.11-based radio frequency network.
  • WAP Wireless Application Protocol
  • GPRS General Packet Radio Service
  • GSM Global System for Mobile Communication
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • cellular phone networks GPS (Global Positioning System)
  • CDPD cellular digital packet data
  • RIM Research in Motion, Limited
  • Bluetooth radio or an IEEE 802.11-based radio frequency network.
  • the system 105 comprises a processor 110 and memory 115 for storing instructions.
  • the memory 115 can include an attribute module 120 , a comparator module 125 , a clustering module 130 , a weighting module 135 and a recommendation module 140 .
  • the terms “module” may also refer to any of an application-specific integrated circuit (“ASIC”), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • ASIC application-specific integrated circuit
  • the diversity analyses begin with input for the attribute module 120 .
  • a set of variables that are indicative of attributes of an entity may be input into the attribute module 120 .
  • the variables can include technologies a company might employ (e.g., internally and externally for Internet communication such as e-mail, website, and social media online presence) such as CDN provider, cloud service provider, server type, OS type, visitor traffic knowledge, customer profiles, as well as other non-technical information such as revenue, number of employees, years in business, and so forth.
  • the breadth and type of variables that can be analyzed and correlated are unlimited.
  • the breadth and type of variables that can be analyzed and correlated for the company and for their industry peers, for comparision may be limited by breadth and type of information that is available at online sources concerning the same. Again, an end user can define or specify the types of variables that are of interest to them.
  • the insurer may desire to know how diverse their insured entities are with respect to cyber security risk relative to a wide and divergent set of variables.
  • interest in such diversity may be only in technological variables such as traffic, page views, bandwidth, and other variables related to cyber risk.
  • GUIs graphical user interfaces
  • the system 105 can use the dashboard to display messages or notifications as well as diversity scores, similarity scores, and/or recommendations.
  • the system may gather variables for an entity by querying the entity for information, scraping available online sources such as websites, corporate filings, news sources, other public record databases and resources. Additionally, data may be gathered from the entity's network using devices already present there or by placing a new device on the entity's network to gather more data.
  • the data collecting device may be a server, router, firewall, switch, or repeater, or may be a software agent or routine that monitors traffic and/or performs packet inspection.
  • the data collecting device may be on the company's network and/or its periphery, and may collect and/or analyze the data, while also transmitting it to system 105 . In this manner, additional, proprietary data may be gleaned from a particular entity's network.
  • the variables are input into the attribute module 120 .
  • the attribute module 120 can format or normalize the input as needed for consistency.
  • the comparator module 125 is executed to perform a variable comparison on all or a subset of the variables.
  • the comparison can be for all or only a subset of all entities.
  • the subset of variables can be selected by the end user, as well as the entities analyzed.
  • the comparator module 125 is configured to identify variables shared between entities or groups of entities. The implications of this analysis are multifaceted. For instance, the same variable can be shared between many entities, which leads to an inference that a particular variable might be problematic. This lack of diversity is a more pointed or granular lack of diversity.
  • Localized commonality can be found between small groups (even between two) entities. This type of similarity can be inferred as less problematic than the more prolific examples provided above where similarity exists between numerous entities.
  • the comparator module 125 can cooperate with the clustering module 130 to create commonality clusters (e.g., various clusters of commonly shared variables). In one embodiment, if five entities are being analyzed, many different clusters can be identified. By example, if variables A-D are being analyzed with respect to entities 1-5, the comparator module 125 finds commonality between entities 1 and 3 with respect to variables B and C. Also, the comparator module 125 finds commonality between entities 1-5 with respect to variable A. Other similar correlations can be found.
  • commonality clusters e.g., various clusters of commonly shared variables. In one embodiment, if five entities are being analyzed, many different clusters can be identified. By example, if variables A-D are being analyzed with respect to entities 1-5, the comparator module 125 finds commonality between entities 1 and 3 with respect to variables B and C. Also, the comparator module 125 finds commonality between entities 1-5 with respect to variable A. Other similar correlations can be found.
  • the clustering module 130 can display to the end user these commonality clusters, which indicate areas of non-diversity. Also, these commonality clusters can be utilized by the recommendation module 140 to create action items for the end user that if enacted would change the diversity score. Details regarding the diversity score are found in greater detail below.
  • the comparator module 125 creates a diversity score or index. This diversity score represents how dissimilar the analyzed group of entities is relative to one another in view of their variables.
  • the diversity score can include a percentage of the overall number of compared variables that are dissimilar to those that are shared.
  • the diversity score can be represented variously as a fraction, a decimal, or a percentage, and may be included in the graphical user interface (e.g., dashboard.) Additionally, or alternatively, the diversity score may be normalized into a number within a user-defined, or predefined, range, similar to a credit score.
  • the comparator module 125 can cooperate with the weighting module 135 to applying a weighting to one or more variables.
  • the weighting is selected by an end user such as an insurer. For example, an insurer determines that industry serviced, gross revenue, and customer country of origin are important variables to analyze, e.g., for assessing individual and aggregate cyber risk. For instance, if the insurer knows that gross revenue is very important to the calculation, the insurer can specify that the gross revenue variable is to be given greater importance in the analysis than other variables. In another example, the insurer can assign a weight to each variable based upon importance.
  • the system can determine weightings and variables based on industry knowledge acquired, and use machine learning, big data and other “tools” to make an “educated” determination.
  • the weighting of variables can also be determined by the system 105 based on information such as actuarial data, industry practices, or other rules established by end users but which are intended to be applied by default.
  • the selection of a weighting schema by the system 105 can be based on variables for the entities. For example, if the system 105 determines that the entities are all physicians, the system 105 can select weightings that are appropriate for medical practices or hospitals. Such determinations by the system may be adjusted and/or otherwise specified by the end user (e.g., using the dashboard) to tailor them for their particular circumstances, preferences, or other factors.
  • the diversity score can be represented as a diversity graph that illustrates the connection between entities.
  • Entities can be graphically connected based on commonality of variables between entities. For example, certain entities may be connected as being banks that present particularly enticing targets for cyber criminals and thus particular cyber risks.
  • the recommendation module 140 can be executed to provide the end user with some type of actionable feedback. For example, the recommendation module 140 can provide the end user one or more actions to the end user based on the diversity score and the clusters of similar variables. These one or more actions potentially increase the diversity score if enacted by the end user.
  • the recommendation module 140 can automatically identify variables, which if changed, would affect the diversity score. For example, if the entities are or utilize technology company service providers that use a particular CDN, the recommendation module 140 can output a recommendation that diversification in this area would be beneficial. The end user can alert the entities and encourage them to explore other options for CDNs. If the end user is an insurer, for example, the insurer can encourage this change by offering rate concessions to the insured entities. Various embodiments of the system thus may automatically provide the diversity score or other information to the end user regarding diversity, which the end user can utilize to encourage or effect various changes (e.g.
  • the diversity score might also be used to inform the insurer as to which polices should be renewed and which polices should not be renewed. For example, if a potential new (target) entity to add presents an unacceptable cyber risk, based on the diversity analysis, the insurer may choose not to provide the entity′ policy or to provide the policy at a rate commensurate with the risk.
  • the recommendation module 140 can identify problematic common variables that negatively impact diversity scores.
  • the recommendation module 140 may identify shared infrastructure such as CDNs and cloud service providers as particularly problematic variables that are commonly shared between several entities.
  • the recommendation module 140 can also identify network traffic, network traffic patterns, firewalls, firewall policies that are commonly shared. Changing these shared variables would likely increase the diversity score for these entities.
  • the recommendation module 140 can determine key variables that if changed would negatively affect a diversity score. The recommendation module 140 can identify these variables to the end user as desirable.
  • Actions that could be taken in response to this information could include a project plan that specifies that the insurer is to find new customers that do not share these problematic variables.
  • the project plan could also or alternatively specify that the insurer is to find new customers that do share key positive variables.
  • an action includes the recommendation module 140 creating and providing the end user with a variable profile of a target entity that when added to the plurality of entities increases the diversity score.
  • the recommendation module 140 could create a profile for a prototypical new client that is in a different technology sector or a completely different industry sector.
  • the recommendation module 140 could create a profile for a prototypical new client that includes desirable variables, rather than merely a client that excludes certain disfavored variables.
  • the recommendation module 140 can provide the end user with a list of entities of the plurality of entities that are lowering the diversity score.
  • certain clusters of variables may be found in common between entities. Certain ones of these clusters may have more of a negative impact on the diversity score than others. For example, commonality between headquarters or domicile may have no impact on the diversity score, even if this variable is shared in common between several entities. On the other hand, commonality in gross revenue or average employee age may have a drastic impact on the diversity score for one reason or another. To be sure, commonality of a variable(s) does not always negatively affect the end user or the end user's business. In these instances the commonality can be ignored or weighted so as not to affect the calculated diversity score.
  • the recommendation module 140 can provide the end user with a list of entities of the plurality of entities that, if lost would lower the diversity score, which can prompt the end user to take action to avoid.
  • action the recommendation module 140 can compare a variable profile for a new entity to determine if the addition of the new entity to the analysis will negatively or positively impact the diversity score of the group.
  • the attribute module 120 can receive a variable profile for a new entity and parse out the variables which are indicative of attributes of the new entity.
  • This profile could include an application form, a survey, or any other content that is capable of conveying variables.
  • the comparator module 125 adds a set of variables of the new entity to the comparison described above and repeats the calculation of the diversity score.
  • the recommendation module 140 can alert the end user if the addition of the new entity decreases the diversity score.
  • the recommendation module 140 can alert the end user if the addition of the new entity increases the diversity score as well.
  • the recommendation module 140 updates the diversity score based on feedback received from the end user. For example, if the end user wants to view how the addition of a proposed new client will affect an existing diversity score, the profile for the new client is added to the system and the variables for the new client are processed and added to the comparison process. A new or updated diversity score is calculated and displayed to the end user.
  • the difference between the new diversity score and the old diversity score is expressed as a diversity delta.
  • the system 105 can apply thresholds to the diversity delta to determine if a proposed change to the entity grouping is sufficient to warrant the proposed change. For example, the system 105 may require at least a net change or diversity delta of 20%. Other percentages can also be utilized.
  • the present technology provides information related to the updated information (the new diversity score), including differences (the amount of the change made in one or more updates, namely the delta), and trends (patterns over many time steps).
  • the present technology also provides attribution information when a diversity score changes.
  • the methods and system indicate to a user why the score has changed, namely what exactly has changed in the underlying data sets to effect that higher level score change.
  • the systems and methods of the present technology provide detailed information to the user to identify the changed data, and thereby understand the positive and negative impacts of the user's actions on the diversity score.
  • the system 105 can also build an entity portfolio for an end user with knowledge gained from an analysis of variables for a plurality of entities. For instance, the system 105 can create a report that informs the end user as to how many and what type of entities a portfolio should have to be balanced in terms of diversity, e.g., with respect to cyber risk. For example, the report may indicate that an insurer should have a certain percentage of clients in the banking sector, a certain percentage in the technology sector, and a certain percentage in the medial industry. These sectors of the portfolio are deduced by comparing variables for various entities in a given industry that lead to a suitable diversity score.
  • the diversity score can be counterbalanced by other factors such as revenue for the end user. That is, an insurer may be more likely to accept a lower diversity score from a group of entities that pay higher premiums or a group of entities that is at least partially self-insured.
  • FIG. 2 is a flowchart 200 of an example method that is executed by the system, in accordance with the present technology.
  • the method includes the system 105 (for each of a plurality of entities), receiving 205 a set of variables that are indicative of attributes of an entity.
  • These variables can include any number or type of variables that represent the attributes of the entity.
  • variables are collected for numerous entities that may belong, in some embodiments, to a particular class or group.
  • the entities could include all employees in a company, all insured customers of an insurance agency, investors in a mutual fund, or other groups.
  • the method includes the system 105 comparing 210 the sets of variables for the plurality of entities to each other and locating 215 clusters of similar variables shared between two or more of the plurality of entities.
  • the method includes the system 105 clustering 220 common variables and identifying the entities that share the common variables. These clusters are indicative of non-diversity between these entities.
  • the method includes the system 105 calculating 225 a diversity score that represents how different the plurality of entities are to one another based on variables that are not shared between the plurality of entities.
  • This diversity is directly related to the commonality discovered above. The more similar or commonly shared variables exist, the less diverse the entities are relative to one another, as a general rule. Again, as mentioned above, some variables will have little to no impact on diversity as dictated by weighting or variable selection by the end user. For example, if a commonly shared variable is not included in the diversity calculation by the end user the variable has no impact on the diversity score.
  • the method includes the system 105 receiving 230 feedback from an end user in response to providing the diversity score to the end user. Also, the method includes the system 105 updating 235 the diversity score in response to the feedback.
  • the feedback can take the form of a suggestion, option, report, or other output that is actionable by the end user.
  • Exemplary methods and systems according to the present technology may also provide benchmarking over time. In this manner, an insurance company or other entity tracking aggregate cyber risk may track their diversity score over an adjustable time period, for example days, weeks, months, and/or years.
  • the flowchart 300 illustrates the method including the system 105 providing, at 305 , the user with one or more actions/suggestions that to the end user based on the diversity score and the clusters of similar variables. These actions can potentially increase the diversity score if enacted by the end user.
  • an action includes providing the end user with a variable profile of a target entity that when added to the plurality of entities increases the diversity score.
  • an action includes providing the end user with a list of entities of the plurality of entities that are lowering the diversity score.
  • an action includes providing the end user with a list of entities of the plurality of entities that, if lost, would lower the diversity score.
  • the feedback is used in calculating, at step 325 , an updated diversity score and delivering, at step 330 , the updated diversity score to the end user.
  • FIG. 4 is a flowchart 400 of a new entity analysis method.
  • the system is utilized to compare the variables of a new entity to an existing diversity analysis.
  • an insurer desires to determine how the addition of this new entity will affect the diversity of an existing client base.
  • This aggregate risk analysis can be used to ensure that diversity is maintained or increased when a new client is added to an existing pool of clients.
  • the method includes receiving 405 a variable profile for a new entity.
  • the variable profile either includes a set of variables or a set of variables is deduced from the variable profile.
  • the variable profile can include an application form, a resume, a corporate filing such as a tax return, or any other document that includes attributes of an entity.
  • the method includes adding 410 the set of variables of the new entity to the variables of the previously analyze entities and performing 415 an updated comparison of variables.
  • the method includes generating 420 an updated diversity score calculation.
  • the method includes alerting 425 the end user if the addition of the new entity decreases (or increases) the diversity score.
  • the end user can decided to accept or reject this new client based upon how the client affects the diversity score.
  • the present technology can be used in scenarios where diversity of clientele is desirous.
  • the present technology can perform diversity analyses on potentially thousands of attributes across countless entities in ways that would be impossible to accomplish absent the use of the diversity analysis system.
  • the diversity analyses of the present technology can bring clarity to business planning and project management, where integration of new clients/entities may affect the diversity of a current client base (either positively or negatively).
  • the present technology provides a means for facilitating and maintaining this diversity in a way that is actionable and usable to the end user. That is, the present technology provides a way for end users to mitigate risk through diversification of their customer base or however diversity impacts their particular business or operations.
  • FIG. 5 is a diagrammatic representation of an example machine in the form of a computer system 1 , within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
  • the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a portable music player (e.g., a portable hard drive audio device such as an Moving Picture Experts Group Audio Layer 3 (MP3) player), a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • MP3 Moving Picture Experts Group Audio Layer 3
  • MP3 Moving Picture Experts Group Audio Layer 3
  • web appliance e.g., a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
  • the example computer system 1 includes a processor or multiple processor(s) 5 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), and a main memory 10 and static memory 15 , which communicate with each other via a bus 20 .
  • the computer system 1 may further include a video display 35 (e.g., a liquid crystal display (LCD)).
  • a processor or multiple processor(s) 5 e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both
  • main memory 10 and static memory 15 which communicate with each other via a bus 20 .
  • the computer system 1 may further include a video display 35 (e.g., a liquid crystal display (LCD)).
  • LCD liquid crystal display
  • the computer system 1 may also include an alpha-numeric input device(s) 30 (e.g., a keyboard), a cursor control device (e.g., a mouse), a voice recognition or biometric verification unit (not shown), a drive unit 37 (also referred to as disk drive unit), a signal generation device 40 (e.g., a speaker), and a network interface device 45 .
  • the computer system 1 may further include a data encryption module (not shown) to encrypt data.
  • the disk drive unit 37 includes a computer or machine-readable medium 50 on which is stored one or more sets of instructions and data structures (e.g., instructions 55 ) embodying or utilizing any one or more of the methodologies or functions described herein.
  • the instructions 55 may also reside, completely or at least partially, within the main memory 10 and/or within the processor(s) 5 during execution thereof by the computer system 1 .
  • the main memory 10 and the processor(s) 5 may also constitute machine-readable media.
  • the instructions 55 may further be transmitted or received over a network (see e.g. 105 B in FIG. 1 ) via the network interface device 45 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol (HTTP)).
  • HTTP Hyper Text Transfer Protocol
  • the machine-readable medium 50 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions.
  • computer-readable medium shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions.
  • the term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Such media may also include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memory (RAM), read only memory (ROM), and the like.
  • RAM random access memory
  • ROM read only memory
  • the example embodiments described herein may be implemented in an operating environment comprising software installed on a computer, in hardware, or in a combination of software and hardware.
  • the Internet service may be configured to provide Internet access to one or more computing devices that are coupled to the Internet service, and that the computing devices may include one or more processors, buses, memory devices, display devices, input/output devices, and the like.
  • the Internet service may be coupled to one or more databases, repositories, servers, and the like, which may be utilized in order to implement any of the embodiments of the disclosure as described herein.
  • the present technology can be utilized to calculate a probable maximum loss (PML) for a portfolio of companies in a cyber risk pool.
  • the PML value can include a loss value calculated against various probability likelihood values.
  • a PML probability likelihood value could include 1/10, 1/20, 1/100, or 1/1,000.
  • other likelihood values can also likewise be utilized in some embodiments. The smaller the likelihood value, the lower the probability of a loss event and vice versa.
  • a smaller likelihood value can be correlated with a cyber risk event or scenario that is not likely to occur, but nevertheless if it were to occur, it would result in a higher PML value than a cyber risk event that is more likely to occur.
  • a smaller likelihood value could be associated with the breach or failure of three out of five shared services between a group of companies. While this type of scenario is unlikely to occur, the PML of this type of event would be greater than what would occur if the loss only involved a single service.
  • the present technology can be configured to calculate one or more PML values for entities in a cyber risk insurance portfolio. If the PML risk is too great, the mixture of entities within the cyber risk insurance portfolio can be selectively altered and updated PML values calculated. In this way, desired permutations of entities can be placed into a cyber risk insurance portfolio so as to maximize optimal PML values, according to various embodiments. To be sure, the lower the PML values are across the board, the better for the insurer.
  • these tactics can be utilized by a large company to analyze its utilization of technology services, non-technology vendors, and business partners amongst its constituent operations and selectively adjust the services to reduce a cyber risk for the company.
  • a quantification of the aggregation paths among service providers, vendors, business partners or insurance policyholders across the cloud-based continuum of services ranging from shared infrastructure to software as a service, other types of service providers, and common exploitable vectors of attack will be crucial to an insurer's or a company's ability to understand and manage the aggregation of risk within their portfolio or operations.
  • Cyber security exposure is relevant with respect to how a company is protecting itself against cyber security threats, both internal and external.
  • Various embodiments of the present technology provide various methods for determining and aggregating such exposure and calculating PML measurements. These may be used, for example, for underwriting cyber security insurance and other uses.
  • the present technology provides methodologies which examine aggregate exposures and potential disaster scenarios involving correlated cyber risk. These exemplary methods can be crucial to insurance companies with portfolios of cyber insurance policies and companies with complex supply chains and business partner networks.
  • FIG. 6 illustrates a flowchart 600 for an exemplary method for determining a probable maximum loss for a group of entities, according to various embodiments.
  • these entities can be selected for inclusion in a possible insurance pool.
  • the selected entities can also comprise entities already in an insurance pool and insured for cyber risk.
  • the exemplary method can also be utilized to help large organizations or companies determine a probable maximum loss across an organization by comparing the services utilized by various parts of an organization and performing a random simulation analysis to calculate PML. The company can then adjust the distribution and use of services, recalculate PML and determine if an increase or decrease in PML is caused by these redistribution/reorganization efforts.
  • the exemplary method implicitly includes a process for identifying a plurality of entities that are or may be included in a cyber risk insurance policy.
  • the exemplary method in the flowchart 600 begins with, for each of a plurality of entities, receiving, at 605 , a set of variables that are indicative of attributes of an entity.
  • variables may include, but are not limited to, for example: content delivery network utilized, ISP usage, cloud storage service providers, database technology, any other hardware or software services, as well as non-technology services utilized by the entities.
  • the exemplary method can comprise comparing, at 610 , the sets of variables for the plurality of entities to each other and locating, at 615 , clusters of similar variables shared between two or more of the plurality of entities.
  • entities can be aggregated into groups based on their shared utilization of services.
  • each of the selected entities may be determined to use AmazonTM AWS and be aggregated based on this shared variable.
  • many correlations can be determined for the entities based on one or more shared variables.
  • the exemplary method can include determining, at 620 , a probable maximum loss for the plurality of entities that share the clusters.
  • the probable maximum loss can comprise a (monetary) loss value attributed to a cyber breach or other cyber event, against one or more of the shared variables.
  • the PML may be calculated with respect to a probability or likelihood for failure or breach of the shared variables. For one non-limiting example, calculating PML with respect to a probability or likelihood for a breach of AmazonTM AWS. According to various embodiments, the PML will be higher if multiple services are involved in a failure or cyber event, and less if fewer services are breached.
  • the PML may be affected according to various embodiments. According to various embodiments, for example, if the likelihood of a service being breached is low, but the damage would be widespread, the PML will be higher than if there is a high likelihood for a breach but limited damage.
  • the method can include receiving, at 625 , using a processor coupled to a memory, feedback from an end user in response to providing the probable maximum loss to the end user.
  • the feedback can include removing, replacing, or adding entities to the calculation that affect the calculation of an updated PML.
  • the method can include automatically converting, at 630 , using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss.
  • the method includes providing, at 635 , one or more actions to the end user based on the probable maximum loss and the clusters of similar variables. To be sure, enactment by the end user of at least one of the one or more actions may decrease the probable maximum loss.
  • an action could include adding, removing, and/or replacing entities within the group of entities that were analyzed to generate the PML value according to various embodiments.
  • FIG. 7 illustrates a flowchart 700 for an example method for calculating PML, in one embodiment.
  • the method comprises performing, at 705 , a singular vulnerability analysis for each of the entities that share at least one common variable therebetween. That is, each entity will have a set of variables that represent the services or attributes. These services or attributes can lead to an increase or decrease in the likelihood that the entity will experience a cyber event or loss.
  • the method provides an analysis of each entity individually to determine their vulnerability.
  • the exemplary method includes performing, at 710 , a pairwise vulnerability analysis for each entity in the group.
  • vulnerability may be determined for each entity in comparison with every other entity, that share at least one common variable, in the group.
  • This step can also be referred to as a pair-wise correlation.
  • Both steps 705 and 710 can produce areas of interest that can be utilized as the focus of a PML analysis, as described below. That is, according to various embodiments, both individual and pairwise vulnerabilities are indicative of potential group-wise vulnerabilities, which are in turn indicative of losses that can be expected for the selected group of entities as a whole.
  • the method can comprise building, at 715 , a covariant matrix using the individual and pairwise vulnerability correlations.
  • FIG. 8 is an exemplary diagram 800 illustrating an example pairwise correlation matrix and a corresponding example covariance matrix.
  • Diagram 800 shows just a non-limiting example embodiment of a way correlated risks may be represented and incorporated into a PML calculation based on the diversity analysis and score.
  • the method includes performing, at 720 , randomized simulations of breach events using the covariant matrix and generating, at 725 , a probable maximum loss.
  • some embodiments include a feedback loop where entities are added, removed, and/or replaced to change the vulnerability analyses, covariant matrix assembly, and PML calculation.
  • a Monte Carlo analysis can be utilized to perform a randomized simulation on the covariant matrix.
  • the PML calculation may be a distribution of losses (where each loss in that distribution corresponds to a simulated state).
  • the calculation itself can be related to the diversity score of the portfolio via the correlation/covariance matrix in various embodiments.
  • FIG. 9 is a diagram 900 illustrating an example distribution with markings showing the corresponding values in an example PML table.
  • the PML value can include a PML that includes every breach likely or expected for the plurality of entities over a period of time, such as a month or year.
  • the likelihood of the breach event affects the PML calculated according to various embodiments. If events that have a 1 in 10 chance of occurring are considered, their contribution to the PML, according to various embodiments, is likely lower than a breach event having a likelihood of occurrence of 1 in 1,000 or 1 in 10,000.
  • the exemplary method can optionally include performing a plurality of randomized simulations against the same covariant matrix and obtaining a plurality of PML values. An average or other statistical representation of the plurality of PML values can be obtained if desired.
  • the system of FIG. 1 can be adapted to perform the methods described herein.
  • the system 100 can further comprise a simulation and PML module 141 that is stored in memory 115 .
  • the simulation and PML module 141 can perform the individual and pairwise analyses, as well as the covariant matrix construction and random simulation processes.
  • the simulation and PML module 141 can be configured to calculate the PML values from the covariant matrix construction and random simulation processes.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • a hyphenated term (e.g., “on-demand”) may be occasionally interchangeably used with its non-hyphenated version (e.g., “on demand”)
  • a capitalized entry e.g., “Software”
  • a non-capitalized version e.g., “software”
  • a plural term may be indicated with or without an apostrophe (e.g., PE's or PEs)
  • an italicized term e.g., “N+1” may be interchangeably used with its non-italicized version (e.g., “N+1”).
  • Such occasional interchangeable uses shall not be considered inconsistent with each other.
  • a “means for” may be expressed herein in terms of a structure, such as a processor, a memory, an I/O device such as a camera, or combinations thereof.
  • the “means for” may include an algorithm that is descriptive of a function or method step, while in yet other embodiments the “means for” is expressed in terms of a mathematical formula, prose, or as a flow chart or signal diagram.

Abstract

Various embodiments of the present technology relate to diversity and similarity analysis. In some exemplary embodiments, a method includes, for each of a plurality of entities, receiving a set of variables that are indicative of attributes of an entity. The exemplary method also includes comparing the sets of variables for the plurality of entities to each other, locating clusters of similar variables shared between two or more of the plurality of entities, determining a probable maximum loss for the plurality of entities that share the clusters, the probable maximum loss being a loss value attributed to a cyber event against one or more of the shared variables, receiving feedback from an end user in response to providing the probable maximum loss to the end user, and updating the probable maximum loss in response to the feedback.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is a Continuation-in-Part of U.S. patent application Ser. No. 14/585,051 (now issued as U.S. Pat. No. 9,253,203), filed Dec. 29, 2014, entitled “Diversity Analysis with Actionable Feedback Methodologies”, which is hereby incorporated by reference herein in its entirety including all references and appendices cited therein.
FIELD OF THE PRESENT TECHNOLOGY
The present technology relates generally to systems and methods for determining metrics, such as diversity or similarity, between entities and the application of those metrics as actionable feedback loops which can be used to increase diversity or reduce similarity amongst groups of entities. These metrics may relate to diversity of aggregate cyber security risk for use in planning or filtering new entities so as to increase that diversity.
SUMMARY
Embodiments of the present technology include a method, comprising: (a) for each of a plurality of entities, receiving a set of variables that are indicative of attributes of an entity; (b) comparing the sets of variables for the plurality of entities to each other; (c) locating clusters of similar variables shared between two or more of the plurality of entities; (d) determining a probable maximum loss for the plurality of entities that share the clusters, the probable maximum loss comprising a loss value attributed to a cyber event against one or more of the shared variables; (e) receiving, using a processor coupled to a memory, feedback from an end user in response to providing the probable maximum loss to the end user; (f) automatically converting, using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss; and (g) providing one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions decreasing the probable maximum loss.
Other embodiments of the present technology include a method, comprising: (a) for each of a plurality of entities, receiving a set of variables that are indicative of attributes of an entity; (b) receiving from an end user, a selection of variables that comprise a subset of the set of variables that the end user desires to analyze; (c) comparing the subsets of variables for the plurality of entities to each other; (d) locating clusters of similar variables shared between two or more of the plurality of entities; (e) determining a probable maximum loss for the plurality of entities that share the clusters, the probable maximum loss comprising a loss value attributed to a cyber event against one or more of the shared variables; and (f) providing, using a processor coupled to a memory, one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions decreasing the probable maximum loss.
Additional embodiments of the present technology include a system, comprising: (a) a processor; and (b) a memory for storing executable instructions, the processor executing the instructions to: (i) compare sets of variables for a plurality of entities to each other, each of the sets of variables being indicative of attributes of an entity; (ii) locate clusters of similar variables shared between two or more of the plurality of entities; (iii) calculate a probable maximum loss for the plurality of entities that share the clusters by: (1) determining singular vulnerabilities of each of the plurality of entities; (2) determining vulnerabilities of pairs of the plurality of entities; and (3) performing a randomized scenario analysis using the singular and pairwise vulnerabilities, the probable maximum loss comprising a loss value; (iv) receive, using the processor, feedback from an end user in response to providing the probable maximum loss to the end user; (v) automatically convert, using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss; and (vi) provide one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions increasing the probable maximum loss.
BRIEF DESCRIPTION OF THE DRAWINGS
Certain embodiments of the present technology are illustrated by the accompanying figures. It will be understood that the figures are not necessarily to scale and that details not necessary for an understanding of the technology or that render other details difficult to perceive may be omitted. It will be understood that the technology is not necessarily limited to the particular embodiments illustrated herein.
FIG. 1 is a high level schematic diagram of computing architecture for practicing aspects of the present technology.
FIG. 2 is a flowchart of an example method for determining entity diversity.
FIG. 3 is a flowchart of an example action and feedback loop method for updating a diversity score and improving client diversity.
FIG. 4 is a flowchart of a method for analyzing a new client's impact on an existing diversity calculation.
FIG. 5 is a schematic diagram of a computing system that is used to implement embodiments according to the present technology.
FIG. 6 is a flowchart of an example method for determining a probable maximum loss value for a portfolio of entities.
FIG. 7 is a flowchart of another example method for determining a probable maximum loss value for a portfolio of entities.
FIG. 8 is a diagram illustrating an example pairwise correlation matrix and a corresponding example covariance matrix.
FIG. 9 is a diagram illustrating an example distribution with markings showing the corresponding values in an example PML table.
DETAILED DESCRIPTION
Various embodiments of the present technology are directed to systems and methods for determining diversity and/or similarity between entities with respect to risk, e.g., cyber security risk, and the utilization of these metrics in various ways to improve diversity between the analyzed entities. In one embodiment, an insurer may desire to understand the diversity of their insured entities with respect to aggregate cyber risk and utilize a measure of diversity to prevent too much similarity between insured entities, and/or to compare their diversity to their industry peers. Additionally, reinsurers, rating agencies and/or insurance brokers may also utilize the present technology. For example, reinsurers may want to compare one insurer's portfolio to another insurer's to buy, invest, and/or cover. Brokers may wish to review their portfolio of clients, and ratings agencies may review an insurer's portfolio and use it to provide a rating on the financial strength rating of the insurer. To be sure, cyber insurance and other insurance risks can be a function of similarity. For cyber insurance risk, if insured entites are very similar to one another in a variety of key attributes such as revenue, clientele, industry, technology utilized such as cloud computing service provider, content delivery network (CDN) provider, operating system, firewall vendor, intrusion detection system vendor, security services provider, etc., or other factors, a loss, e.g., due to a cyber attack, by one of these insured entities might imply that other insured entities having similar attributes will also experience a loss. For example, a plurality of web hosting provider may source their servers from the same company. A cyber attack of that company's servers may equally affect all of these web hosting providers that use the server, and consequently affect an insured that utilizes one of those web hosting providers to host the insured's website and other web services.
To be sure, diversity in attributes between entities can decrease the likelihood that a covered loss by any particular entity will also likely affect the other entities. Thus, the desire is to have the insured entities be as diverse as possible in the aggregate, to reduce overall risk. Conversely, similarity of attributes between insured entities can increase risk for the insurer.
Using the present technology, an end user may determine similar attributes shared between pluralities of entities. These shared attributes can be aggregated into clusters to locate groups of entities with shared attributes. In one example, several entities use the same content delivery network (CDN), the same cloud service provider, a similar website traffic profile, have overlapping executives, and report similar revenue. While these entities may also share attributes with other entities, these attributes are used in various embodiments to create a cluster or grouping of entities that, when considered in the aggregate, have a low diversity score due to the similarities in this example.
End users may use the present technology to learn their aggregate cyber risk compared to industry peers and use that information to, for example, screen potential target entities for inclusion into a group based upon how the potential addition of their attributes to the group would affect the diversity score for the end user's collection of entities. In alternative exemplary embodiments, the system may instead of, or in addition to, outputting a diversity or clustering score, may output a different value analyzing the entites, for example, a probable maximum loss (PML) and/or an expected portfolio value.
The present technology can be used to analyze diversity/similarity between many entities. The diversity/similarity analyses can use hundreds and even thousands of attributes, looking for diversity or commonality therebetween. In some instances, the end user can adjust the attributes and/or select which attributes are important to them and the system will analyze only these attributes when determining diversity, e.g., a diversity score for aggregate cyber risk.
While the examples above mention the suitability of the present technology for use with insurance planning, in general, and cyber insurance planning, in particular, the present technology is not so limited. Other examples of technologies that can implement the present technology are financial portfolio managers, technology companies that desire infrastructure robustness, human resources, venture capital investment, and so forth.
These and other advantages of the present technology are provided below with reference to the collective drawings.
FIG. 1 is a high level schematic diagram of a computing architecture (hereinafter architecture 100) of the present technology. The architecture 100 comprises a diversity analysis system 105 (hereinafter also referred to as system 105), which in some embodiments comprises a server or cloud-based computing device configured specifically to perform the diversity analyses described herein. That is, the system 105 is a particular purpose computing device that is specifically designed and programmed (e.g., configured or adapted) to perform any of the methods described herein.
The system 105 can be coupled with end user device 105A, such as computer, tablet, Smartphone, or other similar end user computing device. End users can interact with the system 105 using their end user device 105A. The end user device 105A and system 105 can be coupled using a network 105B.
A suitable network 105B may include or interface with any one or more of, for instance, a local intranet, a PAN (Personal Area Network), a LAN (Local Area Network), a WAN (Wide Area Network), a MAN (Metropolitan Area Network), a virtual private network (VPN), a storage area network (SAN), a frame relay connection, an Advanced Intelligent Network (AIN) connection, a synchronous optical network (SONET) connection, a digital T1, T3, E1 or E3 line, Digital Data Service (DDS) connection, DSL (Digital Subscriber Line) connection, an Ethernet connection, an ISDN (Integrated Services Digital Network) line, a dial-up port such as a V.90, V.34 or V.34bis analog modem connection, a cable modem, an ATM (Asynchronous Transfer Mode) connection, or an FDDI (Fiber Distributed Data Interface) or CDDI (Copper Distributed Data Interface) connection. Furthermore, communications may also include links to any of a variety of wireless networks, including WAP (Wireless Application Protocol), GPRS (General Packet Radio Service), GSM (Global System for Mobile Communication), CDMA (Code Division Multiple Access) or TDMA (Time Division Multiple Access), cellular phone networks, GPS (Global Positioning System), CDPD (cellular digital packet data), RIM (Research in Motion, Limited) duplex paging network, Bluetooth radio, or an IEEE 802.11-based radio frequency network.
In one embodiment, the system 105 comprises a processor 110 and memory 115 for storing instructions. The memory 115 can include an attribute module 120, a comparator module 125, a clustering module 130, a weighting module 135 and a recommendation module 140. As used herein, the terms “module” may also refer to any of an application-specific integrated circuit (“ASIC”), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
For context, the diversity analyses according to various embodiments of the present technology begin with input for the attribute module 120. A set of variables that are indicative of attributes of an entity may be input into the attribute module 120. In one embodiment, the variables can include technologies a company might employ (e.g., internally and externally for Internet communication such as e-mail, website, and social media online presence) such as CDN provider, cloud service provider, server type, OS type, visitor traffic knowledge, customer profiles, as well as other non-technical information such as revenue, number of employees, years in business, and so forth. In various embodiments, the breadth and type of variables that can be analyzed and correlated are unlimited. In some embodiments, the breadth and type of variables that can be analyzed and correlated for the company and for their industry peers, for comparision, may be limited by breadth and type of information that is available at online sources concerning the same. Again, an end user can define or specify the types of variables that are of interest to them.
For example, if the end user is an insurer, the insurer may desire to know how diverse their insured entities are with respect to cyber security risk relative to a wide and divergent set of variables. In regard to a cloud computing provider, for example, interest in such diversity may be only in technological variables such as traffic, page views, bandwidth, and other variables related to cyber risk.
In some embodiments, entites and end users can access and interact with the system 105 using a variety of graphical user interfaces (GUIs) such as a dashboard, including various elements as described herein. The system 105 can use the dashboard to display messages or notifications as well as diversity scores, similarity scores, and/or recommendations.
The system may gather variables for an entity by querying the entity for information, scraping available online sources such as websites, corporate filings, news sources, other public record databases and resources. Additionally, data may be gathered from the entity's network using devices already present there or by placing a new device on the entity's network to gather more data. The data collecting device may be a server, router, firewall, switch, or repeater, or may be a software agent or routine that monitors traffic and/or performs packet inspection. The data collecting device may be on the company's network and/or its periphery, and may collect and/or analyze the data, while also transmitting it to system 105. In this manner, additional, proprietary data may be gleaned from a particular entity's network. Regardless of how the variables are obtained, the variables are input into the attribute module 120. The attribute module 120 can format or normalize the input as needed for consistency.
In one embodiment, the comparator module 125 is executed to perform a variable comparison on all or a subset of the variables. The comparison can be for all or only a subset of all entities. The subset of variables can be selected by the end user, as well as the entities analyzed.
The comparator module 125 is configured to identify variables shared between entities or groups of entities. The implications of this analysis are multifaceted. For instance, the same variable can be shared between many entities, which leads to an inference that a particular variable might be problematic. This lack of diversity is a more pointed or granular lack of diversity.
In another example, multiple variables are shared between numerous entities. This diversity relationship between the entities signifies a more prolific lack of diversity.
Localized commonality can be found between small groups (even between two) entities. This type of similarity can be inferred as less problematic than the more prolific examples provided above where similarity exists between numerous entities.
The comparator module 125 can cooperate with the clustering module 130 to create commonality clusters (e.g., various clusters of commonly shared variables). In one embodiment, if five entities are being analyzed, many different clusters can be identified. By example, if variables A-D are being analyzed with respect to entities 1-5, the comparator module 125 finds commonality between entities 1 and 3 with respect to variables B and C. Also, the comparator module 125 finds commonality between entities 1-5 with respect to variable A. Other similar correlations can be found.
The clustering module 130 can display to the end user these commonality clusters, which indicate areas of non-diversity. Also, these commonality clusters can be utilized by the recommendation module 140 to create action items for the end user that if enacted would change the diversity score. Details regarding the diversity score are found in greater detail below.
In some embodiments, the comparator module 125 creates a diversity score or index. This diversity score represents how dissimilar the analyzed group of entities is relative to one another in view of their variables.
The diversity score can include a percentage of the overall number of compared variables that are dissimilar to those that are shared. The diversity score can be represented variously as a fraction, a decimal, or a percentage, and may be included in the graphical user interface (e.g., dashboard.) Additionally, or alternatively, the diversity score may be normalized into a number within a user-defined, or predefined, range, similar to a credit score.
In some embodiments, the comparator module 125 can cooperate with the weighting module 135 to applying a weighting to one or more variables. In one embodiment, the weighting is selected by an end user such as an insurer. For example, an insurer determines that industry serviced, gross revenue, and customer country of origin are important variables to analyze, e.g., for assessing individual and aggregate cyber risk. For instance, if the insurer knows that gross revenue is very important to the calculation, the insurer can specify that the gross revenue variable is to be given greater importance in the analysis than other variables. In another example, the insurer can assign a weight to each variable based upon importance.
In some embodiments, the system can determine weightings and variables based on industry knowledge acquired, and use machine learning, big data and other “tools” to make an “educated” determination. For example, the weighting of variables can also be determined by the system 105 based on information such as actuarial data, industry practices, or other rules established by end users but which are intended to be applied by default. The selection of a weighting schema by the system 105 can be based on variables for the entities. For example, if the system 105 determines that the entities are all physicians, the system 105 can select weightings that are appropriate for medical practices or hospitals. Such determinations by the system may be adjusted and/or otherwise specified by the end user (e.g., using the dashboard) to tailor them for their particular circumstances, preferences, or other factors.
In some embodiments, the diversity score can be represented as a diversity graph that illustrates the connection between entities. Entities can be graphically connected based on commonality of variables between entities. For example, certain entities may be connected as being banks that present particularly enticing targets for cyber criminals and thus particular cyber risks.
In response to calculating a diversity and/or similarity score, the recommendation module 140 can be executed to provide the end user with some type of actionable feedback. For example, the recommendation module 140 can provide the end user one or more actions to the end user based on the diversity score and the clusters of similar variables. These one or more actions potentially increase the diversity score if enacted by the end user.
In one example, the recommendation module 140 can automatically identify variables, which if changed, would affect the diversity score. For example, if the entities are or utilize technology company service providers that use a particular CDN, the recommendation module 140 can output a recommendation that diversification in this area would be beneficial. The end user can alert the entities and encourage them to explore other options for CDNs. If the end user is an insurer, for example, the insurer can encourage this change by offering rate concessions to the insured entities. Various embodiments of the system thus may automatically provide the diversity score or other information to the end user regarding diversity, which the end user can utilize to encourage or effect various changes (e.g. via rate concession, screening of potential new entities, adjusting rates based on diversity, or other actions prompted by the system's determinations.) The diversity score might also be used to inform the insurer as to which polices should be renewed and which polices should not be renewed. For example, if a potential new (target) entity to add presents an unacceptable cyber risk, based on the diversity analysis, the insurer may choose not to provide the entity′ policy or to provide the policy at a rate commensurate with the risk.
In another example, the recommendation module 140 can identify problematic common variables that negatively impact diversity scores. For example, the recommendation module 140 may identify shared infrastructure such as CDNs and cloud service providers as particularly problematic variables that are commonly shared between several entities. In some embodiments, the recommendation module 140 can also identify network traffic, network traffic patterns, firewalls, firewall policies that are commonly shared. Changing these shared variables would likely increase the diversity score for these entities. Conversely, the recommendation module 140 can determine key variables that if changed would negatively affect a diversity score. The recommendation module 140 can identify these variables to the end user as desirable.
Actions that could be taken in response to this information could include a project plan that specifies that the insurer is to find new customers that do not share these problematic variables. Likewise, the project plan could also or alternatively specify that the insurer is to find new customers that do share key positive variables.
In one example, an action includes the recommendation module 140 creating and providing the end user with a variable profile of a target entity that when added to the plurality of entities increases the diversity score. For example, the recommendation module 140 could create a profile for a prototypical new client that is in a different technology sector or a completely different industry sector. In another embodiment, the recommendation module 140 could create a profile for a prototypical new client that includes desirable variables, rather than merely a client that excludes certain disfavored variables.
In one embodiment, the recommendation module 140 can provide the end user with a list of entities of the plurality of entities that are lowering the diversity score. Again, as mentioned above, certain clusters of variables may be found in common between entities. Certain ones of these clusters may have more of a negative impact on the diversity score than others. For example, commonality between headquarters or domicile may have no impact on the diversity score, even if this variable is shared in common between several entities. On the other hand, commonality in gross revenue or average employee age may have a drastic impact on the diversity score for one reason or another. To be sure, commonality of a variable(s) does not always negatively affect the end user or the end user's business. In these instances the commonality can be ignored or weighted so as not to affect the calculated diversity score.
In another example, the recommendation module 140 can provide the end user with a list of entities of the plurality of entities that, if lost would lower the diversity score, which can prompt the end user to take action to avoid.
In another example, action the recommendation module 140 can compare a variable profile for a new entity to determine if the addition of the new entity to the analysis will negatively or positively impact the diversity score of the group.
For example, the attribute module 120 can receive a variable profile for a new entity and parse out the variables which are indicative of attributes of the new entity. This profile could include an application form, a survey, or any other content that is capable of conveying variables.
Next, the comparator module 125 adds a set of variables of the new entity to the comparison described above and repeats the calculation of the diversity score. The recommendation module 140 can alert the end user if the addition of the new entity decreases the diversity score. The recommendation module 140 can alert the end user if the addition of the new entity increases the diversity score as well.
In some embodiments, the recommendation module 140 updates the diversity score based on feedback received from the end user. For example, if the end user wants to view how the addition of a proposed new client will affect an existing diversity score, the profile for the new client is added to the system and the variables for the new client are processed and added to the comparison process. A new or updated diversity score is calculated and displayed to the end user.
The difference between the new diversity score and the old diversity score is expressed as a diversity delta. In some embodiments, the system 105 can apply thresholds to the diversity delta to determine if a proposed change to the entity grouping is sufficient to warrant the proposed change. For example, the system 105 may require at least a net change or diversity delta of 20%. Other percentages can also be utilized. The present technology provides information related to the updated information (the new diversity score), including differences (the amount of the change made in one or more updates, namely the delta), and trends (patterns over many time steps).
The present technology also provides attribution information when a diversity score changes. In particular, the methods and system indicate to a user why the score has changed, namely what exactly has changed in the underlying data sets to effect that higher level score change. In this manner, the systems and methods of the present technology provide detailed information to the user to identify the changed data, and thereby understand the positive and negative impacts of the user's actions on the diversity score.
The system 105 can also build an entity portfolio for an end user with knowledge gained from an analysis of variables for a plurality of entities. For instance, the system 105 can create a report that informs the end user as to how many and what type of entities a portfolio should have to be balanced in terms of diversity, e.g., with respect to cyber risk. For example, the report may indicate that an insurer should have a certain percentage of clients in the banking sector, a certain percentage in the technology sector, and a certain percentage in the medial industry. These sectors of the portfolio are deduced by comparing variables for various entities in a given industry that lead to a suitable diversity score.
It will be understood that the diversity score can be counterbalanced by other factors such as revenue for the end user. That is, an insurer may be more likely to accept a lower diversity score from a group of entities that pay higher premiums or a group of entities that is at least partially self-insured.
FIG. 2 is a flowchart 200 of an example method that is executed by the system, in accordance with the present technology. The method includes the system 105 (for each of a plurality of entities), receiving 205 a set of variables that are indicative of attributes of an entity. These variables can include any number or type of variables that represent the attributes of the entity.
These variables are collected for numerous entities that may belong, in some embodiments, to a particular class or group. For example, the entities could include all employees in a company, all insured customers of an insurance agency, investors in a mutual fund, or other groups.
The method includes the system 105 comparing 210 the sets of variables for the plurality of entities to each other and locating 215 clusters of similar variables shared between two or more of the plurality of entities.
The method includes the system 105 clustering 220 common variables and identifying the entities that share the common variables. These clusters are indicative of non-diversity between these entities.
Next, the method includes the system 105 calculating 225 a diversity score that represents how different the plurality of entities are to one another based on variables that are not shared between the plurality of entities. This diversity is directly related to the commonality discovered above. The more similar or commonly shared variables exist, the less diverse the entities are relative to one another, as a general rule. Again, as mentioned above, some variables will have little to no impact on diversity as dictated by weighting or variable selection by the end user. For example, if a commonly shared variable is not included in the diversity calculation by the end user the variable has no impact on the diversity score.
Next, the method includes the system 105 receiving 230 feedback from an end user in response to providing the diversity score to the end user. Also, the method includes the system 105 updating 235 the diversity score in response to the feedback.
Various types of feedback are contemplated and illustrated in FIG. 2. The feedback can take the form of a suggestion, option, report, or other output that is actionable by the end user. Exemplary methods and systems according to the present technology may also provide benchmarking over time. In this manner, an insurance company or other entity tracking aggregate cyber risk may track their diversity score over an adjustable time period, for example days, weeks, months, and/or years.
It will be understood that the methods illustrated in flowchart form are susceptible to execution in various forms such that not all steps may be required. In some instances, additional steps can be added. Some steps may be rephrased or replaced with other steps, in accordance with the claimed technology.
In FIG. 3, the flowchart 300 illustrates the method including the system 105 providing, at 305, the user with one or more actions/suggestions that to the end user based on the diversity score and the clusters of similar variables. These actions can potentially increase the diversity score if enacted by the end user.
In step 310, an action includes providing the end user with a variable profile of a target entity that when added to the plurality of entities increases the diversity score.
In step 315, an action includes providing the end user with a list of entities of the plurality of entities that are lowering the diversity score.
In step 320, an action includes providing the end user with a list of entities of the plurality of entities that, if lost, would lower the diversity score.
Regardless of the action taken ( step 310, 315, and/or 320), the feedback is used in calculating, at step 325, an updated diversity score and delivering, at step 330, the updated diversity score to the end user.
Again, these options are merely examples and are not intended to be limiting. These options can be provided individually or in combination, if desired.
FIG. 4 is a flowchart 400 of a new entity analysis method. In this method, the system is utilized to compare the variables of a new entity to an existing diversity analysis. For example, an insurer desires to determine how the addition of this new entity will affect the diversity of an existing client base. This aggregate risk analysis can be used to ensure that diversity is maintained or increased when a new client is added to an existing pool of clients.
The method includes receiving 405 a variable profile for a new entity. The variable profile either includes a set of variables or a set of variables is deduced from the variable profile. As mentioned above, the variable profile can include an application form, a resume, a corporate filing such as a tax return, or any other document that includes attributes of an entity.
Next, the method includes adding 410 the set of variables of the new entity to the variables of the previously analyze entities and performing 415 an updated comparison of variables. Next, the method includes generating 420 an updated diversity score calculation.
In some embodiments, the method includes alerting 425 the end user if the addition of the new entity decreases (or increases) the diversity score.
The end user can decided to accept or reject this new client based upon how the client affects the diversity score.
Advantageously, the present technology can be used in scenarios where diversity of clientele is desirous. The present technology can perform diversity analyses on potentially thousands of attributes across countless entities in ways that would be impossible to accomplish absent the use of the diversity analysis system. The diversity analyses of the present technology can bring clarity to business planning and project management, where integration of new clients/entities may affect the diversity of a current client base (either positively or negatively). Where diversification is desirable or required, the present technology provides a means for facilitating and maintaining this diversity in a way that is actionable and usable to the end user. That is, the present technology provides a way for end users to mitigate risk through diversification of their customer base or however diversity impacts their particular business or operations.
FIG. 5 is a diagrammatic representation of an example machine in the form of a computer system 1, within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. In various example embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a portable music player (e.g., a portable hard drive audio device such as an Moving Picture Experts Group Audio Layer 3 (MP3) player), a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example computer system 1 includes a processor or multiple processor(s) 5 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), and a main memory 10 and static memory 15, which communicate with each other via a bus 20. The computer system 1 may further include a video display 35 (e.g., a liquid crystal display (LCD)). The computer system 1 may also include an alpha-numeric input device(s) 30 (e.g., a keyboard), a cursor control device (e.g., a mouse), a voice recognition or biometric verification unit (not shown), a drive unit 37 (also referred to as disk drive unit), a signal generation device 40 (e.g., a speaker), and a network interface device 45. The computer system 1 may further include a data encryption module (not shown) to encrypt data.
The disk drive unit 37 includes a computer or machine-readable medium 50 on which is stored one or more sets of instructions and data structures (e.g., instructions 55) embodying or utilizing any one or more of the methodologies or functions described herein. The instructions 55 may also reside, completely or at least partially, within the main memory 10 and/or within the processor(s) 5 during execution thereof by the computer system 1. The main memory 10 and the processor(s) 5 may also constitute machine-readable media.
The instructions 55 may further be transmitted or received over a network (see e.g. 105B in FIG. 1) via the network interface device 45 utilizing any one of a number of well-known transfer protocols (e.g., Hyper Text Transfer Protocol (HTTP)). While the machine-readable medium 50 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions. The term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals. Such media may also include, without limitation, hard disks, floppy disks, flash memory cards, digital video disks, random access memory (RAM), read only memory (ROM), and the like. The example embodiments described herein may be implemented in an operating environment comprising software installed on a computer, in hardware, or in a combination of software and hardware.
One skilled in the art will recognize that the Internet service may be configured to provide Internet access to one or more computing devices that are coupled to the Internet service, and that the computing devices may include one or more processors, buses, memory devices, display devices, input/output devices, and the like.
Furthermore, those skilled in the art may appreciate that the Internet service may be coupled to one or more databases, repositories, servers, and the like, which may be utilized in order to implement any of the embodiments of the disclosure as described herein.
According to some embodiments, the present technology can be utilized to calculate a probable maximum loss (PML) for a portfolio of companies in a cyber risk pool. The PML value can include a loss value calculated against various probability likelihood values. For example, a PML probability likelihood value could include 1/10, 1/20, 1/100, or 1/1,000. To be sure, other likelihood values can also likewise be utilized in some embodiments. The smaller the likelihood value, the lower the probability of a loss event and vice versa. In some instances, a smaller likelihood value can be correlated with a cyber risk event or scenario that is not likely to occur, but nevertheless if it were to occur, it would result in a higher PML value than a cyber risk event that is more likely to occur. In one non-limiting example, a smaller likelihood value could be associated with the breach or failure of three out of five shared services between a group of companies. While this type of scenario is unlikely to occur, the PML of this type of event would be greater than what would occur if the loss only involved a single service.
In general, the present technology can be configured to calculate one or more PML values for entities in a cyber risk insurance portfolio. If the PML risk is too great, the mixture of entities within the cyber risk insurance portfolio can be selectively altered and updated PML values calculated. In this way, desired permutations of entities can be placed into a cyber risk insurance portfolio so as to maximize optimal PML values, according to various embodiments. To be sure, the lower the PML values are across the board, the better for the insurer.
In some embodiments, these tactics can be utilized by a large company to analyze its utilization of technology services, non-technology vendors, and business partners amongst its constituent operations and selectively adjust the services to reduce a cyber risk for the company.
As background, cyber security exposure can be analyzed to increase awareness of these threats and assess the impact on an organization's financial strength. Insurers can and will be required to provide detailed information on their specific cyber security insurance policies. Through the utilization of various techniques, various embodiments of the present technology can aggregate exposures and arrive at a probable maximum loss (PML) in the foreseeable future for an insurer or a company.
A quantification of the aggregation paths among service providers, vendors, business partners or insurance policyholders across the cloud-based continuum of services ranging from shared infrastructure to software as a service, other types of service providers, and common exploitable vectors of attack will be crucial to an insurer's or a company's ability to understand and manage the aggregation of risk within their portfolio or operations.
Cyber security exposure is relevant with respect to how a company is protecting itself against cyber security threats, both internal and external. Various embodiments of the present technology provide various methods for determining and aggregating such exposure and calculating PML measurements. These may be used, for example, for underwriting cyber security insurance and other uses.
The present technology, according to various embodiments, provides methodologies which examine aggregate exposures and potential disaster scenarios involving correlated cyber risk. These exemplary methods can be crucial to insurance companies with portfolios of cyber insurance policies and companies with complex supply chains and business partner networks.
It will be understood that the interconnectedness of cyber risk among companies is not necessarily correlated to attributes like physical location and class of business, so carriers must act accordingly and take a more thorough look at the businesses they are underwriting when examining potential aggregated loss scenarios affecting their portfolio. An understanding of a portfolio's exposure to major service providers and other common vectors of attack are beneficial and require adequate analysis for the potential catastrophe scenarios on various insured companies to arrive at reliable measurements of PML.
FIG. 6 illustrates a flowchart 600 for an exemplary method for determining a probable maximum loss for a group of entities, according to various embodiments. As mentioned above, these entities can be selected for inclusion in a possible insurance pool. The selected entities can also comprise entities already in an insurance pool and insured for cyber risk. As also mentioned above, the exemplary method can also be utilized to help large organizations or companies determine a probable maximum loss across an organization by comparing the services utilized by various parts of an organization and performing a random simulation analysis to calculate PML. The company can then adjust the distribution and use of services, recalculate PML and determine if an increase or decrease in PML is caused by these redistribution/reorganization efforts.
Thus, the exemplary method implicitly includes a process for identifying a plurality of entities that are or may be included in a cyber risk insurance policy.
The exemplary method in the flowchart 600 begins with, for each of a plurality of entities, receiving, at 605, a set of variables that are indicative of attributes of an entity. These variables may include, but are not limited to, for example: content delivery network utilized, ISP usage, cloud storage service providers, database technology, any other hardware or software services, as well as non-technology services utilized by the entities.
Next, the exemplary method can comprise comparing, at 610, the sets of variables for the plurality of entities to each other and locating, at 615, clusters of similar variables shared between two or more of the plurality of entities. Thus, entities can be aggregated into groups based on their shared utilization of services. By way of just one example, each of the selected entities may be determined to use Amazon™ AWS and be aggregated based on this shared variable. To be sure, according to various embodiments, many correlations can be determined for the entities based on one or more shared variables.
Next, the exemplary method can include determining, at 620, a probable maximum loss for the plurality of entities that share the clusters. As mentioned above, the probable maximum loss can comprise a (monetary) loss value attributed to a cyber breach or other cyber event, against one or more of the shared variables. Again, the PML may be calculated with respect to a probability or likelihood for failure or breach of the shared variables. For one non-limiting example, calculating PML with respect to a probability or likelihood for a breach of Amazon™ AWS. According to various embodiments, the PML will be higher if multiple services are involved in a failure or cyber event, and less if fewer services are breached. Likewise, if the likelihood of a shared service being breached is low, the PML may be affected according to various embodiments. According to various embodiments, for example, if the likelihood of a service being breached is low, but the damage would be widespread, the PML will be higher than if there is a high likelihood for a breach but limited damage.
Once the PML is known, actions can be taken to decrease the PML, if needed. For example, the method can include receiving, at 625, using a processor coupled to a memory, feedback from an end user in response to providing the probable maximum loss to the end user. The feedback can include removing, replacing, or adding entities to the calculation that affect the calculation of an updated PML.
Next, the method can include automatically converting, at 630, using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss.
According to some embodiments, the method includes providing, at 635, one or more actions to the end user based on the probable maximum loss and the clusters of similar variables. To be sure, enactment by the end user of at least one of the one or more actions may decrease the probable maximum loss. Again, an action could include adding, removing, and/or replacing entities within the group of entities that were analyzed to generate the PML value according to various embodiments.
FIG. 7 illustrates a flowchart 700 for an example method for calculating PML, in one embodiment. The method comprises performing, at 705, a singular vulnerability analysis for each of the entities that share at least one common variable therebetween. That is, each entity will have a set of variables that represent the services or attributes. These services or attributes can lead to an increase or decrease in the likelihood that the entity will experience a cyber event or loss. The method, according to various embodiments, provides an analysis of each entity individually to determine their vulnerability.
Next, the exemplary method includes performing, at 710, a pairwise vulnerability analysis for each entity in the group. Thus, vulnerability may be determined for each entity in comparison with every other entity, that share at least one common variable, in the group. This step can also be referred to as a pair-wise correlation.
Both steps 705 and 710 can produce areas of interest that can be utilized as the focus of a PML analysis, as described below. That is, according to various embodiments, both individual and pairwise vulnerabilities are indicative of potential group-wise vulnerabilities, which are in turn indicative of losses that can be expected for the selected group of entities as a whole.
In some embodiments, the method can comprise building, at 715, a covariant matrix using the individual and pairwise vulnerability correlations. FIG. 8 is an exemplary diagram 800 illustrating an example pairwise correlation matrix and a corresponding example covariance matrix. Diagram 800 shows just a non-limiting example embodiment of a way correlated risks may be represented and incorporated into a PML calculation based on the diversity analysis and score.
According to some embodiments, the method includes performing, at 720, randomized simulations of breach events using the covariant matrix and generating, at 725, a probable maximum loss.
As mentioned above, some embodiments include a feedback loop where entities are added, removed, and/or replaced to change the vulnerability analyses, covariant matrix assembly, and PML calculation. In one example, a Monte Carlo analysis can be utilized to perform a randomized simulation on the covariant matrix.
In various embodiments, the PML calculation may be a distribution of losses (where each loss in that distribution corresponds to a simulated state). The calculation itself can be related to the diversity score of the portfolio via the correlation/covariance matrix in various embodiments. FIG. 9 is a diagram 900 illustrating an example distribution with markings showing the corresponding values in an example PML table.
According to some embodiments, the PML value can include a PML that includes every breach likely or expected for the plurality of entities over a period of time, such as a month or year. Again, the likelihood of the breach event affects the PML calculated according to various embodiments. If events that have a 1 in 10 chance of occurring are considered, their contribution to the PML, according to various embodiments, is likely lower than a breach event having a likelihood of occurrence of 1 in 1,000 or 1 in 10,000.
It will be understood that since the PML calculation process uses randomized simulations (using Monte Carlo analysis, for example) on a covariant matrix, other randomized simulations may lead to the generation of PML values that are not identical to one another. Thus, the exemplary method can optionally include performing a plurality of randomized simulations against the same covariant matrix and obtaining a plurality of PML values. An average or other statistical representation of the plurality of PML values can be obtained if desired.
It will be understood that the system of FIG. 1 can be adapted to perform the methods described herein. For example, the system 100 can further comprise a simulation and PML module 141 that is stored in memory 115. The simulation and PML module 141 can perform the individual and pairwise analyses, as well as the covariant matrix construction and random simulation processes. Also, the simulation and PML module 141 can be configured to calculate the PML values from the covariant matrix construction and random simulation processes.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present technology has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the present technology in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present technology. Exemplary embodiments were chosen and described in order to best explain the principles of the present technology and its practical application, and to enable others of ordinary skill in the art to understand the present technology for various embodiments with various modifications as are suited to the particular use contemplated.
Aspects of the present technology are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the present technology. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present technology. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular embodiments, procedures, techniques, etc. in order to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” or “according to one embodiment” (or other phrases having similar import) at various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Furthermore, depending on the context of discussion herein, a singular term may include its plural forms and a plural term may include its singular form. Similarly, a hyphenated term (e.g., “on-demand”) may be occasionally interchangeably used with its non-hyphenated version (e.g., “on demand”), a capitalized entry (e.g., “Software”) may be interchangeably used with its non-capitalized version (e.g., “software”), a plural term may be indicated with or without an apostrophe (e.g., PE's or PEs), and an italicized term (e.g., “N+1”) may be interchangeably used with its non-italicized version (e.g., “N+1”). Such occasional interchangeable uses shall not be considered inconsistent with each other.
Also, some embodiments may be described in terms of “means for” performing a task or set of tasks. It will be understood that a “means for” may be expressed herein in terms of a structure, such as a processor, a memory, an I/O device such as a camera, or combinations thereof. Alternatively, the “means for” may include an algorithm that is descriptive of a function or method step, while in yet other embodiments the “means for” is expressed in terms of a mathematical formula, prose, or as a flow chart or signal diagram.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is noted at the outset that the terms “coupled,” “connected”, “connecting,” “electrically connected,” etc., are used interchangeably herein to generally refer to the condition of being electrically/electronically connected. Similarly, a first entity is considered to be in “communication” with a second entity (or entities) when the first entity electrically sends and/or receives (whether through wireline or wireless means) information signals (whether containing data information or non-data/control information) to the second entity regardless of the type (analog or digital) of those signals. It is further noted that various figures (including component diagrams) shown and discussed herein are for illustrative purpose only, and are not drawn to scale.
While specific embodiments of, and examples for, the system are described above for illustrative purposes, various equivalent modifications are possible within the scope of the system, as those skilled in the relevant art will recognize. For example, while processes or steps are presented in a given order, alternative embodiments may perform routines having steps in a different order, and some processes or steps may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or sub-combinations. Each of these processes or steps may be implemented in a variety of different ways. Also, while processes or steps are at times shown as being performed in series, these processes or steps may instead be performed in parallel, or may be performed at different times.
While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. The descriptions are not intended to limit the scope of the invention to the particular forms set forth herein. To the contrary, the present descriptions are intended to cover such alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims and otherwise appreciated by one of ordinary skill in the art. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments.

Claims (28)

What is claimed is:
1. A method, comprising:
for each of a plurality of entities, receiving a set of variables that are indicative of attributes of an entity;
comparing the sets of variables for the plurality of entities to each other;
locating clusters of similar variables shared between two or more of the plurality of entities;
determining a probable maximum loss for the plurality of entities that share the clusters, the probable maximum loss comprising a loss value attributed to a cyber event against one or more of the shared variables;
receiving, using a processor coupled to a memory, feedback from an end user in response to providing the probable maximum loss to the end user;
automatically converting, using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss; and
providing one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions decreasing the probable maximum loss; the providing of one of the one or more actions comprising providing the end user with a list of a subset of entities of the plurality of entities that, if the subset of entities was removed from the plurality of entities, the probable maximum loss is lowered.
2. The method according to claim 1, wherein the probable maximum loss is calculated for a likelihood value that represents how likely the cyber event is for the one or more of the shared variables.
3. The method according to claim 1, wherein determining a probable maximum loss further comprises:
determining a likelihood of the cyber event for each of the plurality of entities; and
performing a pairwise analysis of the likelihood of the cyber event for all pairs of the plurality of entities.
4. The method according to claim 3, wherein determining a probable maximum loss further comprises performing a randomized scenario analysis for the plurality of entities.
5. The method according to claim 1, further comprising calculating a diversity score that represents how different the plurality of entities are from one another based on variables that are not shared between the plurality of entities.
6. The method according to claim 5, further comprising:
receiving feedback from the end user in response to providing the diversity score to the end user; and
updating the diversity score in response to the feedback.
7. The method according to claim 6, wherein the diversity score comprises a determination of diversity with regard to aggregate cyber security risk, the cyber security risk including risks associated with computer networks, the diversity score being configured at least for planning for increasing diversity to reduce the aggregate cyber security risk.
8. The method according to claim 7, further comprising calculating a similarity score based on the clusters of similar variables.
9. The method according to claim 1, wherein the set of variables are selected by the end user.
10. The method according to claim 1, wherein each of the set of variables is provided with a weight that affects how the probable maximum score is calculated.
11. The method according to claim 10, further comprising receiving, from the end user, an adjustment to the weight of the one of the set of variables.
12. The method according to claim 1, wherein an action of the one or more actions comprises providing the end user with a variable profile of a target entity that, when added to the plurality of entities, decreases the probable maximum loss.
13. The method according to claim 1, wherein an action of the one or more actions comprises providing the end user with a list of entities of the plurality of entities that are increasing the probable maximum loss.
14. The method according to claim 1, further comprising:
receiving a variable profile for a new entity;
adding a set of variables of the new entity to the comparison and probable maximum loss calculation; and
alerting the end user if the addition of the new entity decreases a diversity score.
15. The method according to claim 6, further comprising determining common local attributes shared between two or more of the plurality of entities.
16. A method, comprising:
for each of a plurality of entities, receiving a set of variables that are indicative of attributes of an entity;
receiving from an end user, selection of variables that comprise a subset of the set of variables that the end user desires to analyze;
comparing the subsets of variables for the plurality of entities to each other;
locating clusters of similar variables shared between two or more of the plurality of entities;
determining a probable maximum loss for the plurality of entities that share the clusters, the probable maximum loss comprising a loss value attributed to a cyber event against one or more of the shared variables; and
providing, using a processor coupled to a memory, one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions decreasing the probable maximum loss; the providing of one of the one or more actions comprising providing the end user with a list of a subset of entities of the plurality of entities that, if the subset of entities was removed from the plurality of entities, the probable maximum loss is lowered.
17. The method according to claim 16, wherein an action of the one or more actions comprises removal or replacement of one or more of the plurality of entities with a replacement entity that comprises different variables.
18. The method according to claim 16, wherein the probable maximum loss is calculated for a likelihood value that represents how likely the cyber event is for the one or more of the shared variables.
19. The method according to claim 16, wherein determining a probable maximum loss further comprises:
determining a likelihood of the cyber event for each of the plurality of entities; and
performing a pairwise analysis of the likelihood of the cyber event for all pairs of the plurality of entities.
20. The method according to claim 19, wherein determining a probable maximum loss further comprises performing a randomized scenario analysis for the plurality of entities.
21. The method according to claim 16, wherein an action of the one or more actions comprises providing the end user with a variable profile of a target entity that, when added to the plurality of entities, decreases the probable maximum loss.
22. The method according to claim 16, wherein an action of the one or more actions comprises providing the end user with a list of entities of the plurality of entities that are increasing the probable maximum loss.
23. A system, comprising:
a processor; and
a memory for storing executable instructions, the processor executing the instructions to:
compare sets of variables for a plurality of entities to each other, each of the sets of variables being indicative of attributes of an entity;
locate clusters of similar variables shared between two or more of the plurality of entities;
calculate a probable maximum loss for the plurality of entities that share the clusters by:
determining singular vulnerabilities of each of the plurality of entities;
determining vulnerabilities of pairs of the plurality of entities; and
performing a randomized scenario analysis using the singular and pairwise vulnerabilities, the probable maximum loss comprising a loss value;
receive, using the processor, feedback from an end user in response to providing the probable maximum loss to the end user;
automatically convert, using the processor and in response to the feedback, the probable maximum loss to an updated probable maximum loss; and
provide one or more actions to the end user based on the probable maximum loss and the clusters of similar variables, enactment by the end user of at least one of the one or more actions decreasing the probable maximum loss; the providing of one of the one or more actions comprising providing the end user with a list of a subset of entities of the plurality of entities that, if the subset of entities was removed from the plurality of entities, the probable maximum loss is lowered.
24. The system according to claim 23, further comprising:
generating a covariant matrix using the individual and pairwise vulnerability correlations; and
performing randomized simulations using the covariant matrix to generate the probable maximum loss.
25. The system according to claim 23, wherein the one or more actions comprises removal or replacement of one or more of the plurality of entities with a replacement entity that comprises different variables.
26. The system according to claim 23, wherein the probable maximum loss is calculated for a likelihood value that represents how likely the cyber event is for the one or more of the shared variables.
27. The system according to claim 23, wherein determining a probable maximum loss further comprises:
determining a likelihood of a cyber event for each of the plurality of entities; and
performing a pairwise analysis of the likelihood of a cyber event for all pairs of the plurality of entities.
28. The system according to claim 23, wherein determining a probable maximum loss further comprises performing a randomized scenario analysis for the plurality of entities, wherein a Monte Carlo analysis is used for at least one of the randomized scenario analyses.
US14/931,510 2014-12-29 2015-11-03 Diversity analysis with actionable feedback methodologies Active US9373144B1 (en)

Priority Applications (16)

Application Number Priority Date Filing Date Title
US14/931,510 US9373144B1 (en) 2014-12-29 2015-11-03 Diversity analysis with actionable feedback methodologies
US15/099,297 US10341376B2 (en) 2014-12-29 2016-04-14 Diversity analysis with actionable feedback methodologies
US15/141,779 US9521160B2 (en) 2014-12-29 2016-04-28 Inferential analysis using feedback for extracting and combining cyber risk information
US15/142,997 US9699209B2 (en) 2014-12-29 2016-04-29 Cyber vulnerability scan analyses with actionable feedback
PCT/US2016/058711 WO2017078986A1 (en) 2014-12-29 2016-10-25 Diversity analysis with actionable feedback methodologies
US15/371,047 US10230764B2 (en) 2014-12-29 2016-12-06 Inferential analysis using feedback for extracting and combining cyber risk information
US15/373,298 US10050989B2 (en) 2014-12-29 2016-12-08 Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US15/374,212 US10050990B2 (en) 2014-12-29 2016-12-09 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US15/457,921 US10218736B2 (en) 2014-12-29 2017-03-13 Cyber vulnerability scan analyses with actionable feedback
US15/971,946 US10511635B2 (en) 2014-12-29 2018-05-04 Inferential analysis using feedback for extracting and combining cyber risk information
US15/972,027 US10498759B2 (en) 2014-12-29 2018-05-04 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US15/971,909 US10491624B2 (en) 2014-12-29 2018-05-04 Cyber vulnerability scan analyses with actionable feedback
US16/582,977 US11146585B2 (en) 2014-12-29 2019-09-25 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US16/662,936 US11153349B2 (en) 2014-12-29 2019-10-24 Inferential analysis using feedback for extracting and combining cyber risk information
US17/465,739 US11855768B2 (en) 2014-12-29 2021-09-02 Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US17/477,294 US11863590B2 (en) 2014-12-29 2021-09-16 Inferential analysis using feedback for extracting and combining cyber risk information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/585,051 US9253203B1 (en) 2014-12-29 2014-12-29 Diversity analysis with actionable feedback methodologies
US14/931,510 US9373144B1 (en) 2014-12-29 2015-11-03 Diversity analysis with actionable feedback methodologies

Related Parent Applications (5)

Application Number Title Priority Date Filing Date
US14/585,051 Continuation-In-Part US9253203B1 (en) 2014-12-29 2014-12-29 Diversity analysis with actionable feedback methodologies
US201514614897A Continuation 2014-12-29 2015-02-05
US15/099,297 Continuation-In-Part US10341376B2 (en) 2014-12-29 2016-04-14 Diversity analysis with actionable feedback methodologies
US15/141,779 Continuation-In-Part US9521160B2 (en) 2014-12-29 2016-04-28 Inferential analysis using feedback for extracting and combining cyber risk information
US15/142,997 Continuation-In-Part US9699209B2 (en) 2014-12-29 2016-04-29 Cyber vulnerability scan analyses with actionable feedback

Related Child Applications (4)

Application Number Title Priority Date Filing Date
PCT/US2015/067968 Continuation-In-Part WO2016109608A1 (en) 2014-12-29 2015-12-29 System for cyber insurance policy including cyber risk assessment/management service
US15/099,297 Continuation-In-Part US10341376B2 (en) 2014-12-29 2016-04-14 Diversity analysis with actionable feedback methodologies
US15/141,779 Continuation-In-Part US9521160B2 (en) 2014-12-29 2016-04-28 Inferential analysis using feedback for extracting and combining cyber risk information
US15/142,997 Continuation-In-Part US9699209B2 (en) 2014-12-29 2016-04-29 Cyber vulnerability scan analyses with actionable feedback

Publications (2)

Publication Number Publication Date
US9373144B1 true US9373144B1 (en) 2016-06-21
US20160189301A1 US20160189301A1 (en) 2016-06-30

Family

ID=55175096

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/585,051 Active US9253203B1 (en) 2014-12-29 2014-12-29 Diversity analysis with actionable feedback methodologies
US14/931,510 Active US9373144B1 (en) 2014-12-29 2015-11-03 Diversity analysis with actionable feedback methodologies

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/585,051 Active US9253203B1 (en) 2014-12-29 2014-12-29 Diversity analysis with actionable feedback methodologies

Country Status (3)

Country Link
US (2) US9253203B1 (en)
TW (1) TW201636937A (en)
WO (1) WO2016109162A1 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160234247A1 (en) 2014-12-29 2016-08-11 Cyence Inc. Diversity Analysis with Actionable Feedback Methodologies
US9537884B1 (en) * 2016-06-01 2017-01-03 Cyberpoint International Llc Assessment of cyber threats
US9699209B2 (en) 2014-12-29 2017-07-04 Cyence Inc. Cyber vulnerability scan analyses with actionable feedback
US10050990B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10050989B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US10230764B2 (en) 2014-12-29 2019-03-12 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10404748B2 (en) 2015-03-31 2019-09-03 Guidewire Software, Inc. Cyber risk analysis and remediation using network monitored sensors and methods of use
US10970787B2 (en) * 2015-10-28 2021-04-06 Qomplx, Inc. Platform for live issuance and management of cyber insurance policies
US11032244B2 (en) 2019-09-30 2021-06-08 BitSight Technologies, Inc. Systems and methods for determining asset importance in security risk management
US11050779B1 (en) 2020-01-29 2021-06-29 BitSight Technologies, Inc. Systems and methods for assessing cybersecurity state of entities based on computer network characterization
US11126723B2 (en) 2018-10-25 2021-09-21 BitSight Technologies, Inc. Systems and methods for remote detection of software through browser webinjects
US11200323B2 (en) 2018-10-17 2021-12-14 BitSight Technologies, Inc. Systems and methods for forecasting cybersecurity ratings based on event-rate scenarios
US11265330B2 (en) 2020-02-26 2022-03-01 BitSight Technologies, Inc. Systems and methods for improving a security profile of an entity based on peer security profiles
US11329878B2 (en) 2019-09-26 2022-05-10 BitSight Technologies, Inc. Systems and methods for network asset discovery and association thereof with entities
US11627109B2 (en) 2017-06-22 2023-04-11 BitSight Technologies, Inc. Methods for mapping IP addresses and domains to organizations using user activity data
US11652834B2 (en) 2013-09-09 2023-05-16 BitSight Technologies, Inc. Methods for using organizational behavior for risk ratings
US11671441B2 (en) 2018-04-17 2023-06-06 BitSight Technologies, Inc. Systems and methods for external detection of misconfigured systems
US11675912B2 (en) 2019-07-17 2023-06-13 BitSight Technologies, Inc. Systems and methods for generating security improvement plans for entities
US11689555B2 (en) 2020-12-11 2023-06-27 BitSight Technologies, Inc. Systems and methods for cybersecurity risk mitigation and management
US11720679B2 (en) 2020-05-27 2023-08-08 BitSight Technologies, Inc. Systems and methods for managing cybersecurity alerts
US11770401B2 (en) 2018-03-12 2023-09-26 BitSight Technologies, Inc. Correlated risk in cybersecurity
US11777976B2 (en) 2010-09-24 2023-10-03 BitSight Technologies, Inc. Information technology security assessment system
US11777983B2 (en) 2020-01-31 2023-10-03 BitSight Technologies, Inc. Systems and methods for rapidly generating security ratings
US11855768B2 (en) 2014-12-29 2023-12-26 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US11863590B2 (en) 2014-12-29 2024-01-02 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10938822B2 (en) * 2013-02-15 2021-03-02 Rpr Group Holdings, Llc System and method for processing computer inputs over a data communication network
US9521160B2 (en) 2014-12-29 2016-12-13 Cyence Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US9253203B1 (en) 2014-12-29 2016-02-02 Cyence Inc. Diversity analysis with actionable feedback methodologies
US20170134418A1 (en) * 2015-10-16 2017-05-11 Daniel Minoli System and method for a uniform measure and assessement of an institution's aggregate cyber security risk and of the institution's cybersecurity confidence index.
US11182720B2 (en) * 2016-02-16 2021-11-23 BitSight Technologies, Inc. Relationships among technology assets and services and the entities responsible for them
CN106453417B (en) * 2016-12-05 2019-01-22 国网浙江省电力有限公司电力科学研究院 A kind of network attack target prediction method based on neighbour's similitude
US9930062B1 (en) 2017-06-26 2018-03-27 Factory Mutual Insurance Company Systems and methods for cyber security risk assessment
US20200265267A1 (en) * 2019-02-20 2020-08-20 Sony Interactive Entertainment LLC Creating Diversity in Artificial Intelligence and Machine Learning
US20210058421A1 (en) 2019-08-23 2021-02-25 BitSight Technologies, Inc. Systems and methods for inferring entity relationships via network communications of users or user devices
US11556635B2 (en) 2020-04-28 2023-01-17 Bank Of America Corporation System for evaluation and weighting of resource usage activity
US20220236977A1 (en) * 2021-01-22 2022-07-28 SQOR Technologies Inc. System and method for analyzing first party data from one or more software tools
CN114936942B (en) * 2022-07-21 2022-11-01 深圳市绽放工场科技有限公司 Computer network data processing and analyzing system and method for insurance users

Citations (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6269349B1 (en) 1999-09-21 2001-07-31 A6B2, Inc. Systems and methods for protecting private information
US20020026335A1 (en) 2000-07-21 2002-02-28 Tadashi Honda Data security insurance system
US20020091551A1 (en) 2000-09-19 2002-07-11 Robert Parisi Internet insurance product
US20030014344A1 (en) 2001-06-05 2003-01-16 Varkki Chacko System and method for determining the liquidity of a credit
US20030028803A1 (en) * 2001-05-18 2003-02-06 Bunker Nelson Waldo Network vulnerability assessment system and method
US20030040942A1 (en) 2001-08-23 2003-02-27 Hooten Joseph E. System and method for insuring personal computers against damage caused by computer viruses
US20030084349A1 (en) 2001-10-12 2003-05-01 Oliver Friedrichs Early warning system for network attacks
US20030126049A1 (en) * 2001-12-31 2003-07-03 Nagan Douglas A. Programmed assessment of technological, legal and management risks
US20030135758A1 (en) 2001-07-19 2003-07-17 Turner Elliot B. System and method for detecting network events
US20030154393A1 (en) 2002-02-12 2003-08-14 Carl Young Automated security management
US20040006532A1 (en) 2001-03-20 2004-01-08 David Lawrence Network access risk management
US20040010709A1 (en) 2002-04-29 2004-01-15 Claude R. Baudoin Security maturity assessment method
US20040024693A1 (en) 2001-03-20 2004-02-05 David Lawrence Proprietary risk management clearinghouse
US20040064726A1 (en) 2002-09-30 2004-04-01 Mario Girouard Vulnerability management and tracking system (VMTS)
US20050044418A1 (en) 2003-07-25 2005-02-24 Gary Miliefsky Proactive network security system to protect against hackers
US20050261943A1 (en) * 2004-03-23 2005-11-24 Quarterman John S Method, system, and service for quantifying network risk to price insurance premiums and bonds
US20050278786A1 (en) 2004-06-09 2005-12-15 Tippett Peter S System and method for assessing risk to a collection of information resources
US7047419B2 (en) 1999-09-17 2006-05-16 Pen-One Inc. Data security system
US20070192867A1 (en) 2003-07-25 2007-08-16 Miliefsky Gary S Security appliances
US20080016563A1 (en) 2006-07-12 2008-01-17 Verizon Services Corp. Systems and methods for measuring cyber based risks in an enterprise organization
US7324952B2 (en) 2001-08-29 2008-01-29 International Business Machines Corporation Insurance method, insurance system, transaction monitoring method, transaction monitoring system, and program
US20080047016A1 (en) 2006-08-16 2008-02-21 Cybrinth, Llc CCLIF: A quantified methodology system to assess risk of IT architectures and cyber operations
US20080167920A1 (en) 2006-11-29 2008-07-10 Robert Schmidt Methods and apparatus for developing cyber defense processes and a cadre of expertise
US20080250064A1 (en) 2007-01-17 2008-10-09 Aptima, Inc. Method and system to compare data entities
US20090024663A1 (en) 2007-07-19 2009-01-22 Mcgovern Mark D Techniques for Information Security Assessment
US20090319342A1 (en) 2008-06-19 2009-12-24 Wize, Inc. System and method for aggregating and summarizing product/topic sentiment
US7680659B2 (en) 2005-06-01 2010-03-16 Microsoft Corporation Discriminative training for language modeling
US7711646B2 (en) 1999-09-10 2010-05-04 Transurety, Llc Methods and apparatus for providing coverage for receiver of transmission data
US20100114634A1 (en) 2007-04-30 2010-05-06 James Christiansen Method and system for assessing, managing, and monitoring information technology risk
US20100205014A1 (en) 2009-02-06 2010-08-12 Cary Sholer Method and system for providing response services
US20100229187A1 (en) 2009-03-06 2010-09-09 Manish Marwah Event detection from attributes read by entities
US20110078073A1 (en) 2009-09-30 2011-03-31 Suresh Kumar Annappindi System and method for predicting consumer credit risk using income risk based credit score
US20110154497A1 (en) 2009-12-17 2011-06-23 American Express Travel Related Services Company, Inc. Systems, methods, and computer program products for collecting and reporting sensor data in a communication network
US20120041790A1 (en) 2007-10-24 2012-02-16 Koziol Joseph D Insurance Transaction System and Method
US20120089617A1 (en) 2011-12-14 2012-04-12 Patrick Frey Enhanced search system and method based on entity ranking
US20120096558A1 (en) 2009-05-27 2012-04-19 Quantar Solutions Limited Assessing Threat to at Least One Computer Network
US20120215575A1 (en) 2011-02-22 2012-08-23 Bank Of America Corporation Risk Assessment And Prioritization Framework
US20130055404A1 (en) 2010-04-01 2013-02-28 21Ct, Inc. System And Method For Providing Impact Modeling And Prediction Of Attacks On Cyber Targets
US8448245B2 (en) 2009-01-17 2013-05-21 Stopthehacker.com, Jaal LLC Automated identification of phishing, phony and malicious web sites
US8468599B2 (en) 2010-09-20 2013-06-18 Sonalysts, Inc. System and method for privacy-enhanced cyber data fusion using temporal-behavioral aggregation and analysis
US8484066B2 (en) 2003-06-09 2013-07-09 Greenline Systems, Inc. System and method for risk detection reporting and infrastructure
US20130283336A1 (en) 2012-04-23 2013-10-24 Abb Technology Ag Cyber security analyzer
US8577775B1 (en) 2012-08-31 2013-11-05 Sander Gerber Systems and methods for managing investments
US8601587B1 (en) 2009-09-04 2013-12-03 Raytheon Company System, method, and software for cyber threat analysis
US20130347060A1 (en) 2012-04-23 2013-12-26 Verint Systems Ltd. Systems and methods for combined physical and cyber data security
US20140019171A1 (en) 2007-10-24 2014-01-16 Joseph D. Koziol Insurance Transaction System and Method
WO2014036396A1 (en) 2012-08-31 2014-03-06 Gerber Sander Systems and methods for measuring relationships between investments and other variables
US20140067716A1 (en) 2012-08-31 2014-03-06 Sander Gerber Systems and methods for measuring relationships between investments and other variables
US20140067713A1 (en) 2012-08-31 2014-03-06 Sander Gerber Systems and methods for managing investments
US8699767B1 (en) 2006-10-06 2014-04-15 Hrl Laboratories, Llc System for optimal rapid serial visual presentation (RSVP) from user-specific neural brain signals
US20140137257A1 (en) 2012-11-12 2014-05-15 Board Of Regents, The University Of Texas System System, Method and Apparatus for Assessing a Risk of One or More Assets Within an Operational Technology Infrastructure
US20140200930A1 (en) 2001-11-28 2014-07-17 Deloitte Development Llc Methods and Systems for Determining the Importance of Individual Variables in Statistical Models
US20140379708A1 (en) 2011-10-05 2014-12-25 Amazon Technologies, Inc. Diversity Within Search Results
US20150088595A1 (en) 2013-09-25 2015-03-26 General Electric Company Systems and Methods for Evaluating Risks Associated with a Contractual Service Agreement
US9031951B1 (en) 2012-04-02 2015-05-12 Google Inc. Associating interest and disinterest keywords with similar and dissimilar users
US20150331932A1 (en) * 2014-05-13 2015-11-19 Sas Institute Inc. Techniques for generating a clustered representation of a network based on node data
US9253203B1 (en) 2014-12-29 2016-02-02 Cyence Inc. Diversity analysis with actionable feedback methodologies

Patent Citations (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7711646B2 (en) 1999-09-10 2010-05-04 Transurety, Llc Methods and apparatus for providing coverage for receiver of transmission data
US7047419B2 (en) 1999-09-17 2006-05-16 Pen-One Inc. Data security system
US6269349B1 (en) 1999-09-21 2001-07-31 A6B2, Inc. Systems and methods for protecting private information
US20020026335A1 (en) 2000-07-21 2002-02-28 Tadashi Honda Data security insurance system
US20020091551A1 (en) 2000-09-19 2002-07-11 Robert Parisi Internet insurance product
US20040006532A1 (en) 2001-03-20 2004-01-08 David Lawrence Network access risk management
US20040024693A1 (en) 2001-03-20 2004-02-05 David Lawrence Proprietary risk management clearinghouse
US20030028803A1 (en) * 2001-05-18 2003-02-06 Bunker Nelson Waldo Network vulnerability assessment system and method
US20030014344A1 (en) 2001-06-05 2003-01-16 Varkki Chacko System and method for determining the liquidity of a credit
US20030135758A1 (en) 2001-07-19 2003-07-17 Turner Elliot B. System and method for detecting network events
US20030040942A1 (en) 2001-08-23 2003-02-27 Hooten Joseph E. System and method for insuring personal computers against damage caused by computer viruses
US7324952B2 (en) 2001-08-29 2008-01-29 International Business Machines Corporation Insurance method, insurance system, transaction monitoring method, transaction monitoring system, and program
US20030084349A1 (en) 2001-10-12 2003-05-01 Oliver Friedrichs Early warning system for network attacks
US20140200930A1 (en) 2001-11-28 2014-07-17 Deloitte Development Llc Methods and Systems for Determining the Importance of Individual Variables in Statistical Models
US20030126049A1 (en) * 2001-12-31 2003-07-03 Nagan Douglas A. Programmed assessment of technological, legal and management risks
US20030154393A1 (en) 2002-02-12 2003-08-14 Carl Young Automated security management
US20040010709A1 (en) 2002-04-29 2004-01-15 Claude R. Baudoin Security maturity assessment method
US20040064726A1 (en) 2002-09-30 2004-04-01 Mario Girouard Vulnerability management and tracking system (VMTS)
US8484066B2 (en) 2003-06-09 2013-07-09 Greenline Systems, Inc. System and method for risk detection reporting and infrastructure
US20070192867A1 (en) 2003-07-25 2007-08-16 Miliefsky Gary S Security appliances
US20050044418A1 (en) 2003-07-25 2005-02-24 Gary Miliefsky Proactive network security system to protect against hackers
US20050261943A1 (en) * 2004-03-23 2005-11-24 Quarterman John S Method, system, and service for quantifying network risk to price insurance premiums and bonds
US8494955B2 (en) * 2004-03-23 2013-07-23 John S. Quarterman Method, system, and service for quantifying network risk to price insurance premiums and bonds
US20050278786A1 (en) 2004-06-09 2005-12-15 Tippett Peter S System and method for assessing risk to a collection of information resources
US7680659B2 (en) 2005-06-01 2010-03-16 Microsoft Corporation Discriminative training for language modeling
US20080016563A1 (en) 2006-07-12 2008-01-17 Verizon Services Corp. Systems and methods for measuring cyber based risks in an enterprise organization
US20080047016A1 (en) 2006-08-16 2008-02-21 Cybrinth, Llc CCLIF: A quantified methodology system to assess risk of IT architectures and cyber operations
US8699767B1 (en) 2006-10-06 2014-04-15 Hrl Laboratories, Llc System for optimal rapid serial visual presentation (RSVP) from user-specific neural brain signals
US20080167920A1 (en) 2006-11-29 2008-07-10 Robert Schmidt Methods and apparatus for developing cyber defense processes and a cadre of expertise
US20080250064A1 (en) 2007-01-17 2008-10-09 Aptima, Inc. Method and system to compare data entities
US20100114634A1 (en) 2007-04-30 2010-05-06 James Christiansen Method and system for assessing, managing, and monitoring information technology risk
US20090024663A1 (en) 2007-07-19 2009-01-22 Mcgovern Mark D Techniques for Information Security Assessment
US20120041790A1 (en) 2007-10-24 2012-02-16 Koziol Joseph D Insurance Transaction System and Method
US20140019171A1 (en) 2007-10-24 2014-01-16 Joseph D. Koziol Insurance Transaction System and Method
US20090319342A1 (en) 2008-06-19 2009-12-24 Wize, Inc. System and method for aggregating and summarizing product/topic sentiment
US8448245B2 (en) 2009-01-17 2013-05-21 Stopthehacker.com, Jaal LLC Automated identification of phishing, phony and malicious web sites
US20100205014A1 (en) 2009-02-06 2010-08-12 Cary Sholer Method and system for providing response services
US20100229187A1 (en) 2009-03-06 2010-09-09 Manish Marwah Event detection from attributes read by entities
US20120096558A1 (en) 2009-05-27 2012-04-19 Quantar Solutions Limited Assessing Threat to at Least One Computer Network
US8601587B1 (en) 2009-09-04 2013-12-03 Raytheon Company System, method, and software for cyber threat analysis
US20110078073A1 (en) 2009-09-30 2011-03-31 Suresh Kumar Annappindi System and method for predicting consumer credit risk using income risk based credit score
US20110154497A1 (en) 2009-12-17 2011-06-23 American Express Travel Related Services Company, Inc. Systems, methods, and computer program products for collecting and reporting sensor data in a communication network
US20130055404A1 (en) 2010-04-01 2013-02-28 21Ct, Inc. System And Method For Providing Impact Modeling And Prediction Of Attacks On Cyber Targets
US8468599B2 (en) 2010-09-20 2013-06-18 Sonalysts, Inc. System and method for privacy-enhanced cyber data fusion using temporal-behavioral aggregation and analysis
US20120215575A1 (en) 2011-02-22 2012-08-23 Bank Of America Corporation Risk Assessment And Prioritization Framework
US20140379708A1 (en) 2011-10-05 2014-12-25 Amazon Technologies, Inc. Diversity Within Search Results
US20120089617A1 (en) 2011-12-14 2012-04-12 Patrick Frey Enhanced search system and method based on entity ranking
US9031951B1 (en) 2012-04-02 2015-05-12 Google Inc. Associating interest and disinterest keywords with similar and dissimilar users
US20130283336A1 (en) 2012-04-23 2013-10-24 Abb Technology Ag Cyber security analyzer
US20130347060A1 (en) 2012-04-23 2013-12-26 Verint Systems Ltd. Systems and methods for combined physical and cyber data security
US8577775B1 (en) 2012-08-31 2013-11-05 Sander Gerber Systems and methods for managing investments
US20140067713A1 (en) 2012-08-31 2014-03-06 Sander Gerber Systems and methods for managing investments
US20140067716A1 (en) 2012-08-31 2014-03-06 Sander Gerber Systems and methods for measuring relationships between investments and other variables
WO2014036396A1 (en) 2012-08-31 2014-03-06 Gerber Sander Systems and methods for measuring relationships between investments and other variables
US20140137257A1 (en) 2012-11-12 2014-05-15 Board Of Regents, The University Of Texas System System, Method and Apparatus for Assessing a Risk of One or More Assets Within an Operational Technology Infrastructure
US20150088595A1 (en) 2013-09-25 2015-03-26 General Electric Company Systems and Methods for Evaluating Risks Associated with a Contractual Service Agreement
US20150331932A1 (en) * 2014-05-13 2015-11-19 Sas Institute Inc. Techniques for generating a clustered representation of a network based on node data
US9253203B1 (en) 2014-12-29 2016-02-02 Cyence Inc. Diversity analysis with actionable feedback methodologies

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Böhme et al., "Models and Measures for Correlation in Cyber-Insurance," Workshop on the Economics of Information Security (WEIS), Jun. 2006, Retrieved from .
Böhme et al., "Models and Measures for Correlation in Cyber-Insurance," Workshop on the Economics of Information Security (WEIS), Jun. 2006, Retrieved from <http://www.econinfosec.org/archive/weis2006/docs/16.pdf>.
International Search Report & Written Opinion dated Feb. 10, 2016 in Patent Cooperation Treaty Application No. PCT/US2015/065365, filed Dec. 11, 2015.
Variable Selection for Model-Based Clustering|https://www.stat.washington.edu/raftery/Research/PDF/dean2006.pdf|2006| Raftery et al.|pp. 168-178. *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11882146B2 (en) 2010-09-24 2024-01-23 BitSight Technologies, Inc. Information technology security assessment system
US11777976B2 (en) 2010-09-24 2023-10-03 BitSight Technologies, Inc. Information technology security assessment system
US11652834B2 (en) 2013-09-09 2023-05-16 BitSight Technologies, Inc. Methods for using organizational behavior for risk ratings
US20160234247A1 (en) 2014-12-29 2016-08-11 Cyence Inc. Diversity Analysis with Actionable Feedback Methodologies
US11863590B2 (en) 2014-12-29 2024-01-02 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10218736B2 (en) 2014-12-29 2019-02-26 Guidewire Software, Inc. Cyber vulnerability scan analyses with actionable feedback
US10230764B2 (en) 2014-12-29 2019-03-12 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10341376B2 (en) 2014-12-29 2019-07-02 Guidewire Software, Inc. Diversity analysis with actionable feedback methodologies
US9699209B2 (en) 2014-12-29 2017-07-04 Cyence Inc. Cyber vulnerability scan analyses with actionable feedback
US10491624B2 (en) 2014-12-29 2019-11-26 Guidewire Software, Inc. Cyber vulnerability scan analyses with actionable feedback
US10498759B2 (en) 2014-12-29 2019-12-03 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10511635B2 (en) 2014-12-29 2019-12-17 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US11153349B2 (en) 2014-12-29 2021-10-19 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information
US10050990B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10050989B2 (en) 2014-12-29 2018-08-14 Guidewire Software, Inc. Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US11855768B2 (en) 2014-12-29 2023-12-26 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US11146585B2 (en) 2014-12-29 2021-10-12 Guidewire Software, Inc. Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10404748B2 (en) 2015-03-31 2019-09-03 Guidewire Software, Inc. Cyber risk analysis and remediation using network monitored sensors and methods of use
US11265350B2 (en) 2015-03-31 2022-03-01 Guidewire Software, Inc. Cyber risk analysis and remediation using network monitored sensors and methods of use
US10970787B2 (en) * 2015-10-28 2021-04-06 Qomplx, Inc. Platform for live issuance and management of cyber insurance policies
US9537884B1 (en) * 2016-06-01 2017-01-03 Cyberpoint International Llc Assessment of cyber threats
US11627109B2 (en) 2017-06-22 2023-04-11 BitSight Technologies, Inc. Methods for mapping IP addresses and domains to organizations using user activity data
US11770401B2 (en) 2018-03-12 2023-09-26 BitSight Technologies, Inc. Correlated risk in cybersecurity
US11671441B2 (en) 2018-04-17 2023-06-06 BitSight Technologies, Inc. Systems and methods for external detection of misconfigured systems
US11200323B2 (en) 2018-10-17 2021-12-14 BitSight Technologies, Inc. Systems and methods for forecasting cybersecurity ratings based on event-rate scenarios
US11783052B2 (en) 2018-10-17 2023-10-10 BitSight Technologies, Inc. Systems and methods for forecasting cybersecurity ratings based on event-rate scenarios
US11727114B2 (en) 2018-10-25 2023-08-15 BitSight Technologies, Inc. Systems and methods for remote detection of software through browser webinjects
US11126723B2 (en) 2018-10-25 2021-09-21 BitSight Technologies, Inc. Systems and methods for remote detection of software through browser webinjects
US11675912B2 (en) 2019-07-17 2023-06-13 BitSight Technologies, Inc. Systems and methods for generating security improvement plans for entities
US11329878B2 (en) 2019-09-26 2022-05-10 BitSight Technologies, Inc. Systems and methods for network asset discovery and association thereof with entities
US11032244B2 (en) 2019-09-30 2021-06-08 BitSight Technologies, Inc. Systems and methods for determining asset importance in security risk management
US11050779B1 (en) 2020-01-29 2021-06-29 BitSight Technologies, Inc. Systems and methods for assessing cybersecurity state of entities based on computer network characterization
US11777983B2 (en) 2020-01-31 2023-10-03 BitSight Technologies, Inc. Systems and methods for rapidly generating security ratings
US11265330B2 (en) 2020-02-26 2022-03-01 BitSight Technologies, Inc. Systems and methods for improving a security profile of an entity based on peer security profiles
US11720679B2 (en) 2020-05-27 2023-08-08 BitSight Technologies, Inc. Systems and methods for managing cybersecurity alerts
US11689555B2 (en) 2020-12-11 2023-06-27 BitSight Technologies, Inc. Systems and methods for cybersecurity risk mitigation and management

Also Published As

Publication number Publication date
US9253203B1 (en) 2016-02-02
US20160189301A1 (en) 2016-06-30
WO2016109162A1 (en) 2016-07-07
TW201636937A (en) 2016-10-16

Similar Documents

Publication Publication Date Title
US9373144B1 (en) Diversity analysis with actionable feedback methodologies
US11153349B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information
US10050989B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information including proxy connection analyses
US11146585B2 (en) Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
US10491624B2 (en) Cyber vulnerability scan analyses with actionable feedback
US9521160B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information
US10341376B2 (en) Diversity analysis with actionable feedback methodologies
US20190035027A1 (en) Synthetic Diversity Analysis with Actionable Feedback Methodologies
US11657352B2 (en) Reducing cybersecurity risk level of a portfolio of companies using a cybersecurity risk multiplier
US10614401B2 (en) Reducing cybersecurity risk level of portfolio of companies using a cybersecurity risk multiplier
AU2023206104A1 (en) Network-based automated prediction modeling
US20160110819A1 (en) Dynamic security rating for cyber insurance products
US11855768B2 (en) Disaster scenario based inferential analysis using feedback for extracting and combining cyber risk information
WO2016109608A9 (en) System for cyber insurance policy including cyber risk assessment/management service
US11863590B2 (en) Inferential analysis using feedback for extracting and combining cyber risk information

Legal Events

Date Code Title Description
AS Assignment

Owner name: CYENCE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NG, GEORGE Y.;ROSACE, PHILIP A., III;REEL/FRAME:038057/0987

Effective date: 20151102

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: GUIDEWIRE SOFTWARE, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CYENCE LLC;REEL/FRAME:045716/0282

Effective date: 20171005

Owner name: CAESAR ACQUISITION SUB II, LLC, CALIFORNIA

Free format text: MERGER;ASSIGNOR:CYENCE INC.;REEL/FRAME:045716/0257

Effective date: 20171031

Owner name: CYENCE LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:CAESAR ACQUISITION SUB II, LLC;REEL/FRAME:046080/0138

Effective date: 20171102

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.)

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8