WO2005122715A2 - A mandatory access control (mac) method - Google Patents

A mandatory access control (mac) method Download PDF

Info

Publication number
WO2005122715A2
WO2005122715A2 PCT/US2005/019192 US2005019192W WO2005122715A2 WO 2005122715 A2 WO2005122715 A2 WO 2005122715A2 US 2005019192 W US2005019192 W US 2005019192W WO 2005122715 A2 WO2005122715 A2 WO 2005122715A2
Authority
WO
WIPO (PCT)
Prior art keywords
integrity
subject
low
objects
conflict
Prior art date
Application number
PCT/US2005/019192
Other languages
French (fr)
Other versions
WO2005122715A3 (en
Inventor
Jinhong Katherine Guo
Stephen L. Johnson
Il-Pyung Park
Original Assignee
Matsushita Electric Industrial Co. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co. Ltd. filed Critical Matsushita Electric Industrial Co. Ltd.
Publication of WO2005122715A2 publication Critical patent/WO2005122715A2/en
Publication of WO2005122715A3 publication Critical patent/WO2005122715A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/102Entity profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2113Multi-level security, e.g. mandatory access control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S707/00Data processing: database and file management or data structures
    • Y10S707/99931Database or file accessing
    • Y10S707/99939Privileged access

Definitions

  • the present invention relates to a security system for an operating system. More particularly, the invention relates to a security system technique that uses previous behavior of a subject to control access to an object.
  • BACKGROUND OF THE INVENTION One of the most significant issues related to operating systems for a computer system involves security. Security systems are designed to protect the confidentiality, the integrity, and the availability of an operating system. One aspect of a security system involves reducing opportunities for malicious computer instructions (e.g. a virus, a Trojan horse etc.) to affect the operating system. Operating systems such as UNIX or operating systems derived from UNIX (e.g. LINUX) incorporate a security system that can be vulnerable to malicious computer instructions. [0003] There are several types of mechanisms that are presently used to secure operating systems such as a discretionary access control (DAC) or a mandatory access control (MAC). The DAC is unable to completely secure these operating systems for a variety of reasons.
  • DAC discretionary access control
  • MAC mandatory access control
  • the DAC restricts access to objects based solely on the identity of a subject. This makes the operating system vulnerable to Trojan horses.
  • Other operating systems use a MAC.
  • a Biba low watermark mandatory access control mechanism protects the operating system by dividing processes into different security areas such as HIGH and LOW integrity data.
  • HIGH integrity data relates to highly sensitive data whereas the LOW integrity data relates to low sensitive data.
  • the LOMAC security rules require that a write-up not occur between objects and subjects. To illustrate, a LOW integrity subject cannot write to a HIGH integrity object. Additionally, if a HIGH integrity subject attempts to read a LOW integrity object, the HIGH integrity subject is automatically demoted to the same level as the LOW integrity object that it attempted to read.
  • the HIGH integrity subject is demoted to a LOW integrity subject.
  • the LOMAC requires that exceptions to the security rules be granted in order to overcome an access control mechanism that is too coarse grained. For example, certain computer programs are granted a trusted status with special hard-coded privileges. Trusted status means that the operating system automatically recognizes computer instructions without checking for security issues. Syslogd in Linux exemplifies the trusted status that is granted to a Linux system logging utility by the LOMAC. Syslogd is implemented as a trusted process because syslogd needs to access user profiles and also write to the system LOG files. In addition to granting trusted status to computer programs, the LOMAC performs poorly with respect to confining computer programs to their least required privileges. It is therefore desirable to have a system or a method that overcomes the disadvantages associated with conventional security systems.
  • Figure 1 is a block diagram of a HIGH integrity subject that reads a HIGH integrity object and then reads a LOW integrity object in accordance with one embodiment of the invention
  • Figure 2A is a block diagram that illustrates the demotion of a HIGH integrity subject to a LOW integrity subject in accordance with one embodiment of the invention
  • Figure 2B is a block diagram of a demoted LOW subject that is unable to access a HIGH integrity object in accordance with one embodiment of the invention
  • Figure 3 is a block diagram of a HIGH integrity subject attempting to read a HIGH integrity object after reading another HIGH integrity object in accordance with one embodiment of the invention
  • Figure 4 illustrates a block diagram of a LOW integrity subject attempting to read a LOW integrity object after reading another LOW integrity object in accordance with one embodiment of the invention
  • Techniques of the invention improve the security of an operating system by incorporating a mandatory access control (MAC) that uses the prior behavior of a subject.
  • MAC mandatory access control
  • the MAC considers the action of a subject that reads one object and then attempts to read another object that has a conflict of interest with the first object.
  • a conflict of interest occurs when it is possible that data of one object may be used to corrupt the data of another object.
  • a conflict of interest between two objects that have been read by a subject requires that the subject be demoted whereas a nonconflict of interest between objects does not result in the demotion of the subject.
  • a conflict of interest between objects causes the subject to be denied access to the second object.
  • An attribute is a quality or characteristic that is associated with or inherent to the object. At least one or more of the attributes are security attributes.
  • the integrity of a subject relates to the sensitivity or trust level associated with the subject. A higher integrity subject corresponds to items of higher sensitivity or levels of trust. In one embodiment, a higher integrity subject is represented by the term HIGH integrity subject and a lower integrity subject is represented by the term LOW integrity subject. A similar analysis applies to objects. [0020] In another embodiment, there are multiple levels of subjects at multiple integrity levels. For example, a subject may include a first integrity level, a second integrity level, a third integrity level . . . or an N integrity level. In one embodiment, the highest level may be associated with the highest level of sensitive or confidential material that is stored.
  • One example involves a HIGH integrity subject that reads for a first time a HIGH integrity object. The HIGH integrity subject then attempts to read a LOW integrity object that has a conflict of interest with the first HIGH integrity object. In this embodiment, the HIGH integrity subject is then demoted to a LOW integrity subject based upon the conflict of interest.
  • a HIGH integrity subject reads for a first time a first HIGH integrity object. The HIGH integrity subject then attempts to read a second HIGH integrity object that has a conflict of interest with the first HIGH integrity object. In this embodiment, the HIGH integrity subject is denied access to the second HIGH integrity object.
  • a LOW integrity subject first reads a LOW integrity object. The LOW integrity subject then attempts to read a
  • a LOW integrity subject Based upon the conflict of interest between the objects, the LOW integrity subject is denied access to the HIGH integrity object.
  • a LOW integrity subject first reads a first LOW integrity object. The LOW integrity subject then attempts to read a second LOW integrity object that has a conflict of interest with the first LOW integrity object. The LOW integrity subject is denied access to the second LOW integrity object. While these examples are presented in terms of HIGH and LOW subjects and objects, skilled artisans understand that other security rules may be designed to apply to multiple integrity levels for both the subject and the objects.
  • Figure 1 illustrates a block diagram of the interrelationship between two objects, and a HIGH integrity subject 120.
  • a HIGH integrity subject 120 is configured to read from a LOW integrity object 110 or a HIGH integrity object 130.
  • the HIGH integrity subject 120 will not be demoted or penalized since it is the first time that the HIGH integrity subject 120 accesses one of the objects such as the HIGH integrity object 130 or the LOW integrity object 110.
  • Figures 2A-2B are block diagrams that illustrate the process in which a HIGH integrity subject 220 is demoted. Referring to Figure 2A, the HIGH integrity subject 220 attempts to read the HIGH integrity object 230. This generates a system call that prompts the mediator 135 to act.
  • a conflict is declared by mediator 135 when, for example, an object includes security attributes that limit the object to one section of the file system (i.e. merely logging into the operating system which is low sensitive files) and the other object includes security attributes that allow accessing of very sensitive data (e.g., financial files) in another section of the file system.
  • the HIGH integrity subject 220 is allowed to read the HIGH integrity object 230 since the mediator 135 has not previously read attributes from another object that conflict with the HIGH integrity object 230.
  • the HIGH integrity subject 220 attempts to read the LOW integrity object 210.
  • a system call is again generated prompting the mediator 135 to read the security attributes of the LOW integrity object 210.
  • the mediator 135 determines that a conflict exists between the HIGH and LOW integrity objects 230, 210. Since the LOW integrity object 210 has a conflict of interest with the HIGH integrity object 230, the HIGH integrity subject 220 is demoted to the same integrity level as the LOW integrity object 210, which is a LOW integrity subject level 235 shown in Figure 2B.
  • Figure 2B further illustrates, in one embodiment, that a LOW integrity subject 235 is unable to access a HIGH integrity object 230 after the demotion process shown in Figure 2A.
  • FIG. 3 is a block diagram that illustrates a HIGH integrity subject 410 that reads a HIGH integrity object 420 but is unable to read a HIGH integrity object 430 due to a conflict of interest existing between the objects.
  • the HIGH integrity subject 410 first attempts to read the HIGH integrity object 420.
  • a system call prompts the mediator 135 to then read the security attributes associated with the HIGH integrity object 420.
  • the HIGH integrity subject 410 then attempts to read the HIGH integrity object 430. This prompts another system call to the mediator 135.
  • the mediator 135 compares the attributes of the two objects.
  • the mediator 135 determines that these two objects have a conflict of interest with each other.
  • FIG. 4 illustrates a block diagram of a LOW integrity subject 510 first attempting to read the LOW integrity object 520. This triggers a system call that prompts the mediator 135 to act. The mediator 135 reads the security attributes related to the LOW integrity subject 520. The LOW integrity subject 510 then attempts to read the LOW integrity object 530. This generates another system call that prompts the mediator 135 to act. The mediator 135 then reads the attributes associated with the LOW integrity object 530.
  • Figures 5A-5D illustrate the implementation of security rules in which both the subjects and the objects possess multiple integrity levels.
  • Figure 5A is a block diagram of a plurality of integrity levels associated with a fourth integrity subject 620 and a first, second, third, and fourth integrity objects 640, 650, 610, 630 in accordance with one embodiment of the invention.
  • a fourth integrity subject 620 reads for a first time a fourth integrity object 630.
  • FIG. 5B is a block diagram of a plurality of integrity levels associated with subjects and objects in accordance with one embodiment of the invention.
  • a fourth integrity subject 735 reads for a first time a fourth integrity object 740.
  • the fourth integrity subject 735 attempts to read another fourth integrity object 745 that has a conflict of interest with the previous fourth integrity object 740.
  • the fourth integrity subject 735 is denied access to the latter fourth integrity object 745.
  • Figure 5C is a block diagram of a plurality of integrity levels associated with subjects and objects in accordance with one embodiment of the invention.
  • a third integrity subject 810 first reads a second integrity object 820.
  • the third integrity subject 810 attempts to read a fourth integrity object 830 that has a conflict of interest with the third integrity object 810.
  • the third integrity subject 810 is denied access to the fourth integrity object 830.
  • Figure 5D is a block diagram of a plurality of integrity levels associated with subjects and objects in accordance with one embodiment of the invention.
  • a third integrity subject 910 first reads a second integrity object 920.
  • the third integrity subject 910 attempts to read another second integrity object 930 that has a conflict of interest with the second integrity object 920.
  • FIG. 6 is a flow diagram of one method of implementing a security system for an operating system in accordance with one embodiment of the invention.
  • a first integrity subject reads a first object at operation 1000.
  • the first integrity subject attempts to read a second object at operation 1010. It is determined that a conflict exists between the first and second objects at operation 1020.
  • At least one security rule is applied to the conflict between the first and the second objects at operation 1030.
  • One security rule relates to demoting the first integrity subject to a second integrity subject based upon the conflict between the first and the second objects.
  • the first integrity subject and the first integrity object are considered to have a higher level of trust than the second integrity subject and the second integrity object.
  • Another security rule relates to denying the first integrity subject access to the second object based upon the conflict between the first and the second objects. This security rule is typically implemented when the two objects that are read possess similar integrity levels.
  • a trusted process for computer instructions that access the operating system is eliminated at operation 1040.
  • the operations may be performed by specific hardware components that contain hard-wired logic for performing the operations, or by any combination of programmed computer components and custom hardware components.
  • the methods may be provided as a computer program product that may include a machine-readable medium having stored thereon instructions which may be used to program a computer (or other electronic devices) to perform the methods.
  • machine-readable medium includes any medium that is capable of storing or encoding a sequence of instructions for execution by the machine and that cause the machine to perform any one of the methodologies of the present invention.
  • the term “machine-readable medium” includes, but is not be limited to, solid-state memories, optical and magnetic disks, and carrier wave signals.

Abstract

A mandatory access control method for securing an operating system is presented. A first integrity subject (120) reads a first object (130). The first integrity subject attempts to read a second object (110). It is determined that a conflict exists between the first and second objects. At least one security rule is applied to the conflict between the first and the second objects.

Description

A MANDATORY ACCESS CONTROL (MAC) METHOD
FIELD OF THE INVENTION [0001] The present invention relates to a security system for an operating system. More particularly, the invention relates to a security system technique that uses previous behavior of a subject to control access to an object.
BACKGROUND OF THE INVENTION [0002] One of the most significant issues related to operating systems for a computer system involves security. Security systems are designed to protect the confidentiality, the integrity, and the availability of an operating system. One aspect of a security system involves reducing opportunities for malicious computer instructions (e.g. a virus, a Trojan horse etc.) to affect the operating system. Operating systems such as UNIX or operating systems derived from UNIX (e.g. LINUX) incorporate a security system that can be vulnerable to malicious computer instructions. [0003] There are several types of mechanisms that are presently used to secure operating systems such as a discretionary access control (DAC) or a mandatory access control (MAC). The DAC is unable to completely secure these operating systems for a variety of reasons. For example, the DAC restricts access to objects based solely on the identity of a subject. This makes the operating system vulnerable to Trojan horses. [0004] Other operating systems use a MAC. A Biba low watermark mandatory access control mechanism (LOMAC) protects the operating system by dividing processes into different security areas such as HIGH and LOW integrity data. HIGH integrity data relates to highly sensitive data whereas the LOW integrity data relates to low sensitive data. [0005] The LOMAC security rules require that a write-up not occur between objects and subjects. To illustrate, a LOW integrity subject cannot write to a HIGH integrity object. Additionally, if a HIGH integrity subject attempts to read a LOW integrity object, the HIGH integrity subject is automatically demoted to the same level as the LOW integrity object that it attempted to read. Accordingly, in this instance, the HIGH integrity subject is demoted to a LOW integrity subject. [0006] In order to practically implement the LOMAC, the LOMAC requires that exceptions to the security rules be granted in order to overcome an access control mechanism that is too coarse grained. For example, certain computer programs are granted a trusted status with special hard-coded privileges. Trusted status means that the operating system automatically recognizes computer instructions without checking for security issues. Syslogd in Linux exemplifies the trusted status that is granted to a Linux system logging utility by the LOMAC. Syslogd is implemented as a trusted process because syslogd needs to access user profiles and also write to the system LOG files. In addition to granting trusted status to computer programs, the LOMAC performs poorly with respect to confining computer programs to their least required privileges. It is therefore desirable to have a system or a method that overcomes the disadvantages associated with conventional security systems.
BRIEF DESCRIPTION OF THE DRAWINGS [0007] The present invention will become more fully understood from the detailed description and the accompanying drawings, wherein: [0008] Figure 1 is a block diagram of a HIGH integrity subject that reads a HIGH integrity object and then reads a LOW integrity object in accordance with one embodiment of the invention; [0009] Figure 2A is a block diagram that illustrates the demotion of a HIGH integrity subject to a LOW integrity subject in accordance with one embodiment of the invention; [0010] Figure 2B is a block diagram of a demoted LOW subject that is unable to access a HIGH integrity object in accordance with one embodiment of the invention; [0011] Figure 3 is a block diagram of a HIGH integrity subject attempting to read a HIGH integrity object after reading another HIGH integrity object in accordance with one embodiment of the invention; [0012] Figure 4 illustrates a block diagram of a LOW integrity subject attempting to read a LOW integrity object after reading another LOW integrity object in accordance with one embodiment of the invention; [0013] Figure 5A illustrates a fourth integrity subject that reads a fourth integrity object and then attempts to read a third integrity object that is in conflict with the fourth integrity object in accordance with one embodiment of the invention; [0014] Figure 5B illustrates a fourth integrity subject that reads a fourth integrity object and then attempts to read another a fourth integrity object in which a conflict exists in accordance with one embodiment of the invention; [0015] Figure 5C illustrates a third integrity subject that reads a second integrity object and then attempts to read a fourth integrity object that is in conflict with the second integrity object in accordance with one embodiment of the invention; [0016] Figure 5D illustrates a third integrity subject that reads a second integrity object and then attempts to read another second integrity object that are in conflict in accordance with one embodiment of the invention; and [0017] Figure 6 is a flow diagram of a method of implementing a security system for an operating system in accordance with one embodiment of the invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The following description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses. Techniques of the invention improve the security of an operating system by incorporating a mandatory access control (MAC) that uses the prior behavior of a subject. In particular, the MAC considers the action of a subject that reads one object and then attempts to read another object that has a conflict of interest with the first object. A conflict of interest occurs when it is possible that data of one object may be used to corrupt the data of another object. In one embodiment, a conflict of interest between two objects that have been read by a subject requires that the subject be demoted whereas a nonconflict of interest between objects does not result in the demotion of the subject. In another embodiment, a conflict of interest between objects causes the subject to be denied access to the second object. [0018] To better understand these concepts, the definition of a subject, an object, and integrity levels are now presented. A subject (also referred to as a process) is a set of instructions that perform an action such as a read operation on the object, a write back operation to the object, or other suitable operation. An object, on the other hand, is a resource. Examples of objects include a file or a file directory. Each object includes an attribute or a plurality of attributes. An attribute is a quality or characteristic that is associated with or inherent to the object. At least one or more of the attributes are security attributes. [0019] The integrity of a subject relates to the sensitivity or trust level associated with the subject. A higher integrity subject corresponds to items of higher sensitivity or levels of trust. In one embodiment, a higher integrity subject is represented by the term HIGH integrity subject and a lower integrity subject is represented by the term LOW integrity subject. A similar analysis applies to objects. [0020] In another embodiment, there are multiple levels of subjects at multiple integrity levels. For example, a subject may include a first integrity level, a second integrity level, a third integrity level . . . or an N integrity level. In one embodiment, the highest level may be associated with the highest level of sensitive or confidential material that is stored. Given these definitions, general examples are presented below that relate to HIGH and LOW integrity subjects and objects followed by a more detailed description relative to Figures 1-6. [0021] One example involves a HIGH integrity subject that reads for a first time a HIGH integrity object. The HIGH integrity subject then attempts to read a LOW integrity object that has a conflict of interest with the first HIGH integrity object. In this embodiment, the HIGH integrity subject is then demoted to a LOW integrity subject based upon the conflict of interest. [0022] In another embodiment, a HIGH integrity subject reads for a first time a first HIGH integrity object. The HIGH integrity subject then attempts to read a second HIGH integrity object that has a conflict of interest with the first HIGH integrity object. In this embodiment, the HIGH integrity subject is denied access to the second HIGH integrity object. [0023] In still yet another embodiment, a LOW integrity subject first reads a LOW integrity object. The LOW integrity subject then attempts to read a
HIGH integrity object that has a conflict of interest with the LOW integrity object.
Based upon the conflict of interest between the objects, the LOW integrity subject is denied access to the HIGH integrity object. [0024] In another embodiment, a LOW integrity subject first reads a first LOW integrity object. The LOW integrity subject then attempts to read a second LOW integrity object that has a conflict of interest with the first LOW integrity object. The LOW integrity subject is denied access to the second LOW integrity object. While these examples are presented in terms of HIGH and LOW subjects and objects, skilled artisans understand that other security rules may be designed to apply to multiple integrity levels for both the subject and the objects.
Implementation of this fine grained integrity security system eliminates granting of exceptions to certain security rules (e.g. trusted status to computer programs). [0025] Several embodiments are now described in detail with respect to the figures. Figure 1 illustrates a block diagram of the interrelationship between two objects, and a HIGH integrity subject 120. A HIGH integrity subject 120 is configured to read from a LOW integrity object 110 or a HIGH integrity object 130. In this embodiment, the HIGH integrity subject 120 will not be demoted or penalized since it is the first time that the HIGH integrity subject 120 accesses one of the objects such as the HIGH integrity object 130 or the LOW integrity object 110. [0026] Figures 2A-2B are block diagrams that illustrate the process in which a HIGH integrity subject 220 is demoted. Referring to Figure 2A, the HIGH integrity subject 220 attempts to read the HIGH integrity object 230. This generates a system call that prompts the mediator 135 to act. [0027] The mediator 135, disposed between the high integrity object
220 and objects 210, 230, is a set of instructions that is configured to perform security operations such as reading the security attributes of an object and comparing those security attributes to another set of security attributes from a different object. A conflict is declared by mediator 135 when, for example, an object includes security attributes that limit the object to one section of the file system (i.e. merely logging into the operating system which is low sensitive files) and the other object includes security attributes that allow accessing of very sensitive data (e.g., financial files) in another section of the file system. [0028] In this embodiment, the HIGH integrity subject 220 is allowed to read the HIGH integrity object 230 since the mediator 135 has not previously read attributes from another object that conflict with the HIGH integrity object 230. Thereafter, the HIGH integrity subject 220 attempts to read the LOW integrity object 210. A system call is again generated prompting the mediator 135 to read the security attributes of the LOW integrity object 210. The mediator 135 determines that a conflict exists between the HIGH and LOW integrity objects 230, 210. Since the LOW integrity object 210 has a conflict of interest with the HIGH integrity object 230, the HIGH integrity subject 220 is demoted to the same integrity level as the LOW integrity object 210, which is a LOW integrity subject level 235 shown in Figure 2B. Figure 2B further illustrates, in one embodiment, that a LOW integrity subject 235 is unable to access a HIGH integrity object 230 after the demotion process shown in Figure 2A. [0029] Figure 3 is a block diagram that illustrates a HIGH integrity subject 410 that reads a HIGH integrity object 420 but is unable to read a HIGH integrity object 430 due to a conflict of interest existing between the objects. Specifically, in this embodiment, the HIGH integrity subject 410 first attempts to read the HIGH integrity object 420. A system call prompts the mediator 135 to then read the security attributes associated with the HIGH integrity object 420. The HIGH integrity subject 410 then attempts to read the HIGH integrity object 430. This prompts another system call to the mediator 135. The mediator 135 then compares the attributes of the two objects. The mediator 135 determines that these two objects have a conflict of interest with each other. Since a conflict of interest exists between the HIGH integrity objects 420 and 430, access is denied to the HIGH integrity subject 410 to read the HIGH integrity object 430 even though no demotion was involved with respect to the HIGH integrity subject 410. [0030] Figure 4 illustrates a block diagram of a LOW integrity subject 510 first attempting to read the LOW integrity object 520. This triggers a system call that prompts the mediator 135 to act. The mediator 135 reads the security attributes related to the LOW integrity subject 520. The LOW integrity subject 510 then attempts to read the LOW integrity object 530. This generates another system call that prompts the mediator 135 to act. The mediator 135 then reads the attributes associated with the LOW integrity object 530. The mediator 135 determines that a conflict of interest exists between the two objects. Thereafter, the LOW integrity subject 510 is denied access to the LOW integrity object 530. This is due to conflict of interest between the LOW integrity object 520 and the LOW integrity object 530 with each other. [0031] Figures 5A-5D illustrate the implementation of security rules in which both the subjects and the objects possess multiple integrity levels. Figure 5A is a block diagram of a plurality of integrity levels associated with a fourth integrity subject 620 and a first, second, third, and fourth integrity objects 640, 650, 610, 630 in accordance with one embodiment of the invention. In this embodiment, a fourth integrity subject 620 reads for a first time a fourth integrity object 630. The fourth integrity subject 620 then attempts to read a second integrity object 650 that is determined by mediator 135 to have a conflict of interest with the fourth integrity object 630. In this embodiment, the fourth integrity subject 620 is then demoted to a second integrity subject. [0032] Figure 5B is a block diagram of a plurality of integrity levels associated with subjects and objects in accordance with one embodiment of the invention. In this embodiment, a fourth integrity subject 735 reads for a first time a fourth integrity object 740. The fourth integrity subject 735 then attempts to read another fourth integrity object 745 that has a conflict of interest with the previous fourth integrity object 740. In this embodiment, the fourth integrity subject 735 is denied access to the latter fourth integrity object 745. [0033] Figure 5C is a block diagram of a plurality of integrity levels associated with subjects and objects in accordance with one embodiment of the invention. Here, a third integrity subject 810 first reads a second integrity object 820. The third integrity subject 810 then attempts to read a fourth integrity object 830 that has a conflict of interest with the third integrity object 810. The third integrity subject 810 is denied access to the fourth integrity object 830. [0034] Figure 5D is a block diagram of a plurality of integrity levels associated with subjects and objects in accordance with one embodiment of the invention. In this embodiment, a third integrity subject 910 first reads a second integrity object 920. The third integrity subject 910 then attempts to read another second integrity object 930 that has a conflict of interest with the second integrity object 920. The third integrity subject 910 is denied access to the second integrity object 930. [0035] Figure 6 is a flow diagram of one method of implementing a security system for an operating system in accordance with one embodiment of the invention. A first integrity subject reads a first object at operation 1000. The first integrity subject attempts to read a second object at operation 1010. It is determined that a conflict exists between the first and second objects at operation 1020. At least one security rule is applied to the conflict between the first and the second objects at operation 1030. One security rule relates to demoting the first integrity subject to a second integrity subject based upon the conflict between the first and the second objects. In this embodiment, the first integrity subject and the first integrity object are considered to have a higher level of trust than the second integrity subject and the second integrity object. Another security rule relates to denying the first integrity subject access to the second object based upon the conflict between the first and the second objects. This security rule is typically implemented when the two objects that are read possess similar integrity levels. A trusted process for computer instructions that access the operating system is eliminated at operation 1040. [0036] It will be appreciated that more or fewer processes may be incorporated into the method illustrated in Figure 6 without departing from the scope of the invention and that no particular order is implied by the arrangement of blocks shown and described herein. Skilled artisans will appreciate that the method described in conjunction with Figure 6 may be embodied in machine- executable instructions (e.g., software). The instructions can be used to cause a general-purpose or special-purpose processor that is programmed with the instructions to perform the operations described. Alternatively, the operations may be performed by specific hardware components that contain hard-wired logic for performing the operations, or by any combination of programmed computer components and custom hardware components. The methods may be provided as a computer program product that may include a machine-readable medium having stored thereon instructions which may be used to program a computer (or other electronic devices) to perform the methods. For the purposes of this specification, the terms "machine-readable medium" includes any medium that is capable of storing or encoding a sequence of instructions for execution by the machine and that cause the machine to perform any one of the methodologies of the present invention. The term "machine-readable medium" includes, but is not be limited to, solid-state memories, optical and magnetic disks, and carrier wave signals. Furthermore, it is common in the art to speak of software, in one form or another (e.g., program, procedure, process, application, module, logic, etc.), as taking an action or causing a result. Such expressions are merely a shorthand way of saying that the execution of the software by a computer causes the processor of the computer to perform an action or a produce a result. [0037] The description of the invention is merely exemplary in nature and, thus, variations that do not depart from the gist of the invention are intended to be within the scope of the invention. Such variations are not to be regarded as a departure from the spirit and scope of the invention.

Claims

CLAIMS What is claimed is:
1. A mandatory access control method for securing an operating system comprising: providing a first integrity subject; reading a first object through the first integrity subject; attempting to read a second object through the first integrity subject; determining that a conflict exists between the first and second objects; and applying at least one security rule to the conflict between the first and the second objects.
2. The method of claim 1 wherein the at least one security rule is one of: (a) demoting the first integrity subject to a second integrity subject based upon the conflict between the first and the second objects; and (b) denying the first integrity subject access to the second object based upon the conflict between the first and the second objects.
3. The method of claim 2 wherein the first integrity subject is associated with highly sensitive data.
4. The method of claim 3 wherein the second integrity subject is associated with low sensitive data.
5. The method of claim 4 wherein the first integrity subject is associated with highly sensitive data.
6. The method of claim 5 wherein the low sensitive data is separated from the highly sensitive data in the operating system.
7. A mandatory access control method for securing an operating system comprising: providing a HIGH integrity subject; reading a HIGH integrity object through the HIGH integrity subject; attempting to read a LOW integrity object; determining that a conflict exists between the HIGH and LOW integrity objects; and demoting the HIGH integrity subject to a LOW integrity subject.
8. The method of claim 7 wherein the HIGH integrity subject is associated with highly sensitive data.
9. The method of claim 8 wherein the LOW integrity subject is associated with low sensitive data.
10. The method of claim 9 wherein the HIGH integrity subject is associated with highly sensitive data.
11. The method of claim 10 wherein the low sensitive data is separated from the highly sensitive data in the operating system.
12. A mandatory access control method for securing an operating system comprising: providing a HIGH integrity subject; reading a first HIGH integrity object through the HIGH integrity subject; attempting to read a second HIGH integrity object; determining that a conflict exists between the first and the second HIGH integrity objects; denying access of the second HIGH object to the HIGH integrity subject; and eliminating a trusted process for computer instructions that access the operating system.
13. A mandatory access control method for securing an operating system comprising: providing a LOW integrity subject; reading a LOW integrity object through the LOW integrity subject; attempting to read a HIGH integrity object; determining that a conflict exists between the LOW and the HIGH integrity objects; and denying access of the HIGH object to the LOW integrity subject.
14. A mandatory access control method for securing an operating system comprising: providing a LOW integrity subject; reading a first LOW integrity object through the LOW integrity subject; attempting to read a second LOW integrity object; determining that a conflict exists between the first and the second LOW integrity objects; and denying access of the second LOW object to the LOW integrity subject.
15. An article comprising: a storage medium including instructions stored thereon which, when executed, cause a computer system to perform a method including: providing a first integrity subject; attempting to read a second object through a first integrity subject; determining that a conflict exists between the first and second objects; applying at least one security rule to the conflict between the first and the second objects; and eliminating a trusted process for computer instructions that access the operating system.
16. The article of claim 1 wherein the at least one security rule is one of: (a) demoting the first integrity subject to a second integrity subject based upon the conflict between the first and the second objects; and (b) denying the first integrity subject access to the second object based upon the conflict between the first and the second objects.
17. A mandatory access control method for securing an operating system comprising: providing a first integrity subject; reading a first object through the first integrity subject; attempting to write to a second object through the first integrity subject; determining that a conflict exists between the first and second objects; and applying at least one security rule to the conflict between the first and the second objects.
18. The method of claim 17 wherein the at least one security rule is one of: (a) demoting the first integrity subject to a second integrity subject based upon the conflict between the first and the second objects; and (b) denying the first integrity subject access to the second object based upon the conflict between the first and the second objects.
19. The method of claim 18 wherein the first integrity subject is associated with highly sensitive data.
20. The method of claim 19 wherein the second integrity subject is associated with low sensitive data.
21. A mandatory access control method for securing an operating system comprising: providing a HIGH integrity subject; reading a first HIGH integrity object through the HIGH integrity subject; attempting to write to a second HIGH integrity object; determining that a conflict exists between the first and the second HIGH integrity objects; and denying the HIGH integrity subject a write operation to the second HIGH integrity object.
22. The method of claim 21 further comprising: eliminating a trusted process in the operating system.
23. The method of claim 21 , wherein determining the conflict occurs when data of the first HIGH integrity object is able to corrupt the data of the second HIGH integrity object.
PCT/US2005/019192 2004-06-08 2005-06-01 A mandatory access control (mac) method WO2005122715A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/863,784 US7243235B2 (en) 2004-06-08 2004-06-08 Mandatory access control (MAC) method
US10/863,784 2004-06-08

Publications (2)

Publication Number Publication Date
WO2005122715A2 true WO2005122715A2 (en) 2005-12-29
WO2005122715A3 WO2005122715A3 (en) 2006-11-30

Family

ID=35450319

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/019192 WO2005122715A2 (en) 2004-06-08 2005-06-01 A mandatory access control (mac) method

Country Status (2)

Country Link
US (1) US7243235B2 (en)
WO (1) WO2005122715A2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7600117B2 (en) * 2004-09-29 2009-10-06 Panasonic Corporation Mandatory access control scheme with active objects
US7954135B2 (en) * 2007-06-20 2011-05-31 Novell, Inc. Techniques for project lifecycle staged-based access control
US20090048993A1 (en) * 2007-08-13 2009-02-19 Motorola, Inc. Implementation of operating system securing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6430561B1 (en) * 1999-10-29 2002-08-06 International Business Machines Corporation Security policy for protection of files on a storage device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6023765A (en) 1996-12-06 2000-02-08 The United States Of America As Represented By The Secretary Of Commerce Implementation of role-based access control in multi-level secure systems
US5937159A (en) 1997-03-28 1999-08-10 Data General Corporation Secure computer system
US6044466A (en) 1997-11-25 2000-03-28 International Business Machines Corp. Flexible and dynamic derivation of permissions
US6304973B1 (en) 1998-08-06 2001-10-16 Cryptek Secure Communications, Llc Multi-level security network system
US6289462B1 (en) 1998-09-28 2001-09-11 Argus Systems Group, Inc. Trusted compartmentalized computer operating system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6430561B1 (en) * 1999-10-29 2002-08-06 International Business Machines Corporation Security policy for protection of files on a storage device

Also Published As

Publication number Publication date
US7243235B2 (en) 2007-07-10
WO2005122715A3 (en) 2006-11-30
US20050273619A1 (en) 2005-12-08

Similar Documents

Publication Publication Date Title
US10073986B2 (en) Regulating access to and protecting portions of applications of virtual machines
US7506364B2 (en) Integrated access authorization
US9069941B2 (en) Access authorization having embedded policies
US8195938B2 (en) Cloud-based application whitelisting
US7818781B2 (en) Behavior blocking access control
US7685632B2 (en) Access authorization having a centralized policy
US9286486B2 (en) System and method for copying files between encrypted and unencrypted data storage devices
US20150046706A1 (en) System and Method for Controlling Access to Encrypted Files
KR20060050768A (en) Access authorization api
CN111291364A (en) Kernel security detection method, device, equipment and storage medium
WO2005122715A2 (en) A mandatory access control (mac) method
KR101099310B1 (en) Integrated access authorization
EP2835758B1 (en) System and method for controlling access to encrypted files
US11132437B2 (en) Secure computer operating system through interpreted user applications
EP2866169A1 (en) System and method for copying files between encrypted and unencrypted data storage devices
WO2013074071A1 (en) Regulating access to and protecting portions of applications of virtual machines
CN115622771A (en) Information authentication mutual trust processing method and device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase