Saltzer and Schroder

They presented the following security principles [SS75].
  1. Economy of mechanism
    Keep it simple. 

    KISS is a design principle noted by U.S. Navy in 1960.

    This is a good guideline whenever a problem needs to be solved. It also reflects the quality of solution.



  2. Fail-safe Default A system should be designed to remain secure even if it encounters an error or crashes. In particular:
     Unless an entity is given explicit access to an object, it should be denied access to that object. 
    That is, the default situation is denial of access.

    For example, suppose you are writing a firewall policy.

    This approach is also called whitelisting.

    Most file access systems work based on this principle. Also, virtually all protected services on client/server systems work this way.

  3. Complete Mediation
     Every access must be monitored and controlled. 

    That is, there must be no way that can bypass the access control mechanism.

    Care should be taken to ensure that the access control mechanism cannot be circumvented. For example:

  4. Open Design
     The security of a system should not depend on the obscurity of its protection mechanism. 

    The security of a system should not depend on potential adversaries being ignorant of its protection mechanism but rather on specific keys or passwords being kept in secret.

    In cryptography, this is known as Kerckhoff's principle: a cryptosystem should be secure even if all aspects of the system except the keys being used are public knowledge.

    As reverse engineering techniques continue to advance, it is unrealistic to consider binary code as a black box that divulges nothing about the inner workings of a program.

    As another benefit, due to the decoupling of designs from keys, many reviewers can now examine the designs without security concern. This will provide independent confirmation of the design security.

  5. Separation of Privilege
     A system should not grant permission based on a single condition.  

    A good example of this is multi-factor user authentication, which requires multiple techniques, say, using both password and smart card to authorize a user.

    Another example is corporate purchasing.

  6. Least privilege
     Every process or user of the system should operate using the least
         set of privileges necessary to perform the task.

    For example, most office works do not need the privileges of installing new software on a corporate computer, creating new accounts or sniffing network traffic. These employees can do their jobs with fewer privileges, e.g., access to office applications and a directory to store data.

    As another example, a web server's processes need not run with administrative privileges. They should run with the privileges of a less privileged user account.

  7. Least Common Mechanism (Isolation)
    Minimize the mechanisms shared by multiple users or programs.

    Shared objects provide potential channels for information flow. Systems employing physical or logical separation reduce the risk from sharing.

    Process isolation is one example. The memory contents of a process cannot be observed by the other processes, unless it is explicitly shared through the system calls. In a sense, the system minimizes the ways of sharing the memory --- only through the system calls.

    As another example, sandboxing is the practice of isolating an application, a web browser, or a piece of code inside a safe environment. For example, a web browser sandbox lets you run browser applications in isolated environments, to block browser-based malware from spreading to the network.

    As another example, in 2011, Lockheed-Martin relied not only on the security of RSA's product (SecurID), but also on the security of RSA's own IT infrastructure. Its trust in the security of RSA's own IT infrastructure was evidently misplaced; it was hacked.

  8. Psychological Acceptability (Easy of use)
    Security mechanism should be easy to use.

    If a protection mechanism is easy to use, it is unlikely to be avoided.

    However, the more difficult a security mechanism is to use, the more likely it is that users will circumvent it to get their job done or will apply it incorrectly, thereby introducing new vulnerabilities.

    Enforcing complex and difficult-to-remember passwords that need to be frequently changed for access to trivial resources, e.g., to use a white printer would be an example. Complex password requirements can also cause users to store the passwords in an unsafe manner so they don't have to remember them, such as using a sticky note or saving them in an unencrypted file.

    Safety usability failures are estimated to kill about as many people as road traffic accidents --- a few tens of thousands a year in the USA. The most lethal medical devices are probably infusion pumps. This is used to administer intravenous drugs and other fluids to patients in hospital. It is often the case that an emergency room have equipment from different vendors, all with different user interfaces.

    If doctors and nurses press the wrong button, then the wrong dose gets administered, or the dose for an eight-hour transfusion --- patients die.

Remembering the 8 principles

Remember:
ELLF COPS
Economy of Mechanisms
Least Privilege 
Least Common Mechanism
Fail-safe Default

Complete Mediation
Open Design
Psychological Acceptability
Separation of Privilege

Other Principles

  1. Defense in depth: Design the system so that it can resist attack even if a single security vulnerability is discovered or a single security feature is bypassed. Defense in depth may involve including multiple levels of security mechanisms or designing a system so that it crashes rather than allowing an attacker to gain complete control.
  2. Design for updating: No system is likely to remain free from security vulnerabilities forever, so developers should plan for the safe and reliable installation of security updates.

References

  1. [SS75] Jerome H. Saltzer and Michael D. Schroeder. The protection of information in computer systems. Proceedings of the IEEE, 63:1278–1308, 1975. pdf