Page tree

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.


  • Software security is currently not very widely regulated directly. The following types of laws have (had) aspects in Finland that may spill over to the software security side, at least if widely interpreted:
    • Data protection (privacy) laws, most specifically the General Data Protection Regulation, but also including healthcare-specific laws, workplace specific laws
      • Must implement the necessary technical controls (but the definition of “necessary” is left for interpretation)
      • Typically, must guarantee confidentiality, integrity and availability
    • Product liability laws
      • Especially those products which have safety aspects
    • Criminal law covering intrusion into information systems
    • Security evaluation of various governmental systems
    • Secrecy of information in government and laws concerning archival
      • Requires the necessary information security controls that are based on threat analysis
      • Archives must guard against unauthorised access
    • Information security requirements for information exchanged with other countries
    • Copyright law (DRM circumvention aspects; definition of ”effective” DRM)
    • Laws on strong authentication and digital signatures
    • Finnish Constitution
      • Confidentiality of telecommunications is constitutionally protected
  • Of course, in most of these cases, there isn’t any reference to software security per se, or even to information security. Whether or not a privacy or security requirement is interpreted so that it would actually trigger software security activities is just that - interpretation. Many of these laws use language that specifically thinks security as an operational aspect.
  • Regulation and standards are usually fulfilling one or both of the following needs:
    • Internalising an externality (that is, mandating something that would not be done otherwise). This is mainly the role of government regulation, but may also be a factor in vendor management.
    • Having a ready-made set of requirements that can be referred to, so that everyone doesn’t have to reinvent the wheel. The risk here is that these lists will become ”gold standards” and meeting the requirements is what gets optimised, not security.
    • Some standards aim at keeping parties out of litigation and in a contractual world of private arbitration. These are usually security standards of closed ecosystems where the standard is a prerequisite of a membership.
    • Provide common vocabulary and concepts.
  • Regulation through legislation usually does not make specific technical requirements.
    • An exception was, for example, Germany’s digital signature legislation.
    • Usually, legislation just requires something that is ”good enough”, and interpretation is left for courts.
    • Often legal help (a lawyer) is required. However, lawyers that can effectively interpret law into engineering requirements are not common.
  • Product liability legislation may affect software security if software is a part of a physical device. It is driven by safety considerations.
    • This liability requires the product to cause damage (death, bodily injury, damage to an item or property) to consumer; and the product to be defective (in ”reasonable use”).
      • Autonomous vehicles are very interesting in this sense, and the German liability regulations for autonomous cars puts software security pretty much in the driving seat (sorry)
    • Interesting examples that can may be covered are therefore embedded systems: software in vehicles, battery charging software, software that controls actuators in consumer products, software that has a role in how much energy the device emits (radio transmitters, flashes, volume levels).
  • Some specific areas of software (again, mostly embedded software) fall under safety requirements. Often these are sectoral safety requirements, which will in turn require software security engineering.
    • Robotics, vehicles
    • Healthcare technology
      • Notably the FDA in the US has released guidance documents (2014) on the security of medical devices, when the guidance was earlier rather scarce. In the US, there are also large healthcare players (Veterans’ Affairs, as an example) who have their own software security guidance.
    • Software security activities not specifically included, mainly list technical controls and often emphasis on operational.
  • On the financial sector, PCI-DSS (Payment Card Industry Data Security Standard) has had a large impact on application security awareness.
    • In order to process credit card data (of the large card brands), PCI-DSS is a contractual requirement.
    • On the surface, mainly aims at protecting credit card information. May have positive spill-over effects on other software security aspects.
    • It is likely that the main reason is, however, to keep everyone who processes card data in a single contractually enforced system with less chance of olitigation outside the card issuers’ control sphere.
    • Specifically for payment application development, PA-DSS (Payment Application Data Security Standard) is applicable. It has quite a few of specific security feature requirements (like, on using encryption), and also has a requirement of secure and documented software development in general.
    • Compliance is measured by specifically qualified companies / people who are typically security consulting companies and consultants, called QSAs (PCI-DSS) or PA-QSAs (PA-DSS). At the time of writing in Spring 20152018, there are 351 386 QSA companies and 76 62 PA-QSA companies (Payment Application Qualified Security Assessor companies). In practice, getting a QSA status is has traditionally been currently a ticket to guaranteed full employment (if you like that sort of a job and don’t smell very bad).
  • The most software security centric regulatory instrument in Finland appears to be VAHTI 1/2013, the application security guideline for government (”Sovelluskehityksen tietoturvaohje” in Finnish).
    • Specifies 118 requirements for application development
    • In theory, ought to become the de facto requirements document for any public sector procurement.
  • Another fairly important guideline in Finland is the National Security Audit Criteria (Kansallinen turvallisuusauditointikriteeristö, KATAKRI)
    • Applies, for example, when private companies get access to documents having national security significance.
    • Covers not only IT security but also physical security.
    • For example, has an audit question ”How has the security of executable code been ensured”. On the level III & II (”elevated” and ”high” security levels) secure software engineering principles are required from suppliers.
    • Example requirements include threat modelling, robustness testing, and static analysis.
    • Other requirements set out a fairly large number of specific controls on, e.g., use of cryptography.
  • On the EU level, a directive 2013/40/EU ”on attacks against information systems” that is in force and is being implemented nationally
    • Makes ”intentional production, sale, procurement for use, import, distribution or otherwise making available” intrusion tools illegal


  • ISO 27001:2013 has new language on software security more specifically.
    • The previous version of ISO 27001 (2005) largely treated software security as an input validation issue.
    • The new version has fairly strong language that requires actual software security activities:
      • A.14.2.1 Secure development policy
      • A.14.2.5 Secure system engineering principles
      • A.14.2.6 Secure development environment
      • A.14.2.8 System security testing
      • A.15.1.1 Information security policy for supplier relationships
    • It is likely that once contracts that have references to ISO 27001 / 27002 get updated, software security development practices get quite a bit more pull.
  • Software security itself is being standardised in ISO under ISO/IEC 27034.
      Only the first deliverable is currently ready. The second deliverable is due to be published during 2015.
      • Whether or not 27034 becomes as popular as 27001/27002 (more general information security standard) is yet to be seen, but especially in circles who like standards, this is something to keep an eye on.
      • ISO/IEC 27034-1 defines the following terms:
        • Organizational Normative Framework (ONF): All aspects affecting the software security of an organization, including their ASC library (below)
        • Application Normative Framework (ANF): All aspects affecting the software security of an application [project], a subset of ONF
        • ONF Committee: The folks in the organisation who decide and shepherd the ONF. Loosely aligned with the ”software security group” in an enterprise.
        • Application Security Management Process: The process consisting of requirements definition, risk assessment, selecting the ANF subset for an application from the ONF, actual implementation related issues, use of the application, and auditing.
        • Application Security Control (ASC): A software security activity. Collected into an ASC library.
        • The standard
        doesn’t go much further in fleshing out what these specifically mean, but
        • includes the Microsoft SDL mapping to ISO 27034 terms in an appendix.
    • Privacy impact assessments (but not the EU GDPR Data Protection Impact Assessments) are standardised by ISO/IEC 27034-2 (due 2015) will further specify the ONF29134.

    Reading list

    This is a list of useful documents that will enhance your understanding of the course material. The reading list is session-by-session. “Primary” material means something you would be expected to read if you are serious about the course, and may help you to do the weekly exercise; “Additional” material you may want to read if you would like to deepen your understanding on a specific area.