This presentation on the OWASP Software Assurance Maturity Model (SAMM) was by Pravir Chandra, the project lead.  I was actually really excited in seeing this topic on the schedule as SAMM is something that I’ve been toying with for my organization for a while.  It’s actually a very simple and intuitive approach to how to assess where your organization is at as far as software maturity, where you want to get to, and how to get there.  My notes on this presentation are below:

By the end of the presentation should be able to….

  • Evaluate an organizations existing software security practices
  • Build a balanced software security assurance program in well-defined iterations
  • Demonstrate concrete improvements to a security assessment program
  • Define and measure security-related activities throughout the organization

Lessons Learned

  • Microsoft SDL
    • Heavyweight, good for large ISVs
  • Touchpoints
    • High-level, not enough details to execute against
    • Large collection of activities, but no priority ordering
  • ALL: Good for experts to use as a guide, but hard for non-security folkds to use off the shelf

Drivers for a Maturity Model

  • An organization’s behavior changes slowly over time
    • Changes must be iterative while working toward long-term goals
  • There is no single recipe that works for all organizations
    • A solution must enable risk-based choices tailored to the organization
  • Guidance related to security activities must be prescriptive
    • A solution must provide enough details for non-security-people
  • Overall, must be simple, well-defined, and measurable

Therefore, a viable model must…

  • Define building blocks for an assurance program
    • Delineate all functions within an organization that could be improved over time
  • Define how building blocks should be combined
    • Make creating change in iterations a no-brainer

SAMM Business Functions (4 in total)

  • Start with the core activities tied to any organization performing software development
  • Named generically, but should resonate with any developer or manager
  • Governance, Construction, Verification, Deployment

SAMM Security Practices (12 in total)

  • From each of the Business Functions, 3 Security Practices are defined
  • The Security Practices cover all areas relevant to software security assurance
  • Each one is a ‘silo’ for improvement
  • Governance: Strategy & Metrics, Education & Guidance, Policy & Compliance
  • Construction: Threat Assessment, Security Requirements, Secure Architecture
  • Verification: Design Review, Code Review, Security Testing
  • Deployment: Vulnerability Management, Environment Hardening, Operational Enablement

What is “software”?

  • Lots of different aspects of what software is
  • Could be a tarball of source code, UML and specifications, or a server running the code

Under each Security Practice

  • Three successive Objectives under each Practice define how it can be improved over time
  • Level 1, Level 2, and Level 3
  • “Going from crawling to walking to running”
  • 72 different actives all about the size of a bread box

Per Level, SAMM defines…

  • Objectives
  • Activites
  • Results
  • Success Metrics (2-4 metrics for each objective)
  • Costs (training, content, license, or buildout)
  • Personnel (overhead on different roles because operating at this level)

Conducting Assessments

  • SAMM includes assessment worksheets for each Security Practice

Assessment Process

  • Supports both lightweight and detailed assessments
  • Organizations may fall in between levels (+)

Creating Scorecards

  • Gap Analysis
    • Capturing scores from detailed assessments versus expected performance levels
  • Demonstrating Improvement
    • Capturing scores from before and after an iteration of assurance program buld-out
  • Ongoing Measurement
    • Capturing scores over consistent tiem frames for an assurance program that is already in place

Roadmap Templates

  • To make the “building blocks” usable, SAMM defines Roadmaps templates for typical kinds of organizations
    • Independent SW Vendors
    • Online Service Providers
    • Financial Services Organizations
    • Government Organizations
  • Organization types chose because
    • They represent common use-cases
    • Each organization has variations in typical software-induced risk
    • Optimal creation of an assurance program is different for each

Expert Contributions

  • Build based on collected experiences with 100’s of organizations
    • Including security experts, developers, architects, development managers, IT managers

Industry Support

  • Several case studies already
  • Several more case studies underway

The OpenSAMM Project

  • Dedicated to defining, improving, and testing the SAMM framework
  • Always vendor-neutral, but lots of industry participation
  • Targeting new releases every ~18 months
  • Change management process

Future Plans

  • Mappings to existing standards and regulations
  • Additional roadmaps where need is identified
  • Additional case studies