Social Engineering Your Metrics: Using Data Science to Provide Value in Reporting

By Joe Gray

Elevator Pitch

Reporting is generally boring. As social engineers, we often get wrapped up in the hustle and bustle of performing the engagement and report writing falls to the side. Are we providing meaningful measurements, metrics, and advice to the client? This presentation introduces actionable metrics.

Description

Abstract:

Reporting is generally boring. As social engineers, we often get wrapped up in the hustle and bustle of performing the engagement and report writing falls to the side. While the reports do go out and we meet client obligations, a serious question arises: Are we providing meaningful measurements, metrics, and advice to the client?

We surely highlight the deficiencies and where to improve in a report, which is pretty standard. How do we measure the things that matter most to the client? Measuring opens just tells us how many people read their email and, while risky, clicks do not always translate to negative outcomes. Instead of focusing on email opens or links clicked by users, this presentation is introducing:

  • Measurements rooted in statistics
  • Data science techniques
  • Indicators that actually speak to the security posture and culture of the organization.

The distance of a metric is the time between an event (a click or open) and another event (inputting information or reporting the event). These metrics are far more indicative of how an organization would fare against social engineering than who opens an email.

Outline:

  • Intro (1:00)
  • Current Infosec Metrics (3:00)
      • General Metrics
      • Social Engineering specific metrics
          • Clicks
          • Opens
          • Credentials stolen
  • Why these metrics suck (7:00)
      • The lack of ability to measure an actual adversary’s activity
      • Put the burden on the users, instead of security and the organization
  • New Ratio metrics and relevance (18:00)
      • Open to Click Ratio
      • Open to Action (Entering data or downloading file) ratio
      • Open to reporting (to security management) ratio
      • Click to Action (Entering data or downloading file) ratio
      • Click to reporting (to security management) ratio
      • Action (Entering data or downloading file) to reporting (to security management) ratio
  • New Distance metrics and relevance (28:00)
      • Open to Action distance (time between two events)
      • Click to Action distance (time between two events)
      • Open to Report distance (time between two events)
      • Click to Report distance (time between two events)
      • Action to Report distance (time between two events)
  • Additional consulting metrics (35:00)
      • Per engagement metrics (client only)
      • Sliced per engagement metrics (all clients)
      • Sample per engagement metrics (client only, but all engagements)
      • Population per engagement metrics (all clients, all engagements)
      • Use of basic statistic measures:
          • Mean
          • Median
          • Standard Deviation
  • Quantitative versus Qualitative versus Mixed Mode (39:00)
      • Challenges with the ability to measure and desired perspective
  • Challenges for integrating into Big Data, Data Science, and Machine Learning (42:00)
      • Regression Analysis
          • Lack of relationship between actions or outcomes
          • Number of datapoints input for analysis
  • Possible integration with data science (45:00)
      • Cluster analysis
          • Measurement of “statistically true” and “statistically false” data
  • Conclusion and Questions (48:00)