Monday, October 17, 2016

Summary of Findings: Red Teaming (3.5 out of 5 Stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst University in October 2016 regarding Red Teaming as an Analytic Technique specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use on structured data.

Description:

Red teaming is a general classification method used in analysis that looks at scenarios and situations from the enemy’s/adversary’s/partner's perspectives.  Descriptions and processes for how this is precisely done varies widely by the source and application of the technique.

Strengths:

  • Gives an alternate perspective for situational analysis and evaluating scenarios
  • Can be highly effective for SOME applications (i.e. utilization of OpFor training for military operations)
  • Flexible and can be applied to a wide variety of topics / scenarios

Weaknesses:

  • Lack of definition and evidence to support this technique.
  • Red Teaming is easy to set up but difficult to replicate.

How-To:

  1. Identify a scenario which requires analysis from the perspective of the “enemy”
  2. Identify a team designated to “think like the enemy” and come up with plausible actions the enemy may take (recommended to use nominal group technique, screening criteria, and any other methods or modifiers to produce plausible plans)
  3. Ask for as much detail as possible regarding the surroundings of the scenario or actors within the operating area to refine analytic assessments
  4. Present the team's findings to the decision maker(s)
  5. Decision maker(s) then take action to mitigate the opportunities or threats identified by the red team

Application of Technique:

The following exercise was presented to a group of students

Scenario: There is a presentation titled “The Joys of Big Brother” being given in CAE 204, Mercyhurst in several days. We have reason to believe there is a group or several groups related to the campus (aka college kids) that wants to subvert/disrupt/stop this presentation. We have security measures in place (which amount to locking the door), but we need to know what these groups could/would do.

Enemy Team: College kids mostly. Rumors consist of computer sciences majors as well as recreational sports (dirty hippies) types predominantly. We have no real information beyond the known of them being college kids, so they could bring a wealth of skills.

Target: Presentation in CAE 204. We have locked the doors, but we are certain that there are other avenues of approach to that presentation (windows, power, network, etc).

Red Team: Will have 10 min to “think like the enemy” to assess different ways to attack this presentation. Will be free to use computers, ask experts, and generally seek information in any way they want in order to ascertain a top 5 of ways this opposing force could attack.

The Red Team came up with several novel ideas that the decision makers could act upon.

For Further Information:

Red Team: How to Succeed By Thinking Like the Enemy - Micah Zenko:

Micah Zenko on Red Teaming

Defense Science Board Task Force on “The Role and Status of DoD Red Teaming Activities”

Red Teaming and Alternative Analysis:

National Training Center:

Penetration Testing:

Cyber Red Team Operations:

Red Team Wikipedia:


No comments:

Post a Comment