The Rise, Fall, and Re-creation of the Counter-terrorism Simulation (Part 2)

Part 2: The Re-creation

This isn’t a joke!  They are learning!  Aren’t they?
Try this. Watch this three-minute video about the Simulation.  Make a note every time someone mentions learning, learning objective, or outcomes for the students. Make another note when someone says something about a feature or a technology we used.  Who wins?

Were the students learning?” Further, if they were learning something how would we prove it?

I talked with one of the faculty members here at the College of Law (CoL) and asked her what was going on with the Simulation. She said that although Alumni were familiar with the simulation, it had a lukewarm response every year – like a soap opera that the students were playing in. I mean, we were streaming it online and had documentaries made about it, but there was nothing that outlined what the students were learning through the use of this Simulation. What were the objectives? What were the outcomes? What was the rubric?

What are the students learning?  No one could answer that definitively. Sure we could make things up. We could say they are learning decision making in a high pressure environment. We could say we’re operating under the situated cognition theoretical framework. We could say they are learning valuable decision making and communication skills. Fine. They do that through the course of life in law school, don’t they?

Man this bugged me. We invested so much time, thought, energy, and ideas into the Simulation. We wanted this to be successful for EVERYONE involved – all stakeholders from the students to the donors.

We had to stop and reflect as a group. We needed to push the reset button. We had to identify problem areas and address them before we made the simulation “bigger and better.”

What should come first: realism through technology or learning objectives?
We needed to change the model. We equated high technology usage and realism with good learning outcomes. We never stopped to ask ourselves and assess ourselves if the technology facilitated a good learning environment to the students, or was just something extra to add to the pomp. What learning purpose does all this technology have?  We had the right idea on some things… the reporters were a good mechanism for feedback to the students as they “learn” in the simulation, but good learning environments don’t happen by chance.  They happen through good planning.  By setting our your learning objectives in advance, you can be assured that the technology you implement has a direct effect.

The class structure was wrong
The class was virtually all theory or readings. They went through the chapters in the book, talked Socratic style, and learned and discussed terrorism in various capacities. Two weeks before the end of the semester – the Simulation was dropped on them.

What’s wrong with that?

No opportunity to practice.  It’s like reading a book about Vince Lombardi and then playing a football game for their final test. Students spent 99% of their time in class learning, scribing, listening, and no time practicing and performing skills important to doing well in the simulation.  We invested all this time and energy into using technology to create a realistic scenario, but we didn’t even assess whether this technological environment was conducive to learning based on what they were taught. Shouldn’t the students have an opportunity to have some practice with skills that we were grading in the Simulation? How much of the Simulation was them trying to “survive” and how much was an actual test of whether or not they’re doing it right?

What kind of changes did we make?
Alright, so instead of thinking of ways we can make it more real or bigger & better for next year, we put out an analysis of what we thought we should have.  Instead of finding ways to project outwards, we decided to do some self-reflection.

  • More quantifiable outcomes: The Simulation is a highly qualitative event. There’s so much going on, it’s hard to objectively quantify student outcomes during the event. We’d like to facilitate an environment in which the students can be quantifiably rated on their performances – something like a performance score
  • More practice with relevant skills and constructs: Because the only exposure to the Simulation environment occurs during the Simulation at the end of the semester, there isn’t time allocated to the students to identify and practice the skills necessary to facilitate a successful Simulation. We want to give the students more time to develop the skills directly relevant to the Simulation.
  • More formative feedback on student progress: Students learn best with appropriate feedback.  By providing formative structure for feedback, students can further develop their skills in areas they are deficient. This will provide the students an opportunity to continue to work on their skills as relevant to the Simulation, and carry these skills with them into the work place.
  • An overall assessment of student skill performance: By providing the students with an aggregation of the quantifiable scores along with the constructive qualitative feedback, students will essentially have a formative assessment report that provides insight into their strengths and weaknesses and take the necessary steps to work on their performance in the main Simulation.

These changes came about:

  • Breakdown of skills and constructs: We’ve identified four skill areas necessary for successful performance in the Simulation. The four primary skill areas are: decision making, teamwork, information gathering and analysis, and advocacy and articulation.
  • Mini-simulations to test/reinforce relevant skills & constructs: For each of the identified four skill areas, we’ve created four mini-simulations that target development of these skills. Each mini-simulation is approximately one hour in length and is developed in parallel with the coursework. This will allow the students the ability to work on these specific skills prior to the main Simulation – in a Simulation context.
  • Formative feedback given to students pre-Simulation: Performance rubrics have been created for each identified skill. With the rubric, we can provide two different types of useful feedback for the students as they work through the mini-simulation. First, we can provide them with quantifiable information (a score) on their performance as it relates to the rubric. Second, qualitative feedback is provided for each criteria of the rubric.
  • Assessment reports were created and given (feedback): After each mini-simulation, the student is given a printed report that aggregates the quantitative and qualitative feedback provided by the raters of that mini-simulation. This clearly outlines the student’s performance and allows the student to identify and improve on weaknesses. It is also the basis for individual meetings the students schedule with the professor.
  • Main simulation changed to become more efficacious: In order to focus on the quality of the learning experience, we’ve made some changes to the main simulation. Instead of one giant nine-hour simulation, we are separating the students into three groups. Each group participates in a four hour main Simulation. This will level the importance of each role within the simulation and provide a better opportunity for the students to be rated on their performance. Each of the three groups run through the same simulation scenario, so in addition to within-student comparisons, the raters can also provide between-group comparisons of performance.

These questions and answers brought about the creation of a simulation design course. Instead of relying on a group of students doing this in their spare time, or even as a research project, we wanted to provide students an environment in which they can learn how to write a good simulation.  One in which the students can learn the skills and not just perform them. One where each activity is deliberate and chosen to reinforce something we feel they should learn. To refer back to our football example, if we feel blocking is a good skill for our students to learn, we should not only talk about it, but have practice actually doing it.  That way students can receive feedback on their performance, hone their skill, and have an opportunity to implement what they learned in an overall activity. Not only are the students in Amos’ class learning about Counterterrorism, the students in the design class are learning how to train effectively. Everyone wins!

We rely heavily on technology to facilitate this course. We use Canvas as a Learning Management System to manage the course schedule and readings.  We bring subject matter experts in through Skype (or even Polycom if they’re advanced) to give lectures on the skills – to assist the students in creating their learning environments.  The students use Google Docs to collaborate on script writing for both the main simulation and mini simulation.

Technology’s new role
Technology is awesome. It can facilitate learning opportunities and learning environments that didn’t exist prior. Technology can bring content experts from all over the world to your classroom.  Technology can turn brief writing into a collaborative experience for students. Technology can even be the learning environment for students.

Technology is not a substitute for good pedagogical planning. Technology can not take a broken class and make it better, just because we’re using clickers. Technology needs good planning.  Technology needs good insight. Technology needs to be collaborative. Technologists need to understand what the professor is trying to do with their class. Professors need to be receptive to new technologies that can make once tedious or impossible tasks easy. After all, that’s what technology is there for, right?

The role of technology in the Counterterrorism simulation is now tied to a learning objective. Some examples?

  • Streams aren’t recorded or pushed out to the public for promotional purposes, rather we now have time for the simulation writers to watch the recording and give feedback to the students. They rely on the archive of the stream after the fact to create this feedback, the external stream is just a convenient result of this need.
  • Technology was created to facilitate feedback and rating of students.  The iPad app isn’t just a fancy promotional piece, rather it’s something used to make the aggregation of scores and feedback streamlined, so the simulation writers can get it done efficiently and effectively, and get the feedback to the students in a timely manner.
  • Technology helps us communicate those results to the students and the community. By informing other students, faculty, and alumni. Using websites (authenticated of course) or even printed reports, we can get information to students quickly, so they can reflect on their performance and prepare questions for skill review. We can also tell sponsors, donors, and alumni how the students are performing in the simulation.  Instead of anecdotal stories, we actually have some hard evidence of student outcomes.
  • Websites like our fake CNN site are now tied to a skill: information management.  We can write the simulation around what the students do (or don’t do) when information is coming at them a mile a minute. Having this allows us to run the mini-simulations in a much more efficient manner.

We’ve also developed a sort of primer that other faculty can use when creating their own experiential learning exercises. This outlines the different stages of planning and also offers ideas on how technology can be used to develop ideas at each stage. This helps us create a sort of “menu” for technology and situations in which it might be best used to facilitate learning in their simulation.

I know this is long and sort of technology – sort of not. Either way, it’s a learning process. Hopefully in telling this story, we can offer it as a thought experiment for you. Hopefully the path that we’ve moved along will help you when you try to integrate technology into your school’s activities. These are the sort of things we’re hoping to accomplish with our Center for Innovation in Legal Education.

Next week is a little more technical.  I have a blog post telling you a little bit about a blended classroom environment we created for a first-year Contracts course.  We wrapped it up with a survey and I have some interesting thoughts and ideas to share with you.  Look for that one on December 18th!

As always, if you have any questions, thoughts, advice, or comments, leave them in the section below or drop me a line, I’d love to hear from you. You can always follow me on Twitter as well.

Comments

One response to “The Rise, Fall, and Re-creation of the Counter-terrorism Simulation (Part 2)”

  1. […] posted part 2 of my blog today at the Law School Ed Tech Blog.  Here’s an excerpt: We needed to change the model. We equated high technology usage and […]

Leave a Reply