My Research Groups Are No Longer Active

As professor emeritus at IU, I no longer lead research and development projects nor do I mentor research groups. I encourage you to consider working with other faculty in Instructional Systems Technology.


Past Research Groups: 2005 through 2016

Research goal

We are investigating the effectiveness First Principles of Instruction embodied in the Plagiarism Tutorial. This online learning resource is used by literally hundreds of thousands of people every year. We have redesigned the Plagiarism Tutorial, based on First Principles of Instruction. See Frick and Dagli (2016). MOOCs for Research: The Case of the Indiana University Plagiarism Tutorials and Tests. Technology, Knowledge and Learning21(2), 255-276. And especially see Frick et al. Analysis of Patterns in Time for Evaluating Effectiveness of First Principles of Instruction (2022) which illustrates the fruitfulness of this approach.

First Principles of Instruction (FPI) include:

  1. Provision of authentic tasks or problems, sequenced from simple to complex.
  2. Activation (helping students connect what they already know with what is to be newly learned).
  3. Demonstration (of what is to be learned).
  4. Application (where students try to do the tasks or solve problems with instructor guidance and feedback).
  5. Integration (of what is learned into students' own lives).

Research questions we are addressing

To the extent students experience First Principles of Instruction (FPI) in the Plagiarism Tutorial, what is the likelihood of student learning achievement?

  • If fewer First Principles are experienced, is the likelihood of student learning achievement lower?
  • If more First Principles are experienced, is the likelihood of student learning achievement higher?

In 2019 and 2020 we collected big data on usage of the Indiana University Plagiarism Tutorials and Tests (IPTAT)--over 936,000 student learning journeys. We found that successful students (who passed one of trillions of Certification Tests [CT]), were nearly 4 times more likely to select IPTAT web pages which implement FPI learning activities when compared to unsuccessful students (who had not passed a CT). We describe this Big Study in our recent book and in a follow-up study:

What is MAPSAT?

We used a research method called Analysis of Patterns in Time. APT is part of MAPSAT research methodology (Map & Analyze Patterns & Structures Across Time).

MAPSAT differs from traditional quantitative educational research methods, where variables are measured separately and then relations among variables are analyzed statistically. In MAPSAT, relations themselves are empirically observed and coded. MAPSAT was invented decades ago, and is well-suited as a method for modern learning analytics, as well as many other kinds of research.

In APT, measures of relations are determined by relative frequency and/or duration of occurrences of observed temporal patterns. In other words, researchers code sequences of occurrences of events using defined categories from multiple classifications in an observation system. This results in a temporal map for each unique observed entity (e.g., each student who tries to learn via our Plagiarism Tutorial).

Each temporal map in APT can be represented by a spreadsheet. The rows in the spreadsheet represent successive moments in time; the columns represent the classifications in the observation system; and category names are entered by an observer into spreadsheet cells. The entries into the cells represent the temporal order of specific empirical events which are observed to occur within each classification column. The rows in the spreadsheet are labeled by the date and time of each event occurrence. After observations are completed, a researcher subsequently can count specific qualitative patterns within each unique temporal map, as well as sum the durations of a particular temporal pattern.

For example, in the studies of First Principles of Instruction, we will be counting patterns which represent the sequence of specific instructional principles which are followed by student mastery of the learning objectives. There will be a temporal map for each student who goes through the online tutorial or plays the online game. Observations of event occurrences will be done by computer software embedded in the online instruction which will be using specific codes that the researchers have previously associated with each activity (e.g., this activity is an instance of the Application principle, or that activity is an instance of the Activation principle). When a student takes a test, computer software will classify the student as a master or nonmaster of the learning objectives. This will result in literally thousands of temporal maps. Probabilities of event sequences leading to student mastery can be estimated by APT software which we are also developing. This software will scan the temporal maps for occurrences of temporal patterns and count them.

Analysis of Patterns in Configurations (APC) is the other MAPSAT method. In APC, measures of 17 different properties of structural configurations are determined, including interdependence, wholeness, integration, hierarchical order and complexity. For further information on APC, see ATIS Graph Theory (Thompson, 2008). Structural properties are represented by a digraph of specific affect relations. A digraph consists of vertices (points) and edges (lines connecting points).

The most salient difference: MAPSAT measures relations, whereas quantitative statistical methods relate measures. This is not a play on words, rather a profound difference in approach to measurement and analysis in empirical research studies.

MAPSAT measures can be subsequently analyzed with traditional statistical methods. Measures of relations can be treated in aggregate through means (averages), standard deviations, probability estimates, confidence intervals, etc.


Past Research: A Sample

To get an idea of what students in my research groups and I have previously accomplished, see the reports below. Some links below to copyrighted reports require authentication with your IU username and password; other links do not.

SimEd: Developing and Studying Simulations and Games for Learning

Enfield, J., Myers, R., Lara, M., & Frick, T. (2012). Innovation diffusion: Assessment of strategies within the DIFFUSION SIMULATION GAME. Journal of Simulation and Gaming, 43(2) 188–214.

Kwon, S. & Frick, T. (2014). Design theory for instructional overlays within complex simulation games. Under review, Educational Technology Research and Development.

Kwon, S., Lara, M., Enfield, J. & Frick, T. (2013). Design and evaluation of a prompting instrument to support learning within the Diffusion Simulation Game. Journal of Educational Technology Systems, 41(3), 231-253.

Lara, M. (2013). Personality traits and performance in online game-based learning: Collaborative vs. individual settings. Bloomington, IN: Doctoral dissertation.

Lara, M., Myers, R., Frick, T., Aslan, S., & Michaelidou, T. (2010). A design case: Developing an enhanced version of the diffusion simulation game. International Journal of Designs for Learning, 1(1). IJDL online.

Myers, R. (2012). Analyzing interaction patterns to verify a simulation/game model. Bloomington, IN: Doctoral dissertation.

Myers, R. & Frick, T. (2015). Using pattern matching to assess gameplay. In C. S. Loh, Y. Sheng, & D. Infenthaler (Eds.), Serious games analytics: Methodologies for performance measurement, assessment, and improvement, (Chapter 19, pp. 435-458). Heidelberg, Germany: Springer. 


MAPSAT: Map & Analyze Patterns & Structures Across Time

Frick, T., Howard, C., Barrett, A., Enfield, J., & Myers, R. (2009). Alternative research methods: MAPSAT your data to prevent aggregation aggravation. Paper presented at the annual conference of the Association for Educational Communications & Technology, Louisville, KY.

Frick, T., Myers, R., Thompson, K. & York, S. (2008). New ways to measure systemic change: Map & Analyze Patterns & Structures Across Time (MAPSAT). Featured research paper presented at the annual conference of the Association for Educational Communications & Technology, Orlando, FL.

Howard C. D., Barrett A. F., and Frick, T. W. (2010). Anonymity to Promote Peer Feedback: Pre-Service Teachers' Comments in Asynchronous Computer-Mediated Communication. Journal of Educational Computing Research, 43(1), 89-112.

Koh, J., & Frick, T. (2009). Instructor and student classroom interactions during technology skills instruction for facilitating preservice teachers’ computer self-efficacy. Journal of Educational Computing Research, 40(2), 207-224.

Seminal work:

Frick, T. (1990). Analysis of Patterns in Time (APT): A Method of Recording and Quantifying Temporal Relations in EducationAmerican Educational Research Journal, 27(1), 180-204.

Frick, T. (1992). Computerized Adaptive Mastery Tests as Expert Systems.   Journal of Educational Computing Research, 8(2), 187-213.


IDCL: Instructional Design for Complex Learning

Frick, T. & Dagli, C. (2016). MOOCs for Research: The Case of the Indiana University Plagiarism Tutorials and Tests. Technology, Knowledge and Learning21(2), 255-276.

Enfield, J. (2012). Designing an educational game with Ten Steps to Complex Learning. Bloomington, IN: Doctoral dissertation.

Frick, T., Chadha, R., Watson, C., Wang, Y. & Green, P. (2009). College student perceptions of teaching and learning quality. Educational Technology Research and Development, 57(5), 705-720

Frick, T., Chadha, R., Watson & Zlatkovska, E. (2010). Improving Course Evaluations to Improve Instruction and Complex Learning in Higher Education. Educational Technology Research and Development, 58(2), 115-136.

Seminal work:

Frick, T. (2018). The theory of totally integrated education: TIE. In J. M. Spector B. B. Lockee, and M. D. Childress (Eds.), Learning, Design, and Technology: An International Compendium of Theory, Research, Practice and Policy: Learning theory and the learning sciences (J. Elen, Section Ed.).

Additional Research Indices