Teaching and learning with Dashboards
Contents
Teaching and learning with Dashboards#
Introduction#
This page summarizes ways that dashboards have been introduced to EOAS faculty and students for teaching and learning purposes. Design and deployment guidelines are provided on separate pages.
These recommendations were learned partly from experiences during the OCESE project, partly from other experiences implementing dashboard (or other similar) resources within the Department (EOAS), and partly from experiences of those outside UBC and reported in the literature or at geoscience education conference sessions.
Success is measured both in terms of (a) whether students gained new insights, conceptual understanding, or skills, and (b) whether they were demonstrably “inspired” or “motivated” by using the interactive resources (dashboards) to engage in meaningful ways with concepts or skills that they are still learning.
One area deserving further research is development of ways to measure the impact on learning or student motivation of using dashboard resources. This requires that some attention be paid to the “Fidelity of Implementation” - that is - whether the resource itself, the content & contexts, and corresponding teaching tactics (the pedagogy) were all applied in consistent, verifiable ways. This is not trivial - see Stains and Vickery (2017) referenced below. That said - most instructors can tell quite quickly whether their use of the dashboard has been effective, based on student feedback and observations of enthusiasm and engagement. Other references include means of evaluating effectivness of resources, or tactics used (or both), so these and other sources in the literature will provide precedent for such research.
Recommendations#
It is always important that students are given time to familiarize with the tool before asking them to make good use of it. Students cannot solve problems with a new tool unless they have time to familiarize with the tool. There are 3 aspects they are wrestling with: 1) new concepts, 2) a new tool, and 3) a “problem” to solve (eg “what happens to the water divide, both conceptually and mathematically?”).
If possible, ask them to familiarize with the tool before coming to class, perhaps with a few pointers about what to look for. Then they will come to class ready to use the tool productively.
Or, have them explore a little in class at the end of one lesson, then pick up the “real” task in the following lesson.
The first questions they are given could be “trivial” (i.e. more about the tool than the concepts). Then the real problem you want them to solve comes after that.
Some questions that students address could be “find the answer” types of questions. HOWEVER, students learn more when their task causes them to ‘puzzle’ over some issue, concept, implication, or outcome. Dashboards should not be used to repace “calculators” or other tools that focus on answers.
Similarly, questions that are incomplete or “ill-posed” can result in significant learning, especailly if there is no single “answer” to the task. Seeing that their colleagues have produced different, equally legitimate choices(given the assumptions they use or were given) helps these “beginners” gain confidence in dealing with incomplete data, ill-posed situations, etc.
However, more effective are questions that cause exploratory use of concept or data. Ask “what if” or “assume this …” types of questions.
For inclass activities, providing instructions for the activity on lecture slides can work. However, it is more effective to have a worksheet with settings, questions, and spaces for answering. This helps students focus and keep up.
Students are more motivated by questions that aim to address or solve issues with consequences to people, property, ecosystems, or whatever, rather than simply abstract or theoretical questions. This is about connecting concepts to reality.
Encourage students to pose questions - do this strategically; quote from Kastens, Zrada & Turrin, 2019 (refs below): “Encouragingly, over 70% of participants generated at least one question at the highest Bloom’s level. Implications for instruction include that assigned question-asking can be an opportunity to engage all students in question asking, and that students can ask good questions about complex data before the data are explained to them.”
ALSO - be sure to consider the wisdom expressed by others. The references below are a short “starter” list with guidelines for facilitating effective learning with simuilations, demonstrations and interactive resources.
Courses that have used OCESE dashboards#
Use links in this list to explore ways that EOAS courses have used or are using interactive dashboard resources.
ENVR 300, Introduction to Research in Environmental Science.
EOSC 112, The Fluid Earth: Atmosphere and Ocean.
EOSC 116, Mesozoic Earth: Time of the Dinosaurs.
EOSC 310, The Earth and the Solar System.
EOSC 323, Structural Geology I, students explored Mohr’s Circles.
EOSC 325, Principles of Physical Hydrogeology.
EOSC 340, Global Climate Change.
EOSC 372, Introductory Oceanography: Circulation and Plankton.
EOSC 429, Groundwater Contamination.
References#
Chamberlain, J.M., K. Lancaster, R. Parson, & K. K. Perkins, How guidance affects student engagement with an interactive simulation. Chemistry Education Research and Practice. 15 p. 628-638, 2014.
Kastens, Kim A., Melissa Zrada, and Margie Turrin, What Kinds of Questions Do Students Ask While Exploring Data Visualizations?, Journal of Geoscience Education 68, no. 3 (July 2, 2020): 199–219. https://doi.org/10.1080/10899995.2019.1675447.
Perkins, Katherine, Noah Podolefsky, Kelly Lancaster, and Emily Moore, Creating Effective Interactive Tools for Learning: Insights from the PhET Interactive Simulations Project, in Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2012, 2012:436–41, 2012. http://editlib.org/p/40781.
Rehn, D. A., Moore, E. B., Podolefsky, N. S., & Finkelstein, N., Tools for high-tech tool use: A framework and heuristics for using interactive simulations, JoTLT. 2(1), p. 31-55, 2013.
Stains, Marilyne, and Trisha Vickrey, Fidelity of Implementation: An Overlooked Yet Critical Construct to Establish Effectiveness of Evidence-Based Instructional Practices, CBE—Life Sciences Education 16, no. 1, March 2017. https://doi.org/10.1187/cbe.16-03-0113.
Wieman, C., W. Adams, P. Loeblein, and K. Perkins, Teaching Physics using PhET Simulations, The Physics Teacher, 2010.
See also …
Scholarly references at https://phet.colorado.edu/en/research.
References on the QuEST project references page.