Virtual Field Trips (VFTs) can be used to go to places that are impossible to visit (mid Atlantic Ridge), or act as a replacement for students unable to physically attend a field trip. An example of one produced by colleagues at the Open University is previewed in the video below (source):
VFTs have been produced using 3D platforms such as Google Earth but it is only recently that developments in software and hardware have meant that the technology is robust enough to use in everyday teaching.
Tracking Students: One idea we had in our Google research project was to see if tracking students flying around VFTs can be used to inform tutors and students about student's learning. This topic isn't well covered in the literature so worth investigating. A paper Muki, Paolo Viterbo and I have just submitted to a journal describes our work in this area. We collected 4D data (3D with time) using the Google Earth API of students navigating around to complete an educational search task. In some VFTs students are limited to walking but in ours they had access to zoom and pan.
Two Visualisations: In the paper we describes two visualisations which help users’ (either tutors or students) make sense of the complex 4D tracking data. One is a static graphic (not covered today), the other is an animation:
The animation links an altitude vs distance graph with a 3D view of the track in space using Google Earth’s cross section functionality. We think that these visualisations are quick and effective ways to evaluate student's search activities.
Experiment Summary: In the experiment students:
1. Viewed a Google Earth tour which explained how to identify paleo-geographical features (lake banks surrounding a lake long since dried up).
2. They were then set a task searching for their own example in a defined study area. An important feature of the task was that students could not complete their search without zooming in to check characteristics in more detail. Their route through 3D space was tracked and saved to a server.
3. They marked their answer on the map.
Visualised data: The simple 3D path in space looks like spaghetti thrown into the air (top section above diagram), it’s difficult to interpret. However, by plotting altitude against distance along path in a linked graph (bottom part) the actions of the student zooming in and out on targets can be clearly seen. In the main view (top of image) the red arrow shows camera location and the hair line on the graph (bottom) shows the relevant point on the graph. You can control the hair line to explore the path, this page links to a sample KML file and the youtube clip explains how to set it up and what it shows in more detail.
What Does it Show? From interacting with this visualization several aspects of the students’ performance can be easily gauged:
· Did the student zoom in on sensible targets (i.e. the ‘answer’ area and other areas that needed checking out)?
· Did the student get disorientated (stray outside the yellow study area box or spend an overly long time in one area)?
· Were they thorough in their search or just do the bare minimum (did they zoom in on a number of sensible locations, just a few or did they fail to zoom in at all)?
Possible Uses: This technique could be applied to a number of virtual field trip situations. The case study we’ve already looked at represents a physical geography/earth science application. It also could be used for:
human geography: e.g. if students are taught that poorer neighborhoods are likely to be further from the centre of a city you could then ask students to identify poor neighborhoods in a sample city. Tracking a successful search would show students navigating to sample sites around the edge of the city and then zooming into streetview to check their if they were right or not.
Student created maps: Students are first tasked with identifying volcanoes in a country. They mark three answers on a class shared map in the first stage. In the second stage, they assess their peers' work and are tracked zooming in on each other's placemarks. You could see how good their performance was in the second stage from the tracking animation e.g. did they check out suggestions in enough detail. IMHO This last example has the advantage of representing deeper learning, it challenges students to think critically about each other’s work.
Ethics: Learning Analytics is a powerful new tool for teaching, used carefully it has huge potential to assist students and tutors. However, it also raises real teaching issues such as will students react well to the extra kind of feedback they can now receive? Will institutions use it to measure tutors performance in a confrontational manner? IMHO we need to approach this new tool with an open, student focused, frame of mind.