Erik Duval is a professor at the Catholic University in Leuven. His team works on Human Computer Interaction. In the last few years, he has done a lot of work around Learning Analytics, which he defines as being about collecting traces that learners leave behind and using those traces to improve learning.
His students at the university do everything (and he means everything) using blogs and Twitter. He stopped giving lectures and instead works with students in a single place a few times a week. This makes it very hard for him to follow what is going on. The number of posts that are generated in his courses are too many for him to read them all. If you are facilitating a Massive Open Online Course (MOOC) this gets worse. This is why we do learning analytics. This has a lot of attention now with a conference and a Society for Learning Analytics Research
Next he mentions the quantified self movement: self-knowledge to self-tracking. If a tool gives you a good mirror about your behaviour, then this might make it easier to actually change your behaviour. He showed many examples from the consumer market (i.e. Nike+ Fuelband or the Fitbit. He is trying to see if you could develop similar applications for learning. Imagine setting a goal for how many words you want to learn every day and a device that shows you how many you’ve learned for the day. He wants to create awareness in the student, so that they can “drive” themselves better. This is different from the current efforts in learning analytics where they are mostly used to give more information to the institution (Duval doesn’t like that). He showed us an example of the dashboard that he uses to see the student’s activity on the blogs and on Twitter. The students have access to this information too and can see that data for their peers: openness and full transparency. This measuring leads to externalities that aren’t necessarily good (think students writing tweetbots to get good score). Duval depends on the self-regulating abilities of the group of students.
At the beginning of each course he tells his student that everything will be open in the course. He might have a debate about this, but he never gives in. He doesn’t think you can become an engineer without having the ability to engage openly with the society. If a student has very conscionable objections around privatey, then he sometimes allows them to publish under an alias.
If you collect a lot of data about people, then you can make technology enhanced learning more of an exact (i.e. hard) science. He wrote a paper titled: Dataset-driven Research for Improving Recommender Systems for Learning.
This whole field has a couple of issues:
- What can we measure? Time, time spent artefact produced, socal interactions, location. Many other things might be important.
- Privacy might become an issue: we will know so many things about everybody. One solution might be Attention Trust which defines four consumer rights for your (attention) data: property, mobility, economy and transparency. Our idea about privacy is changing, he referred to Public Parts by Jeff Jarvis.
- When does support become enslaving (see this blogpost)
His solution for the problems (once again): openness.
Duval’s talk had a lot of similarities with the talk I will be delivering tomorrow. Luckily we both come from slightly different angles and don’t share all our examples. If you attended his talk and didn’t enjoy his, then you can skip mine! If you loved it, come and get more tomorrow morning.