As part of my doctoral process, I have to review a lot of literature. A LOT. To motivate me to keep track of the resources as I read them, I am going to try to post summaries of the pertinent articles here. My hope is that I will get in the habit of writing and summarizing so that the literature review portion of my dissertation is easier to write. We’ll see if that works, and if I can keep up with the summaries!
Oshima, J., Oshima, R., & Matsuzawa, Y. (2012). Knowledge building discourse explorer: A social network analysis application for knowledge building discourse. Educational Technology Research and Development, 60(5), 903–921. Available from http://link.springer.com/article/10.1007%2Fs11423-012-9265-2
In this article, the authors argue that current assessment techniques are insufficient to accurately identify and assess learning in the knowledge creation metaphor. Instead, they suggest using social network analysis (SNA) to describe and assess how knowledge develops through discourse and community.
Traditionally, knowledge is thought of in one of two models: acquisition, in which the learner accumulates and stores knowledge, and participation, in which the learner experiences “knowing” by participating in cultural phenomena. The knowledge-creation metaphor seeks to unify acquisition and participation into a single metaphor. Knowledge-creation is focused on newness. This could be the creation of new knowledge or the transformation of current activity into a new system.
Assessment techniques commonly used for acquisition and participation models fall short when applied to learning in a knowledge-creation context because they do not capture the full complexity of the knowledge community. Social Network Analysis can capture and analyze interactions within complex networks. While it has not been widely used in education, there is research to support its use within knowledge-creation organizations.
For this study, the researchers used KBDeX, a software program which creates a visual of network structures with a graph of words x discourse units. In addition to visuals, KBDeX calculates network measures, such as betweenness centrality coefficient, degree centrality coefficient, and closeness centrality coefficient.
The data for the study came from a previously published study. Originally, the two groups’ problem-solving discussions were compared with discourse analysis. The results found that one group, the Gillian group, went beyond pure calculation in their discussions which resulted in deeper conceptual understanding than the Matt group. The Gillian group demonstrated more characteristics of knowledge-creation than the Matt group. To compare discourse analysis with SNA, the researchers used the same transcripts from the previous study with SNA techniques.
Distinct differences were found in the vocabulary network of the conversations each group engaged in during problem-solving. In the Gillian group, the majority of the terms formed a cohesive network, while the terms formed two completely separated clusters of words. This shows that the Gillian group discussed the problem in a more connected manner.
Cumulative centrality coefficient of terms affirms this. The Gillian group quickly reached and maintained a high cumulative centrality coefficient, which demonstrates that the concepts were closely connected in their conversation. By contrast, the cumulative centrality coefficient for the Matt group remained low and fluctuated several times, suggesting that they used concepts separate from others.
Social network analysis led to similar conclusions as the original discourse analysis. However, because SNA can be carried out incrementally throughout the progression of the discourse, the researchers were also able to identify the moment in the conversation that the Gillian group connected the concepts. They were also able to assess individual student contributions to the group dynamic, which was not possible with the in-depth discourse analysis.
This paper is a very interesting application of SNA because it uses multiple techniques in a quantitative manner, rather than simply generating network visualizations. In addition, by using SNA to analyze data previously studied under more traditional techniques, the researchers have validated the use of SNA techniques. I think SNA would be a more useful tool to assess MOOCs than traditional pre- and post-test measures, given the distributed nature of the learners and the openness of the content.