Data Analytics and Evaluating Faculty/Student Experience

I just finished reading, “Visualizing Knowledge Networks in Online Courses” by Brian Dashew of Columbia University. It is one of the clearest papers I have read on LMS web analytics as a tool to guide faculty on the dynamics of their online courses. In plain English, the purpose of online graphical analytics is as follows: When in front of a live class, we can tell how we are doing if the students are silent, bored to death or if the room is vibrant with discussion. Graphical web analytics allows us to see, at least at a superficial level, how our class is interacting. Online graphical models serve up visual patterns between students and faculty at a granular level. Graphic analytics enables faculty to see the patterns of their online classes when they are blind to the facial reactions and body language of their students.

My exploration of learning transformation examines the personal, cultural/experiential side of life in poverty and trauma. After 15-years of exploration, it lead to the development of the Global Learning Framework, Personal Learning Framework and Transformation Learning Framework. All of these frameworks manage the collaborative contextual flow around learning content in global/local formats. The question is how do personal frameworks play a role in contemporary online analytic models?

To be fair, detailed web analytics of a single curriculum and class are fairly closed systems (the lesson’s bubble) that are subject to great variances when transplanted into different cultures or personal experiences. It is in this issue that web analytics, or the analytics of human collaboration, may in the future, unfold the hidden mysteries of collaborative learning, and better yet, distributive intelligence. Perhaps the evidence of shared Memeplexs (Richard Dawkins) will yield their importance in this field of cognitive collaboration.

While immersed in Full Sail University’s highly collaborative graduate curriculum model, I wondered how different cultural environments impact the student’s experience and therefore the instructional design strategy. Evaluating online faculty suffers the same problems as evaluating classroom teachers, each year the class makeup varies enormously. One year may be smooth and the next year a class management nightmare. Even with the same content each semester, students create completely new sets of challenges, thereby altering the collaborative patterns of the course. This is as true for kindergarten as it is for evening classes. Each class has its unique cognitive personality that is shared between the students. One of the reasons standardized teacher evaluations are not fair.

In Africa, learning is a social process that Lindemen’s landmark book, “The meaning of adult education,” saw back in 1927 (Lindeman). His ideals on meaning and relevance were validated in my work with the Bridgeport, CT Rescue Mission’s homeless women. The moment I put two women to work on a PC, their production in writing their life stories and their confidence greatly accelerated the learning experience because of the technology. It is interesting to note that when students feel safe, collaboration accelerates; conversely an unsafe environment shuts learning down.

At Full Sail University, they used the RISE Model, because it yields a culture of positive critique in a creative process. Full Sail is a media school and for learning to work, the creative process must keep flowing. RISE also facilitated the stronger, more experienced students to be encouraged, and to help mentor other students who needed help in specific areas. RISE is a superior tool to enhance collaborative constructs. In addition, our pod of eighteen graduate students working through an extremely complex Masters in Instructional Design degree created a safe collaborative environment by launching a separate Google+ circle that blocked faculty from viewing student issues. In this renegade group, the circle would sometimes become more interactive than the classroom environment (outside of the eyes of an analytics tool). That group often freely discussed material contrary to the curriculum and offered collaborative technical support on Adobe’s very buggy overpriced Creative Suite.

It is also interesting to note how the dynamics changed when we went to the weekly instructor webinar. Professors that used the webinar time to present had small attendance. However, professors that used it to tap into open debate and student background would usually bleed over from one to two hour sessions (keep in mind these are non gradable moments, time for real learning). In many ways, we were hungry for the analog faculty’s views (opinions) and life experiences to enrich (give meaning) to the content in the lesson’s module.

This structure to the Full Sail experience seem to deliver layers of emotional context (an academic way of saying meaning): LMS content/blog > Google + group > Webinar collaboration> late night phone calls. Perhaps this all walks us into the space of Distributive Intelligence (Roy Pea). How do we do analytics in the cognitive space? We need to keep in mind that web analytics are a very small snap-shot in time and do not offer the transcendent variances flowing in the background of the actual collaborative experience. What this means is that I could design a highly interactive page turning a sexual harassment course for hospital MDs, whom must pass with 90% or greater to receive a pay check, and yet yield no significant changes in behavior toward the RNs. Analytics may tell us great detail about interaction without yielding hidden data on transformational perceptions/values. And yet analytics still has great value.

At SUNY Empire State, I took a series of adult learning graduate courses that were poorly developed. The professor’s culture was anti-Christian that clashed with the majority of students who were working in Christian learning centers. This created much conflict that was hidden from the faculty. You would think faculty that was familiar with Malcolm Knowles work would know better. The culture clash was so intimidating that webinars were often completely silent during professor questions. Students offered minimal, and highly filtered, responses to blogs out of fear that they might receive low grades or have their thesis rejected.

Variances in the cultural views of the instructor and the experiential frameworks of the students will yield different results each time a class is run with the identical course design. In K-12 using the identical Pearson Common Core curriculum, two different third grade classes in the same school will have great performance variances between them. One class can be easy to control and another a class management nightmare. Personalities, emotional makeups and even male/female ratio greatly transform the global and personal experience in the lesson. These are highly complex variances. I have taught in both high entitlement and poverty settings with similar content that always yielded dissimilar results. Two teachers using the same Common Core history textbook will have completely different outputs if one teacher is a Republican and the other a Democrat. The goal to achieve a Common Core in an uncommon world is delusional at best. Our challenge for analytics is to grasp the deeper human experience. Sometimes no link or response at all indicates a deeper form of transformation or that defiance is taking place. In other words, deep learning can take place when nothing is showing on the surface. Deep reflection may not create any visible link to track.

We would like to believe we can dissect culture, values, and personal experience out of the Instructional Design experience for predictable control group outcomes, however these beliefs can lead to delusional or disinformation in the research. The downside is we can fool ourselves that real transformation is taking place by designing in more collaborative interaction. The underlying truth we maybe forcing online collaboration on mechanical level that to the student is simply irrelevant, boring or forced compliance instruction. Perhaps the greatest learning experiences in school are exceeding messy, emotionally challenging, unpredictable and making measurement grading extremely difficult.     

Until we can honestly grasp the cultural, emotional and experiential framework of each individual and it’s distributive impact on the lesson, analytics of instructional design may be more like placing a wine glass on a door to Grand Central Station during rush hour and declaring, “A lot is going on here.” The truth is, many lonely people are sitting in boxcars and that have a lives that do not make much sense.

Learning web analytics can greatly enhance the faculty’s understanding of “what” is going on (or not going on) in their online classes. We just have to dig deeper as to “why” it is happening on the student’s experiential level.


Comments

Popular posts from this blog

How can NGOs launch "Massive, Self-Sustainable Collaborative Learning" Programs?

Why Through a Kid Out of a Classroom