Monday, July 16, 2012

Gagne's 9 Instructional Events

Before any "instruction" can happen, first you must 1. Decide on outcomes, 2. Decide on performance objectives (ie measurable assessments), 3. Decide on curriculum sequencing. And then you begin diving into designing your actual instruction.

There are virtually no limits on what may or what may not be classified as instruction which is kind of crazy. But essentially instruction boils down to one thing: "COMMUNICATION". IE helping the student move "from one state of mind to another". When you realize that communication can happen almost anywhere, anytime, and in any random, you realize that instructional design has a very broad subject matter indeed. I mean, by this criteria going grocery shopping can be an instructional experience as you learn the average prices of the various goods. Importantly, good communication doesn't necessarily involve a large number of words. In fact in many instances irrelevant speech may detract from learning.

Now to discuss Gagne's view of "instruction".

In Gagne's mind, there exists in the human brain an essentially linear set of processes which for him constitutes learning. IE the information processing model. In short 1) the learner is exposed to some sensory input, 2) Some of this information ends up in short term memory (working memory) where it can be rehearsed and recombined with other bits of information, 3) Some of this ends up in long term memory once it has been sufficiently encoded and supported with linking schema, 4) When so desired this information is pulled out of long term memory and put in short term memory where it can be processed 5) A "response generator" inside the brain takes this information floating around in short term memory and makes the appropriate response, 6) Finally some sort of behavior is manifested. In Gagne's view, if any of these processes are short circuited or cut off prematurely, real learning cannot occur.

Therefore the whole question for Gagne becomes how to ensure that none of these crucial brain processes is ignored when designing instruction. He proposes "9 events of instruction" which an instructor can explicitly bear in mind when designing an instructional experience that will go a long way towards ensuring none of these important steps are missed. Although I'm not sure I agree with this whole premise, I see a fair amount of value in using these "9 events" to guide instruction design.

1. GAIN THE STUDENT'S ATTENTION!!! Did I get yours? This may go without saying but if the learner is not attending to the words you are saying or the actions you are performing then learning cannot occur. It is important that the attention-grabber be somewhat related to what you are going to teach or else the student will soon realize he's been bamboozled and check out. Thus witness the only temporary benefit that gummy bears will give to unmotivated 5th graders.

2. Inform the learner of the objective. This allows the student to filter down the torrent of incoming stimuli and only focus on aspects that will be relevant to the objective at hand. I was particularly intrigued by Gagne's reference to "goal schemas", which as I gathered, refers to the process of making hierarchies of skills and sub-skills explicit in learners minds so they know how to situate the current instruction in context of the ultimate objective. As a lifelong nerd, this makes me think of games like Diablo 2 where there is an explicit skill tree from the very start of the game. You can see the uber-powerful strike of death awaiting you at level 20 if only you gain the prerequisite skills by that time - but it's a long process.

3. Stimulating recall of prerequisite learned capabilities. The sad fact exists that all learners except the self motivated usually won't subject themselves to the inconvenience of having to remember and relate the current information to old information they already have. It's just a lot of work. From what I've read about schema theory (not talked about in this paper) though, this process of relating bits of information to other bits of information is the only way that learning can be permanent. If the student is putting the new information in its own isolated silo in the brain, that information will be effectively useless. Given this, the instructor must help the student begin to remember connected pieces of information so that they can begin forming this dense schema.

4. Presenting the stimulus material. IE the situations or visuals that you would hope the learner would respond to differently by the end of your instruction. These stimuli are preferably specific rather than vague. In order to create as wide a range of response-provoking stimuli as possible, Gagne also discusses the importance using a "variety of examples". After having seen the stimuli in its various different guises the learner is better able to recognize the core of the stimuli that should evoke the appropriate response.

5. Provide scaffolding to allow the learner to make the appropriate logical connections in his own mind. Instructors can only produce a climate conducive to the formation of these logical connections. They cannot force them to occur. Building a nice scaffolding to help the learner jump from one logical place to the next logical place without being explicitly told where to go is one way in which this conducive environment can be developed. Gagne, however, is vague on the ideal degree of heavy handedness in this scaffolding process. As a rule of thumb however, he suggests encouraging more advanced learners to take increasingly large logical leaps while novices should be provided with fairly step by step guidance. There are also other rules of thumb with regards to different learner characteristics (curious vs practical) and learning domains (creative writing vs vocabulary building).

6. Elicit the performance. Using the same stimuli you used to teach the learner, provide the learner with an opportunity to make the connection all the way from stimulus to correct responses - without pointing out any of the intervening steps for him.

7. Provide feedback. If the student responded appropriately to the stimulus, give him two hearty pats on the back (figuratively), whereas if the student responded incorrectly, he needs to be informed of the discrepancy and perhaps be provided with additional instruction. Depending on the domain the nature of this feedback may vary considerably, from verbal to non-verbal, explicit or tacit, and so on. This step of the process is crucial if correct stimulus-response patterns are to be built up in memory. It reminds me of the immortal saying I once heard and have treasured from my 7th grade wrestling coach, "Only perfect practice makes perfect."

8. Assessing performance. As well as I could tell, this step seems to primarily refer to analyzing the validity and reliability of your assessment mechanisms. A good instructor wants to be sure there are no large gaps in the students knowledge and that the stimulus-response encoding will be robust to a wide variety of appropriate cues. Reliability increases as n increases, while validity increases to the extent the instructor believes the assessment mechanism accurately captures an authentic stimulus response opportunity.

9. Enhancing retention and transfer. To me, this stage essentially comes down to the concept of "fluency building". It's one thing to know a concept and be able to spit out a correct answer after several hours of laborious though, but it's an entirely different thing to get the stimulus-response pattern encoded so completely in your brain that it takes only seconds to produce the correct response. This comes down to lots and lots and lots of practice in a wide variety of relevant situations. For example, a typical learner "knows" his times tables far before he is able to answer them in an appropriate amount of time.

So this is Gagne in a nutshell. Interesting points, it seems fairly valid to me. I'm skeptical of his framework as the gospel truth but its hard for me to imagine any sort of effective instruction that doesn't include most or all of these instructional events. Going down this list explicitly as an instructional designer is probably a good place to begin when designing. I have few reservations - now I only need complete events 6 through 9 with regards to learning the concept! :)


For my related article I selected Mary Driscoll's highly cited summary of Gagne's complete instructional approach in Psychology of Learning for Instruction (pdf here). It provides a context for Gagne's approach that you don't get in this piece, providing a good complement.






Wednesday, July 11, 2012

Badges and Analysis - My Meandering Thoughts

As a idea with potential, badges have a lot going for them. However it's going to be a pretty incremental process and frankly I think many of the models pronounced by proponents of badging systems will fall to the wayside as proving to be unpractical. So here are the things I see as  most promising developments:

  1. Badges for gamification purposes. People in general are bad at self regulating their education - game like systems including badges give students clear goals to strive for and clear benchmarks for success. Furthermore, if there's an element of social prestige attached to them that only sweetens the deal. Badges have an almost magnetic appeal to most children. Case in point, I remember playing a certain video game over and over while I was a kid in order to just obtain a badge, although I had been able to move to a different mission for quite some time. Putting systems like this in place will produce awesome results.
  2. Badges as certifications for "hard" skills like programming and science, but probably not for soft skills such as writing or business. For these soft skills there are too many intangibles and immeasurables in play to make a digital test of very much worth. I imagine that in time some particular badges for these soft skills could potentially gain a brand in these fields similar to that held by successful universities but it would take a very long time for such brands to develop. Probably more promising in the soft skills domains are rigorous tournament like processes where there can only be one winner.
  3. Badges as scaffolding for self-directed and self-motivated students. A well designed badge can provide a framework for an individual passionate about some particular interest to guide their education. Take me for instance. I'm trying to learn the guitar. When I practice I typically just pick it up and begin strumming, usually without any real purpose. A badge could let me know what to focus on and when. IE You need to be able to accomplish such and such a task while strumming before you should move on to such and such other task. This would be tremendously helpful. Such a scaffolding could also be useful for self motivated learners in domains traditionally taught inside universities, such as economics, science, and programming. 
However, there are several things I believe that badges probably won't be able to accomplish, despite the hopeful optimism of people like those from the Mozilla foundation. 
  1. Digital badge aggregation as a quasi-substitute for monolithic university degrees. Here's the deal - employers want to gain as much information about potential employees with the least effort possible. That's why prestigious schools with good brands are so valuable - they inform the world with some degree of assurance that you've got what they're looking for. The employers already know about the institution so they don't need to do any extra research, and the signal is fairly reliable. Contrast that with digital badge aggregation. The aggregating process requires more time from an employers perspective than a simple examination of an "education" section on a resume. Moreover, chances are that the employers will have no clue whether or not the particular badges you have obtained are actually worth anything. Once again, for them to find out would require more time on their part. If certain badges or certain badge granting institutions find a way to generate trusted and reliable brands, then this could be overcome but would certainly be a slow process.
  2. Badges as a scaffolding for learning. For any given assessment, badges provide wonderful clarity and scaffolding for students to base their learning decisions off of. However, over the long term of their education I imagine that most students would flounder, looking this way and that way for a badges that seem to fit  but at the end of the day not fitting together in any sort of meaningful pattern. Or worse, after having spent countless hours to obtain badges which you later discover to be utterly worthless. Humans are looking for a good return on their time and money spent for education with a minimum amount of risk. A mishmashing hodgepodge of digital badges scattered across the internet isn't going to provide this in almost all cases. So students will continue to go to monolithic degree granting institutions - especially as long as the government keeps footing much of bill for such a large portion of degree seeking students. 
So those are my thoughts on badges. Turning now to pre-instruction analysis...

The two papers discuss two separate elements of pre instruction analysis. Broadly speaking the exhortation in the piece by Dick et al is to fully consider the external motivating need for the instruction, while the piece by Smith et al is to discuss the need to properly account for the characteristics of your learners. These seem like two logical places to start before diving into creating instruction, although I imagine most teachers do it tacitly rather than the explicit manner used in these papers. 

Here's the 6 step process I gleaned from the external analysis and learning outcomes piece.

0. First decide if this is even an instructional issues. Some issues are technological in nature, and attempts to change behavior by instruction are simply a fool's errand. 
1. Decide learning outcomes. Make them as fleshed out and based on reality and real needs as possible. Specific is usually better if it doesn't come at the cost of omission of important skills. Don't leave it vague such as "be better at economic" analysis but specify the specific skill that will enable them to be better at the economic analysis. 
            -They describe this process as "De-fuzzification" (in its academic parlance) and cite two case studies where they go through this process. By starting at a broad goal and narowing it down to more specific criteria and measurable behaviors, it became readily apparent to me that it would be far easier to design instruction.
2. Gather the content. What things will students know that will enable them to accomplish the learning objectives. It's really nice if you're already an expert in the subject but if need be you can just enlist the aid of a subject matter expert. 
3. Create the instruction. 
4. Design the assessments based on the learning outcomes and your instruction. 
5. Continually test and revise the learning materials with real students.
6. Do all of the above in a reasonable amount of time. Make sure you haven't bitten off more than you can chew. 

This process made a lot of sense to me, although I wonder if the explicitness of the description given in the paper is fully necessary. I also wish they had provided examples of how the resulting instruction is far better than it would have been if the instructional designers had simply dove into the design process without doing a thorough analysis. I guess it comes down to the author's tone for me: they treat these steps as the gospel truth but a) I've never even heard of them before and b) they provide little support beyond their own word. Anyhow, these are just questions to think about.

In the piece on learner characters essentially it boiled down to analyzing four categories of the student's life and personality. 
1. Cognitive - What do they already know and how capable are they intellectually? Is there a high variance in this regard among the students?
2. Physiological characteristics - Do they have health limitations? (I can imagine this question being particularly important for special educators.) 
3. Affective characteristics - How self motivated are the students? What is their purpose in taking this class? How tired and energetic do they feel?
4. Social characteristics - How well do the students interact with each other. 

Each of these four characteristics - once analyzed - lead to some fairly straightforward design implications. If there's a wide range of variance in the students cognitive ability, consider how to break the class into subgroups. If they have low motivation, consider using external rewards. And so on. While the authors list many implications, they barely scratch the surface in their paper, I suppose leaving that to the instructional designers own ingenuity. 

While all teachers implicitly conduct these analyses, I can see a fair amount of benefit to be had from approaching it systematically and explicitly. I suspect however, that many of the most skilled teachers in conducting these type of analyses probably don't think of it in this explicit and systematic manner.  

Tuesday, July 10, 2012

My Insightful Thoughts on Learning Analytics

I feel like I don't have a lot to say about Learning Analytics even though I find the topic fascinating. Maybe it's because most of the articles talked about Learning Analytics from this high level view without really getting into the nitty gritty of what learning analytic engines have already accomplished and how and what we hope they might be able to eventually do. In short, a lot was said about something - I'm just not sure what exactly. I liked it however, and I'm excited to learn more of the details. 

From the excellent LAK 2012 lecture by George Siemens (which I found the most interesting of all the materials) the speaker describes the three main levels of learning analytics research and development:
  1. Networks and social media analysis
  2. Learning Analytics and Data Mining
  3. The (digital) future of learning and learning institutions. 
I think his discussion of learning analytics and data mining the most related to what we have been we hope to accomplish in the class we will be designing. According to Siemen's framework, there are three major sub divisions within Learning Analytics and Data Mining itself: Classroom level data for helping teachers make better interventions, student level learning analytics to provide on-the-spot guided instruction, and institution level data analysis and analytics. 

Furthermore, the tools employed in each of these subdivisions can be roughly divided into several categories as well, although all of them are more of less intertwined one way or the other:
  1. Data mining (looking for new and useful patterns not already known.
  2. Predictive modelling (allowing for timely interventions)
  3. Visualization (tools to help with the analysis of the data)
  4. Social network analysis (it was never entirely clear how such analyses were going to potentially improve student performance)
  5. Customizable dashboards allowing for personalized data analysis.
  6. Intelligent curriculum and recommender systems.
What I find interesting is that each of these tools are created using almost entirely different methodologies. We have Bayesian analysis in one, good ol' software programming of basic visualizations in another, and in yet another we have a clustering algorithm from computer science. 

Moreover, these learning analytic tools are to be used for completely separate purposes and for completely separate people, ranging from student-centric analysis all the way up to aggregate data analysis for high level administrators. In some respects, sometimes it seems as if the only similarity between large domains of research classified as "learning analytics" is the use of computers and some usage of databases.

I cannot end my discussion of learning analytics without expressing my healthy skepticism towards much of the work and effort that has been given to learning analytics. Have you ever used Amazon's recommendation engine? Occasionally it gets something interesting, but very rarely. As an economics student and erstwhile statistician who's designed some data mining algorithms of his own, I've also come to realize that finding truly useful variables out of the vast array of possible data you could collect, or choosing the truly effective means of analyzing that data is a very imprecise process. Sad to say, the data does not naturally want to sing. Moreover, all learning analytics are implemented through algorithms. By their very nature algorithms are a fixed set of instructions for processing inputs to produce an output. They don't change or adapt - so for the foreseeable future we're going to still need humans

-------------------------------------------

I have to add a little on the bit from Creative Commons which was a good refresher on stuff I learned last year. I had never learned about the remixing rules and quite honestly it saddened me. In some ways, if it's not public domain or CC-BY I think it may not be worth your time to use the material. The author protests that regular copyright law has an even larger set of rules and he is undoubtedly correct in this, but in that case people aren't EXPECTING to remix it.  However, what Creative Commons has got going already seems like an excellent step in the right direction towards a greatly diminished copyright future. 

Wednesday, July 4, 2012

Brandon Sanderson Creative Writing 2012

So for my MOOC, I chose a class that I've been helping organize this past summer: Brandon Sanderson Creative Writing. Here's the description from my website:

There’s a group of us who are going to be semi-hardcore and try our best to follow along with the lectures as they come out, submit 1000 words weekly, and finish writing 50,000 words by the end of the summer. If this sounds like you then this is your place.
You can post without being a member but if you want to become a member, there’s a box somewhere that lets you apply for memberships. This board exists primarily for weekly submissions and craft related questions, not random discussions of random things. But please feel free to peruse, we’ve got a great community going. 
Just so new members know though, about 60 of the members who joined before May 20 have been assigned into writing subgroups. Sadly, we will not be creating any more subgroups this summer. Sorry. But you can still join the main forum!
Let there be dragons—
So essentially there are three components to this MOOC

  1. The lecture videos:  http://www.writeaboutdragons.com/home/brandon_w2012/
  2. The online forum:   https://groups.google.com/forum/#!forum/brandon-sanderson-creative-writing-online-2012
  3. The writing subgroups. These writing groups are simply a collection of blogs that the writers post to weekly. Currently we have around 6 active groups with some 50 or so people, and roughly 20 of those have been posting about every week. So it's relatively small scale, but still cool
So I'm not quite sure if my little class qualifies as a Massive Open Online Class, but certainly at least an open online class. I'm excited to hear some of your guys ideas on where I should go next with this, so please let me know!

Tuesday, July 3, 2012

The Brave New World of the Inauspiciously Named MOOC's

First off, my favorite video was this one:




It's classy, informative, and doesn't miss any key points (or at least as few possible in 3 minutes) about what OER is all about.

Now to the articles. Open Education Resources by Wiley et al. does a great job of summarizing OER. First, they explain the relative ambiguity of the term "open" and go on to discuss it's various incarnations, mostly hinging on the degree of "openness" a resource can be said to exhibit. The main three criteria acorss several researchers appeared to be 1. Freedom to access the materials, 2. Freedom to do what you will with the resource, such as revising, remixing, or redistributing the materials.

To some extent, these questions are determined by the mode of production of the OER, which range from the extremely loosely organized volunteer "peer production" such as wikipedia all the way to prestigious institutions such as MIT spending millions of dollars every year on carefully managed OER outreach initiatives.


The next big thing talked about is the 4 problems currently facing the OER community. This is the part I found most interesting. 


1. The discovery problem. Sure tons of great resources may exist out there, but the cost of any resource is not just the monetary price you pay to get it, it's the cost in time and effort as well. OER resources have relatively little marketing clout (since the content creators aren't directly receiving benefit for each extra OER distributed). Moreover, it is my opinion that a large quantity of low-quality OER material is out there, making it even more difficult to sift through the morass of information. Also, when people type in google key words for some content area they want information about, they're typically not looking for a 4 month class on the subject, further exacerbating the subject. The places that are into the whole 4 month class thing on the other hand have little incentive to begin using these resources, since by their very design many OER's are currently more a substitute rather than a complement to instructors.


2. The Sustainability Problem. Who the heck is going to pay for all this awesome, and free, content? Chances are, it's not going to be the content consumer, because most people if given the option between free and not free will choose the free option all the time as the quality is somewhat decent. Although some effort has been made in discvering new open busines models this is still a huge question to be decided. The millions of dollars MIT spends on it's open education resources each year highlights this issue.

3. The Quality problem. It boils down to this: would you rather watch a commercial produced for the superbowl or a commercial produced for the 3 a.m. cooking channel? If you're a sane individual, you would choose the Super Bowl. The point is though that the amount of resources poured into producing something is directly related to the quality of that something. However, the amount of money poured into something depends crucially on the return on investment and the potential size of the gains. OER suffers in both respects from the perspective of most content producers.

4. Localiszation (or recontexualisation of content)
OER is in it's infancy, as are "learning objects". Knowledge and instructional materials have yet to become an interchangeable commodity in the same way that memory sticks in the camera industry are. Each resource is typically optimized for a particular purpose, and unless a content producer shares that purpose, chances are that it may actually be more work to create the resource from scratch rather than spending time and effort to patch together an inferior solution from existing solutions.

5. Remix Problem. This isn't so much a problem as it is a disappointment to the chief proponents of OER. Here we have these free educational resources, with no restrictive copyrights. It seems logical that we should be able to do some awesome remixing - creating a patchwork  out of these vartious ieces to make an awesome instructional resource. Sadly, such is not always the case. This issues hinges on the localizsation and the discovery problems. If these costs are prohibitively high, it doesn't make any sense to remix content (except for the occasional copy and paste)


Online Self Organizing Systems
This article could be summarized in one question and answer. Q: Who's in charge of Wikipedia? A: Essentially no one. Despite this, Wikipedia manages to produce material of quality at least comparable to the Encyclopedia Britannica.

One of the central problems of online education is the "teacher-bandwidth" problem. Although raw content may now have an almost infinitely scalable distribution, the student-teacher relationships still exhibit the same limiations of the pre-digital era. Two main solutions have been proposed to solve this problem:

  1. Automated "smart" learning systems, which adaptively respond to learners needs. 
  2. Student's supporting students
This paper mostly deals with the second option. By leveraging internt capabilities, students can collaborate on a scale never before seen. Although any one given student may be quite unable to help you with your particular learning problem (speaking as a student), it is quite likely that in a class of 10000, some student out there could help you. 

The trick then is to put the right conditions and software infrastructure in place such that these type of useful and mutually beneficial interactions can naturally arise. Several solutions have been developed: 
  • Meta moderation, such as seen at Slashdot.org
  • Blog like ecosystems of learners collaboratively building and critiquing. 
  • Q&A forums, such as stackoverflow, which rely on either altruism or a skillful design to appeal to the natural competitive instinct. ]
  • Let the people figure out how to use the commons. We don't need no instructional designers to tell us how to do it. 
  • Collaborative problem solving
  • Scenario based instruction

The main takeaway point of the article is this: Maybe the new brave era of instructional design isn't figuring out how to make the perfect textbook or college course. Maybe it's in knowing how to create these scalable and learning heavy educational environments. 


MOOC Guide

Monday, July 2, 2012

Why Constructivism=Fails (According to Kirschner et al.)

The very long-titled article "Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching" raised a multitude of interesting points and provoked several (interesting) thoughts.

First, "experts use schema based pattern recognition". Learning is all about getting a series of if-then statements hard coded into long term memory. Problem based learning requires learners to store too many things at once in working memory, making it difficult for them to build the necessary schemas in memory. In other words, constructivist methods tend to bite off too much at a time, and to masticate on it too little. For example, take discovery learning methods and medical residents. A problem based approach instructional approach would lay all the data out on the table for the resident: "She has a blue tongue, oddly speckled saliva, and non-focusing eyes. How would you proceed?" The student is then required to jump into an extremely complex inductive/deductive process: remembering the symptons, multiple competing intermediate hypotheses on diagnosis, and so on. All this combines to tax the resident's working memory to the limit, thus retarding schema formation. Direct instructional methods on the other hand take more of the approach: "This is a patient with a blue tongue, speckled saliva, etc. In almost every case like this, Egyptian scurvy is the diagnosis. Here are a variety of photos showing different manifestations of this disease. In some cases, Nairobian scurvy may in fact be the culprit however. In such cases when you are in doubt, you should perform test X on their saliva." After these instructions are repeated and practiced sufficiently, so goes the hope, the student will automatically perform the same thought process, perhaps with a little modification. 

There is a huge difference in these approaches. On the one hand, the direct approach believes that a "best practice" for a given situation does exist, and that it is the purpose of instruction to lock that best practice in students minds. The constructivist method makes no such assumption however, and although minimal scaffolding may be provided, the student is expected to figure out on his own essentially what is the best way to proceed when confronted by a blue tongue and speckled saliva, etc.

Which approach is more effective? Direct. Hands down. Almost every rigorously measured head to head match up between direct and constructivist methods has resulted in clear wins for the direct approach. Psychological experiments have pointed towards the centrality of long term memory in expertise rather than any type of "faster processing speed" (IE Grand Master chess players were no better at recalling briefly viewed chessboards that would never be encountered in real play than less skilled players, but this finding reversed when the patterns changed to be chessboards that commonly were encountered in real play.) Cognitive load theory further suggests that discovery based learning methods may in fact retard the formation of these detailed mental schemas, since so much processing power is required to juggle multiple bits of information concurrently in working memory. 

Although a little arrogant in tone, Kirschner et al. make a strong case. Although they do cut constructivist methods some slack in some place (For example, constructivist methods have been shown to be at least  as effective as direct methods when learners are at an advanced stage of domain proficiency.), by and large the paper is bare of anything positive to say about constructivist methods. Although this lack of counter-evidence is worrisome, suggesting the possibility that either the authors did not do their homework or that they ideologically committed to direct methods, blinding them to any of the strong points of constructivism. However, I'm inclined to trust them in most of their assertions, both from the professionalism of their tone as well as the things I have experienced in my own life.