We have talked many times about the importance of learning verification tests, also and above all in e-learning. But is it better to put them on the platform or inside the scorms?
In an educational context in which the physical presence of an expert who assesses the degree of understanding of the contents and the level of knowledge acquired is lacking, quizzes become a fundamental replacement tool.
Assessment tests in e-learning
More: in e-learning, tests do not only allow solutions to strictly didactic issues (facilitation and / or evaluation of the learning of the individual user), but also statistics, allowing many considerations about the validity and clarity of the contents , or degree of appreciation and involvement, and favoring virtuous and rewarding mechanisms such as gamification.
Listen to the expert
To those involved in e-learning and, in this case, to those who actually implement it, the potential and possible applications of this tool tend to be quite clear. Above all, it is clear that the most suitable solution among the many depends (as usual?) on the need to be met.
But this is a step that is not at all obvious.
What may seem obvious to the expert, to the less accustomed interlocutor may vice versa appear nebulous or, perhaps even more often, equally obvious but in the opposite sense!
It is natural that those who see only their own reality every day should have a single point of view in mind but, however legitimate, this perspective risks at least being a little limiting.
Scorm or platform?
For example, a rather widespread constant, on which we would like to try to clarify, seems to be the difficulty (which is well understandable) to distinguish well the difference between the use of assessment tests integrated into the platform on which the course is located and, vice versa , the use of quizzes inside the scorm objects that constitute it.
Not just sexy. Maybe
From an inexperienced point of view, in fact, probably little changes except the appearance, certainly more captivating (because it is extremely customizable) in the case of the scorm.
But can the appeal alone tip the scales?
If with the assessment test I am satisfied with obtaining an absolute score (that of the scorm test: the user has passed the quiz or not, so he has completed the course or not) and I am not interested in obtaining a more detailed report (for example: how many users out of the total have selected which options for answering this question), then yes: appeal can be the only determining factor and I do well to opt for the scorm, if only for a matter of aesthetic coherence.
I would do well to opt for the scorm even if I were a creator of courses and, potentially, my single learning object were to be loaded on many different platforms of many different customers: clearly, the test portability same would be easier, it is having all included in one package conveniently year-old sometimes transferable anywhere.
If one day I wanted - for example - to put the same scorm in another more complex course, perhaps composed of several didactic objects or even without tests? Well, I would necessarily have to create a new "cleaned" object, repack it and reload it in the right place.
What if - more simply - I had to correct a typo, add or remove questions, change answer options, integrate the test with different types of questions, in short, MODIFY my quiz? Well, I should correct my scorm source, and package it up and upload it to the right place again.
If only for reasons of necessary timing, the choice of the scorm would begin to become more of a stubbornness, than an optimal response to my practical needs.
Our advice, usually, is to include in the scorms only small intermediate quizzes (mostly to liven up and facilitate learning) and not complex final assessment tests, for which we think it is preferable to opt for the platform, basically for reasons practices of:
The question is not so much whether it is right or wrong, but rather whether and how much it can be worthwhile (in terms of time and, therefore, also of money) one way over the other.
If you've chosen to use an LMS platform for all those amazing time-saving and error-saving features it brings with it, does it make sense to misuse it, wasting time and increasing the risk of error?
To give you what you wanted it for, there is no more effective way than to use its integrated functions as much as possible, especially - as mentioned - with regard to reports and statistics: even the most advanced scorm reports are less. easy and immediate consultation compared to those of the platform tests, which were born and designed on purpose.
Not to mention perhaps the most banal aspect of all, which lies behind any consideration in this regard: WHO has to take care of creating and uploading the test?
To do this with a scorm, not only specific expertise is required, but also dedicated software. On the platform, anyone can do it: teachers, tutors, novice unfortunates (as long as they are educated) or experts in the best systems (as long as they are available), copy-paste fanatics or enthusiasts of manual data entry, without specific skills as an instructional designer, without software for scorm production and even at any time. Even on the spot during a face-to-face lesson, should it ever help. Why not?
Yes we know, you never thought about it.
But that's what we're here for.