Our SharePoint space is now working. A number of documents and a discussion area is there and the code will be added very soon (just removing usernames/passwords now).
If you’d like access to this site please e-mail firstname.lastname@example.org with your details and I’ll send you an invite to register/access the content.
Southampton is just starting its second academic year using the JISC funded e-Assignment system. This will be with the majority of assignments from 3 Faculties along with pilot assignments from all others during the year. We’ll also be linking e-Assignment to a new feature of the University student administration system so that final marks can be derived from the non-final marks that are held in e-Assignment.
Over the summer we’ve had a number of requests for the code or other detailed information about the system. We’ll shortly be adding a link on this site and on www.jisc-ea.soton.ac.uk for registering for access to our SharePoint site. We will be putting a copy of the current code within this repository and also enable cross-institution discussion on how e-Assignment is (or can be) implemented as well as other features that could be developed.
So another big gap since our last update. We performed several more tests of submission and marking over the summer and created our pre-production and production environments. Unfortunately we were still lagging behind in functionality for the administration of assignments, as such we decided not to ‘go live’ with the system in semester 1.
The School’s of Humanities and Medicine both decided to continue using the new system in order to test and fix issues as they are identified. The School of Health Sciences ‘fell back’ to the old Faculty eAssignment demonstrator for their current assignments.
Since that decision many advancements have been made in the administration of assignments using the system as well as minor revisions to the marking process. These advancements include;
- a special interface for downloading submitted files from more than one assignment at once and at a glance seeing all those submissions that are late.
- submission to TurnItIn (where specified) and retrieval and display of the overall plagiarism score (currently working on additional information to be retrieved).
- date/time stamps for when students start and finish the submission process so that it’s clear if students started before or after an assignment deadline.
We expect these developments to continue and are committed to a formal go live for semester 2 when other Schools will be invited to test assignments using the system.
So far there are nearly 300 assignments setup in the system, over 4500 student submissions have been made (in comparison the old demonstrator was used for only 9200 submissions in all of the 2009/10 academic year) and 3340 files have been submitted to the TurnItIn plagiarism service.
It’s been a while since the last post as we’ve been busy preparing for, and conducting, tests on the eAssignment environment so far.
We have actually conducted one successful test of marking with real data (this was additional to ‘normal’ marking so didn’t affect student results). Alongside this we ran a survey for markers asking about how they found electronic marking and the marking descriptors used (use of marking descriptors was also a new ‘test’ for this particular assignment). Details of the results of this survey will be in a future posting but in general feedback was very positive.
We had a second test of student submission and marking scheduled for the end of June and beginning of July (using a group used to the old Faculty eAssignment demonstrator system). However only 12 students submitted their work using the new system (they had the choice of the demonstrator or new systems), and due to the use of a slightly unexpected marking method we had to abandon the second test of marking in the system.
The next submission and marking test is scheduled for late August.
In the meantime we’ll be testing and transitioning from our current development environment to a pre-production environment. We also hope to be doing a number of load tests on both environments to get a clear picture of how many students/markers can be facilitated at once.
New captivate videos are now online on the www.jisc-ea.soton.ac.uk web site. Under resources there are links for instructional videos on student submission and marking. These have been updated with the latest build of the e-Assignment system. Student submission has had a mostly visual make-over. The marking process shows some of the new features of the system.
The marking workflows as described in previous posts are now under resources on the www.jisc-ea.soton.ac.uk web site.
Also worth mentioning is that the old information about the demonstrator has been moved onto the jisc-ea web site from its former home (also on the resources page). Next step will be updating the history document as this will be useful in our upcoming demonstrations in April.
Marking within e-Assignment takes the form of a series of criteria, each of which has a weighting. Scores awarded to criteria are combined with the weighting to create an overall percentage (where necessary this percentage can subsequently be converted to a letter grade). If more ‘old fashioned’ marking is needed, this can of course be achieved by having only one criteria (scores A,B,C,D,E,F corresponding to appropriate percentages of course).
In our previous work on e-Assignment handling and marking we initially only had support for one kind of scoring. This was via radio buttons. Each criteria for an assignment had the same number of descriptors (and therefore buttons). This was usually deployed with 6 values, thus the radio buttons would be 100%, 80%, 60%, 40%, 20% and 0% of the weighting for that criteria.
This is of course very limiting, particularly if you are marking on few criteria. The range of marks being so few it is difficult to distinguish between students – particularly if you have a small cohort.
So an extension was developed to allow markers to enter their own value rather than use the radio buttons. This brought its own issues where markers were unclear as to whether this value (that they could enter) was out of 100% (which it was) or was out of the total weighting for that specific criteria (e.g. 30% weighting meant numbers 0-30 could be entered).
To maintain flexibility and reduce confusion this project is deploying radio buttons (much as before) and (instead of a textbox) a slider normalised to the weighting of the criteria – the currently selected value being displayed. In addition to this each criteria for an assignment may have differing numbers of descriptors (obviously where the slider is used descriptors will have less use). We believe this gives the most flexibility so that where descriptors/few possible ‘scores’ for a criteria exist radio buttons can be used, but where more granularity is needed a slider can be used instead.
We’d be interested to hear about any other scoring mechanisms used in the past (not just online of course). Also as far as I’m aware we’ve never used non-linear marking (although scoring doesn’t always start at 0% and finish at 100%) so it would be interesting if anyone has done something like that before.
As our development pushes deep into the marking and administration modules we’ve had some discussion about standardising the naming of the different marking methods that the e-Assignment system supports. This is also a common question people ask when first finding out about the system, so here’s a run-down of the different methodologies. If you’d like e-Assignment to support a different method or would like more explanation please leave a comment.
- Single Marking – Simple, one marker submits one marking report per student or student group.
- Averaged (Blind) Marking – More than one marker (not limited to just 2 markers) submits one report per student or student group. When all reports have been submitted for a particular student or group the average mark is calculated (all marks added and divided by number of markers) and feedback comments/files are merged into a single report. This methodology can also be used for peer assessment (to be discussed in a later posting).
- Agreed (semi-blind) Marking- Two markers submit reports as per Averaged marking i.e. blind. Once these two reports are submitted the two markers agree to meet (or discuss on the phone) to create a single marking report (under one markers login) using elements from each others reports (or indeed coming up with new marks/comments based on their discussion).
- Approved Marking – One marker submits a report as per Single Marking above. Once submitted a second marker must review and approve (or return to the first marker for revision) the report.
We have some flow charts for the above and I’ll put them on the www.jisc-ea.soton.ac.uk web site soon.
Recently I saw a (very new) draft policy that was sent round our institution. This was all about late submission of student work. I won’t go into the details of the proposed policy (technically I’m probably not allowed), but the main thrust was to apply a blanket policy institution wide on how to deal with lateness of submission.
From an e-Assignment POV this is great. It makes marking and administration simpler because any late penalty is the same whatever the assignment. The system we’re building does allow for any number of penalties that can be differently applied (from School down to individual assignment) – this allows for other institutions or if ours fails to ratify the new policy – but having one policy does simplify things greatly.
Maybe there is hope that one day the use of letter grades, percentages and other marking schemes may be used in a more universal manner.
There haven’t been many blog posts as we’ve all been working hard to catch up on lost time and bring the project back on track.
In future I expect we’ll publish blog items as elements get tested or deployed, or when demonstrations have been produced. I am considering creating a Twitter account so that we can keep everyone updated on the more minor aspects of the project (meetings, minor decision making etc).
Next week Trevor (project PI) and myself are attending the JISC Institutional Innovation meeting, where hopefully we’ll get more feedback on what people think of e-Assignment generally and how it might benefit their institutions (along with finding some synergy with other institutional projects).