Mike and Yolie ‘attended’ the first online Elluminate session, which introduced the evaluation and synthesis part of the OER programme and raised some interesting questions for future discussion. The E&S team will synthesize the discussion at a later date, but what follows is a brief summary of the discussion from our point of view, with appropriate links to other blogs/wikis/Powerpoints etc..
The link for accompanying Powerpoints, and a live recording of this Elluminate session can be found here: http://www.caledonianacademy.net/spaces/oer/index.php?n=Main.Presentations
The JISC OER ‘Synthesis and Evaluation Function’ aims to build a shared framework for the evaluation and synthesis of the OER piulot programme- so does what it says on the tin! There is a project wiki: http://www.caledonianacademy.net/spaces/oer/index.php?n=Main.HomePage, which will hold information for programme participants, and, hopefully, allow continuing dialouge with the S&E consultants. The three OER strands (individual, subject and institutional) have specific consultants working on each: the Subject Strand has Helen Beetham as contact, and Mike and I are already in discussion with her with a (very early) draft/notes on possible evaluation strategies (document to follow). The current draft evaluation and synthesis framework can be found here: http://www.caledonianacademy.net/spaces/oer/index.php?n=Main.GenericFramework.
The first Elluminate discussion session covered the following topics: marketing and branding; possible benefits and support for staff; pedagogy of OER; managing databases and metadata (this will be the topic for the next Elluminate session, on the second Tuesday of August, 2pm); reuse of OER and accompanying quality issues; general E&S queries. Due to the nature of Elluminate as a forum, the discussion was bitty and a little confused, but some interesting questions could be extracted, which may be worth highlighting here for further consideration by the team and partners.
Firstly, however, here is a brief intro into the E&S framework development:
How should we develop our evaluation strategy?
To allow our evaluation strategy to map to the generic E&S framework, we need to identify our project’s key outcomes, especially in relation to the following: findings, impact, benefits and lessons learned.
We need to begin to define ‘measures of openness’ (not quite sure what this means at this stage!).
The E&S team will try to help us identify appropriate factors, methods, timings and measures that will help us achieve the above: but this relies on us being proactive in seeking advice.
Once we have these methods/timings/measures agreed, it is up to us as a project to apply these through our evaluation processes.
The rationale for working in this way is that such a structure allows for the development of a common language for collation of data, challenges, solutions and outputs, encourages the sharing of questions/issues and allows the identification of key areas of interest and useful approaches for the future.
Problems/issues/questions might fit in to three categories:
1. those that can be answered by JISC/HEA Programme Managers (and tend to apply to all projects)- and an answer/solution is available.
2. those that can be answered by programme support (the OU team) and need expert advice from those already working on OER
3. those that are issues for evaluation- that is, questions that don’t have a clear answer, but the programme is investigating. Thies issues/questions can also be seperated in to:
- organisational and IPR
- social and cultural
Issues that were raised by Elluminate participants during discussion:
Marketing and Branding
- Our University’s reputation is at stake with this project, how do we approach marketing and branding? Reputational benefits may depend on identifying authorship.
- Do we need to look at business models?
- What is the difference between ‘branding’ and author/institutional acknowledgement?
- Should this just sit in the accompanying metadata?
- Is anyone thinking of using licensing that doesn’t mandate attribution?
- If an OER is branded, the branding should not impede re-use.
- Could we just use a UKOER logo (in addition to UKOER tag)? How about logo alongside originator (project partner logo?) details?
- Will a lack of branding/acknowledgement prevent reassurance of already reticent academic staff?
- Creative Commons licences wouldn’t mandate retention of UKOER logo
Benefits and support for staff
- Links investigating benefits for staff (individuals and institutions): http://ie-repository.jisc.ac.uk/265/ (Good Intentions Report); CETIS OER Briefing Paper
- Who are the stakeholders and how can we support them, and evaluate their experience? Academic, teaching staff, technical staff, learners
- Do we need to consider evaluation of end-user communities? Who are they/likely to be? What do they need?
- How do we investigate benefits of OER whilst, at the same time, encouraging engagement?
- What motivates enthusiasts? Ask E&S to investigate participants with OpenLearn at the OU- a very enthusiastic and aware group.
- Do we need to look at how discipline cultures work across institutions? How do different disciplines share research, for example? Might this have some effect on how a discipline engages with OER?
- Colleagues at Caledonian University can cite ‘reuse of resources they have authored’ in their application for promotion on the basis of learning and teaching.
- Is anyone doing a formal survey on academics’ preferences for release of material under creative commons?
- Fragmentation of resources makes it difficult for the background pedagogy (good or bad) to be transmitted wholesale to the user
- do we even want to put such caveats (e.g. ‘must be used in this pedagogic context’) against material?
- Ope Spires thinks that should not wrap material up with unnecessary contextual material
- Making resources easier to use overcomes barriers
- For the purposes of this programme, cannot evaluate learning and teaching quality (above and beyond initial accredited material) in terms of reuse, but maybe we can explore changes in practice and perceptions around quality?
- If we are not looking at pedagogic value, then surely this brings the programme’s sustainability credibility in to question?
- the JISC Mosaic Report has a good overview of this issue
Managing Metadata and Databases
- Need advice on subject schemas vs. subject tags?
- Tracking: how do we track across both OpenJorum and other repositories?
- Can track visits using Google Analytics.
- Should not just track the ‘what’ but also the ‘why’
- p.s. from Yolie- I’m happy with metadata discussions, but can’t guarantee that I’ve necessarily got the right words or ‘ideas’ down with respect to e.g. tracking. Far too technical for me!
Reuse and Quality
- Evaluating reuse of OERs may well be out of scope of programme: programme is focussed on creation/repurposing and release of OERs rather than on their reuse
- However, reuse will influence quality of OER, and a couple of the institutional projects will be looking at reuse, so should keep an eye on these.
- However, we may well get hints at reuse, the hows and wherefores- so make sure manage to pick these up as and when they appear.
- Should ‘intention to reuse’ be seen as part of the release? Open release implies re-use and so is part of the change of mind-set
- The programme is starting on the road of ‘building an OER culture into course design and delivery’
- SC credibility for OER project is affected if the actual resources are not valuable for reuse. Is difficult to evaluate the potential for the OERs in context until someone else has tried to embedd them. How do we maximise component reuse?
- For Biosciences, each project partner is finding a ‘client’ as a user for their OER, to act as a critical friend, rather than simply rely on the repository and reactions/use of the latter.
- Is there a difference between quality in terms of original resource (accredited programmes, therefore quality assured) vs. quality within new context (reuse)? (we think yes to this)
- We have no control over context of reuse?
- Institutional programme blogs can be found at http://www.netvibves.com/hwilliamson#oer-institutional_projects
- If anyone is interested in the Oxford project on Audio Visual enterprise level infrastructure, you can find more information at http://steeple.oucs.ox.ac.uk
- Could projects use Elluminate with their partners? (programme leaders investigating licensing for this)
- In terms of evaluation framework, could we have a ‘phase 1′ and a ‘phase 2′, so that we can get the project and partners started (evaluting as we go), without necessarily knowing all the questions we want to ask overall?
- Projects must gather evidence throughout lifecycle, but mapping this to the framework may well take a couple of attempts. The E&S team will help with this.
As the project continues, we will pick out particular issues raised here (and at other meetings) and ask partners for their thoughts.