Archi meets Business Model Canvas

One of the key aims of the Coeducate project is to develop tools that can support staff in the development of new courses through the Validation Process.

One approach that we identified some time ago is the Business Model Canvas (BMC) which as its name suggests is designed to support thinking around business models, something that we are not particularly good at when we develop new programmes.

We have done some work with colleagues using the BMC (see earlier post) and through this identified the need for a tool to capture the outputs of the planning activity.

In discussion with Phil Beauvoir, the developer of Archi open source Archimate modelling tool, the idea was arrived at to build a ‘blank canvas’ feature into Archi that would enable anyone to create a template for approaches such as the BMC. The Blank canvi are fully editable and lockable / un-lockable making them very powerful and flexible tools. In addition, and particularly useful for the BMC, is that it is easy to export and print as an A0 PDF to use in workshop sessions.

Rather than simply building a ‘hard wired’ representation of the BMC into Archi, we hope that we have added a whole new dimension for people who wish to try and tie together different approaches and techniques with the practice of Enterprise Architecture using Archimate modelling language. The important point to understand, is that this isn’t just a visual representation, but the tool captures relationships between objects with associated properties so that more can be done with the data in an automated way.

One possible example is the JISC work on Student Life Cycle Management Service Design in Higher and Further Education which has an approach of Blueprinting and in particular the Front Stage / Back Stage identification of fail points – those who know about will hopefully see the connection!

We think this is a cool bit of work and expect to see the commercial vendors following suit:^) Phil’s work will ship with the next release of Archi in early December 2011.

An early example of the tool in use…

Advertisements

Learning Design Support Environment

The observations below are from a Coeducate project team member.  We hope to conduct a fuller evaluation of the tool in October with about 20 University staff  and will share our findings then.

————————–

In July I sat in on an evaluation day for the Learning Design Support Environment (LDSE) tool[1].  The emphasis of the evaluation process was to obtain user feedback about the tool with particular attention paid to participants’ existing work and requirements.  The participants seemed to be engaged with the evaluation process and generally enthusiastic towards the LDSE tool.  There seemed to be several key factors in the design of both the evaluation process and the LDSE tools which contributed to this positivity.

 

Relevancy to Participants

Each participant was asked to bring along details of a module that they were currently redesigning or needed to redesign.  This seemed to have two benefits.  Firstly the participants were already in an open frame of mind towards the design of the module and therefore might be more receptive to new ideas and thought processes.  Secondly, as opposed to working through a pre-created scenario, the subject matter was of interest and relevance to each participant because it was their own choice.  As well as encouraging participants to engage with the design process their knowledge of the module content and context seemed to allow them to interact more deeply with the process than might have been the case with a sterile pre-created scenario.

 

User Involvement in Tool Design

Discussing the LDSE tool’s development with members of the evaluation team it became apparent that user input had been key to the design process.  From what I was told the first evaluation sessions involved paper‑based exercises to map out the functionality of the tool.  This means that before coding of the tool had begun the team could be reasonably confident that it was at least starting off in the right direction to meet users’ needs.  The value of this approach seemed to be confirmed by seeing how quickly participants were able to make meaningful progress with the tool.  There may have been minor quirks and issues with the user interface but the participants seemed happy to work through these because of what they were able to achieve with the tool.

 

Activity Palette

The evaluation participants seemed to appreciate the balance of freedom and guidance provided by the LDSE tool.  The timeline on to which Teaching and Learning Activities (TLA) are added is a blank canvas.  However this is balanced by an on-screen palette of pre-defined TLAs that users can choose from if they do not need to create their own.  This seemed to be well received in terms of showcasing activities that users might not have otherwise thought to use.

These positive aspects of the evaluation process and the LDSE tool do need to be balanced by some other observations.

 

Added Value and Use in Isolation

At least one participant suggested that the LDSE tool would be more useful and therefore more appealing to them if it could feed into other systems and in particular the course content and time breakdown statistics required for their University’s administrative processes.  Maybe for other users it could be the ability to generate the outline of a LAMS file or the basic structure of an IMS LD Unit of Learning which would provide the “pay off” for the effort involved in the LDSE design process.

Providing this added value from the user’s perspective helps to address the question of “what’s in it for me?” when faced with a new process or tool.  The LDSE tool exports to XML and so there is the potential for data translation and re-use.  Without this potential to interact with other systems is there seems to be the risk that using the LDSE tool becomes an isolated exercise.  If this is the case then could some of the benefits, for example the palette of TLAs outlined earlier, be just as easily provided by a reference list or a set of 8LEM[2]/HLM style reference cards[3]?

 

Hijacking for Box Ticking Purposes

The LDSE is intended to “support teachers in designing effective technology-enhanced learning”1.  One of its key features is to provide a graphical breakdown of the learning experience in terms of how time is used (for example the percentage of time learners spend acquiring knowledge versus discussion and practice).  This could be used by teachers to reflect on their module designs.  However there does also seem to be the potential for these percentages to be the driving force behind module design, especially if targets are set by higher academic powers.  Would this alter the use of LDSE or reduce the quality of experience from a teacher’s perspective?

 

8LEM Wookie Widget

Just before our full-time developer was poached by a mobile phone app developer, we started work on Wookie Widgets to support curriculum development activities.  This beta beta version demonstrates the concept I think, we will return to this as soon as we appoint a new developer but in the meantime if anyone wants to take it forward please do.

Seeking Internal Longer Term Institutional Commitment for IT Process Support

Background

For any software development carried out by the project, its longer term sustainability is an issue that needs to be resolved before the end of the project.

In proposing software to support the validation process, polite resistance was expressed by the head of IT services, as his staff had been cut and the team was finding it difficult to support existing software let alone take on any new software, especially if it had the potential to proliferate many copycat bespoke software solutions in other process improvement projects.

The initial response was to look for generic workflow software that could be used to support any process improvement project.

Strengthening this, was a circular from the VC stating that he would drive forward improvement in seven areas, including: “improvements in efficiency and effectiveness, reviewing all operating units and services.” As workflow support would be involved in almost any  efficiency and effectiveness improvements of operating units and services, a paper was prepared for the Technology and Infrastructures Committee (which is evolving into an EA governance group) to the effect that processes and resources would be needed to evaluate workflow software, prioritise improvement projects, and develop, implement and sustain them.

The meeting did not come to a conclusion on the paper (it will be raised again at the next), but in discussing support for validation, a Dean strongly recommended that the existing process was itself too heavyweight and should be revised before any attempt be made to support it with software. The project put this to the Pro-VC who immediately accepted the proposal to review that validation process and set up a group for the purpose in which the project now participates.

Currently, for process support, rather than looking for software to be brought in, we are exploring the possibility of cloud-based solutions, removing the necessity for local support. In particular, we propose to evaluate an online service, BaseCamp. Although this is project management software we wish to see whether it can be used for process support, particularly where the main task is one of providing transparency as to the current stage any course has reached. BaseCamp provides a relatively cheap start up cost model ($99/month) and is easy to set up. Initially it is proposed to use it to support a lightweight revalidation process which using existing processes would be an enormous task.

We may also evaluate Salesforce.com’s cloud based workflow service.

Innovation Support Network

Background

Working with staff who had been innovating new courses to learn what they wanted by way of supporting ICT, it emerged that more than online tools, what they would most value was a group (like the present one!) where they could share, and strengthen their ideas before it was submitted to the rigours of the validation process and the work it demanded.

At the Summer 2010 Co-Educate SG meeting chaired by our previous Pro-VC, it was proposed and agreed that the Coeducate project should set up an Innovation Support Network (ISN) that would work with staff (and students to encourage a co-creation approach) wanting to participate.

Development

In planning the Innovation Support Network, two categories of course developers were envisaged:

  • those who wanted to think outside the box, i.e. those who wanted to innovate
    • those who’s courses that, for whatever reason, had too few participants, had high dropout rates or whose enrolments were declining and who would therefore like to rethink their offering, i.e. those who needed to innovate.

A further issue that had repeatedly arisen both with the innovation group and with earlier baseline work, was the lack of support for gaining market intelligence for the business plan that is required as a part of the validation process.

The changing climate for higher education has resulted in changed circumstances in the university, at least temporarily, requiring the Coeducate project to re-focus somewhat.

All course were in the process of being reviewed, with those judged to be non-viable being withdrawn and the remaining courses required to comply with a new Core Curriculum Framework, resulting in all of them needing to be revalidated.

At the next Co-Educate SG meeting, the new Pro-VC and chair asked if the ISN could initially focus on the task supporting courses comply with the Curriculum Framework and assisting with the streamlining the revalidation process.

Activity

To this end, the ISN has begun engaging at three levels:

  1. Deans of School
  2. Principal Lectures, Quality
  3. Lecturers piloting courses through the Curriculum Framework

Initial engagement has been with the School of Business and Creative Technology. The Dean welcomed the project’s involvement and saw it as an opportunity to maintain innovation whilst conforming to the Curriculum Framework. Two subsequent meetings were held with School staff., the first group being Business, Law and Accountancy staff, the second creative technologies. Both identified areas where they felt innovation is needed and the ISN will hold further meetings with each group, focussed on these.

The meetings also introduced the Business Model Canvas, discussing how it to adapt and use it.

The Business team thought it would probably be too difficult for other staff to use so they would need expert assistance, but agreed it might be useful in helping establish dialogue between staff and a business model expert.

The Creative Technologies team took to it rapidly and produced a model for a platform to support students developing a realistic ePortfolio that could be used to record and then present their work to employers.

Business Model Canvas – Support for Programme Development

In our work with staff developing new programmes, a common comment is the difficulty of creating a business model that is required for validation and in particular the difficulty in obtaining reliable market intelligence for expected student numbers. This more innovative a course is, the less it is possible to rely on data from other courses and sources, either internally or externally.

UoB has recently gone through an Academic Review which examined all courses with respect to a number of viability criteria and a significant number of courses will be discontinued.

These considerations make it clear that, going forward, it will be necessary to put more weight on the viability of new courses while they are being developed. This in turn will require a change in approach on the part of those developing courses, so we were seeking an approach to business modelling that would be easy for staff to adopt. To this end we have been trialling the Business Model Canvas which has been released under a Creative Commons license with a view to adapting it for the purpose of developing business models for new courses.

This has been presented for comment to staff from the Business School and used in a workshop by staff from the Creative Technologies team with a positive outcome, sufficient to encourage a further workshop with them to develop their ideas further.

This encourages us to work further on adapting the wording and trialling it further. In a separate development, Phil Beauvoir, the Archi developer has prototyped an implementation of the Canvas as an add on to Archi. We have discussed the changes that would be needed for our purposes.

Should the canvas trials continue to prove positive, the work needed to implement it as a tool, adapted for our purposes, could be funded at relatively low cost from the Coeducate budget.

WRITERS’ LAB @BOLTON 2010

An output from the ‘Planning and Developing Open Learning Courses‘ module run by the project was this resource created by Anna Zaluczkowska and colleagues which was used to present the model she developed at a departmental meeting – this is exactly the kind of outcome we had hoped for.

The writers lab  developed and delivered a masters course designed to explicitly align teaching and learning with employers and students needs so that as students demonstrated their capability employers would recruit those that best met their needs.

In carrying out this work, Anna initially sought to use the IDIBL framework as it offered the flexibility that she required to develop a student teaching, learning experience that closely  mirrored that of the workplace including an approach to assessment that didn’t distort the experience by requiring a ‘false’ set of outputs for assessment purposes.

She found, however, that the terminology used by the framework encountered resistance from employers and some colleagues as she sought to move away from a content based curriculum.  Wrestling with these issues lead Anna to the conclusion that she had to wrap the course in familiar terminology so that it was acceptable but to continue to innovate in practice with the learning experience the students had.