Summary
This page describes the didactic vision behind the use of Portflow and provides a clear and manageable MinMax approach to its implementation.
Portflow is a plug-in for an LMS, such as Canvas. In Portflow, students can visualise and self-account for their development progress by collecting evidence, requesting feedback from lecturers, fellow students and experts, and creating an assessment file. Lecturers can provide students with the necessary support in this process (Source: Drieam).
To facilitate students in using the MinMax implementation below, a Portflow template can be created. A template implementing some of the jobs can be found here:
<Link to Portfolio template to follow soon> [Ed.]
The 'front page' of the portfolio contains a description of the person himself, as shown in Figure 1. The idea is that this description is continuously updated to match the current situation.
Figure 1. Screenshot of a Portflow front page.
The description includes the following sections:
A section is a part that comprises a logically defined unit of study with a descriptive title or a separate activity that is not tied to the curriculum (e.g. participation in a robotics COP, activities from a governance role, etc.). Figure 2 shows a number of sections.
Figure 2. Screenshot of some sections within Portflow.
'Collections' by logically related pieces of evidence within the semester. These may include:
Bad practice:
A collection such as this should NOT be used to bundle professional development goals, as from a didactic point of view, evidence should be created in the context of one of the other collections.
Within Portflow, collections can be created, called collections, as shown in Figure 3.
Figure 3. Screenshot of a Portflow collection.
By collection:
Good practice:
For example, the reading guide can be structured chronologically by describing which professional products/evidence were produced for each sprint.
Bad practice:
Using a collection by learning objectives.
To track the progress of goals, Portflow allows users to set personal learning objectives. Goals can also be included in Portflow templates. Use your semester's learning outcomes as learning objectives in Portflow. See Figure 4.
Figure 4. Table showing some learning objectives (Goals) within Portflow.
Good practice
In some semesters, learning objectives are defined very precisely in line with the HBO-i competences model. There, learning objectives consist of competences (Analysis, Advice, …) and professional skills.
In this collection, you are going to add supporting documents. Make sure you bundle the supporting documents in such a way that the granularity is appropriate. That is, your reader is not overloaded with small pieces of evidence that have little relevance on their own. On the other hand, one big document with 'everything in it' is not desirable either. Ideally, you should aim for 2 or 3 supporting documents per sprint or iteration. These should be relevant work products, such as a software requirements document, a user interaction design (with all sketches and design iterations integrated), a hardware test plan and results report, etc.
If the piece of evidence is a group product, also make it clear what your contribution to this piece of evidence is.
Good practice - intermediate steps and products:
In certain semesters, the types of evidence allowed are limited. For example, you are only allowed to hand in:
Good practice - 'essential questions':
To make the process transparent, when describing the evidence, use the 'essential questions' (devlog style) in the description:
Professional products are collected by students in Portflow during the semester. Feedback on products can be requested by the student and can be recorded in Portflow. Task-specific feedback can land at the comment field of the relevant product or collection if it is at a holistic level.
On a regular basis (e.g. a fixed number of times per semester or at the end of each sprint) at milestone moments, a snapshot of the portfolio is taken and submitted in a Canvas Assignment. This assignment has a rubric that records the assessment by competency level or learning outcomes. For a fixed set of potential competences or learning outcomes, this rubric can be created manually.
In situations where competences are flexible (e.g. as in Open Learning), this is more difficult because the rubric of potential competences can be very large. In this case, the tooling as used in Education for Professionals (developed by Dennis Cools) can be used. This tooling makes it possible to enter an assessment in a user-friendly way and translate it into a completed rubric.
In addition to the completed rubric, a milestone delivery is also provided with a piece of explanation on how this assessment came about.