ICS-E4020: Course feedback

Overview Statistics Numbers Comments Changes Challenges

Overview

Many thanks to all of you who took part in the course, and especially many thanks to all of you who gave some course feedback! Overall, 61 students passed the course and 52 students gave some course feedback — see below for more details. I am very pleased that so many of you gave course feedback, and especially that many of you also took some time answering the free-form questions.

This was a brand-new course, and the large number of the registrations caught us by surprise — our plan was to beta-test the course with a much smaller number of students. We had to improvise a lot, and the instructions, workflow, and course automation were far from perfect. Many details clearly need some improvements for the next year.

Nevertheless, it seems that most of you were fairly happy about this course, and there clearly was a lot of demand for this kind of course. If you want to join our course team next year as a teaching assistant, just send me an email.

Thanks, and see you again!

Jukka Suomela, on behalf of the whole course team.

Statistics

Participation

Feedback

We got course feedback as follows:

Even though most students got very good grades, we got some feedback also from the students with lower grades:

Conversely, many of those who did not give feedback also did fairly well:

Based on these numbers, it would seem that the feedback is fairly representative. The large number of free-form answers suggests that most of the students who gave feedback also spent some time thinking about it.

Who did well?

≥ 1 point ≥ 30 points ≥ 50 points ≥ 60 points
(pass) (grade 5/5)
CS 42 40 26 12
physics 11 8 8 5
EE 7 6 4 0
mathematics 4 4 4 0
Bachelor students 29 25 19 7
Master students 26 24 16 9
Doctoral students 6 6 4 1

(Only those categories shown with at least 4 students with the grade of 5/5.)

Numerical assessment

“My overall assessment of the course”

“Teaching methods supported my learning”

“I am pleased with my study effort”

Time vs. credits

“Will benefit from what I learned”

Open-form questions

There were two questions in the feedback form:

Good

Overall, the students seemed to be mostly happy with the following aspects. These are listed roughly in the order of how often these were mentioned in the “what was good” question:

Good/bad

Opinions were more diverse with these aspects — some of the students were pleased with these, while others thought these need improvement:

Bad

These aspects will clearly need some improvement:

There were also some dislike on the particular choices that we made related to the tools and topics: e.g. CUDA vs. OpenCL, and Linux vs. Windows.

What will change for 2016

Based on the course feedback and our experiences during the course, we seem to be on the right track, but many details will need some attention.

Course content and course material

The course content will be revised a bit. We will pay attention in particular to the following aspects:

We will try to publish as much course material in advance as possible. In particular, all exercises and their detailed grading rules will be published when the course begins.

Workflow, submissions, and grading

The workflow for submitting exercises and the grading of the exercises will be changed. Most of the students seemed to be happy with the use of Github, so we will keep using it. However, the grading will be done in a more “self-service” fashion, using a simpler and more efficient process.

In essence, for each task the student can run a grading script that will do all the checks and benchmarks that are needed in the grading. The grade thresholds are fixed in advance, and the grading script can directly tell how many points the current version will give. There will be less emphasis on the reports in the grading, and most of the work will be automated.

This way the students will already know before the deadline how many points they are going to get this week. There should be few surprises related to the grading, and no need to wait for the teaching assistants to run the benchmarks. This way also the teaching assistants can spend less time doing mechanical work and spend more time giving helpful feedback to the students.

The instructions related to the specifications of the exercises, grading, and submissions will be streamlined. In particular, there was some confusion related to various special cases — e.g., how one can resubmit solutions or what to do with late submissions. There will no longer be any weekly reports; each task is an independent entity, with its own deadline, its own grading rules, and it can be resubmitted independently of all other tasks.

Main challenges for 2016

Some challenges are still left to be solved before the course is given again next year. We will need to work together with Aalto IT support to make sure that we have sufficient computer resources available throughout the course. We will need to find a solution that avoids the problem of background jobs disturbing benchmarks, and we will also need to find a solution to the crashing GPUs problem.