Page tree
Skip to end of metadata
Go to start of metadata

1. Introduction

The CodeRunner question type enables using Python code to read and grade file submissions. This can allow even large courses to use Excel-based exercises or lab reports without significant time and labor cost. The downside is that building a functional question from scratch could take a lot of work. For this reason, we have created some easily customizable template questions for the use of Aalto-staff, which require no programming knowledge to use. Instead, all the parameters for grading are defined within Excel. After learning how to set up these questions once, it should be relatively fast to do so again in the future. This wiki-page includes links to videos, which show step-by-step how to set up these types of questions.

Currently there are two types of questions that have been tested to work: LocationMatch and DataExchange. In short, LocationMatch is used when you supply all the data, and DataExchange when students use their own data. Both work by comparing the values of a certain location between a model answer and the submission. For example, do the values in C2:D8 match within an accepted error margin.

DataExchange takes this one step further by first reading the values from predefined cells and writing them into the model file. These student supplied values can then be used in the model answer’s calculations. An example of what this feature can be used for is reading a student’s submitted measurement data and calculations, and then checking that their calculations are correct based on the values they measured.

2. Setting up a graded Excel-question

An example workflow for setting up an automatically graded Excel-submission is the following:

  1. Create a model answer in Excel.
  2. Download the question XML-file and import it to your MyCourses space.
  3. Customize the question text, mark, and other parameters you wish to change.
  4. Download the Excel-support file that is already in the Support files field, and add your model answer to this file as a new tab.
  5. Define how the submission is graded within the supporting Excel-file
  6. Replace the old supporting Excel-file with the one you customized.
  7. Save your question and add the question to a quiz.

Two important rules apply to file names. The names must not contain spaces, and the supporting Excel-file must end with '_support.xlsx'. The second requirement makes sure that the grader can always identify where the model answer and instructions are located. A more detailed instruction will be given on this page, including step-by-step videos. These questions have been used in course environments, both as parts of weekly quizzes and larger projects.

3.  LocationMatch-template

3.1. File


LocationMatch_support.xlsm (this support file has a ready Macro for creating Locations)

3.2. Use

The simplest template to use is the LocationMatch template. It works by comparing different locations (defined by columns and rows) between the submitted file and a model answer in the support file. To make sure the values to be graded are in the same location in both files, it is recommended to create a template file for the student to fill in.

3.3. Instructions for use

Instructions on how to set up a LocationMatch-type graded Excel-submission are shown in the video below.

Link to video

After importing the question, scroll down to the ‘Support file’ section of the question authoring form, where you find the supporting Excel-file. Download this file, customize it, delete the old one and upload the one you customized in its place. The file should contain the model answer, against which the submitted file is compared. The locations to be graded are defined in the Locations-tab of the supporting Excel-file. The support file name must always end with ‘_support’, so the grader can locate it. 

The location is named so it can be identified. Note that two locations must not have the same name. Next the actual location is defined by giving the columns and rows; for example B-C, 2-3.

The location can be graded in three ways, defined on the mode column:

  1. Relative error (ex. rel=0.01 for 1% error)
  2. Absolute error (ex. abs = 1 for error of 1)
  3. Exact for exact comparison (meant for text)

Enter the sheet names for both model answer and submission, so that the grader knows where to look. If the submitted files do not contain the appropriate sheet, an error is raised. Finally, you can set feedback for correct, partially correct, and wrong submissions. The feedback is displayed as an unordered html-list. 

Once you are done customizing the support file, delete the old one from the question authoring form and upload the customized one in its place. All the other parts of the question, such as mark, penalties, question text etc. can be customized like any other CodeRunner question.

3.4.  Missing data

Sometimes the area we want to grade may contain missing values. There are many different conventions for empty cells, such as leaving them blank, adding a line etc. To enable relative and absolute grading in areas that include missing data, the grader will substitute any value that it cannot convert to a number with -1 before comparison. For example, if the submitted location contains values 10, '-', 5, and the model answer is simply 10, 'missing', 5, both will read as 10, -1, 5 to the grader. As such, values that cannot be converted to numbers always compare as identical.

3.5.  Feedback

The feedback you set up in the supporting Excel-file is treated by MyCourses as Spesific feedback. If you set feedback to ‘Force show’ in the question authoring form, the feedback will always be shown. If instead you chose ‘Set by Quiz’, make sure that you have enabled specific feedback in the Quiz settings. Note also, that the text is read as ready html-text. This means you can enter standard html elements within the feedback. For example if you want wrong answers to result in a link to some learning resource, you can use the html link-element. An example feedback could be:

Your answer was incorrect, please see learning material <a href="">here.</a>

The text inside the html-element would be displayed as a hyperlink to the students who receive the feedback. 

General feedback is not displayed during the Quiz. However, if you enable it in Quiz settings, general feedback is displayed to the student during review. 

4. DataExchange-template

4.1.  File


DataExchange_support.xlsm (this support file has ready Macros for creating Locations and exchanging data)

4.2.  Use

The DataExchange question type works in the following way:

  1. Reads data from submitted file.
  2. Writes the read data to the supporting Excel-file.
  3. Calculates new values for all cells, so that the new values can be taken into account.
  4. Grades all wanted locations just like in the LocationMatch question-type.

Reading data from the submission allows using student measured or generated data in your model answer. The downside is that ‘refreshing’ the cell values is not simple as the code cannot open the Excel file. Not all Excel formulas refresh properly, and it is important to take this into account when designing your assignment. For more details, read section Complications. The grader reads the value of the cells, and not any formulas. This means that the submitted cells can safely contain references to other cells in the submitted file without causing errors.

If the cells being read from the student file are empty, a small positive value (default 10-20) will be used instead. This is done to prevent errors in Excel-formulas that occur when referencing an empty cell. If the cells being graded contain error messages in both submission and model answer, they would be seen as identical to the grader and correct by default. The small positive number allows using student submitted values as variables without fearing these error messages. If you want to change the default value for empty cells, simply change the value of variable empty_cell_value to whatever works best for you. The variable should be easy to find at the top of the grader code, under Customisation in the question authoring form.

4.3.  Instructions for use

Watch this video for step-by-step instructions. 

To set up this type of question, start by constructing the model answer and a template file for the student to fill in. If you want to check that the values read from the student file are within acceptable limits, you can create two model sheets: one with values from the submitted files and another that remains constant.

Import the DataExchange base question to the course space you want to use it in, and download the supporting Excel file it contains. Attach sheets containing your model answers to this supporting excel file. Next, tell the grader which cells in the submitted file you want to read, and where in the support file they should be written. Note that these values are entered as one typically enters a cell range in Excel, for example B2:C5. This is done in the ‘Data exchange’ sheet. Next, set up which locations to grade exactly like in the LocationMatch grader.

4.4.  Complications

Not all Excel-formulas work properly in the DataExchange question type. This is due to the process of updating cell values after reading data from the submission. Normally values in Excel-cells are updated when Excel is opened. However, MyCourses cannot open Excel files, and as such a python-based workaround is used. This method uses the python Formulas-library, which has not implemented all Excel formulas. This only matters for the support file. The submitted file can use any Excel-formulas. 

If your assignment is not working properly, try using different (preferably more primitive) formulas in your model answer. For example VAR.S does not work properly, but VAR does. Alternatively, you can try adding your own python-based replacement into the Formulas file, which can be found in the Support file section of the question authoring form.

  • No labels