Tests, Surveys & the Grade Centre

This user guide explains how to work with the Blackboard Grade Centre to view the results of tests and surveys. It follows on from Creating a Blackboard Test/Survey and Deploying a Blackboard Test/Survey.

GradeCentreTestsThe Grade Centre

Each test, survey or assignment you create on Blackboard has a column associated with it in the Grade Centre. Once a student has completed one of these, the Grade Centre can be used to view the results. The Grade Centre will also alert you to any tests or assignments that require your attention, for example a test containing an essay question that requires manual marking.

Accessing the Grade Centre

From inside your module Control Panel, click on Grade Centre and then click Tests. This will show you all of the columns for Tests within the module.

Viewing Individual Results

If you want to see how an individual student has performed in a test or survey, place your mouse pointer over the appropriate cell and click on the arrow button which appears. Select View Grade Details from the menu which appears to go to the Grade Details page.

ViewGradeDetails

The Grade Details page shows a summary where you can view attempt information and grade history. You can also clear attempts made by a student and override the calculated grade for that particular test. If you allowed more than one attempt at the test, each attempt will be listed here under the Attempts section. If you only allowed one attempt, but the student failed to complete the test for any particular reason, you can click the Allow Additional Attempt button to grant another attempt.

Click the Grade Attempt button for the attempt you want to view. You will then be taken to the Grade Test page showing details of the submission made by the student.
GradeAttempt

The Grade Test page shows each question in the Test, the student’s answer and the correct answer. It also shows how many points the student was allocated for each question. You can click in the Points box to override the points given if needed.
Points

Under Feedback and Notes for Attempt you can enter some feedback for the student and also enter
some notes which are kept private for yourself and other module leaders.

When you’ve finished, click Save and Exit.

Viewing Group Results

You can also view how each group of students performed in a particular test. You may wish to do this in
order to identify weak or strong areas across the group.

  1. Enter the Grade Centre.
  2. Identify the column for the test you want to view.
  3. Click the chevron arrow in the column header of a test and then click Attempts Statistics from the drop-down menu.
  4. You will then be taken to the Tests Statistics page which will show you all of the questions for the test and a breakdown of how they were answered by the group. You can view what percentage of the students responded to each question along with other statistics such as Average Score.
Downloading Results

You can download the results of a test or survey to perform statistical analysis in packages such as
Excel or SPSS.

  1. Enter the Grade Centre.
  2. Identify the column for the test that you want to download the results for.
  3. Click the chevron arrow at the top of the column and then click Download Results from the drop-down menu.
  4. You will then be presented with the choice of delimiter for the output file. You can choose between Comma or Tab – both will work fine in Excel.
  5. You can then choose the Format of results – pick either By User or By Question and User. By User will include all of the questions for a user in one row. By Question and User will list each for each user in a separate row. It’s best to choose By Question and User for tests with more than 40 questions.
  6. You can also choose the Attempts to download. Pick either Only valid attempts or All attempts. All attempts will contain data for every attempt each student made at the test including those that may not be complete.
  7. Once you have chosen the required settings, click the button title Click to download results and save the file when prompted.

TestDownload

Item Analysis

The Item Analysis tool provides statistics on overall test performance and individual test questions to help you recognise questions that might be poor discriminators of student performance. You can use this information to improve questions for future tests or to adjust credit on current attempts. You can run item analyses on deployed tests, but not on surveys.

ItemAnalysisTo run Item Analysis on a Test:

  1. In the Grade Centre, click on the chevron arrow in the column header of a test and select Item Analysis from the menu.
  2. Select a test from the drop-down menu and click Run. The Item Analysis will start to run in the background; the time this takes will depend on the size of the test (e.g. number of questions, number of attempts). You will receive an email notification when it is complete and you can return to this page to view the analysis.
  3. You can access previously run item analyses under the Available Analysis heading. Simply click on the date and time to view the analysis.
  4. The Item Analysis will display summary information about the test, such as the number of completed attempts, the average score and the average time taken to complete the test. You can also view Discrimination and Difficulty summaries (see below for more detailed explanations about these two indicators).
    AnalysisResults
  5. Further down the page there is a breakdown of each question in the question statistics table. This provides item analysis statistics for each question in the test. Questions that are recommended for your review are indicated with red circles so that you can quickly identify those which might need revision.
    • Discrimination indicates how well a question differentiates between students who know the subject matter and those who do not. A question is a good discriminator when students who answer the question correctly also do well on the test. Values can range from -1.0 to +1.0. Questions are flagged for review in the Filter Questions section if their discrimination value is less than 0.1 or is negative. Discrimination values are calculated with the Pearson correlation coefficient but cannot be calculated when the question’s difficulty score is 100% or when all students receive the same score on a question.
    • Difficulty of a question is displayed as a percentage of students who answered the question correctly. Difficulty values can range from 0% to 100%, with a high percentage indicating that the question was easy. Questions in the Easy (greater than 80%) or Hard (less than 30%) categories are flagged for review.
    • Standard Deviation is a measure of how far the scores deviate from the average score. If the scores are tightly grouped, with most of the values being close to the average, the standard deviation is small. If the data set is widely dispersed, with values far from the average, the standard deviation is larger.
    • Standard Error is an estimate of the amount of variability in a student’s score due to chance. The smaller the standard error of measurement, the more accurate the measurement provided by the test question.
  6. You can click on the title of a question in the question statistics table to view more details about that individual question. The Question Details page displays student performance on the individual test question you selected.