Webserver use, configuration and management

Unit WUCM1

Feedback on the WUCM1 exam 2008-2009s2

General points

The exam this year was quite poor with an average mark of 46% (down from 58% last year), and 20 people failed out of 55 who took it (last year only 3 out of 43 did). 4 of the failures were due to absence from the exam. However, the top score was 80 and 4 other students got marks of 70 or more, indicating that the unit was by no means impossible.

Despite giving lots of specific advice about how to answer the questions relating to the scenario, few people drew anything relevant out from it. Instead, exam answers were full of generic material clearly copied (and often inaccurately, indicating lack of understanding) from the lecture notes.

There is a clear correlation between lecture attendance and results. People who attended 10 or more lectures averaged 10% more than those who did not. Based on the number of students who:

I have come to the conclusion that the student group as a whole failed to engage with this unit. I know some found it boring (as a matter of fact so did I because I had so little interaction from students!); I suspect that some thought that a quick read of the lecture notes the night before the exam would guarantee success. I don't know how many people conscientously carried out all the practicals - judging from some of the answers (or lack of serious answers) to some of the more technical questions, I doubt it was many.

Hopefully the 20 failing students will now address the course seriously, otherwise they are likely to repeat their failure in the resit exam.

Question 1

A nice easy question about requirements to start off! This was reflected by it being the highest scoring question.

Most people were able to say that they would talk to staff and read existing documentation and observe existing system(s). The most common reason for losing marks was to give an answer about the requirements process in the abstract rather than about Supercars.

Another problem was giving a non-functional requirement when the question specifically asked for a functional one, or vice-versa.

Question 2

Q2 was about the intended audience of the website.

Most people were able to easily identify three audience groups, but many had difficulty in "profiling" one of them. Goodish answers related to the gender, age range and IT skills of a group (usually customers); poorer answers talked about their requirements rather than their attributes.

Question 3

This was about the DocumentRoot and ServerName directives.

An occasional mistake was to misread the latter as ServerRoot.

However the most common mistake was to fail to accurately specify the relevant httpd.conf entries.

Question 4

This was about security policy.

There were some obvious but good answers to this, but a few people talked about implementation mechanisms rather than policies.

In the second part, the most common problem was not giving specific examples, as required by the question.

Question 5

This question was about CGI. Answers fell into four groups:

  1. those people who knew what CGI is and who could talk about it in layman's terms (as asked)
  2. those people who knew what CGI is but who used jargon to describe it
  3. those people who thought that CGI was Computer Generated Imagery - clearly they didn't notice that WUCM1 was a course about web sites!
  4. those people who completely failed to attempt the question; perhaps this was because the lecture that covered CGI was the most sparsely attended of the unit

Question 6

This question was about setting up Apache configuration to restrict access to a particular folder to authorised users.

This should have been free marks to everyone, but common problems included:

A significant number of people scored 0 on this. Like Queston 5, perhaps they had neither attended the relevant lecture, nor attempted the relevant practical.

Question 7

Virtually everyone chose this as one of the optional questions to answer.

Most people who answered this question said that emailing a new password to the user was the best way of solving the problem of forgotten passwords. The discussion of whether this was a good approach or not, and whether other information should be used to verify the identity of the user, was variable.

Some completely infeasible solutions were suggested – I rank having a user ring a call centre to have their password reset as among these: how many e-commerce sites do you know who do this?

Also, asking the user to give their mother's maiden name (or similar personal information) isn't very useful. Since this is not one of the pieces of information the website holds on its users, how would they know whether the answer was correct?

Question 8

Only a minority of people attempted this question.

In the first part, there was considerable confusion between the ways of passing data and the HTTP methods that use them (i.e. GET, POST, etc.).

In the second part, the most common fault was not providing the steps in sufficient detail.

In the third part, we had some novel and innovative environment variables mentioned that are not part of CGI.

Question 9

This question was about sizing a server installation.

In the first part it was remarkable how many people did not understand what hardware was, and instead provided details of a number of software parameters.

In the second part, credit was given for a wide range of operational statistics. Many people lost marks because they could only think of 2 or 3, when I was looking for at least 4 (for 8 marks).

The third part was generally well answered, with most people mentioning the strategies that had been presented in the lecture material.

Question 10

This question was about log files.

The first part was about using tools as a substitute for manual inspection. Most answers were reasonable except for those that didn't mention tools!

The second part was about misuse of log files. Some of the abuses suggested here were quite imaginative in that they used information that couldn't possibly be in a log file!

The third part was about using log files as evidence in legal proceedings. The question asked for an answer that outlined an example situation where this might sensibly occur. Few answers described a realistic situation.

General exam issues

Poor handwriting. A few scripts had sentences that were illegible. Fragments that couldn't be deciphered with normal effort were marked as 0.

Only a minority of students followed the instruction on the answer book to commence each question on a new page.

Many students ignored the instruction on the answer book to insert the numbers of the questions answered on the front cover.

 

Last updated by Prof Jim Briggs of the School of Computing at the University of Portsmouth