INGInious icon indicating copy to clipboard operation
INGInious copied to clipboard

Storing time taken to evaluate submissions

Open donnm opened this issue 7 years ago • 1 comments

This is more of a general question because I couldn't find anything in the documentation.

What is the best practice for timing student submitted code? My goal is to put code timings on a scoreboard to highlight the most efficient (wrt time) submissions. In my case submissions are currently in Python, but this is applicable to any langauge.

I see a few alternatives:

  • Run student code from a Python template that internally times the submission. Disadvantage is we only get user time here.
  • Time from the run script using /usr/bin/time installed in a container. This lets us see the difference between system and user time.
  • Record time using date in the run script. This is simple but the problem is we also count the time to start up the docker container.

Is there some INGInious standard for doing this?

donnm avatar Feb 12 '18 13:02 donnm

I would eliminate the last one which will not give accurate results. Then it depends if you are interested in the system time (or did you mean real ?), but I think the system time will be impacted by the possible parallel submissions that are graded on the same machine, as they use the same kernel.

Actually there is no logging of the execution time, mainly because it is not possible for INGInious to know exactly what has to be measured, so using a scoreboard with a custom field output by the container is a good workaround.

However, I guess it would be a nice suggestion to be able to store "natively" the measured time in the database, either by telling when to measure or by telling directly the measured time.

anthonygego avatar Feb 23 '18 07:02 anthonygego