Rework humanTimeUnit() formats of meters
The humanTimeUnit() function is currently used only in GPUMeter for Linux. (#1288) This is my proposal of a new time format and the code of printing it.
Before
0ns - 999ns
1.0us - 9.9us
10us - 999us
1.0ms - 9.9ms
10ms - 999ms
1.0s - 9.9s
10s - 599s
10m - 599m
10h - 95h
4d - 213503d
After
0ns - 9999ns
10.0us - 99.9us
100us - 9999us
.0100s - .9999s
1.000s - 9.999s
10.00s - 59.99s
1m00s - 59m59s
1h00m - 23h59m
1d00h - 99d23h
100d - 364d
1y000d - 9y364d
10y - 584y
Comparing to the discussion I left in that PR, in this proposal I use 6 characters maximum (increased from 5). The main motive was to make the "hours + minutes" or "minutes + seconds" presentation clear without ambiguity.
A few comments:
- I found a missed optimization in both GCC and Clang when writing this patch.
- Considering this function would be used in meters and not table cells, I think it would be better to not print padding spaces on the left. If the padding spaces are not needed, I can make the following changes in the code to remove the padding spaces:
diff --git a/linux/GPUMeter.c b/linux/GPUMeter.c
index 6854eaf3..5614eb46 100644
--- a/linux/GPUMeter.c
+++ b/linux/GPUMeter.c
@@ -40,7 +40,7 @@ static const int GPUMeter_attributes[] = {
static int humanTimeUnit(char* buffer, size_t size, unsigned long long totalNanoseconds) {
if (totalNanoseconds < 10000)
- return xSnprintf(buffer, size, "%4uns", (unsigned int)totalNanoseconds);
+ return xSnprintf(buffer, size, "%uns", (unsigned int)totalNanoseconds);
unsigned long long value = totalNanoseconds / 100;
@@ -50,7 +50,7 @@ static int humanTimeUnit(char* buffer, size_t size, unsigned long long totalNano
value /= 10; // microseconds
if (value < 10000)
- return xSnprintf(buffer, size, "%4uus", (unsigned int)value);
+ return xSnprintf(buffer, size, "%uus", (unsigned int)value);
value /= 100;
@@ -69,22 +69,22 @@ static int humanTimeUnit(char* buffer, size_t size, unsigned long long totalNano
value = totalSeconds;
if (value < 3600)
- return xSnprintf(buffer, size, "%2um%02us", (unsigned int)value / 60, (unsigned int)value % 60);
+ return xSnprintf(buffer, size, "%um%02us", (unsigned int)value / 60, (unsigned int)value % 60);
value /= 60; // minutes
if (value < 1440)
- return xSnprintf(buffer, size, "%2uh%02um", (unsigned int)value / 60, (unsigned int)value % 60);
+ return xSnprintf(buffer, size, "%uh%02um", (unsigned int)value / 60, (unsigned int)value % 60);
value /= 60; // hours
if (value < 2400)
- return xSnprintf(buffer, size, "%2ud%02uh", (unsigned int)value / 24, (unsigned int)value % 24);
+ return xSnprintf(buffer, size, "%ud%02uh", (unsigned int)value / 24, (unsigned int)value % 24);
value /= 24; // days
if (value < 365)
- return xSnprintf(buffer, size, "%5ud", (unsigned int)value);
+ return xSnprintf(buffer, size, "%ud", (unsigned int)value);
if (value < 3650)
return xSnprintf(buffer, size, "%uy%03ud", (unsigned int)(value / 365), (unsigned int)(value % 365));
@@ -92,9 +92,9 @@ static int humanTimeUnit(char* buffer, size_t size, unsigned long long totalNano
value /= 365; // years (ignore leap years)
if (value < 100000)
- return xSnprintf(buffer, size, "%5luy", (unsigned long)value);
+ return xSnprintf(buffer, size, "%luy", (unsigned long)value);
- return xSnprintf(buffer, size, " inf.");
+ return xSnprintf(buffer, size, "inf");
}
static void GPUMeter_updateValues(Meter* this) {
Building a htop with your patch
* 58524a3b - (HEAD -> main) Merge branch 'meter-human-time-unit' of htop-1 (13 minutes ago) [Daniel Lange]
|\
| * 602d4f40 - Rework humanTimeUnit() formats of meters (3 days ago) [Explorer09]
htop 3.4.0-dev-3.3.0-247-g58524a3
I still see "2h02:07" in the GPU_TIME column and "313043ns".
What am I doing wrong?
/DLange
Building a htop with your patch
* 58524a3b - (HEAD -> main) Merge branch 'meter-human-time-unit' of htop-1 (13 minutes ago) [Daniel Lange] |\ | * 602d4f40 - Rework humanTimeUnit() formats of meters (3 days ago) [Explorer09]htop 3.4.0-dev-3.3.0-247-g58524a3I still see "2h02:07" in the GPU_TIME column and "313043ns".
What am I doing wrong?
/DLange
Well. This patch is not about the GPU_TIME column but the time values displayed in the GPU meter. So you are probably looking at the wrong place. Enable a GPU meter in the meters menu to see it.
That only shows me a usage "%". But in any case why is humanTimeUnit() not even used consistently throughout the GPU meter where it was conceived? I think you need to make it apply to at least the columns so we have something to compare before and after and see whether your changes make sense in practice.
@fasterit Sorry, but I have only an emulator at the moment. You should set the GPU meter to text mode to see the time. It's not displayed in the bar mode.
I made this PR as a follow-up to #1288. The humanTimeUnit() function only applies to the meter, because Row_printTime() (#1325) and Row_printNanoseconds() (#1425) handle the time printing in the columns. And the code of Row_printNanoseconds() and humanTimeUnit() are not shared.
During the review of #1606, I noticed that the totalGPUTimeDiff display needs a conditional so that if the total GPU time is not available from a machine, GPUMeter_display should hide that field entirely. This would be a commit separate from the humanTimeUnit() changes.