fix: uptime RPC returns 0 on first call

The monotonic uptime fix (#34328) used a function-local static for `g_startup_time`, which was initialized on first `GetUptime()` call instead of app startup time.
This caused the first `uptime()` call to always return 0.

Move `g_startup_time` to namespace scope so it initializes at program start, ensuring the first call returns actual elapsed time. Note that we don't need to make it `static` anymore because it is just used in this single translation unit.

Test was updated to simulate some work before the first call.

Co-authored-by: Carlo Antinarella <carloantinarella@users.noreply.github.com>
This commit is contained in:
Lőrinc
2026-01-28 23:45:33 +01:00
parent a6cdc3ec9b
commit e67a676df9
2 changed files with 9 additions and 6 deletions

View File

@@ -127,8 +127,8 @@ std::optional<size_t> GetTotalRAM()
return std::nullopt;
}
SteadyClock::duration GetUptime()
{
static const auto g_startup_time{SteadyClock::now()};
return SteadyClock::now() - g_startup_time;
}
namespace {
const auto g_startup_time{SteadyClock::now()};
} // namespace
SteadyClock::duration GetUptime() { return SteadyClock::now() - g_startup_time; }

View File

@@ -26,8 +26,11 @@ class UptimeTest(BitcoinTestFramework):
assert_raises_rpc_error(-8, "Mocktime must be in the range [0, 9223372036], not -1.", self.nodes[0].setmocktime, -1)
def _test_uptime(self):
wait_time = 20_000
time.sleep(1) # Do some work before checking uptime
uptime_before = self.nodes[0].uptime()
assert uptime_before > 0, "uptime should begin at app start"
wait_time = 20_000
self.nodes[0].setmocktime(int(time.time()) + wait_time)
uptime_after = self.nodes[0].uptime()
self.nodes[0].setmocktime(0)