Skip to content

Conversation

@rustyrussell
Copy link
Contributor

@Lagrang3 pointed out that our single linked list of layers will become a bottleneck, so this contains a quick benchmark for many layers, and replaces the linked list with a hashtable.

Changelog-None

Simple bench.

Before:
   Creating 20,000 layers:  32 seconds

After:
   Creating 20,000 layers:  13 seconds
   Creating 50,000 layers:  30 seconds
   Creating 100,000 layers: 57 seconds

Signed-off-by: Rusty Russell <rusty@rustcorp.com.au>
…ers on startup.

We used non-persistent layers before, but what if we save to the datastore and restore?

This takes it from 57 to 87 seconds.

Signed-off-by: Rusty Russell <rusty@rustcorp.com.au>
@rustyrussell rustyrussell added this to the v26.03 milestone Dec 5, 2025
@Lagrang3
Copy link
Collaborator

Lagrang3 commented Jan 6, 2026

Nice.
On the results of the benchmark I just want to point out that this is only measuring the time it takes to create X many layers.
For a linked list adding a new element takes O(1) as well as for the hashtable with no doubt also a higher cost due to
the hashing (and possibly string comparisons in case of collisions?).
It surprises me the amount of time: like 1 minute for just 100k layers!?
My guess is that most of time is spent in IO being plain text and involving pyln -> lightningd-> askrene and back.

I would like to see a benchmark to measure the time we spend fetching layers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants