Comment by Cricket_Initial on 03/08/2020 at 01:54 UTC

6 upvotes, 0 direct replies (showing 0)

View submission: Celer Joins Reddit Great Scaling Bakeoff: Exactly Matching Use Cases, in Production with Happy Users

This looks very interesting and promising! A few questions though:

1. Can you share your sources for these benchmarks / how much data is being handled by the commit chains under said benchmarking?

​


2. Can you elaborate on how exactly the backbone linking is achieved and how said achievement does/doesn't effect performance?

Each node will serve users in the “proximity” and cross different subreddits if tokens can be shared, they can be connected simply via some “backbone” links as a single and interoperable network.

If some token is accounted for on one layer-2 node and said token is routed to another layer-2 node that's N hops away, when the recipient wants to withdraw the token from the layer-2 network, how do the end and intermediary layer-2 nodes involved in processing this flow behave?

3. How does this relate to the state storage cost *per channel* for a user? IIUC if a user ends up having many channels, the subscription cost would increase linearly with their number of channels which I assume isn't very appealing for most users (unless it's *negligibly* cheap).

Each user’s state proof is around 200Bytes no matter how many transactions a user sends as the state constantly gets updated to the newest state instead of appending new states.

Replies

There's nothing here!