Skip to content

Conversation

@NShahri
Copy link

@NShahri NShahri commented Nov 1, 2025

Why #222 Should Be Reverted

  1. Real-World Applications Have Complex Resolver Patterns
    In large-scale applications, many resolvers fetch different entities with varying keys and in unpredictable orders. It’s unrealistic to expect all keys for a given entity to be available in the same tick. Designing the batching logic around this assumption leads to brittle behavior and missed optimization opportunities.
  2. Excessive Promise Creation Harms Performance
    This implementation creates new Promise instances—even when they’re not strictly necessary. In large-scale Node.js applications, this can lead to significant performance degradation. Node.js event loop and garbage collector performance drops dramatically when thousands of unresolved promises accumulate.
  3. Batching Behavior Should Be Internally Managed and Predictable
    Batching is an internal optimization detail of the module and should not rely on external timing guarantees. If some keys are waiting to be loaded in the current tick, the module should proactively plan to batch them—even if more keys arrive in the next tick. This ensures consistent and efficient data loading behavior.

@NShahri NShahri force-pushed the remove-extra-promises branch from 3e031f2 to 67ae31e Compare November 12, 2025 09:33
@NShahri NShahri force-pushed the remove-extra-promises branch from 22d9c02 to 7ca8681 Compare November 12, 2025 10:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant