-
Notifications
You must be signed in to change notification settings - Fork 532
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Introduce a new FlushMode
in the interest of not creating too many batches
#14060
Conversation
⯆ @fluid-example/bundle-size-tests: -606 Bytes
Baseline commit: 2aebc74 |
@@ -96,6 +96,7 @@ export type FluidDataStoreRegistryEntry = Readonly<Partial<IProvideFluidDataStor | |||
|
|||
// @public | |||
export enum FlushMode { | |||
Async = 2, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can have most of these changes in main, with exception of this change in enum.
I.e. you can easily have
export enum FlushModeEx {
Async = 2,
};
and use it in main, and merge FlushModeEx & FlushMode in next
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How would that work? Can you expand a bit? The runtime would still expect FlushMode | FlushModeEx in its APIs, right? Isn't that still a back-compat breaking change?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, everywhere in the code I'd keep FlushMode alone.
If caller (Loop) wants to experiment, then can do FlusModeEx.timeout as FlusModeEx.
And the only place where we need to do cast is switch:
switch (mode) {
case FlushMode.immidiate:...
case FlushModeEx.timeout as FlushMode:
}
This adds ability to push most of the code into main (which is good think from all points), while not changing officially API, but allowing (if we want to) experimentation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will prepare the change for main
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, so the plan is to merge #14106 first, then wait for main->next integration and afterwards I'll re-adjust this PR to accommodate the changes from main. My goal is to make it so that we don't end up with merge conflicts and at the same time, next uses proper flushmode.
Before we add this experiment, should we check that the current applications will even use this? Otherwise, we might not get any data suggesting if this even works. |
…tches (#14106) ## Description When the client uses async code, there is a risk of creating too many batches (enough batches to trigger server-side throttling) as with FlushMode.TurnBased as flush is scheduled as a microtask. This change allows for flushing to happen in a scheduled macrotask, capturing all the ops from all microtask and bundling them as a single batch. **Note this ability is not part of the public API. The runtime does not accept this new flush mode without explicit casting** This change is #14060 but for `main`
This PR has been automatically marked as stale because it has had no activity for 60 days. It will be closed if no further activity occurs within 8 days of this comment. Thank you for your contributions to Fluid Framework! |
Description
When the client uses async code, there is a risk of creating too many batches (enough batches to trigger server-side throttling) as with
FlushMode.TurnBased
as flush is scheduled as a microtask. This change allows for flushing to happen in a scheduled macrotask, capturing all the ops from all microtask and bundling them as a single batch.