You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've encountered a specific checkpoint in the code where a warning is issued for large nseg values: "Warning, too many segments chosen, falling back to nseg = {}". This checkpoint seems to be designed to maintain 100 frames for the longest timestep.
However, in the context of the default usage, where only m=20 values are utilized for the GLS calculation, the rationale behind this specific limitation isn't entirely clear. Is there a theoretical foundation for this constraint? If not, it might be beneficial to consider allowing larger nseg values to enhance statistical accuracy with the same dataset.
I'm curious to hear your thoughts on this.
Thank you for your time and consideration.
The text was updated successfully, but these errors were encountered:
Hello,
I've encountered a specific checkpoint in the code where a warning is issued for large
nseg
values: "Warning, too many segments chosen, falling back to nseg = {}". This checkpoint seems to be designed to maintain 100 frames for the longest timestep.However, in the context of the default usage, where only m=20 values are utilized for the GLS calculation, the rationale behind this specific limitation isn't entirely clear. Is there a theoretical foundation for this constraint? If not, it might be beneficial to consider allowing larger
nseg
values to enhance statistical accuracy with the same dataset.I'm curious to hear your thoughts on this.
Thank you for your time and consideration.
The text was updated successfully, but these errors were encountered: