-
Notifications
You must be signed in to change notification settings - Fork 102
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add new move count metric #1072
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm leaving a review before I leave for PTO.
It's OK if merged once my comments are resolved, provided that someone else sees the final PR as well.
benchmark/src/main/java/ai/timefold/solver/benchmark/impl/SolverBenchmarkFactory.java
Outdated
Show resolved
Hide resolved
...lver/benchmark/impl/statistic/movecalculationspeed/MoveCalculationSpeedProblemStatistic.java
Outdated
Show resolved
Hide resolved
...er/benchmark/impl/statistic/movecalculationspeed/MoveCalculationSpeedSubSingleStatistic.java
Outdated
Show resolved
Hide resolved
core/src/main/java/ai/timefold/solver/core/impl/heuristic/move/AbstractMetricMove.java
Outdated
Show resolved
Hide resolved
I couldn't come up with a useful sub-metric for the move count metric. Do you have any suggestions?
For the benchmark report, it makes sense to me to present it as "Move Calculation Speed". On the other hand, when defining the termination configuration, I think using only count is better: |
@triceo, I see some possible points of discussion beside the ones you may find during the review:
I couldn't find any useful sub-metric for the move count. We already have the move count per step.
I initially included it but then removed it. My point is that it may confuse the user as it is the same number when not using R&R moves. I believe that the benchmark should be used to compare move speeds, and the score calculation speed should still be the primary metric.
I considered adding
I believe the primary metric remains score calculation speed, and we should delegate the comparison of move speed to the benchmark module. With that in mind, the performance page will not change significantly except for adding a reference to the new metric. It doesn't make sense to me to compare a configuration using R&R with one that doesn't. Hence, when comparing two configurations using R&R, the score calculation still makes sense for evaluating performance. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am okay with the move evaluation speed name, but I am concern that one of the tests, where Ruin moves are not used, have different counts for score calculation count and move evaluation counts (which smells like a bug). Otherwise LGTM.
core/src/test/java/ai/timefold/solver/core/api/solver/SolverManagerTest.java
Show resolved
Hide resolved
Some other metrics can track their values per-move. So, for example, you could have "move execution count for change moves" etc. This is what I mean. There is precedent.
Good point on it being the same number in many cases.
I do think it is worth the effort, for the same reason as I said above. Score calculation count is not stable, its definition changes based on moves used. Move execution count is stable and always means the same thing. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM after the suggested changes are applied.
docs/src/modules/ROOT/pages/using-timefold-solver/benchmarking-and-tweaking.adoc
Outdated
Show resolved
Hide resolved
docs/src/modules/ROOT/pages/constraints-and-score/performance.adoc
Outdated
Show resolved
Hide resolved
docs/src/modules/ROOT/pages/constraints-and-score/performance.adoc
Outdated
Show resolved
Hide resolved
docs/src/modules/ROOT/pages/using-timefold-solver/benchmarking-and-tweaking.adoc
Outdated
Show resolved
Hide resolved
docs/src/modules/ROOT/pages/using-timefold-solver/benchmarking-and-tweaking.adoc
Outdated
Show resolved
Hide resolved
core/src/main/java/ai/timefold/solver/core/impl/phase/scope/AbstractStepScope.java
Outdated
Show resolved
Hide resolved
core/src/main/java/ai/timefold/solver/core/impl/phase/scope/AbstractPhaseScope.java
Outdated
Show resolved
Hide resolved
There was something still bugging me about this PR. So I opened the IDE and looked at it. I know what it was now. Disabling metric collection is a very specific thing, it only ever exists for RnR. Yet, it proliferates everywhere - to every phase, to every step, everywhere. I fixed it by only dealing with disabling metric collection when already in RnR. If we ever need it in more phases than here, we can generify it - but right now, the correct decision was to not introduce this anywhere but in RnR. I have similarly updated the enterprise PR, which no longer needs to check whether metrics are enabled. As long as RnR doesn't trigger any metrics, nobody else (not LS, not PS, ...) needs to know anything about metrics being enabled or disabled. Please take a look and check my thinking. |
Looks good to me! |
Quality Gate passedIssues Measures |
This pull request includes a new metric for counting the number of moves, which differs from the score count when using R&R moves.