Skip to content

Commit

Permalink
Fix broken links (#2720)
Browse files Browse the repository at this point in the history
Summary:
## Motivation

The recent website upgrade moved the location of tutorials and api reference breaking existing links to those. Here we fix those links and also configure Docusaurus to raise an error on broken links in the future.

Pull Request resolved: #2720

Test Plan:
Docusaurus checks for broken links when creating a production build. Running `./scripts/build_docs.sh -b` now results in a clean build with no broken links reported.

## Related PRs
- #2715

Reviewed By: saitcakmak, Balandat

Differential Revision: D69034874

Pulled By: CristianLara

fbshipit-source-id: 3a3c9488a6bb1c0a21d0cbb854972e84432eb467
  • Loading branch information
CristianLara authored and facebook-github-bot committed Feb 3, 2025
1 parent 37f3f7d commit b197bf1
Show file tree
Hide file tree
Showing 36 changed files with 118 additions and 118 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ pip install -e ".[dev, tutorials]"

Here's a quick run down of the main components of a Bayesian optimization loop.
For more details see our [Documentation](https://botorch.org/docs/introduction) and the
[Tutorials](https://botorch.org/tutorials).
[Tutorials](https://botorch.org/docs/tutorials).

1. Fit a Gaussian Process model to data
```python
Expand Down
4 changes: 2 additions & 2 deletions botorch/models/cost.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
Cost are useful for defining known cost functions when the cost of an evaluation
is heterogeneous in fidelity. For a full worked example, see the
`tutorial <https://botorch.org/tutorials/multi_fidelity_bo>`_ on continuous
`tutorial <https://botorch.org/docs/tutorials/multi_fidelity_bo>`_ on continuous
multi-fidelity Bayesian Optimization.
"""

Expand All @@ -29,7 +29,7 @@ class AffineFidelityCostModel(DeterministicModel):
cost = fixed_cost + sum_j weights[j] * X[fidelity_dims[j]]
For a full worked example, see the
`tutorial <https://botorch.org/tutorials/multi_fidelity_bo>`_ on continuous
`tutorial <https://botorch.org/docs/tutorials/multi_fidelity_bo>`_ on continuous
multi-fidelity Bayesian Optimization.
Example:
Expand Down
2 changes: 1 addition & 1 deletion botorch/models/gp_regression_fidelity.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
Multi-Fidelity Gaussian Process Regression models based on GPyTorch models.
For more on Multi-Fidelity BO, see the
`tutorial <https://botorch.org/tutorials/discrete_multi_fidelity_bo>`__.
`tutorial <https://botorch.org/docs/tutorials/discrete_multi_fidelity_bo>`__.
A common use case of multi-fidelity regression modeling is optimizing a
"high-fidelity" function that is expensive to simulate when you have access to
Expand Down
4 changes: 2 additions & 2 deletions botorch/models/transforms/input.py
Original file line number Diff line number Diff line change
Expand Up @@ -1242,7 +1242,7 @@ class AppendFeatures(InputTransform):
`RiskMeasureMCObjective` to optimize risk measures as described in
[Cakmak2020risk]_. A tutorial notebook implementing the rhoKG acqusition
function introduced in [Cakmak2020risk]_ can be found at
https://botorch.org/tutorials/risk_averse_bo_with_environmental_variables.
https://botorch.org/docs/tutorials/risk_averse_bo_with_environmental_variables.
The steps for using this to obtain samples of a risk measure are as follows:
Expand Down Expand Up @@ -1505,7 +1505,7 @@ class InputPerturbation(InputTransform):
on optimizing risk measures.
A tutorial notebook using this with `qNoisyExpectedImprovement` can be found at
https://botorch.org/tutorials/risk_averse_bo_with_input_perturbations.
https://botorch.org/docs/tutorials/risk_averse_bo_with_input_perturbations.
"""

is_one_to_many: bool = True
Expand Down
12 changes: 6 additions & 6 deletions docs/acquisition.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ black box function.

BoTorch supports both analytic as well as (quasi-) Monte-Carlo based acquisition
functions. It provides a generic
[`AcquisitionFunction`](../api/acquisition.html#acquisitionfunction) API that
[`AcquisitionFunction`](https://botorch.readthedocs.io/en/latest/acquisition.html#botorch.acquisition.acquisition.AcquisitionFunction) API that
abstracts away from the particular type, so that optimization can be performed
on the same objects.

Expand Down Expand Up @@ -64,7 +64,7 @@ where $\mu(X)$ is the posterior mean of $f$ at $X$, and $L(X)L(X)^T = \Sigma(X)$
is a root decomposition of the posterior covariance matrix.

All MC-based acquisition functions in BoTorch are derived from
[`MCAcquisitionFunction`](../api/acquisition.html#mcacquisitionfunction).
[`MCAcquisitionFunction`](https://botorch.readthedocs.io/en/latest/acquisition.html#botorch.acquisition.monte_carlo.MCAcquisitionFunction).

Acquisition functions expect input tensors $X$ of shape
$\textit{batch\_shape} \times q \times d$, where $d$ is the dimension of the
Expand Down Expand Up @@ -122,15 +122,15 @@ above.

BoTorch also provides implementations of analytic acquisition functions that
do not depend on MC sampling. These acquisition functions are subclasses of
[`AnalyticAcquisitionFunction`](../api/acquisition.html#analyticacquisitionfunction)
[`AnalyticAcquisitionFunction`](https://botorch.readthedocs.io/en/latest/acquisition.html#botorch.acquisition.analytic.AnalyticAcquisitionFunction)
and only exist for the case of a single candidate point ($q = 1$). These
include classical acquisition functions such as Expected Improvement (EI),
Upper Confidence Bound (UCB), and Probability of Improvement (PI). An example
comparing [`ExpectedImprovement`](../api/acquisition.html#expectedimprovement),
comparing [`ExpectedImprovement`](https://botorch.readthedocs.io/en/latest/acquisition.html#botorch.acquisition.analytic.ExpectedImprovement),
the analytic version of EI, to it's MC counterpart
[`qExpectedImprovement`](../api/acquisition.html#qexpectedimprovement)
[`qExpectedImprovement`](https://botorch.readthedocs.io/en/latest/acquisition.html#botorch.acquisition.monte_carlo.qExpectedImprovement)
can be found in
[this tutorial](../tutorials/compare_mc_analytic_acquisition).
[this tutorial](tutorials/compare_mc_analytic_acquisition).

Analytic acquisition functions allow for an explicit expression in terms of the
summary statistics of the posterior distribution at the evaluated point(s).
Expand Down
12 changes: 6 additions & 6 deletions docs/batching.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ referred to as q-Acquisition Functions. For instance, BoTorch ships with support
for q-EI, q-UCB, and a few others.

As discussed in the
[design philosophy](design_philosophy#batching-batching-batching),
[design philosophy](/docs/design_philosophy#parallelism-through-batched-computations),
BoTorch has adopted the convention of referring to batches in the
batch-acquisition sense as "q-batches", and to batches in the torch
batch-evaluation sense as "t-batches".
Expand All @@ -35,9 +35,9 @@ with samples from the posterior in a consistent fashion.

#### Batch-Mode Decorator

In order to simplify the user-facing API for evaluating acquisition functions,
In order to simplify the user-facing API for evaluating acquisition functions,
BoTorch implements the
[`@t_batch_mode_transform`](../api/utils.html#botorch.utils.transforms.t_batch_mode_transform)
[`@t_batch_mode_transform`](https://botorch.readthedocs.io/en/latest/utils.html#botorch.utils.transforms.t_batch_mode_transform)
decorator, which allows the use of non-batch mode inputs. If applied to an
instance method with a single `Tensor` argument, an input tensor to that method
without a t-batch dimension (i.e. tensors of shape $q \times d$) will automatically
Expand Down Expand Up @@ -66,7 +66,7 @@ distribution:
of $b_1 \times \cdots \times b_k$, with $n$ data points of $d$-dimensions each in every batch)
yields a posterior with `event_shape` being $b_1 \times \cdots \times b_k \times n \times 1$.
In most cases, the t-batch-shape will be single-dimensional (i.e., $k=1$).
- Evaluating a multi-output model with $o$ outputs at a $b_1 \times \cdots \times b_k
- Evaluating a multi-output model with $o$ outputs at a $b_1 \times \cdots \times b_k
\times n \times d$ tensor yields a posterior with `event_shape` equal to
$b_1 \times \cdots \times b_k \times n \times o$.
- Recall from the previous section that internally, with the help of the
Expand Down Expand Up @@ -123,7 +123,7 @@ The shape of the test points must support broadcasting to the $\textit{batch_sha
necessary over $\textit{batch_shape}$)

#### Batched Multi-Output Models
The [`BatchedMultiOutputGPyTorchModel`](../api/models.html#batchedmultioutputgpytorchmodel)
The [`BatchedMultiOutputGPyTorchModel`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.gpytorch.BatchedMultiOutputGPyTorchModel)
class implements a fast multi-output model (assuming conditional independence of
the outputs given the input) by batching over the outputs.

Expand Down Expand Up @@ -157,5 +157,5 @@ back-propagating.

#### Batched Cross Validation
See the
[Using batch evaluation for fast cross validation](../tutorials/batch_mode_cross_validation)
[Using batch evaluation for fast cross validation](tutorials/batch_mode_cross_validation)
tutorial for details on using batching for fast cross validation.
8 changes: 4 additions & 4 deletions docs/botorch_and_ax.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ it easy to drive the car.


Ax provides a
[`BotorchModel`](https://ax.dev/api/models.html#ax.models.torch.botorch.BotorchModel)
[`BotorchModel`](https://https://ax.readthedocs.io/en/latest/models.html#ax.models.torch.botorch.BotorchModel)
that is a sensible default for modeling and optimization which can be customized
by specifying and passing in bespoke model constructors, acquisition functions,
and optimization strategies.
Expand All @@ -43,7 +43,7 @@ the the Bayesian Optimization loop untouched. It is then straightforward to plug
your custom BoTorch model or acquisition function into Ax to take advantage of
Ax's various loop control APIs, as well as its powerful automated metadata
management, data storage, etc. See the
[Using a custom BoTorch model in Ax](../tutorials/custom_botorch_model_in_ax)
[Using a custom BoTorch model in Ax](tutorials/custom_botorch_model_in_ax)
tutorial for more on how to do this.


Expand All @@ -53,8 +53,8 @@ If you're working in a non-standard setting, such as structured feature or
design spaces, or where the model fitting process requires interactive work,
then using Ax may not be the best solution for you. In such a situation, you
might be better off writing your own full Bayesian Optimization loop in BoTorch.
The [q-Noisy Constrained EI](../tutorials/closed_loop_botorch_only) tutorial and
[variational auto-encoder](../tutorials/vae_mnist) tutorial give examples of how
The [q-Noisy Constrained EI](tutorials/closed_loop_botorch_only) tutorial and
[variational auto-encoder](tutorials/vae_mnist) tutorial give examples of how
this can be done.

You may also consider working purely in BoTorch if you want to be able to
Expand Down
2 changes: 1 addition & 1 deletion docs/constraints.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ the constrained expected improvement variant is mathematically equivalent to the
unconstrained expected improvement of the objective, multiplied by the probability of
feasibility under the modeled outcome constraint.

See the [Closed-Loop Optimization](../tutorials/closed_loop_botorch_only)
See the [Closed-Loop Optimization](tutorials/closed_loop_botorch_only)
tutorial for an example of using outcome constraints in BoTorch.


Expand Down
2 changes: 1 addition & 1 deletion docs/design_philosophy.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ all data available. In typical machine learning model training, a stochastic
version of the empirical loss, obtained by "mini-batching" the data, is
optimized using stochastic optimization algorithms.

In BoTorch, [`AcquisitionFunction`](../api/acquisition.html#acquisitionfunction)
In BoTorch, [`AcquisitionFunction`](https://botorch.readthedocs.io/en/latest/acquisition.html#botorch.acquisition.acquisition.AcquisitionFunction)
modules map an input design $X$ to the acquisition function value. Optimizing
the acquisition function means optimizing the output over the possible values of
$X$. If the acquisition function is deterministic, then so is the optimization
Expand Down
4 changes: 2 additions & 2 deletions docs/getting_started.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -89,13 +89,13 @@ Here's a quick run down of the main components of a Bayesian Optimization loop.
## Tutorials

Our Jupyter notebook tutorials help you get off the ground with BoTorch.
View and download them [here](../tutorials).
View and download them [here](tutorials).


## API Reference

For an in-depth reference of the various BoTorch internals, see our
[API Reference](../api).
[API Reference](https://botorch.readthedocs.io/).


## Contributing
Expand Down
48 changes: 24 additions & 24 deletions docs/models.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,15 +13,15 @@ the posterior distribution is a multivariate normal. While BoTorch supports many
GP models, **BoTorch makes no assumption on the model being a GP** or the
posterior being multivariate normal. With the exception of some of the analytic
acquisition functions in the
[`botorch.acquisition.analytic`](../api/acquisition.html#analytic-acquisition-function-api)
[`botorch.acquisition.analytic`](https://botorch.readthedocs.io/en/latest/acquisition.html#analytic-acquisition-function-api)
module, BoTorch’s Monte Carlo-based acquisition functions are compatible with
any model that conforms to the `Model` interface, whether user-implemented or
provided.

Under the hood, BoTorch models are PyTorch `Modules` that implement the
light-weight [`Model`](../api/models.html#model-apis) interface. When working
light-weight [`Model`](https://botorch.readthedocs.io/en/latest/models.html#model-apis) interface. When working
with GPs,
[`GPyTorchModel`](../api/models.html#module-botorch.models.gp_regression)
[`GPyTorchModel`](https://botorch.readthedocs.io/en/latest/models.html#module-botorch.models.gp_regression)
provides a base class for conveniently wrapping GPyTorch models.

Users can extend `Model` and `GPyTorchModel` to generate their own models. For
Expand Down Expand Up @@ -84,36 +84,36 @@ BoTorch provides several GPyTorch models to cover most standard BO use cases:
These models use the same training data for all outputs and assume conditional
independence of the outputs given the input. If different training data is
required for each output, use a
[`ModelListGP`](../api/models.html#module-botorch.models.model_list_gp_regression)
[`ModelListGP`](https://botorch.readthedocs.io/en/latest/models.html#module-botorch.models.model_list_gp_regression)
instead.

- [`SingleTaskGP`](../api/models.html#botorch.models.gp_regression.SingleTaskGP):
- [`SingleTaskGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.gp_regression.SingleTaskGP):
a single-task exact GP that supports both inferred and observed noise. When
noise observations are not provided, it infers a homoskedastic noise level.
- [`MixedSingleTaskGP`](../api/models.html#botorch.models.gp_regression_mixed.MixedSingleTaskGP):
- [`MixedSingleTaskGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.gp_regression_mixed.MixedSingleTaskGP):
a single-task exact GP that supports mixed search spaces, which combine
discrete and continuous features.
- [`SaasFullyBayesianSingleTaskGP`](../api/models.html#botorch.models.fully_bayesian.SaasFullyBayesianSingleTaskGP):
- [`SaasFullyBayesianSingleTaskGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.fully_bayesian.SaasFullyBayesianSingleTaskGP):
a fully Bayesian single-task GP with the SAAS prior. This model is suitable
for sample-efficient high-dimensional Bayesian optimization.

### Model List of Single-Task GPs

- [`ModelListGP`](../api/models.html#module-botorch.models.model_list_gp_regression):
- [`ModelListGP`](https://botorch.readthedocs.io/en/latest/models.html#module-botorch.models.model_list_gp_regression):
A multi-output model in which outcomes are modeled independently, given a list
of any type of single-task GP. This model should be used when the same
training data is not used for all outputs.

### Multi-Task GPs

- [`MultiTaskGP`](../api/models.html#module-botorch.models.multitask): a
- [`MultiTaskGP`](https://botorch.readthedocs.io/en/latest/models.html#module-botorch.models.multitask): a
Hadamard multi-task, multi-output GP using an ICM kernel. Supports both known
observation noise levels and inferring a homoskedastic noise level (when noise
observations are not provided).
- [`KroneckerMultiTaskGP`](../api/models.html#botorch.models.multitask.KroneckerMultiTaskGP):
- [`KroneckerMultiTaskGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.multitask.KroneckerMultiTaskGP):
A multi-task, multi-output GP using an ICM kernel, with Kronecker structure.
Useful for multi-fidelity optimization.
- [`SaasFullyBayesianMultiTaskGP`](../api/models.html#saasfullybayesianmultitaskgp):
- [`SaasFullyBayesianMultiTaskGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.fully_bayesian_multitask.SaasFullyBayesianMultiTaskGP):
a fully Bayesian multi-task GP using an ICM kernel. The data kernel uses the
SAAS prior to model high-dimensional parameter spaces.

Expand All @@ -128,33 +128,33 @@ additional context on the default hyperparameters.

## Other useful models

- [`ModelList`](../api/models.html#botorch.models.model.ModelList): a
- [`ModelList`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.model.ModelList): a
multi-output model container in which outcomes are modeled independently by
individual `Model`s (as in `ModelListGP`, but the component models do not all
need to be GPyTorch models).
- [`SingleTaskMultiFidelityGP`](../api/models.html#botorch.models.gp_regression_fidelity.SingleTaskMultiFidelityGP):
- [`SingleTaskMultiFidelityGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.gp_regression_fidelity.SingleTaskMultiFidelityGP):
A GP model for multi-fidelity optimization. For more on Multi-Fidelity BO, see
the [tutorial](../tutorials/discrete_multi_fidelity_bo).
- [`HigherOrderGP`](../api/models.html#botorch.models.higher_order_gp.HigherOrderGP):
the [tutorial](tutorials/discrete_multi_fidelity_bo).
- [`HigherOrderGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.higher_order_gp.HigherOrderGP):
A GP model with matrix-valued predictions, such as images or grids of images.
- [`PairwiseGP`](../api/models.html#module-botorch.models.pairwise_gp): A
- [`PairwiseGP`](https://botorch.readthedocs.io/en/latest/models.html#module-botorch.models.pairwise_gp): A
probit-likelihood GP that learns via pairwise comparison data, useful for
preference learning.
- [`ApproximateGPyTorchModel`](../api/models.html#botorch.models.approximate_gp.ApproximateGPyTorchModel):
- [`ApproximateGPyTorchModel`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.approximate_gp.ApproximateGPyTorchModel):
for efficient computation when data is large or responses are non-Gaussian.
- [Deterministic models](../api/models.html#module-botorch.models.deterministic),
- [Deterministic models](https://botorch.readthedocs.io/en/latest/models.html#module-botorch.models.deterministic),
such as
[`AffineDeterministicModel`](../api/models.html#botorch.models.deterministic.AffineDeterministicModel),
[`AffineFidelityCostModel`](../api/models.html#botorch.models.cost.AffineFidelityCostModel),
[`GenericDeterministicModel`](../api/models.html#botorch.models.deterministic.GenericDeterministicModel),
[`AffineDeterministicModel`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.deterministic.AffineDeterministicModel),
[`AffineFidelityCostModel`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.cost.AffineFidelityCostModel),
[`GenericDeterministicModel`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.deterministic.GenericDeterministicModel),
and
[`PosteriorMeanModel`](../api/models.html#botorch.models.deterministic.PosteriorMeanModel)
[`PosteriorMeanModel`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.deterministic.PosteriorMeanModel)
express known input-output relationships; they conform to the BoTorch `Model`
API, so they can easily be used in conjunction with other BoTorch models.
Deterministic models are useful for multi-objective optimization with known
objective functions and for encoding cost functions for cost-aware
acquisition.
- [`SingleTaskVariationalGP`](../api/models.html#botorch.models.approximate_gp.SingleTaskVariationalGP):
- [`SingleTaskVariationalGP`](https://botorch.readthedocs.io/en/latest/models.html#botorch.models.approximate_gp.SingleTaskVariationalGP):
an approximate model for faster computation when you have a lot of data or
your responses are non-Gaussian.

Expand All @@ -169,7 +169,7 @@ configurable model class whose implementation is difficult to understand.
Instead, we advocate that users implement their own models to cover more
specialized use cases. The light-weight nature of BoTorch's Model API makes this
easy to do. See the
[Using a custom BoTorch model in Ax](../tutorials/custom_botorch_model_in_ax)
[Using a custom BoTorch model in Ax](tutorials/custom_botorch_model_in_ax)
tutorial for an example.

The BoTorch `Model` interface is light-weight and easy to extend. The only
Expand Down
Loading

0 comments on commit b197bf1

Please sign in to comment.