mattertune.finetune.optimizer

Functions

create_optimizer(config, parameters)

Classes

AdamConfig(*[, name, eps, betas, ...])

AdamWConfig(*[, name, eps, betas, ...])

SGDConfig(*[, name, momentum, weight_decay, ...])

class mattertune.finetune.optimizer.AdamConfig(*, name='Adam', lr, eps=1e-08, betas=(0.9, 0.999), weight_decay=0.0, amsgrad=False)[source]
Parameters:
  • name (Literal['Adam'])

  • lr (Annotated[float, Gt(gt=0)])

  • eps (Annotated[float, Ge(ge=0)])

  • betas (tuple[Annotated[float, Gt(gt=0)], Annotated[float, Gt(gt=0)]])

  • weight_decay (Annotated[float, Ge(ge=0)])

  • amsgrad (bool)

name: Literal['Adam']

name of the optimizer.

lr: C.PositiveFloat

Learning rate.

eps: C.NonNegativeFloat

Epsilon.

betas: tuple[C.PositiveFloat, C.PositiveFloat]

Betas.

weight_decay: C.NonNegativeFloat

Weight decay.

amsgrad: bool

Whether to use AMSGrad variant of Adam.

class mattertune.finetune.optimizer.AdamWConfig(*, name='AdamW', lr, eps=1e-08, betas=(0.9, 0.999), weight_decay=0.01, amsgrad=False)[source]
Parameters:
  • name (Literal['AdamW'])

  • lr (Annotated[float, Gt(gt=0)])

  • eps (Annotated[float, Ge(ge=0)])

  • betas (tuple[Annotated[float, Gt(gt=0)], Annotated[float, Gt(gt=0)]])

  • weight_decay (Annotated[float, Ge(ge=0)])

  • amsgrad (bool)

name: Literal['AdamW']

name of the optimizer.

lr: C.PositiveFloat

Learning rate.

eps: C.NonNegativeFloat

Epsilon.

betas: tuple[C.PositiveFloat, C.PositiveFloat]

Betas.

weight_decay: C.NonNegativeFloat

Weight decay.

amsgrad: bool

Whether to use AMSGrad variant of Adam.

class mattertune.finetune.optimizer.SGDConfig(*, name='SGD', lr, momentum=0.0, weight_decay=0.0, nestrov=False)[source]
Parameters:
  • name (Literal['SGD'])

  • lr (Annotated[float, Gt(gt=0)])

  • momentum (Annotated[float, Ge(ge=0)])

  • weight_decay (Annotated[float, Ge(ge=0)])

  • nestrov (bool)

name: Literal['SGD']

name of the optimizer.

lr: C.PositiveFloat

Learning rate.

momentum: C.NonNegativeFloat

Momentum.

weight_decay: C.NonNegativeFloat

Weight decay.

nestrov: bool

Whether to use nestrov.

mattertune.finetune.optimizer.create_optimizer(config, parameters)[source]
Parameters:
  • config (OptimizerConfig)

  • parameters (Iterable[Parameter])

Return type:

Optimizer