nn
optim
AdamW

AdamW

Implements the AdamW optimizer to your model.

Learn more about AdamW (opens in a new tab)

Usage

from tinygrad.nn.state import get_parameters
from tinygrad.nn import optim
from models.resnet import ResNet
 
model = ResNet()
 
for _ in range(5):
  optimizer = optim.AdamW(get_parameters(model), lr=0.01)
  # train, eval ...

Arguments

params

The parameters of the model.

lr

The learning rate for the optimizer.

b1

The beta1 for the optimizer.

b2

The beta2 for the optimizer.

eps

The epsilon for the optimizer.

wd

The weight decay for the optimizer.

;