nn
optim
Adam

Adam

Implements the Adam optimizer to your model.

Learn more about the Adam optimizer (opens in a new tab)

Usage

from tinygrad.nn.state import get_parameters
from tinygrad.nn import optim
from models.resnet import ResNet
 
model = ResNet()
 
for _ in range(5):
  optimizer = optim.Adam(get_parameters(model), lr=0.01)
  # train, eval ...

Arguments

params

The parameters of the model.

lr (default: 0.001)

The learning rate for the optimizer.

b1 (defualt: 0.9)

The beta1 value for the optimizer.

b2 (default: 0.999)

The beta2 value for the optimizer.

eps (default: 1e-8)

The epsilon value for the optimizer.

;