Skip to content

Schedule

This module provides commonly used schedulers with hyperparameters such as learning rate.

It is very important to remember to apply hyperparameters update such as learning rate update after optimizer's update. This means that it should be either in before_batch OR after_batch in our framework.

We have two choices to schedule hyperparameters:

BatchScheduler

Bases: Scheduler

Change hyperparameters after every batch using scheduler.

EpochScheduler

Bases: Scheduler

Change hyperparameters after every epoch using scheduler.

ParamScheduler

Bases: Callback

This class is used to schedule the values of hyperparameters during the training process.

__init__(pname, sched_funcs)

Parameters:

Name Type Description Default
pname str

The name of the hyperparameter to be scheduled.

required
sched_funcs list[Callabel] | tuple[Callable]

A list or tuple of schedulers for each parameter group. Each scheduler should accept a single argument (position) and return a value for the hyperparameter.

required

Scheduler

Bases: Callback

Base scheduler to change hyperparameters using scheduler.

Note

Pytorch's schedulers take optimizer as the first argument. Therefore, it is important to pass the scheduler that has all its arguments already passed except the optimizer. This will be done in Scheduler's before_fit method. For example:

Scheduler(partial(torch.opt.lr_schedule.OneCycleLR, max_lr=1e-2, total_steps=1000))

combine_scheds(pcts, scheds)

Combine multiple schedulers, each run for a given percentage of the training process.

cos_1cycle_anneal(start, high, end)

combine two cosine schedulers where first scheduler goes from start to high and second scheduler goes from high to end.

cos_sched(start, end, pos)

Cosine scheduler.

exp_sched(start, end, pos)

Exponential scheduler.

lin_sched(start, end, pos)

Linear scheduler.

no_sched(start, end, pos)

Constant schedular.