Safemotion Lib
Loading...
Searching...
No Matches
Public Member Functions | Protected Attributes | List of all members
fastreid.engine.hooks.LRScheduler Class Reference
Inheritance diagram for fastreid.engine.hooks.LRScheduler:
fastreid.engine.train_loop.HookBase

Public Member Functions

 __init__ (self, optimizer, scheduler)
 
 after_step (self)
 
- Public Member Functions inherited from fastreid.engine.train_loop.HookBase
 before_train (self)
 
 after_train (self)
 
 before_step (self)
 

Protected Attributes

 _optimizer
 
 _scheduler
 
 _best_param_group_id
 

Detailed Description

A hook which executes a torch builtin LR scheduler and summarizes the LR.
It is executed after every iteration.

Definition at line 192 of file hooks.py.

Constructor & Destructor Documentation

◆ __init__()

fastreid.engine.hooks.LRScheduler.__init__ ( self,
optimizer,
scheduler )
Args:
    optimizer (torch.optim.Optimizer):
    scheduler (torch.optim._LRScheduler)

Definition at line 198 of file hooks.py.

198 def __init__(self, optimizer, scheduler):
199 """
200 Args:
201 optimizer (torch.optim.Optimizer):
202 scheduler (torch.optim._LRScheduler)
203 """
204 self._optimizer = optimizer
205 self._scheduler = scheduler
206
207 # NOTE: some heuristics on what LR to summarize
208 # summarize the param group with most parameters
209 largest_group = max(len(g["params"]) for g in optimizer.param_groups)
210
211 if largest_group == 1:
212 # If all groups have one parameter,
213 # then find the most common initial LR, and use it for summary
214 lr_count = Counter([g["lr"] for g in optimizer.param_groups])
215 lr = lr_count.most_common()[0][0]
216 for i, g in enumerate(optimizer.param_groups):
217 if g["lr"] == lr:
218 self._best_param_group_id = i
219 break
220 else:
221 for i, g in enumerate(optimizer.param_groups):
222 if len(g["params"]) == largest_group:
223 self._best_param_group_id = i
224 break
225

Member Function Documentation

◆ after_step()

fastreid.engine.hooks.LRScheduler.after_step ( self)
Called after each iteration.

Reimplemented from fastreid.engine.train_loop.HookBase.

Definition at line 226 of file hooks.py.

226 def after_step(self):
227 lr = self._optimizer.param_groups[self._best_param_group_id]["lr"]
228 self.trainer.storage.put_scalar("lr", lr, smoothing_hint=False)
229 self._scheduler.step()
230
231

Member Data Documentation

◆ _best_param_group_id

fastreid.engine.hooks.LRScheduler._best_param_group_id
protected

Definition at line 218 of file hooks.py.

◆ _optimizer

fastreid.engine.hooks.LRScheduler._optimizer
protected

Definition at line 204 of file hooks.py.

◆ _scheduler

fastreid.engine.hooks.LRScheduler._scheduler
protected

Definition at line 205 of file hooks.py.


The documentation for this class was generated from the following file: