easycv.models package¶
Subpackages¶
- easycv.models.backbones package
- Submodules
- easycv.models.backbones.benchmark_mlp module
- easycv.models.backbones.bninception module
- easycv.models.backbones.darknet module
- easycv.models.backbones.genet module
- easycv.models.backbones.hrnet module
- easycv.models.backbones.inceptionv3 module
- easycv.models.backbones.lighthrnet module
- easycv.models.backbones.mae_vit_transformer module
- easycv.models.backbones.mnasnet module
- easycv.models.backbones.mobilenetv2 module
- easycv.models.backbones.network_blocks module
- easycv.models.backbones.pytorch_image_models_wrapper module
- easycv.models.backbones.resnest module
- easycv.models.backbones.resnet module
- easycv.models.backbones.resnet_jit module
- easycv.models.backbones.resnext module
- easycv.models.backbones.shuffle_transformer module
- easycv.models.backbones.swin_transformer_dynamic module
- easycv.models.backbones.vit_transfomer_dynamic module
- easycv.models.backbones.xcit_transformer module
- easycv.models.classification package
- easycv.models.detection package
- easycv.models.heads package
- easycv.models.loss package
- easycv.models.pose package
- easycv.models.selfsup package
- Submodules
- easycv.models.selfsup.byol module
- easycv.models.selfsup.dino module
- easycv.models.selfsup.mae module
- easycv.models.selfsup.mixco module
- easycv.models.selfsup.moby module
- easycv.models.selfsup.moco module
- easycv.models.selfsup.necks module
- easycv.models.selfsup.simclr module
- easycv.models.selfsup.swav module
- easycv.models.utils package
- Submodules
- easycv.models.utils.accuracy module
- easycv.models.utils.activation module
- easycv.models.utils.conv_module module
- easycv.models.utils.conv_ws module
- easycv.models.utils.dist_utils module
- easycv.models.utils.gather_layer module
- easycv.models.utils.init_weights module
- easycv.models.utils.multi_pooling module
- easycv.models.utils.norm module
- easycv.models.utils.ops module
- easycv.models.utils.pos_embed module
- easycv.models.utils.res_layer module
- easycv.models.utils.scale module
- easycv.models.utils.sobel module
Submodules¶
easycv.models.base module¶
- class easycv.models.base.BaseModel(init_cfg=None)[source]¶
Bases:
torch.nn.modules.module.Module
base class for model.
- __init__(init_cfg=None)[source]¶
Initializes internal Module state, shared by both nn.Module and ScriptModule.
- property is_init: bool¶
- abstract forward_train(img: torch.Tensor, **kwargs) → Dict[str, torch.Tensor][source]¶
Abstract interface for model forward in training
- Parameters
img (Tensor) – image tensor
kwargs (keyword arguments) – Specific to concrete implementation.
- forward_test(img: torch.Tensor, **kwargs) → Dict[str, torch.Tensor][source]¶
Abstract interface for model forward in testing
- Parameters
img (Tensor) – image tensor
kwargs (keyword arguments) – Specific to concrete implementation.
- forward(mode='train', *args, **kwargs)[source]¶
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- train_step(data, optimizer)[source]¶
The iteration step during training.
This method defines an iteration step during training, except for the back propagation and optimizer updating, which are done in an optimizer hook. Note that in some complicated cases or models, the whole process including back propagation and optimizer updating is also defined in this method, such as GAN.
- Parameters
data (dict) – The output of dataloader.
optimizer (
torch.optim.Optimizer
| dict) – The optimizer of runner is passed totrain_step()
. This argument is unused and reserved.
- Returns
It should contain at least 3 keys:
loss
,log_vars
,num_samples
.loss
is a tensor for back propagation, which can be a weighted sum of multiple losses.log_vars
contains all the variables to be sent to the logger.num_samples
indicates the batch size (when the model is DDP, it means the batch size on each GPU), which is used for averaging the logs.
- Return type
dict
- val_step(data, optimizer)[source]¶
The iteration step during validation.
This method shares the same signature as
train_step()
, but used during val epochs. Note that the evaluation after training epochs is not implemented with this method, but an evaluation hook.
- training: bool¶
easycv.models.builder module¶
- easycv.models.builder.build_positional_encoding(cfg, default_args=None)[source]¶
Builder for Position Encoding.
- easycv.models.builder.build_feedforward_network(cfg, default_args=None)[source]¶
Builder for feed-forward network (FFN).