NeptuneLogger¶
- class lightning.pytorch.loggers.NeptuneLogger(*, api_key=None, project=None, name=None, run=None, log_model_checkpoints=True, prefix='training', **neptune_run_kwargs)[소스]¶
기반 클래스:
lightning.pytorch.loggers.logger.Logger
Log using Neptune.
Install it with pip:
pip install neptune
or conda:
conda install -c conda-forge neptune-client
Quickstart
Pass a NeptuneLogger instance to the Trainer to log metadata with Neptune:
from lightning.pytorch import Trainer from lightning.pytorch.loggers import NeptuneLogger import neptune neptune_logger = NeptuneLogger( api_key=neptune.ANONYMOUS_API_TOKEN, # replace with your own project="common/pytorch-lightning-integration", # format "workspace-name/project-name" tags=["training", "resnet"], # optional ) trainer = Trainer(max_epochs=10, logger=neptune_logger)
How to use NeptuneLogger?
Use the logger anywhere in your
LightningModule
as follows:from neptune.types import File from lightning.pytorch import LightningModule class LitModel(LightningModule): def training_step(self, batch, batch_idx): # log metrics acc = ... self.append("train/loss", loss) def any_lightning_module_function_or_hook(self): # log images img = ... self.logger.experiment["train/misclassified_images"].append(File.as_image(img)) # generic recipe metadata = ... self.logger.experiment["your/metadata/structure"] = metadata
Note that the syntax
self.logger.experiment["your/metadata/structure"].append(metadata)
is specific to Neptune and extends the logger capabilities. It lets you log various types of metadata, such as scores, files, images, interactive visuals, and CSVs. Refer to the Neptune docs for details. You can also use the regular logger methodslog_metrics()
, andlog_hyperparams()
with NeptuneLogger.Log after fitting or testing is finished
You can log objects after the fitting or testing methods are finished:
neptune_logger = NeptuneLogger(project="common/pytorch-lightning-integration") trainer = pl.Trainer(logger=neptune_logger) model = ... datamodule = ... trainer.fit(model, datamodule=datamodule) trainer.test(model, datamodule=datamodule) # Log objects after `fit` or `test` methods # model summary neptune_logger.log_model_summary(model=model, max_depth=-1) # generic recipe metadata = ... neptune_logger.experiment["your/metadata/structure"] = metadata
Log model checkpoints
If you have
ModelCheckpoint
configured, the Neptune logger automatically logs model checkpoints. Model weights will be uploaded to the “model/checkpoints” namespace in the Neptune run. You can disable this option with:neptune_logger = NeptuneLogger(log_model_checkpoints=False)
Pass additional parameters to the Neptune run
You can also pass
neptune_run_kwargs
to add details to the run, liketags
ordescription
:from lightning.pytorch import Trainer from lightning.pytorch.loggers import NeptuneLogger neptune_logger = NeptuneLogger( project="common/pytorch-lightning-integration", name="lightning-run", description="mlp quick run with pytorch-lightning", tags=["mlp", "quick-run"], ) trainer = Trainer(max_epochs=3, logger=neptune_logger)
Check run documentation for more info about additional run parameters.
Details about Neptune run structure
Runs can be viewed as nested dictionary-like structures that you can define in your code. Thanks to this you can easily organize your metadata in a way that is most convenient for you.
The hierarchical structure that you apply to your metadata is reflected in the Neptune web app.
더 보기
Read about what objects you can log to Neptune.
Check out an example run with multiple types of metadata logged.
For more detailed examples, see the user guide.
- 매개변수
api_key¶ (
Optional
[str
]) – Optional. Neptune API token, found on https://neptune.ai upon registration. You should save your token to the NEPTUNE_API_TOKEN environment variable and leave the api_key argument out of your code. Instructions: Setting your API token.project¶ (
Optional
[str
]) – Optional. Name of a project in the form “workspace-name/project-name”, for example “tom/mask-rcnn”. IfNone
, the value of NEPTUNE_PROJECT environment variable is used. You need to create the project on https://neptune.ai first.name¶ (
Optional
[str
]) – Optional. Editable name of the run. The run name is displayed in the Neptune web app.run¶ (
None
) – Optional. Default isNone
. A NeptuneRun
object. If specified, this existing run will be used for logging, instead of a new run being created. You can also pass a namespace handler object; for example,run["test"]
, in which case all metadata is logged under the “test” namespace inside the run.log_model_checkpoints¶ (
Optional
[bool
]) – Optional. Default isTrue
. Log model checkpoint to Neptune. Works only ifModelCheckpoint
is passed to theTrainer
.prefix¶ (
str
) – Optional. Default is"training"
. Root namespace for all metadata logging.**neptune_run_kwargs¶ – Additional arguments like
tags
,description
,capture_stdout
, etc. used when a run is created.
- 예외 발생
ModuleNotFoundError – If the required Neptune package is not installed.
ValueError – If an argument passed to the logger’s constructor is incorrect.
- after_save_checkpoint(checkpoint_callback)[소스]¶
Automatically log checkpointed model. Called after model checkpoint callback saves a new checkpoint.
- log_hyperparams(params)[소스]¶
Log hyperparameters to the run.
Hyperparameters will be logged under the “<prefix>/hyperparams” namespace.
참고
You can also log parameters by directly using the logger instance:
neptune_logger.experiment["model/hyper-parameters"] = params_dict
.In this way you can keep hierarchical structure of the parameters.
- 매개변수
params¶ (
Union
[Dict
[str
,Any
],Namespace
]) – dict. Python dictionary structure with parameters.
Example:
from lightning.pytorch.loggers import NeptuneLogger import neptune PARAMS = { "batch_size": 64, "lr": 0.07, "decay_factor": 0.97, } neptune_logger = NeptuneLogger( api_key=neptune.ANONYMOUS_API_TOKEN, project="common/pytorch-lightning-integration" ) neptune_logger.log_hyperparams(PARAMS)
- 반환 형식
- property experiment: None¶
Actual Neptune run object. Allows you to use neptune logging features in your
LightningModule
.Example:
class LitModel(LightningModule): def training_step(self, batch, batch_idx): # log metrics acc = ... self.logger.experiment["train/acc"].append(acc) # log images img = ... self.logger.experiment["train/misclassified_images"].append(File.as_image(img))
Note that the syntax
self.logger.experiment["your/metadata/structure"].append(metadata)
is specific to Neptune and extends the logger capabilities. It lets you log various types of metadata, such as scores, files, images, interactive visuals, and CSVs. Refer to the Neptune docs for more detailed explanations. You can also use the regular logger methodslog_metrics()
, andlog_hyperparams()
with NeptuneLogger.- 반환 형식
- property name: Optional[str]¶
Return the experiment name or ‘offline-name’ when exp is run in offline mode.