#1592 Documentation issues

Merged TrigonaMinima
Coverage Reach
classification/training/trainer.py classification/training/loggers/checkpointer.py classification/training/loggers/log_manager.py classification/training/loggers/log_writer.py classification/training/loggers/tensorboard_writer.py classification/training/loggers/__init__.py classification/training/schedulers/shuffled_scheduler.py classification/training/schedulers/scheduler.py classification/training/schedulers/sequential_scheduler.py classification/training/schedulers/__init__.py classification/multitask_classifier.py classification/data.py classification/utils.py classification/task.py classification/loss.py classification/__init__.py labeling/model/label_model.py labeling/model/logger.py labeling/model/base_labeler.py labeling/model/baselines.py labeling/model/graph_utils.py labeling/model/__init__.py labeling/apply/core.py labeling/apply/pandas.py labeling/apply/dask.py labeling/analysis.py labeling/lf/nlp.py labeling/lf/core.py labeling/lf/__init__.py labeling/utils.py labeling/__init__.py slicing/utils.py slicing/sliceaware_classifier.py slicing/modules/slice_combiner.py slicing/sf/core.py slicing/sf/nlp.py slicing/sf/__init__.py slicing/monitor.py slicing/__init__.py slicing/apply/core.py augmentation/apply/core.py augmentation/apply/pandas.py augmentation/policy/core.py augmentation/policy/sampling.py augmentation/tf.py augmentation/__init__.py utils/core.py utils/optimizers.py utils/lr_schedulers.py utils/config_utils.py utils/data_operators.py utils/__init__.py analysis/scorer.py analysis/metrics.py analysis/error_analysis.py analysis/__init__.py map/core.py map/__init__.py preprocess/nlp.py preprocess/core.py preprocess/__init__.py synthetic/synthetic_data.py types/data.py types/__init__.py types/classifier.py version.py __init__.py

No flags found

Use flags to group coverage reports by test type, project and/or folders.
Then setup custom commit statuses and notifications for each flag.

e.g., #unittest #integration

#production #enterprise

#frontend #backend

Learn more about Codecov Flags here.

Showing 1 of 1 files from the diff.

@@ -420,9 +420,10 @@
Loading
420 420
        """Return predicted labels, with ties broken according to policy.
421 421
422 422
        Policies to break ties include:
423 -
        "abstain": return an abstain vote (-1)
424 -
        "true-random": randomly choose among the tied options
425 -
        "random": randomly choose among tied option using deterministic hash
423 +
424 +
        - "abstain": return an abstain vote (-1)
425 +
        - "true-random": randomly choose among the tied options
426 +
        - "random": randomly choose among tied option using deterministic hash
426 427
427 428
        NOTE: if tie_break_policy="true-random", repeated runs may have slightly different
428 429
        results due to difference in broken ties
@@ -472,9 +473,14 @@
Loading
472 473
        Y
473 474
            Gold labels associated with data points in L
474 475
        metrics
475 -
            A list of metric names
476 +
            A list of metric names. Possbile metrics are - `accuracy`, `coverage`,
477 +
            `precision`, `recall`, `f1`, `f1_micro`, `f1_macro`, `fbeta`,
478 +
            `matthews_corrcoef`, `roc_auc`. See `sklearn.metrics
479 +
            <https://scikit-learn.org/stable/modules/classes.html#module-sklearn.metrics>`_
480 +
            for details on the metrics.
476 481
        tie_break_policy
477 -
            Policy to break ties when converting probabilistic labels to predictions
482 +
            Policy to break ties when converting probabilistic labels to predictions.
483 +
            Same as :func:`.predict` method above.
478 484
479 485
480 486
        Returns
@@ -810,7 +816,35 @@
Loading
810 816
        class_balance
811 817
            Each class's percentage of the population, by default None
812 818
        **kwargs
813 -
            Arguments for changing train config defaults
819 +
            Arguments for changing train config defaults.
820 +
821 +
            n_epochs
822 +
                The number of epochs to train (where each epoch is a single
823 +
                optimization step), default is 100
824 +
            lr
825 +
                Base learning rate (will also be affected by lr_scheduler choice
826 +
                and settings), default is 0.01
827 +
            l2
828 +
                Centered L2 regularization strength, default is 0.0
829 +
            optimizer
830 +
                Which optimizer to use (one of ["sgd", "adam", "adamax"]),
831 +
                default is "sgd"
832 +
            optimizer_config
833 +
                Settings for the optimizer
834 +
            lr_scheduler
835 +
                Which lr_scheduler to use (one of ["constant", "linear",
836 +
                "exponential", "step"]), default is "constant"
837 +
            lr_scheduler_config
838 +
                Settings for the LRScheduler
839 +
            prec_init
840 +
                LF precision initializations / priors, default is 0.7
841 +
            seed
842 +
                A random seed to initialize the random number generator with
843 +
            log_freq
844 +
                Report loss every this many epochs (steps), default is 10
845 +
            mu_eps
846 +
                Restrict the learned conditional probabilities to
847 +
                [mu_eps, 1-mu_eps], default is None
814 848
815 849
        Raises
816 850
        ------
@@ -823,8 +857,8 @@
Loading
823 857
        >>> Y_dev = [0, 1, 0]
824 858
        >>> label_model = LabelModel(verbose=False)
825 859
        >>> label_model.fit(L)
826 -
        >>> label_model.fit(L, Y_dev=Y_dev)
827 -
        >>> label_model.fit(L, class_balance=[0.7, 0.3])
860 +
        >>> label_model.fit(L, Y_dev=Y_dev, seed=2020, lr=0.05)
861 +
        >>> label_model.fit(L, class_balance=[0.7, 0.3], n_epochs=200, l2=0.4)
828 862
        """
829 863
        # Set random seed
830 864
        self.train_config: TrainConfig = merge_config(  # type:ignore

Learn more Showing 12 files with coverage changes found.

New file snorkel/labeling/model/__init__.py
New
Loading file...
New file snorkel/preprocess/__init__.py
New
Loading file...
New file snorkel/classification/__init__.py
New
Loading file...
New file snorkel/classification/training/loggers/__init__.py
New
Loading file...
New file snorkel/augmentation/__init__.py
New
Loading file...
New file snorkel/slicing/__init__.py
New
Loading file...
New file snorkel/__init__.py
New
Loading file...
New file snorkel/utils/__init__.py
New
Loading file...
New file snorkel/slicing/sf/__init__.py
New
Loading file...
New file snorkel/labeling/__init__.py
New
Loading file...
New file snorkel/map/__init__.py
New
Loading file...
Changes in snorkel/slicing/utils.py
+2
Loading file...
Files Coverage
snorkel 0.05% 97.18%
Project Totals (67 files) 97.18%
Loading