Note

All plotting functions require a valid Makie backend, e.g. CairoMakie, to be loaded. If there is no backend loaded, then functions won't do anything interesting. On Julia 1.9+, Makie is a weak dependency and so won't incur any compilation/dependency cost without a backend. On Julia < 1.9, Makie will still be loaded, but without a backend, the resulting figure will not be rendered.

Confusion matrices

Lighthouse.plot_confusion_matrixFunction
plot_confusion_matrix!(subfig::GridPosition, args...; kw...)

plot_confusion_matrix(confusion::AbstractMatrix{<: Number},
                      class_labels::AbstractVector{String},
                      normalize_by::Union{Symbol,Nothing}=nothing;
                      size=(800,600), annotation_text_size=20)

Lighthouse plots confusion matrices, which are simple tables showing the empirical distribution of predicted class (the rows) versus the elected class (the columns). These can optionally be normalized:

  • row-normalized (:Row): this means each row has been normalized to sum to 1. Thus, the row-normalized confusion matrix shows the empirical distribution of elected classes for a given predicted class. E.g. the first row of the row-normalized confusion matrix shows the empirical probabilities of the elected classes for a sample which was predicted to be in the first class.
  • column-normalized (:Column): this means each column has been normalized to sum to 1. Thus, the column-normalized confusion matrix shows the empirical distribution of predicted classes for a given elected class. E.g. the first column of the column-normalized confusion matrix shows the empirical probabilities of the predicted classes for a sample which was elected to be in the first class.
fig, ax, p = plot_confusion_matrix(rand(2, 2), ["1", "2"])
fig = Figure()
ax = plot_confusion_matrix!(fig[1, 1], rand(2, 2), ["1", "2"], :Row)
ax = plot_confusion_matrix!(fig[1, 2], rand(2, 2), ["1", "2"], :Column)
Note

This function requires a valid Makie backend (e.g. CairoMakie) to be loaded.

source
using Lighthouse: plot_confusion_matrix, plot_confusion_matrix!

classes = ["red", "orange", "yellow", "green"]
ground_truth =     [1, 1, 1, 2, 2, 2, 3, 3, 3, 4, 4, 4]
predicted_labels = [1, 1, 1, 1, 2, 2, 4, 4, 4, 4, 4, 3]
confusion = Lighthouse.confusion_matrix(length(classes), zip(predicted_labels, ground_truth))

fig, ax, p = plot_confusion_matrix(confusion, classes)
fig = Figure(size=(800, 400))
plot_confusion_matrix!(fig[1, 1], confusion, classes, :Row, annotation_text_size=14)
plot_confusion_matrix!(fig[1, 2], confusion, classes, :Column, annotation_text_size=14)
fig

Theming

All plots are globally themeable, by setting their camelcase(functionname) to a theme. Usually, there are a few sub categories, for e.g. axis, text and subplots.

Warning

Make sure, that you spell names correctly and fully construct the named tuples in the calls. E.g. (color=:red) is not a named tuple - it needs to be (color=:red,). Misspelled names and badly constructed named tuples are not easy to error on, since those theming attributes are global, and may be valid for other plots.

with_theme(
        ConfusionMatrix = (
            Text = (
                color=:yellow,
            ),
            Heatmap = (
                colormap=:greens,
            ),
            Axis = (
                backgroundcolor=:black,
                xticklabelrotation=0.0,
            )
        )
    ) do
    plot_confusion_matrix(confusion, classes, :Row)
end

Reliability calibration curves

Lighthouse.plot_reliability_calibration_curvesFunction
plot_reliability_calibration_curves!(fig::SubFigure, args...; kw...)

plot_reliability_calibration_curves(per_class_reliability_calibration_curves::SeriesCurves,
                                    per_class_reliability_calibration_scores::NumberVector,
                                    class_labels::AbstractVector{String};
                                    legend=:rb, size=(800, 600))
Note

This function requires a valid Makie backend (e.g. CairoMakie) to be loaded.

source
using Lighthouse: plot_reliability_calibration_curves
classes = ["class $i" for i in 1:5]
curves = [(LinRange(0, 1, 10), range(0, stop=i/2, length=10) .+ (stable_randn(10) .* 0.1)) for i in -1:3]

plot_reliability_calibration_curves(
    curves,
    stable_rand(5),
    classes
)

Note that all curve plot types accepts these types:

Theming

All generic series and axis attributes can be themed via SeriesPlot.Series / SeriesPlot.Axis. You can have a look at the series doc to get an idea about the applicable attributes. To style specifics of a subplot inside the curve plot, e.g. the ideal lineplot, one can use the camel case function name (without plot_) and pass those attributes there. So e.g the ideal curve inside the reliability curve can be themed like this:

# The axis is getting created in the seriesplot,
# to always have these kind of probabilistic series have the same axis
series_theme = (
    Axis = (
        backgroundcolor = (:gray, 0.1),
        bottomspinevisible = false,
        leftspinevisible = false,
        topspinevisible = false,
        rightspinevisible = false,
    ),
    Series = (
        color=:darktest,
        marker=:circle
    )
)
with_theme(
        ReliabilityCalibrationCurves = (
            Ideal = (
                color=:red, linewidth=3
            ),
        ),
        SeriesPlot = series_theme
    ) do
    plot_reliability_calibration_curves(
        curves,
        stable_rand(5),
        classes
    )
end

Binary Discrimination Calibration Curves

Lighthouse.plot_binary_discrimination_calibration_curvesFunction
plot_binary_discrimination_calibration_curves!(fig::SubFigure, args...; kw...)

plot_binary_discrimination_calibration_curves!(calibration_curve::SeriesCurves, calibration_score,
                                               per_expert_calibration_curves::SeriesCurves,
                                               per_expert_calibration_scores, optimal_threshold,
                                               discrimination_class::AbstractString;
                                               marker=:rect, markersize=5, linewidth=2)
Note

This function requires a valid Makie backend (e.g. CairoMakie) to be loaded.

source
using Lighthouse: plot_binary_discrimination_calibration_curves

Lighthouse.plot_binary_discrimination_calibration_curves(
    curves[3],
    stable_rand(5),
    curves[[1, 2, 4, 5]],
    nothing, nothing,
    "",
)

PR curves

Lighthouse.plot_pr_curvesFunction
plot_pr_curves!(subfig::GridPosition, args...; kw...)

plot_pr_curves(per_class_pr_curves::SeriesCurves,
            class_labels::AbstractVector{<: String};
            size=(800, 600),
            legend=:lt, title="PR curves",
            xlabel="True positive rate", ylabel="Precision",
            linewidth=2, scatter=NamedTuple(), color=:darktest)
  • scatter::Union{Nothing, NamedTuple}: can be set to a named tuples of attributes that are forwarded to the scatter call (e.g. markersize). If nothing, no scatter is added.
Note

This function requires a valid Makie backend (e.g. CairoMakie) to be loaded.

source
using Lighthouse: plot_pr_curves
plot_pr_curves(
    curves,
    classes
)

Theming

# The plots with only a series don't have a special keyword
with_theme(SeriesPlot = series_theme) do
    plot_pr_curves(
        curves,
        classes
    )
end

ROC curves

Lighthouse.plot_roc_curvesFunction
plot_roc_curves!(subfig::GridPosition, args...; kw...)

plot_roc_curves(per_class_roc_curves::SeriesCurves,
                per_class_roc_aucs::NumberVector,
                class_labels::AbstractVector{<: String};
                size=(800, 600),
                legend=:lt,
                title="ROC curves",
                xlabel="False positive rate",
                ylabel="True positive rate",
                linewidth=2, scatter=NamedTuple(), color=:darktest)
  • scatter::Union{Nothing, NamedTuple}: can be set to a named tuples of attributes that are forwarded to the scatter call (e.g. markersize). If nothing, no scatter is added.
Note

This function requires a valid Makie backend (e.g. CairoMakie) to be loaded.

source
using Lighthouse: plot_roc_curves

plot_roc_curves(
    curves,
    stable_rand(5),
    classes,
    legend=:lt)

Theming

# The plots with only a series don't have a special keyword
with_theme(SeriesPlot = series_theme) do
    plot_roc_curves(
        curves,
        stable_rand(5),
        classes,
        legend=:lt)
end

Kappas (per expert agreement)

Lighthouse.plot_kappasFunction
plot_kappas!(subfig::GridPosition, args...; kw...)

plot_kappas(per_class_kappas::NumberVector,
            class_labels::AbstractVector{String},
            per_class_IRA_kappas=nothing;
            size=(800, 600),
            annotation_text_size=20)
Note

This function requires a valid Makie backend (e.g. CairoMakie) to be loaded.

source
using Lighthouse: plot_kappas
plot_kappas(stable_rand(5), classes)
using Lighthouse: plot_kappas
plot_kappas(stable_rand(5), classes, stable_rand(5))

Theming

with_theme(
        Kappas = (
            Axis = (
                xticklabelsvisible=false,
                xticksvisible=false,
                leftspinevisible = false,
                rightspinevisible = false,
                bottomspinevisible = false,
                topspinevisible = false,
            ),
            Text = (
                color = :blue,
            ),
            BarPlot = (color=[:black, :green],)
        )) do
    plot_kappas((1:5) ./ 5 .- 0.1, classes, (1:5) ./ 5)
end

Evaluation metrics plot

Lighthouse.evaluation_metrics_plotFunction
evaluation_metrics_plot(predicted_hard_labels::AbstractVector,
                        predicted_soft_labels::AbstractMatrix,
                        elected_hard_labels::AbstractVector,
                        classes,
                        thresholds=0.0:0.01:1.0;
                        votes::Union{Nothing,AbstractMatrix}=nothing,
                        strata::Union{Nothing,AbstractVector{Set{T}} where T}=nothing,
                        optimal_threshold_class::Union{Nothing,Integer}=nothing)

Return a plot and dictionary containing a battery of classifier performance metrics that each compare predicted_soft_labels and/or predicted_hard_labels agaist elected_hard_labels.

See evaluation_metrics for a description of the arguments.

This method is deprecated in favor of calling evaluation_metrics and evaluation_metrics_plot separately.

Note

This function requires a valid Makie backend (e.g. CairoMakie) to be loaded.

source
using Lighthouse: evaluation_metrics_plot
data = Dict{String, Any}()
data["confusion_matrix"] = stable_rand(0:100, 5, 5)
data["class_labels"] = classes

data["per_class_kappas"] = stable_rand(5)
data["multiclass_kappa"] = stable_rand()
data["per_class_IRA_kappas"] = stable_rand(5)
data["multiclass_IRA_kappas"] = stable_rand()

data["per_class_pr_curves"] = curves
data["per_class_roc_curves"] = curves
data["per_class_roc_aucs"] = stable_rand(5)

data["per_class_reliability_calibration_curves"] = curves
data["per_class_reliability_calibration_scores"] = stable_rand(5)

evaluation_metrics_plot(data)

Optionally, one can also add a binary discrimination calibration curve plot:

data["discrimination_calibration_curve"] = (LinRange(0, 1, 10), LinRange(0,1, 10) .+ 0.1randn(10))
data["per_expert_discrimination_calibration_curves"] = curves

# These are currently not used in plotting, but are still passed to `plot_binary_discrimination_calibration_curves`!
data["discrimination_calibration_score"] = missing
data["optimal_threshold_class"] = 1
data["per_expert_discrimination_calibration_scores"] = missing
data["optimal_threshold"] = missing

evaluation_metrics_plot(data)

Plots can also be generated directly from an EvaluationV1:

data_row = EvaluationV1(data)
evaluation_metrics_plot(data_row)