Scruff.Algorithms

Scruff.Algorithms.QueryableType

A query target is either a variable instance or a variable. Allowing queries to be defined in terms of instances rather than variables makes it possible to ask queries across multiple instances of a variable at different times. However, in many cases the current instance of the variable(s) is required and then it is easier to use variables.

source
Scruff.Algorithms.AlgorithmType
Algorithm

The supertype of all algorithms.

A standard set of queries is defined for algorithms. Any given subtype of Algorithm will implement a subset of these queries.

source
Scruff.Algorithms.CoherentWindowType
struct CoherentWindow <: WindowCreator end

A variant of AsyncWindow that ensures that parent values are never stale for any variable that gets updated in a filter step. In other words, if any parent of a direct parent has been updated more recently than a variable to be updated, the direct parent is added to the variables to be updated. This condition then recurses for the direct parents.

source
Scruff.Algorithms.FilterType
abstract type Filter <: Algorithm end

General type of filtering algorithms.

Must implement initfilter and filterstep methods.

source
Scruff.Algorithms.ImportanceType
Importance <: InstantAlgorithm

Representation of an importance sampling algorithm.

arguments

  • proposal_function Specifies how the algorithm should make proposals. This is a function

that takes a runtime and an instance and returns a proposer. The proposer takes parent values and proposes a value for the instance along with a log score.

  • num_particles The number of completed particles to use. This is not necessarily the

number attempted. If there are rejections, the algorithm will continue to create particles until num_particles have been completed. Warning: With impossible evidence, the process will not terminate.

source
Scruff.Algorithms.IterativeAlgorithmType
abstract type IterativeAlgorithm <: InstantAlgorithm

Algorithm that runs iteratively on an InstantNetwork.

The algorithm should support two methods: prepare and refine.

An IterativeAlgorithm is also trivially an InstantAlgorithm where Infer is implemented by calling prepare and refine once.

source
Scruff.Algorithms.LazyInferenceType
struct LazyInference <: IterativeAlgorithm

An iterative algorithm that expands recursively and increases the ranges of instances on every iteration.

source
Scruff.Algorithms.LazyStateType
mutable struct LazyState

Maintains the state of a lazy algorithm

Fields

  • previous_algorithm: The last instant algorithm used, if any
  • evidence: The evidence supplied in prepare
  • interventions: The interventions supplied in prepare
  • placeholder_beliefs: The placeholder beliefs supplied in prepare
  • next_size: The range size to use in the next call to refine
  • next_depth: The depth to use in the next call to refine
  • next_iteration: The number of the next iteration
  • is_complete: A flag indicating whether the netwowk has been fully expanded
  • order: The order of nodes used in computations
source
Scruff.Algorithms.LoopyBPType
LoopyBP

An instant algorithm that runs loopy belief propagation.

Arguments

  • defaultrangesize: The size to use as default when calling support on a node.
  • epsilon: The allowable difference between beliefs on successive iterations

for termination.

  • maxiterations: The maximum number of iterations to run. infer will terminate if

this number of iterations is reached, even if it has not converged.

source
Scruff.Algorithms.VEType
VE(query_vars::Vector{Variable})

An instant algorithm that runs variable elimination.

Arguments

  • network
  • query_vars: The variables to query, which are not eliminated
  • depth: A depth of 1 means not to expand expanders, otherwise expands recursively to the given depth
  • bounds: If true, return lower and upper bounds factors, otherwise just return a single factor
source
Scruff.Algorithms.WindowFilterType
struct WindowFilter <: Filter

General construction for a filter based on a flexible windowing scheme.

#arguments windowcreator Defines the method used to create windows inferencealgorithm Defines the algorithm to use on a window postprocess! A postprocessing function, that takes the runtime and does any additional processing needed to carry to the next iteration. Defaults to doing nothing.

source
Scruff.Algorithms.AsyncBPFunction
AsyncBP(range_size = 10, T = Float64)

A window filter that uses an asynchronous window with ThreePassBP with the given range size. T represents the time type and must be the same as used in creation of the runtime.

source
Scruff.Algorithms.AsyncLoopyFunction
AsyncLoopy(range_size = 10, T = Float64)

A window filter that uses an asynchronous window with LoopyBP with the given range size. T represents the time type and must be the same as used in creation of the runtime.

source
Scruff.Algorithms.AsyncPFFunction
AsyncPF(num_particles::Int, resampling_size::Int = num_particles, T = Float64)

A particle filter that uses an asynchronous window with the given number of particles and resampling buffer size. T represents the time type and must be the same as used in creation of the runtime.

source
Scruff.Algorithms.CoherentBPFunction
CoherentBP(range_size = 10, T = Float64)

A window filter that uses a coherent window with ThreePassBP with the given range size. T represents the time type and must be the same as used in creation of the runtime.

source
Scruff.Algorithms.CoherentLoopyFunction
CoherentLoopy(range_size = 10, T = Float64)

A window filter that uses a coherent window with LoopyBP with the given range size. T represents the time type and must be the same as used in creation of the runtime.

source
Scruff.Algorithms.CoherentPFFunction
CoherentPF(num_particles::Int, resampling_size::Int = num_particles, T = Float64)

A particle filter that uses a coherent window with the given number of particles and resampling buffer size. T represents the time type and must be the same as used in creation of the runtime.

source
Scruff.Algorithms.LSFIMethod
function LSFI(query_vars; 
    increment = 10, start_size = increment, max_iterations = 100, start_depth = 1)

A lazy inference algorithm that uses variable elimination at every step.

Arguments

  • query_vars: Variables that can be queried after each refine step
  • increment: The increment to range size on every iteration
  • start_size: The starting range size
  • max_iterations: The maximum number of refinement steps
  • start_depth: The depth of recursive expansion in the first iteration
source
Scruff.Algorithms.SyncBPFunction
SyncBP(range_size = 10)

A window filter that uses a synchronous window with ThreePassBP with the given range size.

source
Scruff.Algorithms.SyncPFFunction
SyncPF(num_particles::Int, resampling_size::Int = num_particles)

A particle filter that uses a synchronous window with the given number of particles and resampling buffer size.

source
Scruff.Algorithms.answerMethod
answer(::Query, ::Algorithm, ::Runtime, ::VariableInstance)
answer(::Query, ::Algorithm, ::Runtime, ::Vector{VariableInstance})
answer(::Query, ::Algorithm, ::Runtime, ::Queryable)
answer(::Query, ::Algorithm, ::Runtime, ::Vector{Queryable})

Answer the query.

An implementation of an algorithm should implement an `answer` method for any queries
it can handle. The type hierarchies of `Query` and `Algorithm` will enable
query answering methods to be used wherever appropriate with the right specialization.
The implementations of `answer` are differentiated along two dimensions:
- single or multiple items
- queryable items in general or specifically instances

It is expected that an algorithm will implement one of the first two methods for queries it
can handle. I.e., an algorithm is expected to handle a single instance or a vector of instances.
If it can handle multiple instances, it should implement a second method and a single instance implementation
is derived by default using a singleton vector. An algorithm can still override this default
method if it handles single instances differently from multiple.

Algorithms will generally not implement the latter two methods, which are provide for convenience. 
Default implementations are provided that delegate to the instance-specific methods.

Defining a very high-level default implementation that throws an error enables implementations
to go through sequences of preferences.
source
Scruff.Algorithms.create_windowMethod
create_window(::SyncWindow, runtime::Runtime, variables::Vector{<:Variable}, time::Int)::Vector{Instance}

Creates a window by instantiating all variables at all intermediate time steps from the earliest parent to the given time. The variables argument is ignored.

source
Scruff.Algorithms.expectationMethod
expectation(alg::Algorithm, runtime::Runtime, item::Queryable, f::Function)::Float64

Return the expectation of the function f over the marginal distribution of item.

The default implementation uses the expectation operation on the SFunc representing the marginal distribution.

source
Scruff.Algorithms.filter_stepMethod
filter_step(filter::Filter, runtime::Runtime, variables::Vector{Variable}, time::T, evidence::Dict{Symbol, Score})

Run one step of the filter by instantiating the given variables at the given time and passing in the given evidence.

source
Scruff.Algorithms.instant_runtime_from_instancesMethod
instant_runtime_from_instances(runtime::DynamicRuntime, instances::Vector{Instance})

Create an instant runtime from the given instances in the given dynamic runtime.

This runtime has an instant network that contains a variable for each instance in insts, tagged with the time of the instance. The network also contains a placeholder for each instance in placeholder_insts. The function also instantiates the variables in the instant runtime and stores any runtime values from the dynamic runtime with the corresponding instances in the instant runtime. This function is useful for running instant algorithms on a time window for dynamic reasoning.

source
Scruff.Algorithms.jointMethod
joint(alg::Algorithm, run::Runtime, items::Vector{Queryable})::Union{Score{O}, Tuple{Score{O}, Score{O}}}

Return the joint distribution over items, or lower and upper distributions, depending on the algorithm.

The returned Score assigns a score for each Vector of values of the items.

source
Scruff.Algorithms.lw_proposalMethod
lw_proposal(runtime::Runtime, instance::VariableInstance, parent_values::Tuple)

Return a proposer and scorer to implement likelihood weighting.

This proposal scheme is the same as the prior proposal unless a variable has hard evidence. In the case of hard evidence, the proposer sets the value of the variable to the evidence value and scores it by the log conditional probability of the evidence given the parent values.

source
Scruff.Algorithms.make_custom_proposalMethod
make_custom_proposal(custom_sfs::Dict{Symbol, SFunc})

Create a proposal function for a custom proposal scheme.

Returns a proposal function that can be provided to the Importance constructor. Evidence is handled similar to lw, except that the custom proposal is used for soft evidence.

Arguments

  • custom_sfs A dictionary mapping variable names to custom sfuncs used for their proposal.

Need not be complete; if a variable is not in this dictionary, its standard sfunc will be used.

source
Scruff.Algorithms.marginalMethod
marginal(parts::Particles, x::Symbol)::Cat

Returns a Cat representing the marginal distribution over the given symbol according to parts

source
Scruff.Algorithms.marginalMethod
marginal(alg::Algorithm, runtime::Runtime, item::Queryable{O})::Union{Dist{O}, Tuple{Dist{O}, Dist{O}}} where O

Return the marginal distribution over item, or lower and upper marginals, depending on the algorithm.

The returned Score assigns a score to each value of item.

source
Scruff.Algorithms.prepareMethod
prepare(algorithm::IterativeAlgorithm, runtime::InstantRuntime
    evidence::Dict{Symbol, <:Score}, 
    interventions::Dict{Symbol, <:Dist},
    placeholder_beliefs::Dict{Symbol, <:Dist})

Prepare the inference algorithm for iteration.

Stores the algorithm state in runtime.

Arguments

  • algorithm: The iterative algorithm to run.
  • runtime: The runtime in which to run the algorithm.
  • evidence: The supplied evidence, which defaults to Dict().
  • interventions: The supplied interventions, which defaults to Dict().
  • placeholder_beliefs: The beliefs associated with the placeholders in the

network, which default to Dict().

source
Scruff.Algorithms.probabilityMethod
probability(alg::Algorithm, run::Runtime, items::Vector{Queryable}, predicate::Function)::Union{Float64, Tuple{Float64, Float64}}

Return the probability that items satisfy query or lower and upper probabilities.

predicate is a function from Vector{Any} to Bool.

source
Scruff.Algorithms.probabilityMethod
probability(alg::Algorithm, run::Runtime, item::Queryable{O}, value::O)::Union{Float64, Tuple{Float64, Float64}} where O

Return the probability that item has value or lower and upper probabilities.

The default implementation tries to use the more general probability of a query. If that fails, it uses the cpdf operation on the marginal of item.

source
Scruff.Algorithms.probability_boundsMethod
probability_bounds(alg::Algorithm, run::Runtime, item::Queryable, range::Vector)::Tuple{Vector{Float64}, Vector{Float64}}

For an algorithm that produces lower and upper bounds, return vectors of lower and upper bounds on probabilities for values in the range.

The range is important for computing the bounds, because it is assumed that values outside the range have probability zero.
source
Scruff.Algorithms.refineMethod
refine(algorithm::IterativeAlgorithm, runtime::InstantRuntime)

Perform the next iteration of the algorithm.

Uses the algorithm state stored in runtime and stores the next state in runtime.

source
Scruff.Algorithms.rejection_proposalMethod
rejection_proposal(::Runtime, instance::VariableInstance, parent_values::Tuple)

Return a proposer and scorer to implement standard rejection sampling from the prior. It proposes a value for the instance from its sfunc, and scores it by the evidence, if any.

source