greedy.sampler.SteepestDescentSolver.sample¶
- SteepestDescentSolver.sample(bqm, num_reads=None, initial_states=None, initial_states_generator='random', seed=None, large_sparse_opt=False, **kwargs)[source]¶
Sample from a binary quadratic model.
Starts from
initial_states
, and converges to local minima using discrete steepest-descent method.- Parameters
bqm (
BinaryQuadraticModel
) – The binary quadratic model to be sampled.num_reads (int, optional, default=len(initial_states) or 1) – Number of reads. Each read is generated by one run of the steepest descent algorithm. If num_reads is not explicitly given, it is selected to match the number of initial states given. If initial states are not provided, only one read is performed.
initial_states (samples-like, optional, default=None) – One or more samples, each defining an initial state for all the problem variables. Initial states are given one per read, but if fewer than num_reads initial states are defined, additional values are generated as specified by initial_states_generator. See
dimod.as_samples()
for a description of “samples-like”.initial_states_generator ({'none', 'tile', 'random'}, optional, default='random') –
Defines the expansion of initial_states if fewer than num_reads are specified:
- ”none”:
If the number of initial states specified is smaller than num_reads, raises ValueError.
- ”tile”:
Reuses the specified initial states if fewer than num_reads or truncates if greater.
- ”random”:
Expands the specified initial states with randomly generated states if fewer than num_reads or truncates if greater.
seed (int (32-bit unsigned integer), optional) – Seed to use for the PRNG. Specifying a particular seed with a constant set of parameters produces identical results. If not provided, a random seed is chosen.
large_sparse_opt (bool, optional, default=False) – Use optimizations for large and sparse problems (search tree for next descent variable instead of linear array).
- Returns
A dimod
SampleSet
object.Number of descends (single variable flips) taken to reach the local minimum for each sample is stored in a data vector called
num_steps
.- Return type
Examples
This example samples a simple two-variable Ising model.
>>> import dimod >>> bqm = dimod.BQM.from_ising({}, {'ab': 1}) ... >>> import greedy >>> sampler = greedy.SteepestDescentSampler() ... >>> samples = sampler.sample(bqm) >>> samples.first.energy -1.0
Run steepest descent one million times (takes ~150ms on an average laptop), converging to local minima, each time starting from a random state:
>>> bqm = dimod.BQM.from_ising({}, {'ab': 1}) >>> sampler = greedy.SteepestDescentSampler() ... >>> samples = sampler.sample(bqm, num_reads=10**6) >>> print(samples.aggregate()) a b energy num_oc. num_st. 0 -1 +1 -1.0 500115 1 1 +1 -1 -1.0 499885 1 ['SPIN', 2 rows, 1000000 samples, 2 variables]
Use a combination of one fixed initial state and two randomly generated ones to sample from a simple convex Ising problem with global minimum at
(-1,-1)
:>>> bqm = dimod.BQM.from_ising({'x': 2, 'y': 2}, {'xy': -1}) >>> sampler = greedy.SteepestDescentSampler() ... >>> samples = sampler.sample(bqm, initial_states=([1, 1], 'xy'), num_reads=3) >>> print(samples) x y energy num_oc. num_st. 0 -1 -1 -5.0 1 2 1 -1 -1 -5.0 1 1 2 -1 -1 -5.0 1 2 ['SPIN', 3 rows, 3 samples, 2 variables]
Notice it required 2 variable flips (
num_steps
field in the last column) to reach the minimum state,(-1, -1)
, from the initial state,(1, 1)
.