Computing Properties
Inner Products and Norms
For general ITensorNetwork states, inner products are computed by constructing and contracting the combined bra–ket network. The default algorithm is belief propagation (alg="bp"), which is efficient for large and loopy networks. Use alg="exact" for exact contraction (only practical for small networks or trees).
z = inner(phi, psi) # ⟨ϕ|ψ⟩
n = norm(psi) # √⟨ψ|ψ⟩1.0000000000000004For numerically large tensor networks where the inner product would overflow, use the logarithmic variant:
logz = loginner(phi, psi) # log(⟨ϕ|ψ⟩) (numerically stable)-1.0182252764206963 + 3.141592653589793imFor TreeTensorNetwork, specialised exact methods exploit the tree structure directly without belief propagation:
z = inner(x, y) # ⟨x|y⟩ via DFS contraction
n = norm(psi) # uses ortho_region if available for efficiency1.0000000000000004ITensors.inner — Method
inner(ϕ::AbstractITensorNetwork, ψ::AbstractITensorNetwork; alg="bp", kwargs...) -> NumberCompute the inner product ⟨ϕ|ψ⟩ by contracting the combined bra-ket network.
Keyword Arguments
alg="bp": Contraction algorithm."bp"uses belief propagation (default, efficient for large or loopy networks);"exact"uses full contraction with an optimized sequence.
See also: loginner, norm, inner(ϕ, A, ψ).
ITensors.inner — Method
inner(ϕ::AbstractITensorNetwork, A::AbstractITensorNetwork, ψ::AbstractITensorNetwork; alg="bp", kwargs...) -> NumberCompute the matrix element ⟨ϕ|A|ψ⟩ where A is a tensor network operator.
Keyword Arguments
alg="bp": Contraction algorithm."bp"(default) or"exact".
See also: inner(ϕ, ψ).
ITensorNetworks.loginner — Function
loginner(ϕ::AbstractITensorNetwork, ψ::AbstractITensorNetwork; alg="bp", kwargs...) -> NumberCompute log(⟨ϕ|ψ⟩) in a numerically stable way by accumulating logarithms during contraction rather than computing the inner product directly.
Useful when the inner product would overflow or underflow in floating-point arithmetic.
Keyword Arguments
alg="bp": Contraction algorithm,"bp"(default) or"exact".
See also: inner, lognorm.
ITensors.inner — Method
inner(x::AbstractTreeTensorNetwork, y::AbstractTreeTensorNetwork) -> NumberCompute the inner product ⟨x|y⟩ by contracting the bra-ket network using a post-order DFS traversal rooted at root_vertex.
Both networks must have the same graph structure and compatible site indices.
See also: loginner, norm, inner(y, A, x).
ITensors.inner — Method
inner(ϕ::AbstractITensorNetwork, A::AbstractITensorNetwork, ψ::AbstractITensorNetwork; alg="bp", kwargs...) -> NumberCompute the matrix element ⟨ϕ|A|ψ⟩ where A is a tensor network operator.
Keyword Arguments
alg="bp": Contraction algorithm."bp"(default) or"exact".
See also: inner(ϕ, ψ).
Normalization
normalize rescales all tensors in the network by the same factor so that norm(ψ) ≈ 1. For TreeTensorNetwork, the normalisation is applied directly at the orthogonality centre.
psi = normalize(psi) # exact (default)
psi_bp = normalize(psi; alg = "bp") # belief-propagation (for large loopy networks)ITensorNetworks.ITensorNetwork{Tuple{Int64}} with 4 vertices:
4-element NamedGraphs.OrderedDictionaries.OrderedIndices{Tuple{Int64}}:
(1,)
(2,)
(3,)
(4,)
and 3 edge(s):
(1,) => (2,)
(2,) => (3,)
(3,) => (4,)
with vertex data:
4-element Dictionaries.Dictionary{Tuple{Int64}, Any}:
(1,) │ ((dim=2|id=957|"S=1/2,Site,n=1×"), (dim=2|id=474|"1×,2×"))
(2,) │ ((dim=2|id=430|"S=1/2,Site,n=2×"), (dim=2|id=474|"1×,2×"), (dim=2|id=48…
(3,) │ ((dim=2|id=254|"S=1/2,Site,n=3×"), (dim=2|id=480|"2×,3×"), (dim=2|id=27…
(4,) │ ((dim=2|id=285|"S=1/2,Site,n=4×"), (dim=2|id=272|"3×,4×"))LinearAlgebra.normalize — Method
normalize(tn::AbstractITensorNetwork; alg="exact", kwargs...) -> AbstractITensorNetworkReturn a copy of tn rescaled so that norm(tn) ≈ 1.
The rescaling is distributed evenly across all tensors in the network (each tensor is multiplied by the same scalar factor).
Keyword Arguments
alg="exact": Normalization algorithm."exact"contracts ⟨ψ|ψ⟩ exactly;"bp"uses belief propagation for large networks.
Example
julia> using NamedGraphs.NamedGraphGenerators: named_grid
julia> using LinearAlgebra: norm
julia> g = named_grid((4,));
julia> s = siteinds("S=1/2", g);
julia> psi = random_ttn(s; link_space = 2);
julia> psi = normalize(psi);
julia> norm(psi) ≈ 1
trueSee also: norm, inner.
Expectation Values
General ITensorNetwork
For arbitrary (possibly loopy) tensor networks, expectation values are computed via belief propagation by default. This is approximate for loopy networks but can be made exact with alg="exact" (at exponential cost).
# Expectation of "Sz" at every vertex
sz = expect(psi, "Sz")
# Selected vertices only
sz = expect(psi, "Sz", [(1,), (3,)])
# Exact contraction
sz = expect(psi, "Sz"; alg = "exact")4-element Vector{Float64}:
-0.1345088311294428
0.14679402171750452
0.17312505023823688
0.23815952832791223ITensorNetworks.expect — Method
expect(ψ::AbstractITensorNetwork, op::String; alg="bp", kwargs...) -> DictionaryCompute local expectation values ⟨ψ|op_v|ψ⟩ / ⟨ψ|ψ⟩ for the operator named op at every vertex of ψ.
Arguments
ψ: The tensor network state.op: Name of the local operator (e.g."Sz","N","Sx"), passed toITensors.op.alg="bp": Contraction algorithm."bp"uses belief propagation (efficient for loopy or large networks);"exact"performs full contraction.
Keyword Arguments (alg="bp" only)
cache!: OptionalRefto a pre-built belief propagation cache. If provided, the cache is reused across multipleexpectcalls for efficiency.update_cache=true: Whether to update the cache before computing expectation values.
Returns
A Dictionary mapping each vertex of ψ to its expectation value.
Example
julia> using NamedGraphs.NamedGraphGenerators: named_grid
julia> g = named_grid((4,));
julia> s = siteinds("S=1/2", g);
julia> psi = random_ttn(s; link_space = 2);
julia> sz = expect(psi, "Sz");
julia> sz_exact = expect(psi, "Sz"; alg = "exact");
See also: expect(ψ, op::String, vertices), expect(operator, state::AbstractTreeTensorNetwork).
ITensorNetworks.expect — Method
expect(ψ::AbstractITensorNetwork, op::String, vertices; alg="bp", kwargs...) -> DictionaryCompute local expectation values ⟨ψ|op_v|ψ⟩ / ⟨ψ|ψ⟩ for the operator named op at each vertex in vertices.
See expect(ψ, op::String) for full documentation.
ITensorNetworks.expect — Method
expect(ψ::AbstractITensorNetwork, op::Op; alg="bp", kwargs...) -> NumberCompute the expectation value ⟨ψ|op|ψ⟩ / ⟨ψ|ψ⟩ for a single ITensors.Op object.
The default algorithm is belief propagation ("bp"); use alg="exact" for exact contraction.
See also: expect(ψ, op::String).
TreeTensorNetwork
For TTN/MPS states, a specialised exact method exploiting successive orthogonalisations is available. The operator name is passed as the first argument (note the different argument order from the general form above):
sz = expect("Sz", psi) # all sites
sz = expect("Sz", psi; vertices = [(1,), (3,)]) # selected sites2-element Dictionaries.Dictionary{Tuple{Int64}, Float64}:
(1,) │ -0.1345088311294427
(3,) │ 0.17312505023823657This is more efficient than the belief propagation approach for tree-structured networks because it reuses the orthogonal gauge.
ITensorNetworks.expect — Method
expect(operator::String, state::AbstractTreeTensorNetwork; vertices=vertices(state), root_vertex=...) -> DictionaryCompute local expectation values ⟨state|op_v|state⟩ / ⟨state|state⟩ for each vertex v in vertices using exact contraction via successive orthogonalization.
The state is normalized before computing expectation values. The operator name is passed to ITensors.op; each vertex must carry exactly one site index.
Arguments
operator: Name of the local operator, e.g."Sz","N","Sx".state: The tree tensor network state.vertices: Subset of vertices at which to evaluate the operator. Defaults to all vertices.root_vertex: Root used for the DFS traversal order.
Returns
A Dictionary mapping each vertex to its (real-typed) expectation value.
Example
sz = expect("Sz", psi)
sz_sub = expect("Sz", psi; vertices = [1, 3, 5])See also: expect(ψ, op::String) for general ITensorNetwork states with belief propagation support.