Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "ITensorNetworks"
uuid = "2919e153-833c-4bdc-8836-1ea460a35fc7"
version = "0.15.20"
version = "0.15.21"
authors = ["Matthew Fishman <mfishman@flatironinstitute.org>, Joseph Tindall <jtindall@flatironinstitute.org> and contributors"]

[workspace]
Expand Down
3 changes: 2 additions & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Graphs = "86223c79-3864-5bf0-83f7-82e725a168b6"
ITensorNetworks = "2919e153-833c-4bdc-8836-1ea460a35fc7"
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"

[sources.ITensorNetworks]
path = ".."

[compat]
Documenter = "1.10"
Documenter = "1"
ITensorNetworks = "0.15"
Literate = "2.20.1"
14 changes: 13 additions & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
using Documenter: Documenter, DocMeta, deploydocs, makedocs
using Graphs: Graphs
using ITensorNetworks: ITensorNetworks
using ITensors: ITensors
using LinearAlgebra: LinearAlgebra

DocMeta.setdocmeta!(
ITensorNetworks, :DocTestSetup, :(using ITensorNetworks); recursive = true
Expand All @@ -16,7 +19,16 @@ makedocs(;
edit_link = "main",
assets = ["assets/favicon.ico", "assets/extras.css"]
),
pages = ["Home" => "index.md", "Reference" => "reference.md"],
pages = [
"Home" => "index.md",
"Manual" => [
"Tensor Networks" => "tensor_networks.md",
"Tree Tensor Networks" => "tree_tensor_networks.md",
"Computing Properties" => "computing_properties.md",
"Solvers" => "solvers.md",
],
"API Reference" => "reference.md",
],
warnonly = true
)

Expand Down
104 changes: 104 additions & 0 deletions docs/src/computing_properties.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
# Computing Properties

## Inner Products and Norms

For general `ITensorNetwork` states, inner products are computed by constructing and
contracting the combined bra–ket network. The default algorithm is **belief propagation**
(`alg="bp"`), which is efficient for large and loopy networks. Use `alg="exact"` for
exact contraction (only practical for small networks or trees).

```julia
z = inner(phi, psi) # ⟨ϕ|ψ⟩ (belief propagation by default)
z = inner(phi, psi; alg="exact") # ⟨ϕ|ψ⟩ (exact contraction)
z = inner(phi, A, psi) # ⟨ϕ|A|ψ⟩
n = norm(psi) # √⟨ψ|ψ⟩
```

For numerically large tensor networks where the inner product would overflow, use the
logarithmic variant:

```julia
logz = loginner(phi, psi) # log(⟨ϕ|ψ⟩) (numerically stable)
```

For `TreeTensorNetwork`, specialised exact methods exploit the tree structure directly
without belief propagation:

```julia
z = inner(x, y) # ⟨x|y⟩ via DFS contraction
z = inner(y, A, x) # ⟨y|A|x⟩
n = norm(psi) # uses ortho_region if available for efficiency
```

```@docs; canonical=false
ITensors.inner(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
ITensors.inner(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
ITensorNetworks.loginner
ITensors.inner(::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork)
ITensors.inner(::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork, ::ITensorNetworks.AbstractTreeTensorNetwork)
```

## Normalization

`normalize` rescales all tensors in the network by the same factor so that `norm(ψ) ≈ 1`.
For `TreeTensorNetwork`, the normalisation is applied directly at the orthogonality centre.

```julia
psi = normalize(psi) # exact (default)
psi = normalize(psi; alg = "bp") # belief-propagation (for large loopy networks)
```

```@docs; canonical=false
LinearAlgebra.normalize(::ITensorNetworks.AbstractITensorNetwork)
```

## Expectation Values

### General `ITensorNetwork`

For arbitrary (possibly loopy) tensor networks, expectation values are computed via
**belief propagation** by default. This is approximate for loopy networks but can be made
exact with `alg="exact"` (at exponential cost).

```julia
# Expectation of "Sz" at every vertex
sz = expect(psi, "Sz")

# Selected vertices only
sz = expect(psi, "Sz", [(1,), (3,), (5,)])

# Exact contraction
sz = expect(psi, "Sz"; alg = "exact")
```

When computing multiple operators on the same state, reuse the belief propagation cache
to avoid redundant work:

```julia
using ITensors: Op
sz = expect(psi, Op("Sz", v)) # single-operator form
```

```@docs; canonical=false
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::String)
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::String, ::Any)
ITensorNetworks.expect(::ITensorNetworks.AbstractITensorNetwork, ::ITensors.Ops.Op)
```

### `TreeTensorNetwork`

For TTN/MPS states, a specialised exact method exploiting successive orthogonalisations is
available. The operator name is passed as the **first** argument (note the different
argument order from the general form above):

```julia
sz = expect("Sz", psi) # all sites
sz = expect("Sz", psi; vertices = [1, 3, 5]) # selected sites
```

This is more efficient than the belief propagation approach for tree-structured networks
because it reuses the orthogonal gauge.

```@docs; canonical=false
ITensorNetworks.expect(::String, ::ITensorNetworks.AbstractTreeTensorNetwork)
```
5 changes: 4 additions & 1 deletion docs/src/reference.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
# Reference
# API Reference

Complete listing of all documented public functions and types in ITensorNetworks.jl,
ITensorNetworks.ModelNetworks, and ITensorNetworks.ModelHamiltonians.

```@autodocs
Modules = [ITensorNetworks, ITensorNetworks.ModelNetworks, ITensorNetworks.ModelHamiltonians]
Expand Down
58 changes: 58 additions & 0 deletions docs/src/solvers.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Solvers

ITensorNetworks.jl provides sweep-based solvers for variational problems on tree tensor
networks. All solvers follow the same high-level pattern:

1. Start from an initial `ITensorNetwork` guess.
2. Sweep over the network, solving a small local problem at each site or pair of sites.
3. After each local solve, truncate the updated bond to control bond dimension growth.
4. Repeat for `nsweeps` sweeps.

## Eigenvalue Problems — `eigsolve` / `dmrg`

[`eigsolve`](@ref ITensorNetworks.eigsolve) finds the lowest eigenvalue and corresponding
eigenvector of an operator (e.g. a Hamiltonian) using a DMRG-like
variational sweep algorithm.
[`dmrg`](@ref ITensorNetworks.dmrg) is an alias for `eigsolve`.

```julia
using NamedGraphs.NamedGraphGenerators: named_comb_tree
using ITensors: OpSum
using ITensorNetworks: dmrg, dst, edges, normalize, random_ttn, siteinds, src, ttn
using TensorOperations

let
# Build a Heisenberg Hamiltonian on a comb tree
g = named_comb_tree((4, 3))
s = siteinds("S=1/2", g)
h = OpSum()
for e in edges(g)
h += 0.5, "S+", src(e), "S-", dst(e)
h += 0.5, "S-", src(e), "S+", dst(e)
h += "Sz", src(e), "Sz", dst(e)
end
H = ttn(h, s)

# Random initial state (normalise first!)
psi0 = normalize(random_ttn(s; link_space = 4))

# Run DMRG
energy, psi = dmrg(H, psi0;
nsweeps = 10,
nsites = 2,
factorize_kwargs = (; cutoff = 1e-10, maxdim = 50),
outputlevel = 1,
)
end
```

```@docs
ITensorNetworks.eigsolve
ITensorNetworks.dmrg
```

## Time Evolution — `time_evolve`

```@docs
ITensorNetworks.time_evolve
```
104 changes: 104 additions & 0 deletions docs/src/tensor_networks.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
# Tensor Networks

## The `ITensorNetwork` Type

An `ITensorNetwork` is the central data structure of this package. It represents a
collection of [`ITensor`](https://itensor.github.io/ITensors.jl/stable/)s arranged on a
graph, where each edge encodes a shared (contracted) index between the neighboring tensors.

Key facts:

- The underlying graph is a [`NamedGraph`](https://github.com/ITensor/NamedGraphs.jl), so
vertices can be any hashable Julia value: integers, tuples, strings, etc.
- Each vertex holds exactly one `ITensor`.
- Edges and link indices are either inferred from shared `Index` objects (when constructing
from a collection of `ITensor`s) or inserted automatically (when constructing from an
`IndsNetwork`).

## Construction

The most common entry point is an `IndsNetwork` — a graph whose vertices and edges carry
`Index` objects. Generate site indices with the `siteinds` function which takes a site
type string (such as "S=1/2" or "Electron") and a NamedGraph. The NamedGraph can be
generated from functions such as `named_grid`, `named_comb_tree`, etc. from the NamedGraphs.jl
`NamedGraphGenerators` module:

```julia
using ITensorNetworks, ITensors, NamedGraphs.NamedGraphGenerators

# 3×3 square-lattice tensor network
g = named_grid((3, 3))
s = siteinds("S=1/2", g) # one spin-½ Index per vertex

# Zero-initialized, bond dimension 2
ψ = ITensorNetwork(s; link_space = 2)

# Product state — every site in the |↑⟩ state
ψ = ITensorNetwork("Up", s)

# Staggered initialization with a vertex-dependent function
ψ = ITensorNetwork(v -> isodd(sum(v)) ? "Up" : "Dn", s)
```

When you already have `ITensor`s in hand, edges are inferred automatically from shared
indices:

```julia
i, j, k = Index(2,"i"), Index(2,"j"), Index(2,"k")
A, B, C = ITensor(i,j), ITensor(j,k), ITensor(k)

tn = ITensorNetwork([A, B, C]) # integer vertices 1, 2, 3
tn = ITensorNetwork(["A","B","C"], [A, B, C]) # named vertices
tn = ITensorNetwork(["A"=>A, "B"=>B, "C"=>C]) # from pairs
```

```@docs; canonical=false
ITensorNetworks.ITensorNetwork
```

## Accessing Data

```julia
ψ[(1,2)] # ITensor at vertex (1,2)
ψ[(1,2)] = T # replace tensor at a vertex
vertices(ψ) # all vertex labels
edges(ψ) # all edges
neighbors(ψ, v) # neighbouring vertices of v
nv(ψ), ne(ψ) # vertex / edge counts
siteinds(ψ) # IndsNetwork of site (physical) indices
linkinds(ψ) # IndsNetwork of bond (virtual) indices
```

## Adding Two `ITensorNetwork`s

Two networks with the same graph and site indices can be added. The result represents the
quantum state `ψ₁ + ψ₂` and has bond dimension equal to the **sum** of the two input bond
dimensions. Individual bonds of the result can be recompressed with `truncate(tn, edge)`.
For `TreeTensorNetwork`, the no-argument form `truncate(ttn; kwargs...)` sweeps and
recompresses all bonds at once.

```julia
ψ12 = add(ψ1, ψ2)
```

```@docs; canonical=false
ITensorNetworks.add(::ITensorNetworks.AbstractITensorNetwork, ::ITensorNetworks.AbstractITensorNetwork)
```

## Bond Truncation

A single bond (edge) of any `ITensorNetwork` can be truncated by SVD:

```julia
tn = truncate(tn, (1,2) => (1,3)) # truncate the bond between vertices (1,2) and (1,3)
tn = truncate(tn, edge) # or pass an AbstractEdge directly
```

Truncation parameters (`cutoff`, `maxdim`, `mindim`, …) are forwarded to `ITensors.svd`.
For a `TreeTensorNetwork`, the sweep-based `truncate(ttn; kwargs...)` is usually more
convenient because it recompresses the entire network at once with controlled errors;
see the [Tree Tensor Networks](@ref) page.

```@docs; canonical=false
Base.truncate(::ITensorNetworks.AbstractITensorNetwork, ::Graphs.AbstractEdge)
```
Loading
Loading