Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix name conflicts on ITensorMPS extension #322

Closed
wants to merge 13 commits into from
2 changes: 2 additions & 0 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ ScopedValues = "7e506255-f358-4e82-b7e4-beb19740aa63"
Serialization = "9e88b42a-f829-5b0c-bbe9-9e923198166b"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
UUIDs = "cf7118a7-6976-5b1a-9a39-7adc72f591a4"
ValSplit = "0625e100-946b-11ec-09cd-6328dd093154"

[weakdeps]
Adapt = "79e6a3ab-5dfb-504d-930d-738a2a938a0e"
Expand Down Expand Up @@ -84,5 +85,6 @@ ScopedValues = "1"
Serialization = "1.10"
SparseArrays = "1.10"
UUIDs = "1.10"
ValSplit = "0.1.1"
YaoBlocks = "0.13"
julia = "1.10"
1 change: 1 addition & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ makedocs(;
"The Product ansatz" => "manual/design/product.md",
"The MPS/MPO ansatz" => "manual/design/mps.md",
],
"Interfaces" => "manual/interfaces.md",
"🤝 Interoperation" => "manual/interop.md",
"Acceleration with Reactant.jl" => "manual/reactant.md",
],
Expand Down
135 changes: 135 additions & 0 deletions docs/src/manual/interfaces.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,135 @@
# Interfaces

Julia doesn't have a formal definition of interface built into the language. Instead it relies on [duck typing](https://wikipedia.org/wiki/Duck_typing).
Any declaration of a formal interface is then the documentation written for it.

## [TensorNetwork](@id man-interface-tensornetwork) interface

A [TensorNetwork (interface)](@ref man-interface-tensornetwork) is a collection of [`Tensor`](@ref)s forming a graph structure.

| Required method | Brief description |
| :----------------------- | :-------------------------------------------------- |
| `tensors(tn; kwargs...)` | Returns a list of [`Tensor`](@ref)s present in `tn` |
| `copy(tn)` | Returns a shallow copy of `tn` |

The following methods are optional but you might be interested on implementing them for performance reasons.

| Method | Default definition | Brief description |
| :------------------------ | :----------------------------------------------------------- | :---------------------------------------------------------------- |
| `inds(tn; kwargs...)` | `mapreduce(inds, ∪, tensors(tn))` | Returns a list of indices present in `tn` |
| `hasind(tn, ind)` | `index in inds(tn)` | Returns `true` if `index` is a existing index in `tn` |
| `hastensor(tn, tensor)` | `tensor in tensors(tn)` | Returns `true` if `tensor` is a existing [`Tensor`](@ref) in `tn` |
| `size(tn)` | Get index sizes from `tensors(tn)` | Returns a `Dict` that maps indices to their sizes |
| `size(tn, i)` | Get first matching tensor from `tensors(tn)` and query to it | Returns the size of the given index `i` |
| `ntensors(tn; kwargs...)` | `length(tensors(tn; kwargs...))` | Returns the number of tensors contained in `tn` |
| `ninds(tn; kwargs...)` | `length(inds(tn; kwargs...))` | Returns the number of indices in `tn` |

### Mutating methods

Tensor Networks are not static entitities. They change.
In order to support mutation, the Tensor Network type needs to implement the following methods.

| Method | Brief description |
| :----------------------------------- | :------------------------------------------------------------- |
| `push!(tn, tensor)` | Adds a new [`Tensor`](@ref) to `tn` |
| `pop!(tn, tensor)` | Removes a specific [`Tensor`](@ref) from `tn` |
| `replace!(tn, index => new_index)` | Renames the `index` with `new_index`, if `index` is in `tn` |
| `replace!(tn, tensor => new_tensor)` | Replace the `tensor` with `new_tensor`, if `tensor` is in `tn` |

!!! warning
These methods are not forwarded because mutating the `TensorNetwork` can break mappings of composed objects in [Pluggable](@ref man-interface-pluggable) and [Ansatz](@ref man-interface-ansatz).

!!! todo
We are considering moving to a _effect handling_ system, which would ease tracking mutation on subtypes. In particular the effects we are currently considering are:

- `ContractIndexEffect` called on `contract!(tn, i)`
- `ReplaceIndexEffect` called on `replace!(tn, old_index => new_index)`
- `ReplaceTensorEffect` called on `replace!(tn, old_tensor => new_tensor)`

### Keyword methods

#### `tensors` keyword methods

| Method | Default implementation | Default Brief description |
| :--------------------------- | :------------------------------------ | :--------------------------------------------------------------- |
| `tensors(tn; contains=is)` | `filter(⊇(is), tensors(tn))` | Returns the [`Tensor`](@ref)s containing all the indices `is` |
| `tensors(tn; intersects=is)` | `filter(isdisjoint(is), tensors(tn))` | Returns the [`Tensor`](@ref)s intersecting with the indices `is` |

#### `inds` keyword methods

| Method | Brief description |
| :--------------------- | :-------------------------------------------------------------------------------------------------------------- |
| `inds(tn; set)` | Return a subset of the indices present in `tn`, where `set` can be one of `:all`, `:open`, `:inner` or `:hyper` |
| `inds(tn; parallelto)` | Return the indices parallel to the index `parallelto` |

### [WrapsTensorNetwork](@id man-interface-wrapstensornetwork) trait

Many of the types in Tenet work by composing [`TensorNetwork` (type)](@ref TensorNetwork) and all of the optional methods above have a faster implementation for it.
By just forwarding to their [`TensorNetwork` (type)](@ref TensorNetwork) field, wrapper types can accelerate their [TensorNetwork (interface)](@ref man-interface-tensornetwork) methods.

| Required method | Default definition | Brief description |
| :--------------------------------- | :----------------- | :--------------------------------------------------------------- |
| `Wraps(::Type{TensorNetwork}, tn)` | `No()` | Return `Yes()` if `tn` contains a [`TensorNetwork`](@ref) object |
| `TensorNetwork(tn)` | (_undefined_) | Return the [`TensorNetwork`](@ref) object wrapped by `tn` |

## [Pluggable](@id man-interface-pluggable) interface

A [`Pluggable`](@ref man-interface-pluggable) is a [`TensorNetwork`](@ref man-interface-tensornetwork) together with a mapping between [`Site`](@ref)s and open indices.

| Required method | Brief description |
| :-------------- | :-------------------------------------------------- |
| `sites(tn;)` | Returns the list of [`Site`](@ref)s present in `tn` |
| `inds(tn; at)` | Return the index linked to `at` [`Site`](@ref) |
| `sites(tn; at)` | Return the [`Site`](@ref) linked to the index `at` |

As with the interface above, there are optional methods with default implementations that you might be interested in overriding for performance reasons.

| Method | Default definition | Brief description |
| :---------------------- | :----------------------------- | :---------------------------------------------------- |
| `nsites(tn; kwargs...)` | `length(sites(tn; kwargs...))` | Returns the number of [`Site`](@ref)s present in `tn` |
| `hassite(site, tn)` | `site in sites(tn))` | Returns `true` if `site` exists in `tn` |

### Keyword methods

| Method | Brief description |
| :----------------- | :------------------------------------------------------------------------------------------- |
| `sites(tn;)` | Returns all the [`Site`](@ref)s |
| `sites(tn; plugs)` | |
| `inds(tn; plugs)` | Return the indices linked to some [`Site`](@ref); i.e. the ones behaving as physical indices |

### Mutating methods

!!! warning
If you use directly these methods, you are responsible for leaving the Tensor Network in a coherent state.

| Mutating methods | Brief description |
| :-------------------------- | :------------------------------- |
| `addsite!(tn, site => ind)` | Register mapping `site` to `ind` |
| `rmsite!(tn, site)` | Unregister `site` |

## [Ansatz](@id man-interface-ansatz) interface

A [`Ansatz`](@ref man-interface-ansatz) is a [`TensorNetwork`](@ref man-interface-tensornetwork) together with a mapping between [`Lane`](@ref)s and [`Tensor`](@ref)s.

| Required method | Brief description |
| :---------------- | :--------------------------------------------------------------------------------------------------------- |
| `lanes(tn)` | Returns the list of [`Lane`](@ref)s present in `tn` |
| `tensors(tn; at)` | Returns the [`Tensor`](@ref) linked to the `at` [`Lane`](@ref). Dispatched through `tensors(tn; at::Lane)` |
| `lattice(tn)` | Returns the [`Lattice`](@ref) associated to `tn` |

As in the interfaces defined above, there are optional methods with default definitions which you might be interested on overriding for performance reasons.

| Method | Default definition | Brief description |
| :----------------- | :------------------ | :---------------------------------------------------- |
| `nlanes(tn)` | `length(lanes(tn))` | Returns the number of [`Lane`](@ref)s present in `tn` |
| `haslane(tn,lane)` | `lane in lanes(tn)` | Returns `true` if `lane` exists in `tn` |

### Mutating methods

!!! warning
If you use directly these methods, you are responsible for leaving the Tensor Network in a coherent state.

| Method | Brief description |
| :----------------------------- | :------------------------------------------ |
| `addlane!(tn, lane => tensor)` | Registers the mapping of `lane` to `tensor` |
| `rmlane!(tn, lane)` | Unregister `lane` from mapping |
13 changes: 6 additions & 7 deletions ext/TenetITensorMPSExt.jl
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
module TenetITensorMPSExt

using Tenet
using Tenet: Tenet, MPS, tensors, form, inds, lanes, id, Site, Lane
using ITensors
using ITensorMPS
using ITensors: ITensor, Index, dim
using Tenet: Tenet, MPS, tensors, form, inds, lanes, Site, Lane
using ITensors: ITensors, ITensor, Index, dim, siteinds, linkinds
using ITensorMPS: ITensorMPS

# Convert an AbstractMPS to an ITensor MPS
function Base.convert(::Type{ITensorMPS.MPS}, mps::Tenet.AbstractMPS)
Expand Down Expand Up @@ -76,15 +75,15 @@ function Base.convert(::Type{MPS}, itensors_mps::ITensorMPS.MPS)
links = linkinds(itensors_mps)

tensors_vec = []
first_ten = array(itensors_mps[1], sites[1], links[1])
first_ten = ITensors.array(itensors_mps[1], sites[1], links[1])
push!(tensors_vec, first_ten)

# Extract the bulk tensors
for j in 2:(length(itensors_mps) - 1)
ten = array(itensors_mps[j], sites[j], links[j - 1], links[j]) # Indices are ordered as (site index, left link, right link)
ten = ITensors.array(itensors_mps[j], sites[j], links[j - 1], links[j]) # Indices are ordered as (site index, left link, right link)
push!(tensors_vec, ten)
end
last_ten = array(itensors_mps[end], sites[end], links[end])
last_ten = ITensors.array(itensors_mps[end], sites[end], links[end])
push!(tensors_vec, last_ten)

mps = Tenet.MPS(tensors_vec)
Expand Down
28 changes: 28 additions & 0 deletions src/Helpers.jl
Original file line number Diff line number Diff line change
Expand Up @@ -60,3 +60,31 @@ Base.values(uc::UnsafeScope) = map(x -> x.value, uc.refs)

# from https://discourse.julialang.org/t/sort-keys-of-namedtuple/94630/3
@generated sort_nt(nt::NamedTuple{KS}) where {KS} = :(NamedTuple{$(Tuple(sort(collect(KS))))}(nt))

mutable struct CachedField{T}
isvalid::Bool
value::T
end

CachedField{T}() where {T} = CachedField{T}(false, T())

invalidate!(cf::CachedField) = cf.isvalid = false
function Base.get!(f, cf::CachedField)
!cf.isvalid && (cf.value = f())
cf.isvalid = true
return cf.value
end

struct Yes end
struct No end

function hist(x; init=Dict{eltype(x),Int}())
for xi in x
if haskey(init, xi)
init[xi] += 1
else
init[xi] = 1
end
end
return init
end
10 changes: 10 additions & 0 deletions src/Interfaces/Ansatz.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
function lanes end

function tensorat end
function laneat end

nlanes(tn) = length(lanes(tn))
haslane(tn, lane) = lane ∈ lanes(tn)

# sugar
Base.in(lane::Lane, tn::AbstractTensorNetwork) = haslane(tn, lane)
Loading
Loading