-
Notifications
You must be signed in to change notification settings - Fork 60
Add absorb for putting (part of) the contents of one tensor in another
#283
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 3 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -512,6 +512,36 @@ function catcodomain(t1::TT, t2::TT) where {S,N₂,TT<:AbstractTensorMap{<:Any,S | |
| return t | ||
| end | ||
|
|
||
| """ | ||
| embed!(tdst::AbstactTensorMap, tsrc::AbstractTensorMap) | ||
|
|
||
| Embed the contents of `tsrc` into `tdst`, which may have different sizes of data. | ||
| This is equivalent to the following operation on dense arrays, but also works for symmetric | ||
| tensors. Note also that this only overwrites the regions that are shared, and will do | ||
| nothing on the ones that are not, so it is up to the user to properly initialize the | ||
| destination. | ||
|
|
||
| ```julia | ||
| sub_axes = map((x, y) -> 1:min(x, y), size(tdst), size(tsrc)) | ||
| tdst[sub_axes...] .= tsrc[sub_axes...] | ||
| ``` | ||
| """ | ||
| function embed!(tdst::AbstractTensorMap, tsrc::AbstractTensorMap) | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Is zeroing out existing data in
Member
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's not, it is actually useful to be able to seed the destination with small random noise |
||
| numin(tdst) == numin(tsrc) && numout(tdst) == numout(tsrc) || | ||
| throw(DimensionError("Incompatible number of indices for source and destination")) | ||
| S = spacetype(tdst) | ||
| S == spacetype(tsrc) || throw(SpaceMismatch("incompatible spacetypes")) | ||
| dom = mapreduce(infimum, ⊗, domain(tdst), domain(tsrc); init=one(S)) | ||
| cod = mapreduce(infimum, ⊗, codomain(tdst), codomain(tsrc); init=one(S)) | ||
| for (f1, f2) in fusiontrees(cod ← dom) | ||
| @inbounds data_dst = tdst[f1, f2] | ||
| @inbounds data_src = tsrc[f1, f2] | ||
| sub_axes = map(Base.OneTo ∘ min, size(data_dst), size(data_src)) | ||
| data_dst[sub_axes...] .= data_src[sub_axes...] | ||
| end | ||
| return tdst | ||
| end | ||
|
|
||
| # tensor product of tensors | ||
| """ | ||
| ⊗(t1::AbstractTensorMap, t2::AbstractTensorMap, ...) -> TensorMap | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Out of curiosity, are both implementations functionally equivalent and is it just nicer to have the error checks at the top. Or is there something technically superior to the new implementation?
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They would be, but there is a weird error that got introduced at some point which I actually traced to a formatting change where there is a very subtle difference between:
unfortunately both iterators and keyword arguments can be specified using
inand=, so because previously we weren't being explicit about the;to separate arguments from keyword arguments, it seems like we messed up thedualas keyword argument and it became a product iterator, therefore losing the dual flag altogether.To avoid having this confusion, I felt like instead of changing the
,for;it might be more obvious to just have the error upfront