Skip to content

Use native gradient API for ForwardDiff, Enzyme, Mooncake#1354

Open
yebai wants to merge 11 commits intomainfrom
hg/mooncake-forward-mode
Open

Use native gradient API for ForwardDiff, Enzyme, Mooncake#1354
yebai wants to merge 11 commits intomainfrom
hg/mooncake-forward-mode

Conversation

@yebai
Copy link
Copy Markdown
Member

@yebai yebai commented Apr 12, 2026

Calls ForwardDiff, Enzyme, and Mooncake API directly, as DI has robustness
issues with both Enzyme and Mooncake.

This PR improves the performance of all benchmarking cases.

To support backend-specific prep and evaluation, _prepare_gradient and
_value_and_gradient are extracted into overridable dispatch methods.
The ADP type parameter on LogDensityFunction is unconstrained
accordingly, since AutoMooncakeForward's prep is a NamedTuple rather
than a GradientPrep. A tangent_type(LogDensityAt) = NoTangent
declaration tells Mooncake to treat the function object as a constant.

- Refactor `_prepare_gradient` and `_value_and_gradient` into
  overridable dispatch methods so backends can bypass DI entirely
- Implement AutoMooncakeForward in the Mooncake extension using
  Mooncake's native derivative cache and a column-by-column sweep
- Force `friendly_tangents=false` in `_cache_config` for both
  AutoMooncake and AutoMooncakeForward to keep caches valid across calls
- Declare `tangent_type(LogDensityAt) = NoTangent` so Mooncake treats
  the function object as a constant
- Relax ADP type parameter on LogDensityFunction from
  `Union{Nothing,DI.GradientPrep}` to unconstrained, to accommodate
  custom prep objects (e.g. the NamedTuple used by AutoMooncakeForward)
- Add AutoMooncakeForward to the precompile workload and test suite,
  including a test that friendly_tangents=true config is handled correctly

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@yebai yebai changed the title Add AutoMooncakeForward (forward-mode AD) support Use native Mooncake cache API Apr 12, 2026
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@yebai yebai force-pushed the hg/mooncake-forward-mode branch 3 times, most recently from 99de26f to b9bca61 Compare April 12, 2026 18:40
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 12, 2026

Benchmark Report

  • this PR's head: 8be907f517400a2443ec93bcbf27e0faac19813b
  • base branch: c636aaaa564ddf38de2823ea869542edfa4d8af2

Computer Information

Julia Version 1.11.9
Commit 53a02c0720c (2026-02-06 00:27 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: 4 × AMD EPYC 9V74 80-Core Processor
  WORD_SIZE: 64
  LLVM: libLLVM-16.0.6 (ORCJIT, znver4)
Threads: 1 default, 0 interactive, 1 GC (on 4 virtual cores)

Benchmark Results

┌───────────────────────┬───────┬─────────────┬────────┬───────────────────────────────┬────────────────────────────┬─────────────────────────────────┐
│                       │       │             │        │       t(eval) / t(ref)        │     t(grad) / t(eval)      │        t(grad) / t(ref)         │
│                       │       │             │        │ ─────────┬──────────┬──────── │ ───────┬─────────┬──────── │ ──────────┬───────────┬──────── │
│                 Model │   Dim │  AD Backend │ Linked │     base │  this PR │ speedup │   base │ this PR │ speedup │      base │   this PR │ speedup │
├───────────────────────┼───────┼─────────────┼────────┼──────────┼──────────┼─────────┼────────┼─────────┼─────────┼───────────┼───────────┼─────────┤
│               Dynamic │    10 │    mooncake │   true │   298.22 │   252.08 │    1.18 │   7.47 │    5.98 │    1.25 │   2226.41 │   1508.19 │    1.48 │
│                   LDA │    12 │ reversediff │   true │  2653.14 │  2006.77 │    1.32 │   2.01 │    2.08 │    0.97 │   5341.90 │   4170.91 │    1.28 │
│   Loop univariate 10k │ 10000 │    mooncake │   true │ 30480.32 │ 27449.35 │    1.11 │   7.43 │    6.78 │    1.09 │ 226363.47 │ 186231.48 │    1.22 │
├───────────────────────┼───────┼─────────────┼────────┼──────────┼──────────┼─────────┼────────┼─────────┼─────────┼───────────┼───────────┼─────────┤
│    Loop univariate 1k │  1000 │    mooncake │   true │  4175.06 │  3612.79 │    1.16 │   5.49 │    5.73 │    0.96 │  22937.72 │  20684.20 │    1.11 │
│      Multivariate 10k │ 10000 │    mooncake │   true │ 35785.74 │ 28379.58 │    1.26 │   8.81 │    8.62 │    1.02 │ 315398.16 │ 244673.66 │    1.29 │
│       Multivariate 1k │  1000 │    mooncake │   true │  3971.96 │  3532.87 │    1.12 │   8.48 │    7.33 │    1.16 │  33689.69 │  25907.11 │    1.30 │
├───────────────────────┼───────┼─────────────┼────────┼──────────┼──────────┼─────────┼────────┼─────────┼─────────┼───────────┼───────────┼─────────┤
│ Simple assume observe │     1 │ forwarddiff │  false │     0.93 │     5.67 │    0.16 │  11.25 │    1.36 │    8.26 │     10.44 │      7.72 │    1.35 │
│           Smorgasbord │   201 │ forwarddiff │  false │   964.64 │   786.01 │    1.23 │  84.86 │   77.19 │    1.10 │  81856.53 │  60673.14 │    1.35 │
│           Smorgasbord │   201 │      enzyme │   true │  1392.77 │  1058.73 │    1.32 │   4.40 │    4.51 │    0.98 │   6128.58 │   4772.68 │    1.28 │
├───────────────────────┼───────┼─────────────┼────────┼──────────┼──────────┼─────────┼────────┼─────────┼─────────┼───────────┼───────────┼─────────┤
│           Smorgasbord │   201 │ forwarddiff │   true │  1342.58 │  1077.43 │    1.25 │  70.09 │   77.68 │    0.90 │  94097.19 │  83693.33 │    1.12 │
│           Smorgasbord │   201 │    mooncake │   true │  1778.52 │  1073.46 │    1.66 │   4.00 │    4.45 │    0.90 │   7113.71 │   4772.68 │    1.49 │
│           Smorgasbord │   201 │ reversediff │   true │  1328.12 │  1087.35 │    1.22 │ 127.27 │  117.72 │    1.08 │ 169031.73 │ 128001.64 │    1.32 │
├───────────────────────┼───────┼─────────────┼────────┼──────────┼──────────┼─────────┼────────┼─────────┼─────────┼───────────┼───────────┼─────────┤
│              Submodel │     1 │    mooncake │   true │     0.92 │     2.47 │    0.37 │  29.26 │    8.56 │    3.42 │     27.04 │     21.13 │    1.28 │
└───────────────────────┴───────┴─────────────┴────────┴──────────┴──────────┴─────────┴────────┴─────────┴─────────┴───────────┴───────────┴─────────┘

@yebai yebai force-pushed the hg/mooncake-forward-mode branch from b9bca61 to 9262d08 Compare April 12, 2026 18:44
@github-actions
Copy link
Copy Markdown
Contributor

DynamicPPL.jl documentation for PR #1354 is available at:
https://TuringLang.github.io/DynamicPPL.jl/previews/PR1354/

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@yebai yebai force-pushed the hg/mooncake-forward-mode branch from 9262d08 to 2a88f94 Compare April 12, 2026 18:49
@TuringLang TuringLang deleted a comment from github-actions Bot Apr 12, 2026
@gdalle
Copy link
Copy Markdown
Contributor

gdalle commented Apr 12, 2026

@yebai can you explain the rationale behind getting rid of DifferentiationInterface?

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@yebai yebai force-pushed the hg/mooncake-forward-mode branch from 9df03ef to 31cc13b Compare April 12, 2026 18:54
@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 12, 2026

Codecov Report

❌ Patch coverage is 67.12329% with 24 lines in your changes missing coverage. Please review.
✅ Project coverage is 78.30%. Comparing base (c636aaa) to head (8be907f).

Files with missing lines Patch % Lines
ext/DynamicPPLEnzymeExt.jl 0.00% 20 Missing ⚠️
ext/DynamicPPLMooncakeExt.jl 93.33% 2 Missing ⚠️
src/logdensityfunction.jl 50.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1354      +/-   ##
==========================================
- Coverage   78.62%   78.30%   -0.33%     
==========================================
  Files          50       52       +2     
  Lines        3631     3697      +66     
==========================================
+ Hits         2855     2895      +40     
- Misses        776      802      +26     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@yebai yebai changed the title Use native Mooncake cache API Use native Mooncake gradient API Apr 12, 2026
@yebai yebai changed the title Use native Mooncake gradient API Use native gradient API for ForwardDiff, Enzyme, Mooncake Apr 12, 2026
@yebai yebai force-pushed the hg/mooncake-forward-mode branch from 0fe3386 to e70bf72 Compare April 12, 2026 19:52
@TuringLang TuringLang deleted a comment from github-actions Bot Apr 12, 2026
Copy link
Copy Markdown
Contributor

@github-actions github-actions Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remaining comments which cannot be posted as a review comment to avoid GitHub Rate Limit

JuliaFormatter v1.0.62

[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

key in vn_leaves,
j in eachindex(axes(split_dicts, 2))


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

i in eachindex(axes(params_and_stats, 1)),
key in stat_keys,


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

@inline DynamicPPL.maybe_view_ad(vect::ReverseDiff.TrackedArray, range) = getindex(
vect, range
)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

getlogprior(vi) + getloglikelihood(vi) - getlogjac(vi)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

generate_mainbody!(mod, Symbol[], expr, warn, warn_threads)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

has_cf_value(CF, context, vn)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

has_cf_value_nested(Condition, context, vn)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

get_cf_value_nested(Condition, context, vn)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

remove_cf_values(Condition, context, args...)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

condition(model, values)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

contextualize(model, unfix_context(model.context, syms...))


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

s.params[1].subparams[:, 1, :] ~
reshape(product_distribution(fill(InverseGamma(2, 3), n)), d, 2)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

return (vi1.transform_strategy == vi2.transform_strategy) & (vi1.values == vi2.values) &


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

idx_size_type = Dims{_ndims(template, coptic.ix...;coptic.kw...)}


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

push!(exs, quote
val = vnt.data.$name
if val isa VarNamedTuple || val isa PartialArray
if !Base.isempty(val)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

else
return false
end
end)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

DynamicPPL.accumulate_assume!!(::ErrorAccumulator, ::Any, ::Any, ::Any, ::VarName, ::Distribution, ::Any) = throw(
ErrorAccumulatorException()
)
DynamicPPL.accumulate_observe!!(::ErrorAccumulator, ::Distribution, ::Any, ::Union{VarName,Nothing}, ::Any) = throw(
ErrorAccumulatorException()
)


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶


[JuliaFormatter v1.0.62] reported by reviewdog 🐶

…ional

- Move DifferentiationInterface to [weakdeps]; add DynamicPPLDifferentiationInterfaceExt
  as fallback for backends without native implementations
- Add native ForwardDiff gradient via GradientConfig (DynamicPPLForwardDiffExt)
- Add native Enzyme gradient via autodiff(ReverseWithPrimal, ...) (new DynamicPPLEnzymeExt)
- Keep native Mooncake reverse/forward gradient (DynamicPPLMooncakeExt)
- Add Enzyme to test env; drop DI from test env

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@yebai yebai force-pushed the hg/mooncake-forward-mode branch from 4f6efda to 0cb79f8 Compare April 12, 2026 20:09
@gdalle
Copy link
Copy Markdown
Contributor

gdalle commented Apr 12, 2026

@yebai can you please open issues on the DI repository to describe the "robustness issues" you're referring to above?

@yebai yebai force-pushed the hg/mooncake-forward-mode branch from 6e4ac27 to b3c9ed1 Compare April 12, 2026 20:54
f = DynamicPPL.LogDensityAt(
model, getlogdensity, varname_ranges, transform_strategy, accs
)
dx = prep.dx
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@penelopeysm @yebai you should use the version without a closure here [otherwise it's breaking]

- ForwardDiff: use DiffResults (via ForwardDiff.DiffResults) for single-pass
  value+gradient, removing the double primal evaluation
- ForwardDiff: remove redundant chunk_size guard in _prepare_gradient
  (tweak_adtype already normalises it to a concrete positive integer)
- AutoMooncakeForward: handle empty params edge case (loop doesn't execute)
- Mooncake _cache_config: use Accessors.@set to preserve all Config fields
  when overriding friendly_tangents=false, instead of forwarding only two
  known fields
- Mooncake @compile_workload: remove redundant single-element for-loop
- EnzymeExt: document that adtype.mode is intentionally ignored (always reverse)
- src/logdensityfunction.jl: add fallback error for _value_and_gradient with
  unknown AD backends, pointing users to ForwardDiff (the default) or DI
- test/logdensityfunction.jl: revert formatter noise (accumulate_assume!!,
  accumulate_observe!!, ::Type{T}=... syntax)
- test/Project.toml: remove accidentally-added DynamicPPL dep

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@yebai yebai force-pushed the hg/mooncake-forward-mode branch from b3c9ed1 to de0e6b4 Compare April 12, 2026 21:30
@yebai yebai force-pushed the hg/mooncake-forward-mode branch from d446e29 to 11fae89 Compare April 12, 2026 21:42
@yebai yebai force-pushed the hg/mooncake-forward-mode branch from 8f212fe to da2b77f Compare April 12, 2026 22:32
transform_strategy::DynamicPPL.AbstractTransformStrategy,
accs::DynamicPPL.AccumulatorTuple,
)
# Pass the plain function plus Const arguments; Enzyme is brittle with closure-like callables.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is not true, Enzyme (as is julia) is higher performance without closures

Enzyme.Const(accs),
),
)
return val, copy(dx)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should'nt copy here?

@yebai yebai force-pushed the hg/mooncake-forward-mode branch from da2b77f to 8c6c52a Compare April 12, 2026 22:37
@yebai
Copy link
Copy Markdown
Member Author

yebai commented Apr 12, 2026

@wsmoses, feel free to suggest concrete code changes for Enzyme. Coding agents don't always give optimal solutions.

@yebai yebai force-pushed the hg/mooncake-forward-mode branch from 28d9100 to 8be907f Compare April 12, 2026 23:46
@langestefan
Copy link
Copy Markdown

Please don't do this. We want more interoperability, not less. This will hurt the ecosystem

@brenhinkeller
Copy link
Copy Markdown

This seems like a somewhat strange decision. What were the robustness issues?

@@ -0,0 +1,65 @@
module DynamicPPLDifferentiationInterfaceExt
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason for making this a package extension instead of keeping it as a dependency? DI only depends on ADTypes and LinearAlgebra, it doesn't get much more lightweight

fill!(dx, zero(eltype(dx)))
_, val = Enzyme.autodiff(
_enzyme_gradient_mode(adtype),
logdensity_at,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the function annotation in the AutoEnzyme backend taken into account?

Comment on lines +5 to +6
# DiffResults is a direct dependency of ForwardDiff; access it through ForwardDiff's namespace
# rather than listing it as a separate (weak)dep of DynamicPPL.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is bad practice because you cannot version-bound DiffResults separately, and nothing guarantees that ForwardDiff will keep it as a dep

Comment on lines +101 to +107
@inbounds for i in eachindex(grad, dx)
dx[i] = one(eltype(dx))
result = value_and_derivative!!(cache, Dual(f, NoTangent()), Dual(params, dx))
value = primal(result)
grad[i] = tangent(result)
dx[i] = zero(eltype(dx))
end
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought there was a chunked forward mode for this kind of stuff?
Related:

Comment thread src/logdensityfunction.jl
Comment on lines +410 to +420
function _value_and_gradient(adtype::ADTypes.AbstractADType, args...)
throw(
ArgumentError(
"No gradient implementation found for AD backend $adtype. " *
"If you intended to use the default (ForwardDiff), ensure that ForwardDiff is " *
"loaded (e.g. `using ForwardDiff`). For other backends, load the corresponding " *
"package (e.g. `using Mooncake`, `using Enzyme`) or load " *
"DifferentiationInterface as a fallback.",
),
)
end
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a breaking change

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And it makes ReverseDiff no longer supported by Turing, if I understand properly?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically it is still supported but you need to import DI separately to trigger the extension

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this PR also removes the relevant tests anyway?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's still tested, but even though DI is removed as a test dep, it still gets pulled into the test env via Bijectors and MarginalLogDensities, which is why the tests pass. So I guess this is dangerous since it implicitly relies on them still having DI as a dep.

pkg> why DifferentiationInterface
  Bijectors  DifferentiationInterface
  MarginalLogDensities  DifferentiationInterface
  MarginalLogDensities  Optimization  OptimizationBase  DifferentiationInterface
  MarginalLogDensities  OptimizationOptimJL  Optim  LineSearches  NLSolversBase  DifferentiationInterface
  MarginalLogDensities  OptimizationOptimJL  Optim  NLSolversBase  DifferentiationInterface
  MarginalLogDensities  OptimizationOptimJL  OptimizationBase  DifferentiationInterface

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Especially since @yebai is also removing DI from
Bijectors as part of the great spring cleaning 😅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants