Feature: TGCN should support non linear_activations#596
Feature: TGCN should support non linear_activations#596CarloLucibello merged 10 commits intoJuliaGraphs:masterfrom
Conversation
…n and gate activations
|
I Believe the test fails are not related to my changes, but to test of GNNGraph class itself as far I am concerned |
send it on slack or by email |
|
The fact that the lux and flux implementations are not in sync is issue #562, the Lux ones need to be updated (not this PR of course) |
Perfect I will try to attack this after this issue is solved |
|
I already implemented your recommendations, test passed correctly |
|
Let me know if its okay and will start attacking other issues |

Hi again @CarloLucibello
Changes in this PR – Solving Issue #591
gate_activation) in the temporal graph layer cells forGraphNeuralNetworks, with corresponding tests.gate_activation) in the temporal graph layer cells forGNNLux, with corresponding tests.Notes & Open Questions
Currently, only the sigmoid gate activation functions have been modified based on the referenced paper.
Should we also modify the tanh activation?
Flux, this change is straightforward.Lux, we might need an alternative toLux.GruCell, as it uses tanh by default (as far as I understand) and may not be easily modifiable.I'll have the proposal between Tuesday and Wednesday. If you could review it as recommended for the GSoC application Julia Guidelines , I’d really appreciate it your help.
Let me know your thoughts! 🚀 It’s working on my end—please confirm if it runs for you.