apply power limit to gross imports/exports#123
Merged
Conversation
JulianGeis
approved these changes
Oct 13, 2025
Contributor
JulianGeis
left a comment
There was a problem hiding this comment.
I reviewed the PR:
- workflow runs through in 365H resolution
- import and export limits are met 15 MW in 2020 and 35 MW in 2045
- [x] code is well documented
Possible problems:
- This PR creates many auxiliary variables, for the 365H run it did not significantly increase the solving time, but this should maybe be checked with a 3H run
- When I import the networks now, there is a Performance warning:
/home/julian-geis/mambaforge/envs/p-de-public_de/lib/python3.12/site-packages/pypsa/io.py:442: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of callingframe.insertmany times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, usenewframe = frame.copy()-> I am not sure if this is directly related but seems to be likely - I tried a few things like to create all variables at once and assign them. but did not really work out, maybe just keep it as is
Collaborator
Author
|
Thanks for the review! I rewrote the constraint with more Linopyesque syntax, in the low resolution model the warning are gone, hopefully for the High Res as well, we will see |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Before:


After:
Description
I learned a nice trick (for some of you this probably old news, but I didn't know it and I think it is very helpful):
In PyPSA-DE we want to limit the total capacity of power imports. A relatively straightforward way to attempt this is to define a constraint for every snapshot
t:However,
"Line-s"may be both positive and negative. Hence this limit will only apply to the net import. In other words: If the model would like to import more from France, it can do so if it exports a bit to Switzerland. (This can get pretty bad, sometimes we observe gross imports that are twice as big as the limit.)To avoid this behaviour, what you actually want to constrain are only the gross imports, i.e., positive parts of the flows,
However,
maxis a nonlinear function. Here I thought, it's impossible to achieve this in PyPSA-DE without converting all lines to links, or the LP to a MILP. But actually there is another way, and ChatGPT gave it away pretty quickly:For every incoming flow introduce an auxiliary variable,
then add the constraint
and voila, the
maxhas been linearized and the constraint applies to the gross import.Before asking for a review for this PR make sure to complete the following checklist:
ariadne_allcompletes without errorsexport_ariadne_variableshas been adapted to the changesChangelog.mdmainhas been merged into the PRYYYYMMDDdescriptive_title