documentsReadOnly implemented#10659
Conversation
🦋 Changeset detectedLatest commit: efabaa5 The changes in this PR will be included in the next version bump. This PR includes changesets to release 6 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
c88a311 to
3005df7
Compare
| throw error; | ||
| const readOnlyHash = JSON.stringify(readOnlyPointerMap); | ||
| const outputDocumentsReadOnly = await cache( | ||
| 'documents', |
There was a problem hiding this comment.
A readonly document is also just a document for hashing purposes.
This ensures there is only one document in the final merged array, because we use the hash to detect the file uniqueness
|
|
||
| async loadDocuments(pointer: Types.OperationDocument[]): Promise<Types.DocumentFile[]> { | ||
| async loadDocuments( | ||
| pointer: UnnormalizedTypeDefPointer, |
There was a problem hiding this comment.
The only caller of this function will be passing in UnnormalizedTypeDefPointer, and nothing else. May as well narrow the type to make it easier to code.
| function hashDocument(doc: Types.DocumentFile) { | ||
| if (doc.rawSDL) { | ||
| return hashContent(doc.rawSDL); | ||
| } | ||
| async function addHashToDocumentFiles( | ||
| documentFilesPromise: Promise<Source[]>, | ||
| type: 'standard' | 'read-only', | ||
| ): Promise<Types.DocumentFile[]> { | ||
| function hashDocument(doc: Source) { | ||
| if (doc.rawSDL) { | ||
| return hashContent(doc.rawSDL); | ||
| } | ||
|
|
||
| if (doc.document) { | ||
| return hashContent(print(doc.document)); | ||
| } | ||
| if (doc.document) { | ||
| return hashContent(print(doc.document)); | ||
| } | ||
|
|
||
| return null; | ||
| } |
There was a problem hiding this comment.
hashDocument is is only used by one function, so I moved it inline to avoid jumping around.
| documentPointers: UnnormalizedTypeDefPointer | UnnormalizedTypeDefPointer[], | ||
| config: Types.Config, | ||
| ): Promise<Types.DocumentFile[]> { | ||
| ): Promise<Source[]> { |
There was a problem hiding this comment.
When this function finishes, we'd only get the base Source. Types.DocumentFile is what we get after processing the Source (adding hash and type)
| ); | ||
|
|
||
| return newDocuments.map((document, index) => ({ | ||
| ...documents[index], |
There was a problem hiding this comment.
Context for future me:
Previously we throw away hash when optimising the doc.
When we introduce type (read-only or standard), we need it for further processing, so we need to keep it like this.
Maybe we should not just add these custom Codegen meta directly to the node, since optimizeDocuments implementation may remove it in the future.
Maybe we can add a special key/value to reduce this risk e.g. __meta: { hash: 'abc', types: 'standard' } 🤔
| hash: string; | ||
| type: 'standard' | 'read-only'; |
There was a problem hiding this comment.
Source is the default from parsing the docs.
DocumentFile is the "processed" source, with Codegen metadata.
We may want to group them to avoid edge cases mentioned in: https://github.com/dotansimha/graphql-code-generator/pull/10659/changes#r3032421307
- Flow documentsReadOnly to typescript-operations - Set up documents-readonly dev-test - Update dev-tests output - Merge standard and read-only documents - Fix Source vs Types.DocumentFile usage for type correctness
8f4ba67 to
df25ba7
Compare
There was a problem hiding this comment.
Documents now have type being 'standard' | 'read-only'.
If we don't supply that, the plugin will generate nothing.
This utility helps by creating Types.DocumentFile with hash and type hardcoded. If we need to make the params partial with override in the future, we could,.
631abbd to
40f5ca8
Compare
40f5ca8 to
819cf71
Compare
| const populateDocumentPointerMap = ( | ||
| allDocumentsDenormalizedPointers: Types.OperationDocument[], | ||
| ): UnnormalizedTypeDefPointer => { | ||
| const pointer: UnnormalizedTypeDefPointer = {}; | ||
| for (const denormalizedPtr of allDocumentsDenormalizedPointers) { | ||
| if (typeof denormalizedPtr === 'string') { | ||
| pointer[denormalizedPtr] = {}; | ||
| } else if (typeof denormalizedPtr === 'object') { | ||
| Object.assign(pointer, denormalizedPtr); | ||
| } | ||
| } | ||
| return pointer; | ||
| }; |
There was a problem hiding this comment.
Inlining populateDocumentPointerMap makes it a bit easier to read as we don't jump around the file. We could take this pure function and put it at the bottom of the file if needed.
| export interface DocumentFile extends Source { | ||
| hash?: string; | ||
| hash: string; | ||
| type: 'standard' | 'read-only'; |
There was a problem hiding this comment.
2 non-optional fields introduced. Any existing custom plugin or preset that constructs a DocumentFile manually will now have a TypeScript compile error. This is a breaking change, not a minor one (mentioned in the changeset)
There was a problem hiding this comment.
Any existing custom plugin or preset that constructs a DocumentFile manually will now have a TypeScript compile error.
That's a good point, I can make this optional and treat no type as standard
| processedFile[file.hash] = true; | ||
|
|
||
| return prev; | ||
| }, []); |
There was a problem hiding this comment.
Looks like the reduce accumulator prev is never used — outputDocuments.push(file) writes to an external array. The return value of the entire reduce is discarded.
It works, by chance, but this is confusing. Maybe a plain for...of loop would be cleaner?
Additionally, if a document has hash: null (which hashDocument() can return when rawSDL and document are both absent), processedFile[null] will be "null" — meaning all null-hash documents get deduped down to 1, which may be wrong.
There was a problem hiding this comment.
Looks like the reduce accumulator prev is never used — outputDocuments.push(file) writes to an external array. The return value of the entire reduce is discarded.
It works, by chance, but this is confusing. Maybe a plain for...of loop would be cleaner?
That's true! I think I was trying to reduce the result to outputDocuments, but it's accessible inside the loop anyways. I've made it for...of in 7ec7e06
Additionally, if a document has hash: null (which hashDocument() can return when rawSDL and document are both absent), processedFile[null] will be "null" — meaning all null-hash documents get deduped down to 1, which may be wrong.
Huh, that's true, the typecheck in this repo is not strict enough. Something I'll need to configure after the major release.
I wonder when null would happen though 🤔 When Codegen is parsing an empty file?
| '@graphql-codegen/client-preset': minor | ||
| --- | ||
|
|
||
| Add support for `documentsReadOnly` |
There was a problem hiding this comment.
It's the last chance to consider other names, such as: externalDocuments, referenceDocuments,
documentsReadOnly has its clear benefits as well, so I don't mind keeping it.
There was a problem hiding this comment.
The pro of documentsReadOnly is that it looks related to documents
The con of documentsReadOnly is that it doesn't sound like English.
externalDocuments sounds better. I'm happy to go with that if you are 🙂
There was a problem hiding this comment.
Sorry for bringing this one up!
It's really hard to decide!
-
documentsReadOnlyhas clear benefits. It is self-descriptive and it appears neardocuments(for code completion) - which is quite valuable as well. But it is not a proper English. -
externalDocuments: The feature is about documents that are loaded for reference but don't produce output. ReadOnly describes how the current output treats them, not a property of the documents themselves - which makesdocumentsReadOnlyname slightly confusing.externalDocumentsis a proper name - in terms of English and semantics, but it is not self-explanatory.
I guess a user of this new feature would have to go to the documentation in any case (either slightly confusing name or not informative enough name), so a better sounding name has a slight benefit, imho.
| documentPointers: UnnormalizedTypeDefPointer | UnnormalizedTypeDefPointer[], | ||
| config: Types.Config, | ||
| ): Promise<Types.DocumentFile[]> { | ||
| ): Promise<Source[]> { |
There was a problem hiding this comment.
The 2 new fields were added to Types.DocumentFile, not to Source, which makes is harder to see where the contract is established.
There was a problem hiding this comment.
Sorry I don't fully understand this statement. However I've traced the CLI code to use the right type:
- before adding metadata (hash, type), the documents are
Source - after adding metadata the documents are
DocumentFile
I feel this may not answer your concern. Could you please help me understand this a bit more?
There was a problem hiding this comment.
I was a bit worried that the Types.DocumentFile[] has 2 new properties (hash and type) and Source doesn't have them - which might mislead a user.
But it is correct from the polymorphic world perspective. So - we can leave it as is.
| const processedFile: Record<string, true> = {}; | ||
| const mergedDocuments = [ | ||
| ...outputDocumentsStandard, | ||
| ...outputDocumentsReadOnly, | ||
| ]; | ||
| for (const file of mergedDocuments) { | ||
| if (processedFile[file.hash]) { | ||
| continue; | ||
| } | ||
| }); | ||
|
|
||
| outputDocuments = result.documents; | ||
| outputDocuments.push(file); | ||
| processedFile[file.hash] = true; | ||
| } |
There was a problem hiding this comment.
I've considered forwarding documentsReadOnly as-is but hit a few annoying cases:
- documents are processed a lot after the initial loading stage here. If we kept documentsReadOnly separate, we'd need to run the same functions over them
- Plugins use positional param, which means
documentsReadOnlyneed to be at the last position to avoid a breaking change. This works, but feels disconnected
For this reason, I feel it's easier if we merge them as early as possible, and run the same processing over both types, and plugins like typescript-operations can handle the filtering without having to access another param
Description
documentsReadOnlyimplemented, for safe handling of fragment includes between packages in the monorepoRelated # (10658)[https://github.com//issues/10658]
Type of change
How Has This Been Tested?