Convert Android LLM extension from Java to Kotlin (#19211)#19211
Convert Android LLM extension from Java to Kotlin (#19211)#19211kirklandsign wants to merge 1 commit intomainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/19211
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 1 New Failure, 1 Cancelled Job, 3 Unrelated FailuresAs of commit cf99b56 with merge base d767516 ( CANCELLED JOB - The following job was cancelled. Please retry:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following jobs failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
@kirklandsign has exported this pull request. If you are a Meta employee, you can view the originating Diff in D102880053. |
This PR needs a
|
There was a problem hiding this comment.
Pull request overview
Converts the Android LLM extension surface area from Java to Kotlin while aiming to preserve the existing public API and JNI/fbjni HybridData integration.
Changes:
- Replaced Java implementations of
LlmModule,LlmModuleConfig, andLlmGenerationConfigwith Kotlin equivalents. - Updated
LlmCallbackto Kotlin and adjusted BUCK to compile the target as Kotlin sources. - Preserved the HybridData init pattern and external/native method declarations for JNI compatibility.
Reviewed changes
Copilot reviewed 8 out of 8 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
| extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.kt | Kotlin port of the LLM module wrapper and JNI entrypoints. |
| extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.java | Removed Java version after Kotlin conversion. |
| extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.kt | Kotlin port of module initialization configuration + builder. |
| extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.java | Removed Java version after Kotlin conversion. |
| extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.kt | Kotlin port of generation configuration + builder with Java getter name compatibility. |
| extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.java | Removed Java version after Kotlin conversion. |
| extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt | Kotlin port of callback interface for generation results/stats/errors. |
| extension/android/BUCK | Switches the LLM extension BUCK target from Java to Kotlin sources. |
Comments suppressed due to low confidence (1)
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt:46
onStats/onErrorare declared with Kotlin interface method bodies. Unless the module is compiled with Kotlin JVM default methods enabled (e.g.,-Xjvm-default=allorall-compatibility), these will be emitted viaDefaultImplsand will still be abstract to Java callers—breaking existing Java implementers that previously relied on Java-8 default methods (e.g., code that only overridesonResult). To preserve Java source/binary compatibility, either enable JVM default mode for this Android library and/or use@JvmDefaultWithCompatibilityon these methods, or keepLlmCallbackas a Java interface with real default methods.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| require(width > 0 && height > 0 && channels > 0 && image.remaining() >= expectedBytes) { | ||
| "ByteBuffer remaining (${image.remaining()}) must be at least width*height*channels*4 ($expectedBytes)." | ||
| } |
There was a problem hiding this comment.
Same issue here: image.remaining() is an Int but expectedBytes is a Long, so image.remaining() >= expectedBytes won’t compile. After you’ve ensured the computed byte count is within Int range, cast once and do all remaining comparisons using the same type.
| expectedBytes <= Int.MAX_VALUE && | ||
| image.remaining() >= expectedBytes) { |
There was a problem hiding this comment.
This require(...) mixes Long (expectedBytes) with Int (Int.MAX_VALUE, image.remaining()), which won’t compile in Kotlin because numeric types aren’t implicitly converted. Convert the Int values to Long (e.g., Int.MAX_VALUE.toLong() / image.remaining().toLong()) or cast expectedBytes to Int after the bounds check and compare as Int.
| expectedBytes <= Int.MAX_VALUE && | |
| image.remaining() >= expectedBytes) { | |
| expectedBytes <= Int.MAX_VALUE.toLong() && | |
| image.remaining().toLong() >= expectedBytes) { |
|
@claude review |
1 similar comment
|
@claude review |
|
Claude finished @kirklandsign's task in 3m 30s —— View job Review: Convert Android LLM extension from Java to Kotlin
Overall this is a clean conversion that preserves the public API surface well. Good use of Kotlin idioms ( I have two concerns — one potentially blocking and one worth investigating: Issue 1:
|
Summary: Pull Request resolved: #19211 Convert LlmModule, LlmCallback, LlmGenerationConfig, and LlmModuleConfig from Java to Kotlin while preserving all public API signatures and JNI compatibility. Key changes: - Use Kotlin idioms: `external fun` for native methods, `companion object` for static members, `require()` for validation, `apply {}` in Builders - Preserve HybridData/fbjni pattern with DoNotStrip and JvmStatic - Add JvmName annotations for boolean getters (isEcho, isWarming) to maintain Java source compatibility - Keep all constructor overloads as secondary constructors for backward compatibility with Java callers (e.g., benchmark LlmModelRunner) - Update BUCK target language from JAVA to KOTLIN - Remove package-info.java (no Kotlin equivalent needed) Differential Revision: D102880053
498ee55 to
a6a8b8d
Compare
Summary: Pull Request resolved: #19211 Convert LlmModule, LlmCallback, LlmGenerationConfig, and LlmModuleConfig from Java to Kotlin while preserving all public API signatures and JNI compatibility. Key changes: - Use Kotlin idioms: `external fun` for native methods, `companion object` for static members, `require()` for validation, `apply {}` in Builders - Preserve HybridData/fbjni pattern with DoNotStrip and JvmStatic - Add JvmName annotations for boolean getters (isEcho, isWarming) to maintain Java source compatibility - Keep all constructor overloads as secondary constructors for backward compatibility with Java callers (e.g., benchmark LlmModelRunner) - Update BUCK target language from JAVA to KOTLIN - Remove package-info.java (no Kotlin equivalent needed) Differential Revision: D102880053
a6a8b8d to
cf99b56
Compare
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 8 out of 8 changed files in this pull request and generated 2 comments.
Comments suppressed due to low confidence (2)
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt:47
LlmCallbackpreviously provided Java 8 default methods foronStats/onError, so Java callers could implement onlyonResult. In Kotlin, interface methods with bodies will not become Java default methods unless the module is compiled with-Xjvm-default(e.g.all-compatibility), so this likely breaks Java source compatibility. Consider enabling-Xjvm-default=all-compatibilityfor this module (Gradle + Buck) or revertingLlmCallbackto a Java interface / providing a Java-friendly adapter that preserves optional overrides.
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt:28- The Android tests exercise
LlmCallbackfrom Kotlin, but the stated goal is Java source compatibility. Adding a small Java-side compilation/runtime test (e.g., a Java class implementingLlmCallbackoverriding onlyonResult) would catch regressions like missing Java default methods / incorrect nullability annotations.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| @Experimental | ||
| class LlmModule | ||
| private constructor( | ||
| modelType: Int, | ||
| modulePath: String?, | ||
| tokenizerPath: String?, | ||
| temperature: Float, | ||
| dataFiles: List<String>, | ||
| numBos: Int, | ||
| numEos: Int, | ||
| loadMode: Int, | ||
| ) { |
There was a problem hiding this comment.
In Java, LlmModule was non-final and had public constructors, so it was technically extensible/overridable by API consumers. Kotlin classes/methods are final by default, so this conversion makes the class non-extendable and methods non-overridable, which is a source-compatibility break for any downstream code that subclasses LlmModule. If preserving Java source compatibility is required, mark the class (and any intended-to-be-overridable methods) as open, or explicitly document/commit to it being non-inheritable and provide an alternative extension mechanism.
| srcs = [ | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.java", | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.java", | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.java", | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.java", | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt", | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.kt", | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.kt", | ||
| "executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.kt", | ||
| ], | ||
| autoglob = False, | ||
| language = "JAVA", | ||
| language = "KOTLIN", |
There was a problem hiding this comment.
PR description says package-info.java was removed, but org/pytorch/executorch/extension/llm/package-info.java still exists in this package. Either remove it as described, or update the PR description to avoid confusion about what changed.
Summary:
Convert LlmModule, LlmCallback, LlmGenerationConfig, and LlmModuleConfig
from Java to Kotlin while preserving all public API signatures and JNI
compatibility.
Key changes:
external funfor native methods,companion objectfor static members,
require()for validation,apply {}in Buildersmaintain Java source compatibility
compatibility with Java callers (e.g., benchmark LlmModelRunner)
Differential Revision: D102880053