Skip to content

Convert Android LLM extension from Java to Kotlin (#19211)#19211

Open
kirklandsign wants to merge 1 commit intomainfrom
export-D102880053
Open

Convert Android LLM extension from Java to Kotlin (#19211)#19211
kirklandsign wants to merge 1 commit intomainfrom
export-D102880053

Conversation

@kirklandsign
Copy link
Copy Markdown
Contributor

@kirklandsign kirklandsign commented Apr 29, 2026

Summary:

Convert LlmModule, LlmCallback, LlmGenerationConfig, and LlmModuleConfig
from Java to Kotlin while preserving all public API signatures and JNI
compatibility.

Key changes:

  • Use Kotlin idioms: external fun for native methods, companion object
    for static members, require() for validation, apply {} in Builders
  • Preserve HybridData/fbjni pattern with DoNotStrip and JvmStatic
  • Add JvmName annotations for boolean getters (isEcho, isWarming) to
    maintain Java source compatibility
  • Keep all constructor overloads as secondary constructors for backward
    compatibility with Java callers (e.g., benchmark LlmModelRunner)
  • Update BUCK target language from JAVA to KOTLIN

Differential Revision: D102880053

Copilot AI review requested due to automatic review settings April 29, 2026 20:40
@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Apr 29, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/19211

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

❌ 1 New Failure, 1 Cancelled Job, 3 Unrelated Failures

As of commit cf99b56 with merge base d767516 (image):

NEW FAILURE - The following job has failed:

CANCELLED JOB - The following job was cancelled. Please retry:

FLAKY - The following job failed but was likely due to flakiness present on trunk:

BROKEN TRUNK - The following jobs failed but was present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla Bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 29, 2026
@meta-codesync
Copy link
Copy Markdown
Contributor

meta-codesync Bot commented Apr 29, 2026

@kirklandsign has exported this pull request. If you are a Meta employee, you can view the originating Diff in D102880053.

@github-actions
Copy link
Copy Markdown

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Converts the Android LLM extension surface area from Java to Kotlin while aiming to preserve the existing public API and JNI/fbjni HybridData integration.

Changes:

  • Replaced Java implementations of LlmModule, LlmModuleConfig, and LlmGenerationConfig with Kotlin equivalents.
  • Updated LlmCallback to Kotlin and adjusted BUCK to compile the target as Kotlin sources.
  • Preserved the HybridData init pattern and external/native method declarations for JNI compatibility.

Reviewed changes

Copilot reviewed 8 out of 8 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.kt Kotlin port of the LLM module wrapper and JNI entrypoints.
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.java Removed Java version after Kotlin conversion.
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.kt Kotlin port of module initialization configuration + builder.
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.java Removed Java version after Kotlin conversion.
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.kt Kotlin port of generation configuration + builder with Java getter name compatibility.
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.java Removed Java version after Kotlin conversion.
extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt Kotlin port of callback interface for generation results/stats/errors.
extension/android/BUCK Switches the LLM extension BUCK target from Java to Kotlin sources.
Comments suppressed due to low confidence (1)

extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt:46

  • onStats / onError are declared with Kotlin interface method bodies. Unless the module is compiled with Kotlin JVM default methods enabled (e.g., -Xjvm-default=all or all-compatibility), these will be emitted via DefaultImpls and will still be abstract to Java callers—breaking existing Java implementers that previously relied on Java-8 default methods (e.g., code that only overrides onResult). To preserve Java source/binary compatibility, either enable JVM default mode for this Android library and/or use @JvmDefaultWithCompatibility on these methods, or keep LlmCallback as a Java interface with real default methods.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +413 to +415
require(width > 0 && height > 0 && channels > 0 && image.remaining() >= expectedBytes) {
"ByteBuffer remaining (${image.remaining()}) must be at least width*height*channels*4 ($expectedBytes)."
}
Copy link

Copilot AI Apr 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same issue here: image.remaining() is an Int but expectedBytes is a Long, so image.remaining() >= expectedBytes won’t compile. After you’ve ensured the computed byte count is within Int range, cast once and do all remaining comparisons using the same type.

Copilot uses AI. Check for mistakes.
Comment on lines +361 to +362
expectedBytes <= Int.MAX_VALUE &&
image.remaining() >= expectedBytes) {
Copy link

Copilot AI Apr 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This require(...) mixes Long (expectedBytes) with Int (Int.MAX_VALUE, image.remaining()), which won’t compile in Kotlin because numeric types aren’t implicitly converted. Convert the Int values to Long (e.g., Int.MAX_VALUE.toLong() / image.remaining().toLong()) or cast expectedBytes to Int after the bounds check and compare as Int.

Suggested change
expectedBytes <= Int.MAX_VALUE &&
image.remaining() >= expectedBytes) {
expectedBytes <= Int.MAX_VALUE.toLong() &&
image.remaining().toLong() >= expectedBytes) {

Copilot uses AI. Check for mistakes.
@kirklandsign
Copy link
Copy Markdown
Contributor Author

kirklandsign commented Apr 29, 2026

@claude review

1 similar comment
@kirklandsign
Copy link
Copy Markdown
Contributor Author

@claude review

@claude
Copy link
Copy Markdown

claude Bot commented Apr 29, 2026

Claude finished @kirklandsign's task in 3m 30s —— View job


Review: Convert Android LLM extension from Java to Kotlin

  • Read changed files and understand the conversion
  • Check for type safety issues (Long/Int mixing flagged by Copilot)
  • Verify JNI compatibility and API preservation
  • Check Kotlin idiom usage and backward compatibility
  • Post review feedback

Overall this is a clean conversion that preserves the public API surface well. Good use of Kotlin idioms (external fun, companion object with const val, require(), apply {} in Builders, listOfNotNull). The @JvmName annotations for boolean getters and @JvmStatic on create() methods ensure Java source compatibility.

I have two concerns — one potentially blocking and one worth investigating:


Issue 1: Int vs Long comparison in require() (potential compilation failure)

LlmModule.kt:361-362 and LlmModule.kt:413

Kotlin does not implicitly widen Int to Long for comparisons the way Java does. The require block mixes types:

expectedBytes <= Int.MAX_VALUE &&          // Long <= Int
image.remaining() >= expectedBytes         // Int >= Long

In the original Java, this worked via implicit widening. In Kotlin, these comparisons will fail to compile. Fix:

expectedBytes <= Int.MAX_VALUE.toLong() &&
image.remaining().toLong() >= expectedBytes

Same pattern applies in prefillNormalizedImage at line 413:

image.remaining() >= expectedBytes   // Int >= Long

should become image.remaining().toLong() >= expectedBytes.

Fix this →


Issue 2: Kotlin interface default methods and Java caller compatibility

LlmCallback.kt:37,46onStats and onError have default (empty) bodies.

The build.gradle kotlinOptions block only sets jvmTarget = "11" without -Xjvm-default=all:

kotlinOptions {
    jvmTarget = "11"
}

With Kotlin 1.9.x (used in this project), interface default method bodies are still compiled using the DefaultImpls pattern by default — meaning they appear abstract to Java callers. The existing Java caller extension/benchmark/android/benchmark/app/src/main/java/org/pytorch/minibench/LlmModelRunner.java implements LlmCallback and does NOT override onError, which would cause a compilation error.

Recommended fix — add to kotlinOptions in executorch_android/build.gradle:

kotlinOptions {
    jvmTarget = "11"
    freeCompilerArgs += ["-Xjvm-default=all"]
}

This will emit proper JVM default methods instead of DefaultImpls, matching the original Java interface behavior.

Alternatively, for the BUCK build, ensure the equivalent Kotlin compiler arg is passed. (Meta's internal build may handle this differently.)


Minor observations (non-blocking)

  1. LlmModule.kt:41-42: The requireNotNull calls happen before validateFilePath, which means a null path throws IllegalArgumentException("model path must not be null") rather than whatever validateFilePath would throw. This is fine and arguably better than the original Java which would NPE inside validateFilePath.

  2. Constants accessibility from Java ✓ — const val in companion objects correctly compiles to public static final fields, so LlmModule.MODEL_TYPE_TEXT and LlmModuleConfig.LOAD_MODE_MMAP remain directly accessible.

  3. Constructor overloads ✓ — All secondary constructors properly delegate to the primary private constructor, preserving backward compat with Java callers like LlmModelRunner.

  4. JNI compatibility ✓ — external fun generates the same JNI method names as Java native. The @DoNotStrip + @JvmStatic on initHybrid in the companion object preserves the fbjni HybridData pattern.

  5. LlmModuleConfig.Builder fields are internal — this means they're visible within the same Kotlin module but not from Java. The original Java used package-private. If any Java code in the same package accessed builder fields directly (unlikely given the builder pattern), this could break. Probably fine for this codebase.


Summary

The conversion is well-executed. The two issues above (Int/Long comparisons and interface default method compilation) should be verified — if the internal Meta build passes, it may be that their Kotlin compiler configuration handles both (via different Kotlin version or compiler flags). But for OSS Gradle builds, both are likely problems.


kirklandsign added a commit that referenced this pull request Apr 29, 2026
Summary:
Pull Request resolved: #19211

Convert LlmModule, LlmCallback, LlmGenerationConfig, and LlmModuleConfig
from Java to Kotlin while preserving all public API signatures and JNI
compatibility.

Key changes:
- Use Kotlin idioms: `external fun` for native methods, `companion object`
  for static members, `require()` for validation, `apply {}` in Builders
- Preserve HybridData/fbjni pattern with DoNotStrip and JvmStatic
- Add JvmName annotations for boolean getters (isEcho, isWarming) to
  maintain Java source compatibility
- Keep all constructor overloads as secondary constructors for backward
  compatibility with Java callers (e.g., benchmark LlmModelRunner)
- Update BUCK target language from JAVA to KOTLIN
- Remove package-info.java (no Kotlin equivalent needed)

Differential Revision: D102880053
@meta-codesync meta-codesync Bot changed the title Convert Android LLM extension from Java to Kotlin Convert Android LLM extension from Java to Kotlin (#19211) Apr 29, 2026
Summary:
Pull Request resolved: #19211

Convert LlmModule, LlmCallback, LlmGenerationConfig, and LlmModuleConfig
from Java to Kotlin while preserving all public API signatures and JNI
compatibility.

Key changes:
- Use Kotlin idioms: `external fun` for native methods, `companion object`
  for static members, `require()` for validation, `apply {}` in Builders
- Preserve HybridData/fbjni pattern with DoNotStrip and JvmStatic
- Add JvmName annotations for boolean getters (isEcho, isWarming) to
  maintain Java source compatibility
- Keep all constructor overloads as secondary constructors for backward
  compatibility with Java callers (e.g., benchmark LlmModelRunner)
- Update BUCK target language from JAVA to KOTLIN
- Remove package-info.java (no Kotlin equivalent needed)

Differential Revision: D102880053
Copilot AI review requested due to automatic review settings April 29, 2026 21:31
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 8 out of 8 changed files in this pull request and generated 2 comments.

Comments suppressed due to low confidence (2)

extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt:47

  • LlmCallback previously provided Java 8 default methods for onStats/onError, so Java callers could implement only onResult. In Kotlin, interface methods with bodies will not become Java default methods unless the module is compiled with -Xjvm-default (e.g. all-compatibility), so this likely breaks Java source compatibility. Consider enabling -Xjvm-default=all-compatibility for this module (Gradle + Buck) or reverting LlmCallback to a Java interface / providing a Java-friendly adapter that preserves optional overrides.
    extension/android/executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt:28
  • The Android tests exercise LlmCallback from Kotlin, but the stated goal is Java source compatibility. Adding a small Java-side compilation/runtime test (e.g., a Java class implementing LlmCallback overriding only onResult) would catch regressions like missing Java default methods / incorrect nullability annotations.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +24 to +35
@Experimental
class LlmModule
private constructor(
modelType: Int,
modulePath: String?,
tokenizerPath: String?,
temperature: Float,
dataFiles: List<String>,
numBos: Int,
numEos: Int,
loadMode: Int,
) {
Copy link

Copilot AI Apr 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In Java, LlmModule was non-final and had public constructors, so it was technically extensible/overridable by API consumers. Kotlin classes/methods are final by default, so this conversion makes the class non-extendable and methods non-overridable, which is a source-compatibility break for any downstream code that subclasses LlmModule. If preserving Java source compatibility is required, mark the class (and any intended-to-be-overridable methods) as open, or explicitly document/commit to it being non-inheritable and provide an alternative extension mechanism.

Copilot uses AI. Check for mistakes.
Comment thread extension/android/BUCK
Comment on lines 49 to +56
srcs = [
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.java",
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.java",
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.java",
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.java",
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmCallback.kt",
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmGenerationConfig.kt",
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModule.kt",
"executorch_android/src/main/java/org/pytorch/executorch/extension/llm/LlmModuleConfig.kt",
],
autoglob = False,
language = "JAVA",
language = "KOTLIN",
Copy link

Copilot AI Apr 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR description says package-info.java was removed, but org/pytorch/executorch/extension/llm/package-info.java still exists in this package. Either remove it as described, or update the PR description to avoid confusion about what changed.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants