Skip to content

SKaiNET-developers/skainet-notebook

Repository files navigation

SKaiNET Kotlin Notebook

License: MIT

Maven Central

Important
About the name “SKaiNET” - it is a working project name chosen early in the project’s life as part of a personal learning and experimentation effort, before any trademark considerations were known. The name is not intended to reference, infringe, or imply association with any existing trademarks, companies, or products. It is not a commercial brand and is not claimed or assignable to any company or organization that contributors may be affiliated with. If a naming conflict arises, the project name may be changed in the future.

Empower your deep machine learning data science workflows with Kotlin’s type safety and expressiveness in Kotlin Notebooks.

Overview

SKaiNET project is an open-source deep learning framework written in Kotlin, designed with developers in mind to enable the creation of modern AI-powered applications with ease. It seamlessly integrates with Jupyter notebooks, providing a powerful environment for interactive data analysis, machine learning experimentation, and model development.

Features

  • Full Kotlin language support in Kotlin Notebooks

  • Interactive data visualization capabilities

  • Seamless integration with popular ML libraries

  • Type-safe data manipulation

  • Rich markdown and documentation support

Using with Jupyter Notebook

IntelliJ IDEA

You can create Kotlin notebooks directly in IntelliJ IDEA using one of these methods:

Within a Project

  • Right-click on source root/folder in Project view

  • Select NewKotlin Notebook

Scratch Notebook

  • Press kbd:[Cmd+Shift+N] (macOS) or kbd:[Ctrl+Alt+Shift+Insert] (Windows/Linux)

  • Select Kotlin Notebook

SKaiNET notebook dependency

There are two ways to pull SKaiNET into a Kotlin Notebook cell. Pick whichever fits your kernel — both load the same uber-jar and run the same SKaiNETJupyterIntegration (default imports, tensor / image renderers, ready banner, SIMD-availability check).

Via %use magic (registry-driven)

Once skainet-notebook.json lands in the Kotlin Jupyter library registry, a single line is enough — the kernel resolves the artifact coordinate and version pin from the registry and the in-classpath integration takes over from there:

%use skainet-notebook

Pin a specific version (otherwise the registry’s properties.v wins):

%use skainet-notebook(0.22.1)
%use skainet-notebook@0.22.1

Combine with other registry libraries in the usual comma-separated form:

%use skainet-notebook, dataframe, lets-plot
Note
JVM startup arguments (notably --add-modules jdk.incubator.vector, see Enabling SIMD) cannot be set from a registry descriptor — they have to be configured on the kernel itself. The integration will still load on a vanilla kernel; it will just emit the SIMD-not-active warning and run the scalar CPU path.

Via @file:DependsOn (explicit coordinate)

When you need to pin against a specific Maven coordinate without depending on the registry (offline kernels, internal mirrors, locked-down environments):

@file:DependsOn("sk.ainet.app:kotlin-notebook:0.22.1")

In either case the first cell that needs SKaiNET looks the same:

code
val input = data<FP32, Float>(ctx) {
    tensor<FP32, Float>() {
        shape(5) {
            fromArray(
                floatArrayOf(1f,2f,3f,4f,5f)
            )
        }
    }
}

Inline image display (Quick Start)

You can convert tensors to images and show them in a Kotlin notebook using our utils:

import sk.ainet.app.notebook.tools.Layout
import sk.ainet.app.notebook.tools.toImage

imageTensor.toImage(Layout.CHW)

Enabling SIMD

SKaiNET’s CPU backend uses the JDK Vector API (jdk.incubator.vector) for SIMD-accelerated matmul, quantized kernels (Q4_K, Q6_K, Q8_0, TurboQuant), and elementwise/reduction ops. The Vector API is an incubator module — it is shipped with the JDK but not in the default module graph, so the kernel JVM has to be started with --add-modules jdk.incubator.vector for SKaiNET to install the SIMD kernels. Without that flag, SKaiNET silently falls back to scalar DefaultCpuOps.

When the notebook integration loads, it runs the same probe SKaiNET’s CPU backend uses and renders a yellow warning if the SIMD path is unreachable. You can also call checkSimd() from any cell:

checkSimd()
// SimdReport(jdkFeatureVersion=21, jdkOk=true, vectorApiAvailable=true,
//            configEnabled=true, simdActive=true, reason=Vector API kernels active)

The first DirectCpuExecutionContext.create() call also prints one of:

[SKaiNET] Using SIMD-accelerated CPU operations (Vector API)
[SKaiNET] Using standard CPU operations (Vector API not available)

IntelliJ Kotlin Notebook

Settings → Languages & FrameworksKotlinKotlin NotebookJVM options for Kotlin Notebook, add:

--add-modules jdk.incubator.vector

Restart the kernel (toolbar → kernel menu → Restart Kernel).

Plain Jupyter with the Kotlin kernel

Either set the env var before launching Jupyter:

export KOTLIN_JUPYTER_JAVA_OPTS="--add-modules=jdk.incubator.vector"
jupyter notebook

…or edit the kernel descriptor (jupyter kernelspec list to find the kotlin entry’s directory, open its kernel.json, and add "--add-modules", "jdk.incubator.vector" to the argv array before -jar).

Disabling SIMD

If you need to compare against the scalar path (e.g. for a benchmark or to triage a numerical regression) set either:

  • JVM property: -Dskainet.cpu.vector.enabled=false

  • Environment variable: SKAINET_CPU_VECTOR_ENABLED=false

About

Empower your data science workflows with Kotlin's type safety and expressiveness—now in Jupyter notebooks.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors