Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/actions/integration/snowflake.sh
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ export CUBEJS_DB_SNOWFLAKE_WAREHOUSE=COMPUTE_WH
export CUBEJS_DB_USER=$DRIVERS_TESTS_SNOWFLAKE_CUBEJS_DB_USER
export CUBEJS_DB_PASS=$DRIVERS_TESTS_SNOWFLAKE_CUBEJS_DB_PASS

yarn lerna run --concurrency 1 --stream --no-prefix integration:snowflake
yarn lerna run --concurrency 1 --stream --no-prefix smoke:snowflake

echo "::endgroup::"
1 change: 0 additions & 1 deletion docs-mintlify/admin/deployment/deployment-types.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,6 @@ You can try a Development deployment by
<Info>

Available on [all paid plans](https://cube.dev/pricing).
You can also choose a [deployment tier](/admin/account-billing/pricing#deployment-tiers).

</Info>

Expand Down
2 changes: 1 addition & 1 deletion docs-mintlify/admin/deployment/environments.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ credentials**][ref-credentials].
[ref-dev-mode]: /docs/data-modeling/dev-mode
[ref-deployment-types]: /docs/deployment/cloud/deployment-types
[ref-api-instance-scalability]: /docs/deployment/cloud/scalability#auto-scaling-of-api-instances
[ref-pricing-deployment-tiers]: /admin/account-billing/pricing#deployment-tiers
[ref-pricing-deployment-tiers]: /admin/account-billing/pricing
[ref-suspend]: /docs/deployment/cloud/auto-suspension
[ref-overview]: /docs/workspace/integrations#review-integrations
[ref-credentials]: /docs/workspace/integrations#view-api-credentials
Expand Down
6 changes: 2 additions & 4 deletions docs-mintlify/docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,6 @@
{
"group": "Migrate from Cube Core",
"pages": [
"docs/getting-started/migrate-from-core",
"docs/getting-started/migrate-from-core/upload-with-cli",
"docs/getting-started/migrate-from-core/import-github-repository",
"docs/getting-started/migrate-from-core/import-gitlab-repository-via-ssh",
Expand Down Expand Up @@ -166,8 +165,8 @@

{
"group": "Dynamic Data Models",
"root": "docs/data-modeling/dynamic/index",
"pages": [
"docs/data-modeling/dynamic",
"docs/data-modeling/dynamic/jinja",
"docs/data-modeling/dynamic/javascript",
"docs/data-modeling/dynamic/code-reusability-export-and-import",
Expand All @@ -179,8 +178,7 @@
"pages": [
"docs/data-modeling/visual-modeler",
"docs/data-modeling/data-model-ide",
"docs/data-modeling/dev-mode",
"docs/data-modeling/sql-runner"
"docs/data-modeling/dev-mode"
]
}
]
Expand Down
33 changes: 23 additions & 10 deletions docs-mintlify/docs/data-modeling/dynamic/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,19 +3,32 @@ title: Dynamic Data Models
description: Generate data models programmatically using Jinja with Python or JavaScript for dynamic schema creation.
---

Dynamic data models
Cube supports authoring data models dynamically — useful for de-duplicating
common patterns across cubes, generating models from a remote source, or
adapting the schema per tenant at runtime.

<Frame>
Pick the approach that matches the language your data models are written in:

<CardGroup cols={2}>
<Card title="Jinja & Python" icon="brand-python" href="/docs/data-modeling/dynamic/jinja">
Template YAML data models with Jinja, and use Python for loops, includes,
and runtime generation.
</Card>
<Card title="JavaScript" icon="brand-javascript" href="/docs/data-modeling/dynamic/javascript">
Generate cubes and views on-the-fly from JavaScript data models using
`asyncModule()`.
</Card>
</CardGroup>

## How it fits together

The diagrams below show how YAML and JavaScript data models are parsed and
compiled before they're served by Cube.

<Frame caption="YAML data models compiled with COMPILE_CONTEXT">
<img src="https://ucarecdn.com/44e17cc4-60cb-40c4-889b-aba17def7136/" />
</Frame>

<Frame>
<Frame caption="JavaScript data models compiled with COMPILE_CONTEXT">
<img src="https://ucarecdn.com/7379561c-418e-49e2-9b94-be7c13b0f2de/" />
</Frame>

<CardGroup cols={2}>
<Card title="Jinja & Python" img="https://static.cube.dev/icons/python.svg" href="dynamic/jinja">
</Card>
<Card title="JavaScript" img="https://static.cube.dev/icons/javascript.svg" href="dynamic/javascript">
</Card>
</CardGroup>

This file was deleted.

1 change: 0 additions & 1 deletion docs-mintlify/docs/integrations/power-bi/kerberos.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ It can be used to authenticate requests to [DAX API][ref-dax-api].
<Info>

Available on [Enterprise and above plans](https://cube.dev/pricing).
Also requires the M [deployment tier](/admin/account-billing/pricing#deployment-tiers).

</Info>

Expand Down
1 change: 0 additions & 1 deletion docs-mintlify/docs/integrations/power-bi/ntlm.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ authenticate requests to [DAX API][ref-dax-api].
<Info>

Available on [Enterprise and above plans](https://cube.dev/pricing).
Also requires the M [deployment tier](/admin/account-billing/pricing#deployment-tiers).

</Info>

Expand Down
1 change: 0 additions & 1 deletion docs-mintlify/reference/core-data-apis/dax-api/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@ support for Power BI features.
<Info>

Available on [Enterprise and above plans](https://cube.dev/pricing).
Also requires the M [deployment tier](/admin/account-billing/pricing#deployment-tiers).

</Info>

Expand Down
1 change: 0 additions & 1 deletion docs-mintlify/reference/core-data-apis/mdx-api.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ native [PivotTable][link-pivottable] in Excel.
<Info>

Available on [Enterprise and above plans](https://cube.dev/pricing).
Also requires the M [deployment tier](/admin/account-billing/pricing#deployment-tiers).

</Info>

Expand Down
2 changes: 1 addition & 1 deletion packages/cubejs-backend-shared/src/env.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2012,7 +2012,7 @@ const variables: Record<string, (...args: any) => any> = {
.default('true')
.asBoolStrict(),
cubestoreSendableParameters: () => get('CUBEJS_CUBESTORE_SENDABLE_PARAMETERS')
.default('false')
.default('true')
.asBoolStrict(),
externalDefault: () => get('CUBEJS_EXTERNAL_DEFAULT')
.default('true')
Expand Down
25 changes: 25 additions & 0 deletions packages/cubejs-backend-shared/src/time.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,31 @@ export type TimeSeriesOptions = {
};
type ParsedInterval = Partial<Record<unitOfTime.DurationConstructor, number>>;

// Hand-rolled zero-padders for date/time formatting hot paths. `String(n).padStart`
// allocates an extra intermediate string per call; in driver result hydration
// (Snowflake TIMESTAMP*, Postgres TIMESTAMPTZ) we measured the range-checked
// template-literal versions ~15–20% faster than padStart, so we keep them
// here as a shared utility rather than reimplementing per driver.
export const pad2 = (n: number): string => (n < 10 ? `0${n}` : `${n}`);

export const pad3 = (n: number): string => {
if (n < 10) return `00${n}`;
if (n < 100) return `0${n}`;

return `${n}`;
};

export const pad4 = (n: number): string => {
if (n < 1000) {
if (n < 10) return `000${n}`;
if (n < 100) return `00${n}`;

return `0${n}`;
}

return `${n}`;
};

const GRANULARITY_LEVELS: Record<string, number> = {
second: 1,
minute: 2,
Expand Down
24 changes: 2 additions & 22 deletions packages/cubejs-postgres-driver/src/type-parsers.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
import { pad2, pad3, pad4 } from '@cubejs-backend/shared';

/** OID 1082 — Postgres emits `YYYY-MM-DD`. */
export const dateTypeParser = (val: string): string => `${val}T00:00:00.000`;

Expand All @@ -12,28 +14,6 @@ export const timestampTypeParser = (val: string): string => {
return `${val.slice(0, 10)}T${val.slice(11, 19)}.${ms}`;
};

// Hand-rolled zero-padders for the TIMESTAMPTZ hot path. `String(n).padStart`
// allocates an extra intermediate string per call; with six pad calls per value
// that measured ~15–20% slower in our microbenchmark than these range-checked
// template literals, so we keep the explicit versions.
const pad2 = (n: number): string => (n < 10 ? `0${n}` : `${n}`);
const pad3 = (n: number): string => {
if (n < 10) return `00${n}`;
if (n < 100) return `0${n}`;

return `${n}`;
};
const pad4 = (n: number): string => {
if (n < 1000) {
if (n < 10) return `000${n}`;
if (n < 100) return `00${n}`;

return `0${n}`;
}

return `${n}`;
};

/**
* OID 1184 — same as TIMESTAMP, suffixed with `(+|-)HH`, `(+|-)HH:MM`, or
* `(+|-)HH:MM:SS`. We shift the value into UTC before formatting.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ export async function testWithPreAggregation(
const preAggSql = preAggregationsDescription
.loadSql[0]
// Without `ON COMMIT DROP` temp tables are session-bound, and can live across multiple transactions
.replace(/CREATE TABLE (.+) AS SELECT/, 'CREATE TEMP TABLE $1 ON COMMIT DROP AS SELECT');
.replace(/CREATE TABLE (.+) AS\s+(SELECT|WITH)/, 'CREATE TEMP TABLE $1 ON COMMIT DROP AS $2');
const preAggParams = preAggregationsDescription.loadSql[1];

const queries = [
Expand Down
6 changes: 4 additions & 2 deletions packages/cubejs-snowflake-driver/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,14 +21,15 @@
"build": "rm -rf dist && npm run tsc",
"tsc": "tsc",
"watch": "tsc -w",
"integration": "vitest run",
"integration:snowflake": "vitest run",
"lint": "eslint src/* --ext .ts",
"lint:fix": "eslint --fix src/* --ext .ts"
},
"dependencies": {
"@aws-sdk/client-s3": "^3.726.0",
"@cubejs-backend/base-driver": "1.6.39",
"@cubejs-backend/shared": "1.6.39",
"date-fns-timezone": "^0.1.4",
"snowflake-sdk": "^2.2.0"
},
"license": "Apache-2.0",
Expand All @@ -40,6 +41,7 @@
},
"devDependencies": {
"@cubejs-backend/linter": "1.6.39",
"typescript": "~5.2.2"
"typescript": "~5.2.2",
"vitest": "^4"
}
}
91 changes: 22 additions & 69 deletions packages/cubejs-snowflake-driver/src/SnowflakeDriver.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
* @fileoverview The `SnowflakeDriver` and related types declaration.
*/

import { assertDataSource, getEnv, } from '@cubejs-backend/shared';
import { assertDataSource, getEnv } from '@cubejs-backend/shared';
import snowflake, { Column, Connection, RowStatement } from 'snowflake-sdk';
import {
BaseDriver,
Expand All @@ -20,82 +20,19 @@ import {
TableStructure,
UnloadOptions,
} from '@cubejs-backend/base-driver';
import { formatToTimeZone } from 'date-fns-timezone';
import fs from 'fs/promises';
import crypto from 'crypto';
import { S3ClientConfig } from '@aws-sdk/client-s3';
import { HydrationMap, HydrationStream } from './HydrationStream';
import { hydrators } from './type-parsers';

const SUPPORTED_BUCKET_TYPES = ['s3', 'gcs', 'azure'];

type HydrationConfiguration = {
types: string[], toValue: (column: Column) => ((value: any) => any) | null
};

type UnloadResponse = {
// eslint-disable-next-line camelcase
rows_unloaded: string
};

// It's not possible to declare own map converters by passing config to snowflake-sdk
const hydrators: HydrationConfiguration[] = [
{
types: ['fixed', 'real'],
toValue: (column) => {
if (column.isNullable()) {
return (value) => {
// We use numbers as strings by fetchAsString
if (value === 'NULL') {
return null;
}

return value;
};
}

// Nothing to fix, let's skip this field
return null;
},
},
{
// The TIMESTAMP_* variation associated with TIMESTAMP, default to TIMESTAMP_NTZ
types: [
'date',
// TIMESTAMP_LTZ internally stores UTC time with a specified precision.
'timestamp_ltz',
// TIMESTAMP_NTZ internally stores “wallclock” time with a specified precision.
// All operations are performed without taking any time zone into account.
'timestamp_ntz',
// TIMESTAMP_TZ internally stores UTC time together with an associated time zone offset.
// When a time zone is not provided, the session time zone offset is used.
'timestamp_tz'
],
toValue: () => (value) => {
if (!value) {
return null;
}

return formatToTimeZone(
value,
'YYYY-MM-DDTHH:mm:ss.SSS',
{
timeZone: 'UTC'
}
);
},
},
{
types: ['object'], // Workaround for HLL_SNOWFLAKE
toValue: () => (value) => {
if (!value) {
return null;
}

return JSON.stringify(value);
},
}
];

const SnowflakeToGenericType: Record<string, GenericDataBaseType> = {
// It's a limitation for now, because anyway we don't work with JSON objects in Cube Store.
object: 'HLL_SNOWFLAKE',
Expand Down Expand Up @@ -663,7 +600,12 @@ export class SnowflakeDriver extends BaseDriver implements DriverInterface {
return new Promise((resolve, reject) => connection.execute({
sqlText: `${sql} LIMIT 0`,
binds: <string[] | undefined>params,
fetchAsString: ['Number'],
fetchAsString: [
// It's not possible to store big numbers in Number, It's a common way how to handle it in Cube
'Number',
// VARIANT, OBJECT, ARRAY are mapped to JSON type in Snowflake SDK
'JSON'
],
complete: (err, stmt) => {
if (err) {
reject(err);
Expand Down Expand Up @@ -876,12 +818,18 @@ export class SnowflakeDriver extends BaseDriver implements DriverInterface {
return new Promise((resolve, reject) => connection.execute({
sqlText: query,
binds: <string[] | undefined>values,
fetchAsString: ['Number'],
fetchAsString: [
// It's not possible to store big numbers in Number, It's a common way how to handle it in Cube
'Number',
// VARIANT, OBJECT, ARRAY are mapped to JSON type in Snowflake SDK
'JSON'
],
complete: (err, stmt, rows) => {
if (err) {
reject(err);
return;
}

const hydrationMap = this.generateHydrationMap(stmt.getColumns());
const types: {name: string, type: string}[] =
this.getTypes(stmt);
Expand Down Expand Up @@ -913,7 +861,7 @@ export class SnowflakeDriver extends BaseDriver implements DriverInterface {
sqlText: query,
binds: <string[] | undefined>values,
fetchAsString: [
// It's not possible to store big numbers in Number, It's a common way how to handle it in Cube.js
// It's not possible to store big numbers in Number, It's a common way how to handle it in Cube
'Number',
// VARIANT, OBJECT, ARRAY are mapped to JSON type in Snowflake SDK
'JSON'
Expand Down Expand Up @@ -999,7 +947,12 @@ export class SnowflakeDriver extends BaseDriver implements DriverInterface {
return new Promise((resolve, reject) => connection.execute({
sqlText: query,
binds: <string[] | undefined>values,
fetchAsString: ['Number'],
fetchAsString: [
// It's not possible to store big numbers in Number, It's a common way how to handle it in Cube
'Number',
// VARIANT, OBJECT, ARRAY are mapped to JSON type in Snowflake SDK
'JSON'
],
complete: (err, stmt, rows) => {
if (err) {
reject(err);
Expand Down
Loading
Loading