Skip to content

Commit 96e0f8f

Browse files
srielaudtenedor
authored andcommitted
[SPARK-56489][SQL] Add CURRENT_PATH() builtin expression and keywords
### What changes were proposed in this pull request? Add the `CURRENT_PATH()` builtin function that returns the current SQL resolution search path as a comma-separated string of qualified schema names (e.g. `system.builtin,system.session,spark_catalog.default`). Also register the grammar keywords needed by the upcoming SQL PATH feature: `CURRENT_PATH`, `CURRENT_SCHEMA`, `CURRENT_DATABASE`, `DEFAULT_PATH`, `SYSTEM_PATH`, `PATH`. `CURRENT_PATH` and `CURRENT_SCHEMA` are reserved in ANSI mode per SQL:2023; the others are non-reserved. In non-ANSI mode, `CURRENT_PATH`, `CURRENT_DATABASE`, and `CURRENT_SCHEMA` always resolve to their respective expressions (not `UnresolvedAttribute`), matching the behavior of `CURRENT_CATALOG`. This is part 1 of the SQL PATH feature (SPARK-54810), split out to keep the review scope manageable. ### Why are the changes needed? `CURRENT_PATH()` is a SQL-standard function (SQL:2023) that exposes the resolution search path to users. The grammar keywords are prerequisites for the `SET PATH` command and path-based resolution coming in follow-up PRs. ### Does this PR introduce _any_ user-facing change? Yes. New builtin function `CURRENT_PATH()` and new reserved/non-reserved keywords. ### How was this patch tested? - Added test in `FunctionQualificationSuite` verifying `current_path()` returns a non-empty qualified path string. - Updated keyword golden files (`keywords.sql.out`, `keywords-enforced.sql.out`, `nonansi/keywords.sql.out`). - Updated `sql-expression-schema.md` and `SparkConnectDatabaseMetaDataSuite` keyword assertions. ### Was this patch authored or co-authored using generative AI tooling? Generated-by: Claude Opus 4.6 Closes #55354 from srielau/SPARK-56489-path-syntax. Authored-by: Serge Rielau <serge@rielau.com> Signed-off-by: Daniel Tenedorio <daniel.tenedorio@databricks.com>
1 parent 96ebbaa commit 96e0f8f

15 files changed

Lines changed: 123 additions & 8 deletions

File tree

docs/sql-ref-ansi-compliance.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -479,7 +479,10 @@ Below is a list of all the keywords in Spark SQL.
479479
|CROSS|reserved|strict-non-reserved|reserved|
480480
|CUBE|non-reserved|non-reserved|reserved|
481481
|CURRENT|non-reserved|non-reserved|reserved|
482+
|CURRENT_DATABASE|non-reserved|non-reserved|non-reserved|
482483
|CURRENT_DATE|reserved|non-reserved|reserved|
484+
|CURRENT_PATH|reserved|non-reserved|reserved|
485+
|CURRENT_SCHEMA|reserved|non-reserved|reserved|
483486
|CURRENT_TIME|reserved|non-reserved|reserved|
484487
|CURRENT_TIMESTAMP|reserved|non-reserved|reserved|
485488
|CURRENT_USER|reserved|non-reserved|reserved|
@@ -502,6 +505,7 @@ Below is a list of all the keywords in Spark SQL.
502505
|DEFAULT|non-reserved|non-reserved|non-reserved|
503506
|DEFINED|non-reserved|non-reserved|non-reserved|
504507
|DEFINER|non-reserved|non-reserved|non-reserved|
508+
|DEFAULT_PATH|non-reserved|non-reserved|not a keyword|
505509
|DELAY|non-reserved|non-reserved|non-reserved|
506510
|DELETE|non-reserved|non-reserved|reserved|
507511
|DELIMITED|non-reserved|non-reserved|non-reserved|
@@ -671,6 +675,7 @@ Below is a list of all the keywords in Spark SQL.
671675
|PARTITION|non-reserved|non-reserved|reserved|
672676
|PARTITIONED|non-reserved|non-reserved|non-reserved|
673677
|PARTITIONS|non-reserved|non-reserved|non-reserved|
678+
|PATH|non-reserved|non-reserved|not a keyword|
674679
|PERCENT|non-reserved|non-reserved|non-reserved|
675680
|PIVOT|non-reserved|non-reserved|non-reserved|
676681
|PLACING|non-reserved|non-reserved|non-reserved|
@@ -754,6 +759,7 @@ Below is a list of all the keywords in Spark SQL.
754759
|SUBSTR|non-reserved|non-reserved|non-reserved|
755760
|SUBSTRING|non-reserved|non-reserved|non-reserved|
756761
|SYNC|non-reserved|non-reserved|non-reserved|
762+
|SYSTEM_PATH|non-reserved|non-reserved|not a keyword|
757763
|SYSTEM_TIME|non-reserved|non-reserved|non-reserved|
758764
|SYSTEM_VERSION|non-reserved|non-reserved|non-reserved|
759765
|TABLE|reserved|non-reserved|reserved|

sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseLexer.g4

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -198,7 +198,10 @@ CREATE: 'CREATE';
198198
CROSS: 'CROSS';
199199
CUBE: 'CUBE';
200200
CURRENT: 'CURRENT';
201+
CURRENT_DATABASE: 'CURRENT_DATABASE';
201202
CURRENT_DATE: 'CURRENT_DATE';
203+
CURRENT_PATH: 'CURRENT_PATH';
204+
CURRENT_SCHEMA: 'CURRENT_SCHEMA';
202205
CURRENT_TIME: 'CURRENT_TIME';
203206
CURRENT_TIMESTAMP: 'CURRENT_TIMESTAMP';
204207
CURRENT_USER: 'CURRENT_USER';
@@ -219,6 +222,7 @@ DEC: 'DEC';
219222
DECIMAL: 'DECIMAL';
220223
DECLARE: 'DECLARE';
221224
DEFAULT: 'DEFAULT';
225+
DEFAULT_PATH: 'DEFAULT_PATH';
222226
DEFINED: 'DEFINED';
223227
DEFINER: 'DEFINER';
224228
DELAY: 'DELAY';
@@ -389,6 +393,7 @@ OVERWRITE: 'OVERWRITE';
389393
PARTITION: 'PARTITION';
390394
PARTITIONED: 'PARTITIONED';
391395
PARTITIONS: 'PARTITIONS';
396+
PATH: 'PATH';
392397
PERCENTLIT: 'PERCENT';
393398
PIVOT: 'PIVOT';
394399
PLACING: 'PLACING';
@@ -474,6 +479,7 @@ SUBSTRING: 'SUBSTRING';
474479
SYNC: 'SYNC';
475480
SYSTEM_TIME: 'SYSTEM_TIME';
476481
SYSTEM_VERSION: 'SYSTEM_VERSION';
482+
SYSTEM_PATH: 'SYSTEM_PATH';
477483
TABLE: 'TABLE';
478484
TABLES: 'TABLES';
479485
TABLESAMPLE: 'TABLESAMPLE';

sql/api/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBaseParser.g4

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1297,7 +1297,7 @@ datetimeUnit
12971297
;
12981298

12991299
primaryExpression
1300-
: name=(CURRENT_DATE | CURRENT_TIMESTAMP | CURRENT_USER | USER | SESSION_USER | CURRENT_TIME) #currentLike
1300+
: name=(CURRENT_DATE | CURRENT_TIMESTAMP | CURRENT_USER | USER | SESSION_USER | CURRENT_TIME | CURRENT_PATH) #currentLike
13011301
| name=(TIMESTAMPADD | DATEADD | DATE_ADD) LEFT_PAREN (unit=datetimeUnit | invalidUnit=stringLit) COMMA unitsAmount=valueExpression COMMA timestamp=valueExpression RIGHT_PAREN #timestampadd
13021302
| name=(TIMESTAMPDIFF | DATEDIFF | DATE_DIFF | TIMEDIFF) LEFT_PAREN (unit=datetimeUnit | invalidUnit=stringLit) COMMA startTimestamp=valueExpression COMMA endTimestamp=valueExpression RIGHT_PAREN #timestampdiff
13031303
| CASE whenClause+ (ELSE elseExpression=expression)? END #searchedCase
@@ -1961,6 +1961,7 @@ ansiNonReserved
19611961
| CURSOR
19621962
| CUBE
19631963
| CURRENT
1964+
| CURRENT_DATABASE
19641965
| DATA
19651966
| DATABASE
19661967
| DATABASES
@@ -1977,6 +1978,7 @@ ansiNonReserved
19771978
| DECIMAL
19781979
| DECLARE
19791980
| DEFAULT
1981+
| DEFAULT_PATH
19801982
| DEFINED
19811983
| DEFINER
19821984
| DELAY
@@ -2112,6 +2114,7 @@ ansiNonReserved
21122114
| PARTITION
21132115
| PARTITIONED
21142116
| PARTITIONS
2117+
| PATH
21152118
| PERCENTLIT
21162119
| PIVOT
21172120
| PLACING
@@ -2187,6 +2190,7 @@ ansiNonReserved
21872190
| SUBSTR
21882191
| SUBSTRING
21892192
| SYNC
2193+
| SYSTEM_PATH
21902194
| SYSTEM_TIME
21912195
| SYSTEM_VERSION
21922196
| TABLES
@@ -2342,7 +2346,10 @@ nonReserved
23422346
| CUBE
23432347
| CURRENT
23442348
| CURSOR
2349+
| CURRENT_DATABASE
23452350
| CURRENT_DATE
2351+
| CURRENT_PATH
2352+
| CURRENT_SCHEMA
23462353
| CURRENT_TIME
23472354
| CURRENT_TIMESTAMP
23482355
| CURRENT_USER
@@ -2362,6 +2369,7 @@ nonReserved
23622369
| DECIMAL
23632370
| DECLARE
23642371
| DEFAULT
2372+
| DEFAULT_PATH
23652373
| DEFINED
23662374
| DEFINER
23672375
| DELAY
@@ -2524,6 +2532,7 @@ nonReserved
25242532
| PARTITION
25252533
| PARTITIONED
25262534
| PARTITIONS
2535+
| PATH
25272536
| PERCENTLIT
25282537
| PIVOT
25292538
| PLACING
@@ -2604,6 +2613,7 @@ nonReserved
26042613
| SUBSTR
26052614
| SUBSTRING
26062615
| SYNC
2616+
| SYSTEM_PATH
26072617
| SYSTEM_TIME
26082618
| SYSTEM_VERSION
26092619
| TABLE

sql/api/src/main/scala/org/apache/spark/sql/functions.scala

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4412,6 +4412,14 @@ object functions {
44124412
*/
44134413
def current_schema(): Column = Column.fn("current_schema")
44144414

4415+
/**
4416+
* Returns the current SQL path as a comma-separated list of qualified schema names.
4417+
*
4418+
* @group misc_funcs
4419+
* @since 4.2.0
4420+
*/
4421+
def current_path(): Column = Column.fn("current_path")
4422+
44154423
/**
44164424
* Returns the user name of current execution context.
44174425
*

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -854,6 +854,7 @@ object FunctionRegistry {
854854
expression[CurrentDatabase]("current_database"),
855855
expression[CurrentDatabase]("current_schema", true, Some("3.4.0")),
856856
expression[CurrentCatalog]("current_catalog"),
857+
expression[CurrentPath]("current_path", true, Some("4.2.0")),
857858
expression[CurrentUser]("current_user"),
858859
expression[CurrentUser]("user", true, Some("3.4.0")),
859860
expression[CurrentUser]("session_user", true, Some("4.0.0")),

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -249,6 +249,28 @@ case class CurrentCatalog()
249249
final override val nodePatterns: Seq[TreePattern] = Seq(CURRENT_LIKE)
250250
}
251251

252+
/**
253+
* Returns the current SQL path as a comma-separated list of qualified schema names
254+
* (catalog.schema). The result reflects the current catalog and schema context.
255+
*/
256+
@ExpressionDescription(
257+
usage = "_FUNC_() - Returns the current SQL path (qualified schema names).",
258+
examples = """
259+
Examples:
260+
> SELECT _FUNC_();
261+
system.builtin,system.session,spark_catalog.default
262+
""",
263+
since = "4.2.0",
264+
group = "misc_funcs")
265+
case class CurrentPath()
266+
extends LeafExpression
267+
with DefaultStringProducingExpression
268+
with Unevaluable {
269+
override def nullable: Boolean = false
270+
override def prettyName: String = "current_path"
271+
final override val nodePatterns: Seq[TreePattern] = Seq(CURRENT_LIKE)
272+
}
273+
252274
// scalastyle:off line.size.limit
253275
@ExpressionDescription(
254276
usage = """_FUNC_() - Returns an universally unique identifier (UUID) string. The value is returned as a canonical UUID 36-character string.""",

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/finishAnalysis.scala

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,7 @@ import org.apache.spark.sql.catalyst.util.DateTimeUtils.{convertSpecialDate, con
3434
import org.apache.spark.sql.catalyst.util.SparkDateTimeUtils.{instantToNanosOfDay, truncateTimeToPrecision}
3535
import org.apache.spark.sql.catalyst.util.TypeUtils.toSQLExpr
3636
import org.apache.spark.sql.connector.catalog.CatalogManager
37+
import org.apache.spark.sql.internal.SQLConf
3738
import org.apache.spark.sql.types._
3839

3940

@@ -148,21 +149,28 @@ object ComputeCurrentTime extends Rule[LogicalPlan] {
148149
}
149150

150151
/**
151-
* Replaces the expression of CurrentDatabase, CurrentCatalog, and CurrentUser
152+
* Replaces the expression of CurrentDatabase, CurrentCatalog, CurrentPath, and CurrentUser
152153
* with the current values.
153154
*/
154155
case class ReplaceCurrentLike(catalogManager: CatalogManager) extends Rule[LogicalPlan] {
155156
def apply(plan: LogicalPlan): LogicalPlan = {
156157
import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
157158
lazy val currentNamespace = catalogManager.currentNamespace.quoted
159+
lazy val currentNamespaceSeq = catalogManager.currentNamespace.toSeq
158160
lazy val currentCatalog = catalogManager.currentCatalog.name()
159161
lazy val currentUser = CurrentUserContext.getCurrentUser
162+
lazy val currentPathStr = {
163+
val catalogPath = (currentCatalog +: currentNamespaceSeq).toSeq
164+
SQLConf.get.resolutionSearchPath(catalogPath).map(_.quoted).mkString(",")
165+
}
160166

161167
plan.transformAllExpressionsWithPruning(_.containsPattern(CURRENT_LIKE)) {
162168
case CurrentDatabase() =>
163169
Literal.create(currentNamespace, StringType)
164170
case CurrentCatalog() =>
165171
Literal.create(currentCatalog, StringType)
172+
case CurrentPath() =>
173+
Literal.create(currentPathStr, StringType)
166174
case CurrentUser() =>
167175
Literal.create(currentUser, StringType)
168176
}

sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3434,14 +3434,20 @@ class AstBuilder extends DataTypeAstBuilder
34343434
CurrentTimestamp()
34353435
case SqlBaseParser.CURRENT_TIME =>
34363436
CurrentTime()
3437+
case SqlBaseParser.CURRENT_PATH =>
3438+
CurrentPath()
34373439
case SqlBaseParser.CURRENT_USER | SqlBaseParser.USER | SqlBaseParser.SESSION_USER =>
34383440
CurrentUser()
34393441
}
34403442
} else {
3441-
// If the parser is not in ansi mode, we should return `UnresolvedAttribute`, in case there
3442-
// are columns named `CURRENT_DATE` or `CURRENT_TIMESTAMP` or `CURRENT_TIME`.
3443-
// ctx.name is a token, not an identifier context.
3444-
UnresolvedAttribute.quoted(ctx.name.getText)
3443+
ctx.name.getType match {
3444+
case SqlBaseParser.CURRENT_PATH =>
3445+
CurrentPath()
3446+
case _ =>
3447+
// If the parser is not in ansi mode, we should return `UnresolvedAttribute`, in case
3448+
// there are columns named `CURRENT_DATE` or `CURRENT_TIMESTAMP` or `CURRENT_TIME`.
3449+
UnresolvedAttribute.quoted(ctx.name.getText)
3450+
}
34453451
}
34463452
}
34473453

sql/connect/client/jdbc/src/test/scala/org/apache/spark/sql/connect/client/jdbc/SparkConnectDatabaseMetaDataSuite.scala

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -209,7 +209,8 @@ class SparkConnectDatabaseMetaDataSuite extends ConnectFunSuite with RemoteSpark
209209
withConnection { conn =>
210210
val metadata = conn.getMetaData
211211
// scalastyle:off line.size.limit
212-
assert(metadata.getSQLKeywords === "ADD,AFTER,AGGREGATE,ALWAYS,ANALYZE,ANTI,ANY_VALUE,ARCHIVE,ASC,BINDING,BUCKET,BUCKETS,BYTE,CACHE,CASCADE,CATALOG,CATALOGS,CHANGE,CHANGES,CLEAR,CLUSTER,CLUSTERED,CODEGEN,COLLATION,COLLATIONS,COLLECTION,COLUMNS,COMMENT,COMPACT,COMPACTIONS,COMPENSATION,COMPUTE,CONCATENATE,CONTAINS,CONTINUE,COST,DATA,DATABASE,DATABASES,DATEADD,DATEDIFF,DATE_ADD,DATE_DIFF,DAYOFYEAR,DAYS,DBPROPERTIES,DEFINED,DEFINER,DELAY,DELIMITED,DESC,DFS,DIRECTORIES,DIRECTORY,DISTRIBUTE,DIV,DO,ELSEIF,ENFORCED,ESCAPED,EVOLUTION,EXCHANGE,EXCLUDE,EXCLUSIVE,EXIT,EXPLAIN,EXPORT,EXTEND,EXTENDED,FIELDS,FILEFORMAT,FIRST,FLOW,FOLLOWING,FORMAT,FORMATTED,FOUND,FUNCTIONS,GENERATED,GEOGRAPHY,GEOMETRY,HANDLER,HOURS,IDENTIFIED,IDENTIFIER,IF,IGNORE,ILIKE,IMMEDIATE,INCLUDE,INCLUSIVE,INCREMENT,INDEX,INDEXES,INPATH,INPUT,INPUTFORMAT,INVOKER,ITEMS,ITERATE,JSON,KEY,KEYS,LAST,LAZY,LEAVE,LEVEL,LIMIT,LINES,LIST,LOAD,LOCATION,LOCK,LOCKS,LOGICAL,LONG,LOOP,MACRO,MAP,MATCHED,MATERIALIZED,MEASURE,METRICS,MICROSECOND,MICROSECONDS,MILLISECOND,MILLISECONDS,MINUS,MINUTES,MONTHS,MSCK,NAME,NAMESPACE,NAMESPACES,NANOSECOND,NANOSECONDS,NORELY,NULLS,OFFSET,OPTION,OPTIONS,OUTPUTFORMAT,OVERWRITE,PARTITIONED,PARTITIONS,PERCENT,PIVOT,PLACING,PRECEDING,PRINCIPALS,PROCEDURES,PROPERTIES,PURGE,QUARTER,QUERY,RECORDREADER,RECORDWRITER,RECOVER,RECURSION,REDUCE,REFRESH,RELY,RENAME,REPAIR,REPEAT,REPEATABLE,REPLACE,RESET,RESPECT,RESTRICT,ROLE,ROLES,SCHEMA,SCHEMAS,SECONDS,SECURITY,SEMI,SEPARATED,SERDE,SERDEPROPERTIES,SETS,SHORT,SHOW,SINGLE,SKEWED,SORT,SORTED,SOURCE,STATISTICS,STORED,STRATIFY,STREAM,STREAMING,STRING,STRUCT,SUBSTR,SYNC,SYSTEM_TIME,SYSTEM_VERSION,TABLES,TARGET,TBLPROPERTIES,TERMINATED,TIMEDIFF,TIMESTAMPADD,TIMESTAMPDIFF,TIMESTAMP_LTZ,TIMESTAMP_NTZ,TINYINT,TOUCH,TRANSACTION,TRANSACTIONS,TRANSFORM,TRUNCATE,TRY_CAST,TYPE,UNARCHIVE,UNBOUNDED,UNCACHE,UNLOCK,UNPIVOT,UNSET,UNTIL,USE,VAR,VARIABLE,VARIANT,VERSION,VIEW,VIEWS,VOID,WATERMARK,WEEK,WEEKS,WHILE,X,YEARS,ZONE")
212+
// CURRENT_PATH is excluded: getSQLKeywords drops SQL:2003 reserved words (see companion).
213+
assert(metadata.getSQLKeywords === "ADD,AFTER,AGGREGATE,ALWAYS,ANALYZE,ANTI,ANY_VALUE,ARCHIVE,ASC,BINDING,BUCKET,BUCKETS,BYTE,CACHE,CASCADE,CATALOG,CATALOGS,CHANGE,CHANGES,CLEAR,CLUSTER,CLUSTERED,CODEGEN,COLLATION,COLLATIONS,COLLECTION,COLUMNS,COMMENT,COMPACT,COMPACTIONS,COMPENSATION,COMPUTE,CONCATENATE,CONTAINS,CONTINUE,COST,CURRENT_DATABASE,CURRENT_SCHEMA,DATA,DATABASE,DATABASES,DATEADD,DATEDIFF,DATE_ADD,DATE_DIFF,DAYOFYEAR,DAYS,DBPROPERTIES,DEFAULT_PATH,DEFINED,DEFINER,DELAY,DELIMITED,DESC,DFS,DIRECTORIES,DIRECTORY,DISTRIBUTE,DIV,DO,ELSEIF,ENFORCED,ESCAPED,EVOLUTION,EXCHANGE,EXCLUDE,EXCLUSIVE,EXIT,EXPLAIN,EXPORT,EXTEND,EXTENDED,FIELDS,FILEFORMAT,FIRST,FLOW,FOLLOWING,FORMAT,FORMATTED,FOUND,FUNCTIONS,GENERATED,GEOGRAPHY,GEOMETRY,HANDLER,HOURS,IDENTIFIED,IDENTIFIER,IF,IGNORE,ILIKE,IMMEDIATE,INCLUDE,INCLUSIVE,INCREMENT,INDEX,INDEXES,INPATH,INPUT,INPUTFORMAT,INVOKER,ITEMS,ITERATE,JSON,KEY,KEYS,LAST,LAZY,LEAVE,LEVEL,LIMIT,LINES,LIST,LOAD,LOCATION,LOCK,LOCKS,LOGICAL,LONG,LOOP,MACRO,MAP,MATCHED,MATERIALIZED,MEASURE,METRICS,MICROSECOND,MICROSECONDS,MILLISECOND,MILLISECONDS,MINUS,MINUTES,MONTHS,MSCK,NAME,NAMESPACE,NAMESPACES,NANOSECOND,NANOSECONDS,NORELY,NULLS,OFFSET,OPTION,OPTIONS,OUTPUTFORMAT,OVERWRITE,PARTITIONED,PARTITIONS,PATH,PERCENT,PIVOT,PLACING,PRECEDING,PRINCIPALS,PROCEDURES,PROPERTIES,PURGE,QUARTER,QUERY,RECORDREADER,RECORDWRITER,RECOVER,RECURSION,REDUCE,REFRESH,RELY,RENAME,REPAIR,REPEAT,REPEATABLE,REPLACE,RESET,RESPECT,RESTRICT,ROLE,ROLES,SCHEMA,SCHEMAS,SECONDS,SECURITY,SEMI,SEPARATED,SERDE,SERDEPROPERTIES,SETS,SHORT,SHOW,SINGLE,SKEWED,SORT,SORTED,SOURCE,STATISTICS,STORED,STRATIFY,STREAM,STREAMING,STRING,STRUCT,SUBSTR,SYNC,SYSTEM_PATH,SYSTEM_TIME,SYSTEM_VERSION,TABLES,TARGET,TBLPROPERTIES,TERMINATED,TIMEDIFF,TIMESTAMPADD,TIMESTAMPDIFF,TIMESTAMP_LTZ,TIMESTAMP_NTZ,TINYINT,TOUCH,TRANSACTION,TRANSACTIONS,TRANSFORM,TRUNCATE,TRY_CAST,TYPE,UNARCHIVE,UNBOUNDED,UNCACHE,UNLOCK,UNPIVOT,UNSET,UNTIL,USE,VAR,VARIABLE,VARIANT,VERSION,VIEW,VIEWS,VOID,WATERMARK,WEEK,WEEKS,WHILE,X,YEARS,ZONE")
213214
// scalastyle:on line.size.limit
214215
}
215216
}

sql/core/src/test/resources/sql-functions/sql-expression-schema.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -107,6 +107,7 @@
107107
| org.apache.spark.sql.catalyst.expressions.CurrentDatabase | current_database | SELECT current_database() | struct<current_schema():string> |
108108
| org.apache.spark.sql.catalyst.expressions.CurrentDatabase | current_schema | SELECT current_schema() | struct<current_schema():string> |
109109
| org.apache.spark.sql.catalyst.expressions.CurrentDate | current_date | SELECT current_date() | struct<current_date():date> |
110+
| org.apache.spark.sql.catalyst.expressions.CurrentPath | current_path | SELECT current_path() | struct<current_path():string> |
110111
| org.apache.spark.sql.catalyst.expressions.CurrentTime | current_time | SELECT current_time() | struct<current_time(6):time(6)> |
111112
| org.apache.spark.sql.catalyst.expressions.CurrentTimeZone | current_timezone | SELECT current_timezone() | struct<current_timezone():string> |
112113
| org.apache.spark.sql.catalyst.expressions.CurrentTimestamp | current_timestamp | SELECT current_timestamp() | struct<current_timestamp():timestamp> |

0 commit comments

Comments
 (0)