Skip to content

Commit 92d2a15

Browse files
AbinayaJayaprakasamcloud-fan
authored andcommitted
[SPARK-54541][SQL] Rename _LEGACY_ERROR_TEMP_1007 and add sqlState
### What changes were proposed in this pull request? I renamed _LEGACY_ERROR_TEMP_1007 to VIEW_WRITE_NOT_ALLOWED and added sqlState 42809. The changes include: - New error class VIEW_WRITE_NOT_ALLOWED in error-conditions.json with sqlState 42809 - Updated QueryCompilationErrors.scala to reference the new error class - Refactored 6 test cases in DataFrameWriterV2Suite to use checkError() instead of string matching ### Why are the changes needed? Users frequently encounter this error when attempting to write to temporary views using the DataFrameWriter V2 API. The legacy error name is non-descriptive and lacks an sqlState, which makes it difficult to handle programatically and reduces compatibility with JDBC/ODBC clients. ### Does this PR introduce any user-facing change? No. The error message text stays the same. User will only see the same error message, but the underlying error class is now properly named and includes a sqlState for better tooling support. ### How was this patch tested? I tested this change in multiple ways: - Manually triggered the error in spark-shell to verify the new error class is used - Updated 6 existing tests in DataFrameWriterV2Suite to use checkError() for structured validation of error class and parameters - Ran the full test suite: all tests pass successfully #### Manual Testing in Spark-Shell **Before (legacy error):** ``` scala> val sourceData = Seq((1, "a"), (2, "b"), (3, "c")).toDF("id", "name") scala> sourceData.createOrReplaceTempView("source") scala> spark.range(10).createOrReplaceTempView("temp_view") scala> spark.table("source").writeTo("temp_view").append() org.apache.spark.sql.AnalysisException: Cannot write into temp view temp_view as it's not a data source v2 relation. at org.apache.spark.sql.errors.QueryCompilationErrors$.writeIntoTempViewNotAllowedError(QueryCompilationErrors.scala:551) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$12.$anonfun$applyOrElse$73(Analyzer.scala:1237) ``` **After (new error with class and sqlState):** ``` scala> val sourceData = Seq((1, "a"), (2, "b"), (3, "c")).toDF("id", "name") scala> sourceData.createOrReplaceTempView("source") scala> spark.range(10).createOrReplaceTempView("temp_view") scala> spark.table("source").writeTo("temp_view").append() org.apache.spark.sql.AnalysisException: [VIEW_WRITE_NOT_ALLOWED] Cannot write into view temp_view, please write into a table instead. SQLSTATE: 42809 at org.apache.spark.sql.errors.QueryCompilationErrors$.writeIntoTempViewNotAllowedError(QueryCompilationErrors.scala:551) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$12.$anonfun$applyOrElse$73(Analyzer.scala:1237) ``` *Note: The error now displays the error class `[VIEW_WRITE_NOT_ALLOWED]` and includes sqlState `42809` (accessible via `getSqlState()`).* Ran the full DataFrameWriterV2Suite test suite: all tests pass successfully ``` [info] Test run finished: 0 failed, 0 ignored, 6 total, 0.602s [info] ScalaTest [info] Run completed in 8 seconds, 683 milliseconds. [info] Total number of tests run: 48 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 48, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [info] Passed: Total 54, Failed 0, Errors 0, Passed 54 [success] Total time: 38 s, completed 13-Dec-2025, 10:51:11 pm ``` ### Was this patch authored or co-authored using generative AI tooling? No Closes apache#53251 from AbinayaJayaprakasam/SPARK-fix-54541-PR. Authored-by: AbinayaJayaprakasam <jaiabiman@gmail.com> Signed-off-by: Wenchen Fan <wenchen@databricks.com>
1 parent e3569ba commit 92d2a15

File tree

3 files changed

+52
-41
lines changed

3 files changed

+52
-41
lines changed

common/utils/src/main/resources/error/error-conditions.json

Lines changed: 6 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -7482,6 +7482,12 @@
74827482
],
74837483
"sqlState" : "42P01"
74847484
},
7485+
"VIEW_WRITE_NOT_ALLOWED" : {
7486+
"message" : [
7487+
"Cannot write into view <name>, please write into a table instead."
7488+
],
7489+
"sqlState" : "42809"
7490+
},
74857491
"WINDOW_FUNCTION_AND_FRAME_MISMATCH" : {
74867492
"message" : [
74877493
"<funcName> function can only be evaluated in an ordered row-based window frame with a single offset: <windowExpr>."
@@ -7706,21 +7712,11 @@
77067712
"Aggregate expression required for pivot, but '<sql>' did not appear in any aggregate function."
77077713
]
77087714
},
7709-
"_LEGACY_ERROR_TEMP_1007" : {
7710-
"message" : [
7711-
"Cannot write into temp view <quoted> as it's not a data source v2 relation."
7712-
]
7713-
},
77147715
"_LEGACY_ERROR_TEMP_1008" : {
77157716
"message" : [
77167717
"<quoted> is not a temp view of streaming logical plan, please use batch API such as `DataFrameReader.table` to read it."
77177718
]
77187719
},
7719-
"_LEGACY_ERROR_TEMP_1011" : {
7720-
"message" : [
7721-
"Writing into a view is not allowed. View: <identifier>."
7722-
]
7723-
},
77247720
"_LEGACY_ERROR_TEMP_1012" : {
77257721
"message" : [
77267722
"Cannot write into v1 table: <identifier>."

sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -547,8 +547,8 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase with Compilat
547547

548548
def writeIntoTempViewNotAllowedError(quoted: String): Throwable = {
549549
new AnalysisException(
550-
errorClass = "_LEGACY_ERROR_TEMP_1007",
551-
messageParameters = Map("quoted" -> quoted))
550+
errorClass = "VIEW_WRITE_NOT_ALLOWED",
551+
messageParameters = Map("name" -> quoted))
552552
}
553553

554554
def readNonStreamingTempViewError(quoted: String): Throwable = {
@@ -578,8 +578,8 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase with Compilat
578578

579579
def writeIntoViewNotAllowedError(identifier: TableIdentifier, t: TreeNode[_]): Throwable = {
580580
new AnalysisException(
581-
errorClass = "_LEGACY_ERROR_TEMP_1011",
582-
messageParameters = Map("identifier" -> identifier.toString),
581+
errorClass = "VIEW_WRITE_NOT_ALLOWED",
582+
messageParameters = Map("name" -> toSQLId(identifier.nameParts)),
583583
origin = t.origin)
584584
}
585585

sql/core/src/test/scala/org/apache/spark/sql/DataFrameWriterV2Suite.scala

Lines changed: 42 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -165,19 +165,24 @@ class DataFrameWriterV2Suite extends QueryTest with SharedSparkSession with Befo
165165

166166
test("Append: fail if it writes to a temp view that is not v2 relation") {
167167
spark.range(10).createOrReplaceTempView("temp_view")
168-
val exc = intercept[AnalysisException] {
169-
spark.table("source").writeTo("temp_view").append()
170-
}
171-
assert(exc.getMessage.contains("Cannot write into temp view temp_view as it's not a " +
172-
"data source v2 relation"))
168+
checkError(
169+
exception = intercept[AnalysisException] {
170+
spark.table("source").writeTo("temp_view").append()
171+
},
172+
condition = "VIEW_WRITE_NOT_ALLOWED",
173+
parameters = Map("name" -> "temp_view")
174+
)
173175
}
174176

175177
test("Append: fail if it writes to a view") {
176178
spark.sql("CREATE VIEW v AS SELECT 1")
177-
val exc = intercept[AnalysisException] {
178-
spark.table("source").writeTo("v").append()
179-
}
180-
assert(exc.getMessage.contains("Writing into a view is not allowed"))
179+
checkError(
180+
exception = intercept[AnalysisException] {
181+
spark.table("source").writeTo("v").append()
182+
},
183+
condition = "VIEW_WRITE_NOT_ALLOWED",
184+
parameters = Map("name" -> "`spark_catalog`.`default`.`v`")
185+
)
181186
}
182187

183188
test("Append: fail if it writes to a v1 table") {
@@ -270,19 +275,24 @@ class DataFrameWriterV2Suite extends QueryTest with SharedSparkSession with Befo
270275

271276
test("Overwrite: fail if it writes to a temp view that is not v2 relation") {
272277
spark.range(10).createOrReplaceTempView("temp_view")
273-
val exc = intercept[AnalysisException] {
274-
spark.table("source").writeTo("temp_view").overwrite(lit(true))
275-
}
276-
assert(exc.getMessage.contains("Cannot write into temp view temp_view as it's not a " +
277-
"data source v2 relation"))
278+
checkError(
279+
exception = intercept[AnalysisException] {
280+
spark.table("source").writeTo("temp_view").overwrite(lit(true))
281+
},
282+
condition = "VIEW_WRITE_NOT_ALLOWED",
283+
parameters = Map("name" -> "temp_view")
284+
)
278285
}
279286

280287
test("Overwrite: fail if it writes to a view") {
281288
spark.sql("CREATE VIEW v AS SELECT 1")
282-
val exc = intercept[AnalysisException] {
283-
spark.table("source").writeTo("v").overwrite(lit(true))
284-
}
285-
assert(exc.getMessage.contains("Writing into a view is not allowed"))
289+
checkError(
290+
exception = intercept[AnalysisException] {
291+
spark.table("source").writeTo("v").overwrite(lit(true))
292+
},
293+
condition = "VIEW_WRITE_NOT_ALLOWED",
294+
parameters = Map("name" -> "`spark_catalog`.`default`.`v`")
295+
)
286296
}
287297

288298
test("Overwrite: fail if it writes to a v1 table") {
@@ -375,19 +385,24 @@ class DataFrameWriterV2Suite extends QueryTest with SharedSparkSession with Befo
375385

376386
test("OverwritePartitions: fail if it writes to a temp view that is not v2 relation") {
377387
spark.range(10).createOrReplaceTempView("temp_view")
378-
val exc = intercept[AnalysisException] {
379-
spark.table("source").writeTo("temp_view").overwritePartitions()
380-
}
381-
assert(exc.getMessage.contains("Cannot write into temp view temp_view as it's not a " +
382-
"data source v2 relation"))
388+
checkError(
389+
exception = intercept[AnalysisException] {
390+
spark.table("source").writeTo("temp_view").overwritePartitions()
391+
},
392+
condition = "VIEW_WRITE_NOT_ALLOWED",
393+
parameters = Map("name" -> "temp_view")
394+
)
383395
}
384396

385397
test("OverwritePartitions: fail if it writes to a view") {
386398
spark.sql("CREATE VIEW v AS SELECT 1")
387-
val exc = intercept[AnalysisException] {
388-
spark.table("source").writeTo("v").overwritePartitions()
389-
}
390-
assert(exc.getMessage.contains("Writing into a view is not allowed"))
399+
checkError(
400+
exception = intercept[AnalysisException] {
401+
spark.table("source").writeTo("v").overwritePartitions()
402+
},
403+
condition = "VIEW_WRITE_NOT_ALLOWED",
404+
parameters = Map("name" -> "`spark_catalog`.`default`.`v`")
405+
)
391406
}
392407

393408
test("OverwritePartitions: fail if it writes to a v1 table") {

0 commit comments

Comments
 (0)