val df = spark.read.raster //....
.select(rf_tile($"red"), rf_extent($"red") as "red_extent", rf_crs($"red") as "red_crs")
.toLayer(tlm)
I believe in this case since there is no projected raster tile column (note use of rf_tile), it is trying to select crs and extent by name.
Should look for columns by type instead.
This code:
Fails with error about column
crs as crsnot found. Pinpointed this to https://github.com/locationtech/rasterframes/blob/develop/core/src/main/scala/org/locationtech/rasterframes/extensions/RasterJoin.scala#L44I believe in this case since there is no projected raster tile column (note use of
rf_tile), it is trying to selectcrsandextentby name.Should look for columns by type instead.