-
Notifications
You must be signed in to change notification settings - Fork 361
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ETL option to specify upper zoom limit for raster layer ingestion #1494
Conversation
val (extent, cellType, cellSize, bounds, crs) = collectMetadataWithCRS(rdd) | ||
val LayoutLevel(zoom, layout) = scheme.levelFor(extent, cellSize) | ||
val LayoutLevel(zoom, layout) = (if (maxZoom > 0) scheme.asInstanceOf[ZoomedLayoutScheme].levelForZoom(maxZoom) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably should use an Option[Int]
type for the maxZoom, so if someone calls the argument with a maxZoom of 0, we can deal with that error differently than if the argument was not there at all.
We wouldn't want to put this asInstanceOf
in here (asInstanceOf
is considered Evil in Scala and we should only use it as a last resort).
The maxZoom parameter shouldn't be used with anything but a tms
layout, which could be detected in the EtlConfig
code and an error would be issued at that point.
Here we could avoid the asInstanceOf
by matching
val LayoutLevel(zoom, layout) = {
maxZoom match {
case Some(zoom) =>
scheme match {
case zoomedLayoutScheme: ZoomedLayoutScheme =>
zoomedLayoutScheme.levelFor(extent, cellSize)
case _ => // throw a descriptive error
}
case None =>
scheme.levelFor(extent, cellSize)
}
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lossyrob Thanks for the feedback and sorry for the evil Scala! I've updated the code.
This is looking good! Just added some more comments. Before we merge these changes, you'll have to sign an Eclipse CLA. Can you sign one with an eclipse account tied to the same email address you commit to github with, and let me know when you have that? You can follow the right hand side instructions of here: https://www.eclipse.org/legal/CLA.php Thanks! |
Thanks @lossyrob, I made some more updates to the code and signed the Eclipse CLA. |
@mbertrand I am failing to verify the Eclipse CLA through: https://projects.eclipse.org/user/cla/validate |
K: (? => TilerKeyMethods[K, K2]) , | ||
V <: CellGrid, | ||
K2: SpatialComponent: Boundable | ||
](rdd: RDD[(K, V)], crs: CRS, scheme: ZoomedLayoutScheme, maxZoom: Option[Int] = None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unfortunately Scala does not allow default arguments on a method that is overloaded. The rules around this are strange, and sometimes the compiler does not catch this by default. I think it might only happen on 2.11, but if you run these commands, you'll get these outputs:
./sbt
geotrellis > project spark
spark > +package
[info] Compiling 230 Scala sources to /Users/rob/proj/gt/geotrellis-codereview/spark/target/scala-2.11/classes...
[error] /Users/rob/proj/gt/geotrellis-codereview/spark/src/main/scala/geotrellis/spark/TileLayerMetadata.scala:66: in object TileLayerMetadata, multiple overloaded alternatives of method fromRdd define default arguments.
[error] object TileLayerMetadata {
[error] ^
[error] one error found
[error] (spark/compile:compileIncremental) Compilation failed
[error] Total time: 75 s, completed Jun 7, 2016 11:30:43 AM
spark >
That means we have to create overloads for the default options. And since we have to do that anyway, might be good to not require the option
e.g.
def fromRdd[
K: (? => TilerKeyMethods[K, K2]) ,
V <: CellGrid,
K2: SpatialComponent: Boundable
](rdd: RDD[(K, V)], crs: CRS, scheme: ZoomedLayoutScheme, maxZoom: Option[Int] = None):
(Int, TileLayerMetadata[K2]) = {
val (extent, cellType, cellSize, bounds) = collectMetadata(rdd)
val LayoutLevel(zoom, layout) = maxZoom match {
case Some(zoom) => scheme.levelForZoom(maxZoom.get)
case _ => scheme.levelFor(extent, cellSize)
}
val kb = bounds.setSpatialBounds(KeyBounds(layout.mapTransform(extent)))
(zoom, TileLayerMetadata(cellType, layout, extent, crs, kb))
}
becomes something like
def fromRdd[
K: (? => TilerKeyMethods[K, K2]) ,
V <: CellGrid,
K2: SpatialComponent: Boundable
](rdd: RDD[(K, V)], crs: CRS, scheme: ZoomedLayoutScheme):
(Int, TileLayerMetadata[K2]) =
_fromRdd[K, V, K2](rdd, crs, scheme, None)
def fromRdd[
K: (? => TilerKeyMethods[K, K2]) ,
V <: CellGrid,
K2: SpatialComponent: Boundable
](rdd: RDD[(K, V)], crs: CRS, scheme: ZoomedLayoutScheme, maxZoom: Int):
(Int, TileLayerMetadata[K2]) =
_fromRdd[K, V, K2](rdd, crs, scheme, Some(maxZoom))
private def _fromRdd[
K: (? => TilerKeyMethods[K, K2]) ,
V <: CellGrid,
K2: SpatialComponent: Boundable
](rdd: RDD[(K, V)], crs: CRS, scheme: ZoomedLayoutScheme, maxZoom: Option[Int]):
(Int, TileLayerMetadata[K2]) = {
val (extent, cellType, cellSize, bounds) = collectMetadata(rdd)
val LayoutLevel(zoom, layout) = maxZoom match {
case Some(zoom) => scheme.levelForZoom(maxZoom.get)
case _ => scheme.levelFor(extent, cellSize)
}
val kb = bounds.setSpatialBounds(KeyBounds(layout.mapTransform(extent)))
(zoom, TileLayerMetadata(cellType, layout, extent, crs, kb))
}
You might be able to restructure something that takes a LayoutLevel
explicitly which might look a bit more pretty.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lossyrob sorry for the delay, I just updated the branch with your suggested changes.
@mbertrand thanks! This is a great change 👍 |
This PR addresses #1484 by adding an optional "maxZoom" argument to ETLConf (with a default value of 0), and modifies both the Etl and TileLayerMetadata classes to handle that option.
In testing with the Chattanooga demo however, setting maxZoom levels above 10 often led to the ingestion process being killed due to lack of memory. Increasing VM memory to 6 GB, and the memory available to the ingestion process to 4 GB, allowed the ingestion process to succeed up to zoom level 12.