Dem Resampling Smoothing Tool
Free Dem resampling smoothing Calculator for geomorphology & mapping. Enter variables to compute results with formulas and detailed steps.
Formula
New Dimensions = Original Dimensions / (Target Res / Original Res); Nyquist = 2 * Cell Size
Output grid dimensions equal original dimensions divided by the resample factor. The Nyquist wavelength is twice the cell spacing, representing the minimum resolvable feature size.
Worked Examples
Example 1: Lidar to Regional Resolution
Problem: Resample a 10 m lidar DEM of 1000x1200 cells to 30 m resolution with 3x3 smoothing kernel.
Solution: Resample factor = 30/10 = 3.0\nNew rows = 1000/3 = 333\nNew cols = 1200/3 = 400\nOriginal cells = 1,200,000\nNew cells = 133,200\nReduction = 88.9%\nSmoothing radius = 1 * 10 = 10 m
Result: 333x400 grid | 88.9% reduction | Method: Cubic Convolution
Example 2: Moderate Resolution Adjustment
Problem: Resample a 30 m SRTM DEM of 3601x3601 cells to 90 m with 5x5 kernel.
Solution: Resample factor = 90/30 = 3.0\nNew size = 1200x1200\nOriginal cells = 12,967,201\nNew cells = 1,440,000\nReduction = 88.9%\nNyquist = 180 m
Result: 1200x1200 grid | 88.9% reduction | Nyquist: 180 m
Frequently Asked Questions
What is DEM resampling and when is it needed?
DEM resampling is the process of changing the spatial resolution of a digital elevation model by creating a new grid with different cell size from the original data. It is needed when combining DEMs from different sources with different resolutions into a consistent dataset for analysis. Resampling to coarser resolution reduces file size and processing time for regional-scale analyses where fine detail is unnecessary. Resampling to finer resolution is sometimes done to match a high-resolution dataset but cannot create new information only interpolate between existing values. The choice of resampling method affects the accuracy of the output with different methods appropriate for different situations.
What are the main resampling methods and their differences?
The three primary resampling methods are nearest neighbor bilinear interpolation and cubic convolution each with distinct characteristics. Nearest neighbor assigns each output cell the value of the closest input cell preserving original values but creating a blocky appearance. Bilinear interpolation uses a distance-weighted average of the four nearest input cells producing smoother output suitable for continuous data like elevation. Cubic convolution uses the 16 nearest input cells with a more complex weighting function producing the smoothest result but potentially creating values outside the original data range. For elevation data bilinear or cubic convolution is generally preferred while nearest neighbor is used for categorical data like land cover.
What is spatial smoothing and how does it reduce noise?
Spatial smoothing applies a moving window filter across the DEM replacing each cell value with a statistical summary of its neighborhood to reduce random noise and small-scale artifacts. Mean filters average all values within the kernel window producing uniform smoothing while median filters take the middle value preserving edges better but removing spike noise. Gaussian filters apply distance-weighted averaging where nearby cells contribute more than distant cells producing natural-looking smoothing. The kernel size determines the degree of smoothing with larger kernels removing more detail but also eliminating more noise. The trade-off between noise reduction and feature preservation is the central challenge in DEM smoothing.
How does resampling affect terrain derivatives like slope?
Resampling to coarser resolution systematically reduces calculated slope values because it smooths out fine-scale topographic variation. A 1-meter lidar DEM will show much higher maximum slopes than the same area resampled to 30 meters because small steep features are averaged into gentler slopes. Aspect calculations become less variable but may shift systematically if the dominant terrain orientation changes with scale. Curvature which is the second derivative of elevation is even more sensitive to resolution changes than slope. These scale-dependent effects must be considered when comparing terrain analyses performed at different resolutions and when selecting the appropriate resolution for a specific application.
What is the optimal kernel size for smoothing?
Optimal kernel size depends on the noise characteristics of the DEM and the scale of features to be preserved. A 3x3 kernel provides minimal smoothing suitable for removing single-pixel noise spikes while preserving most terrain detail. A 5x5 kernel smooths at a moderate level appropriate for reducing systematic noise patterns from photogrammetric or radar-derived DEMs. Larger kernels of 7x7 or more aggressively smooth the terrain and are used when the goal is to extract only regional-scale landforms. The smoothing radius in ground units equals half the kernel size minus one times the cell resolution. Adaptive filtering that varies kernel size based on local terrain roughness can outperform fixed-size approaches.
How do you assess DEM quality before and after processing?
DEM quality assessment uses several metrics including comparison with ground truth survey points calculation of difference statistics against reference data and visual inspection of hillshade images. Root mean square error between the DEM and ground control points quantifies overall vertical accuracy. Difference maps between the original and smoothed DEM reveal the spatial pattern of removed features helping assess whether noise or real terrain was eliminated. Slope and curvature maps highlight artifacts such as striping terracing or pit-and-mound patterns common in specific DEM sources. Cross-validation using a subset of ground points held back during processing provides an unbiased accuracy estimate.