I'm working on generating spatially varying noise and got confused by the differences in output when applying two similar Gaussian operations. I followed this answer and broke the process into three steps:
- Generate a Gaussian-like kernel:
import numpy as np
import scipy.signal
import matplotlib.pyplot as plt
# Compute filter kernel with radius correlation_scale (can probably be a bit smaller)
correlation_scale = 150
x = np.arange(-correlation_scale, correlation_scale)
y = np.arange(-correlation_scale, correlation_scale)
X, Y = np.meshgrid(x, y)
print(X.shape, Y.shape)
dist = np.sqrt(X*X + Y*Y)
filter_kernel = np.exp(-dist**2/(2*correlation_scale))
This kernel, when visualized, looks like the following:

- Create a random noise grid:
n = 512
noise = np.random.randn(n, n)
The noise appears as:

- Convolve the noise with the kernel:
noise1 = scipy.signal.fftconvolve(noise, filter_kernel, mode='same')
The result is:

However, if I apply a standard Gaussian filter:
from scipy.ndimage import gaussian_filter
noise = gaussian_filter(noise, sigma=1, radius=10)
The resulting image is:

My question is: Why do these two images look so different even though both involve a form of Gaussian convolution? What aspect of the custom kernel and convolution process leads to a noise pattern with large-scale spatial variations compared to the more locally blurred result from gaussian_filter?