I really should write these brain waves down before I forget. First of all an unspoken assumption in statistical signal processing is that the signal is normalized before you do any thing. The variance of the noise will be some fractional value. I spent an entire week puzzling out how to make Steins Unbiased Risk Estimate work with a large signal amplitude (15-30) and simulated noise variance = 1, 2, 3.
I got estimates that were too big, which by itself is fine, a scaling problem only. But the position of the minimum wouldn’t agree with the MSE. Worse, the result was unstable and changed from one run to the next.
Then the idea came that I should normalize the signal first. Only then did the SURE work as expected, finding the optimal width of the Gaussian kernel used for smoothing the test signal. It also works with the rectangular kernel.
It’s somewhat frustrating that the tutorials don’t mention normalizing the signal. Maybe it’s a given for people who have done statistics, but my physics training has been mostly deterministic. I regarded the statistical parts as being messy add ons.
I will try to implement SURE for other convolution kernels.
There was an idea that came an hour ago but since I didn’t write it down, alas it fled from me. Senior moment. Hopefully the idea will return.