State and explain about Gibbs phenomenon.
Answers
Answered by
0
Gibbs phenomenon describes a special case where a function has a jump discontinuity (like the square wave above). When the function jumps in the middle the Fourier estimation will end up overshooting that jump. That overshoot will never go to zero no matter how many terms are added (it would go away if you could add an infinite mount of terms but we can’t). This means if an algorithm approximates a function by it's Fourier series and that function has a jump discontinuity the Fourier series will introduce artifacts into it's estimation
Similar questions