Given a 2d array of size (width=x, height=y), where each row contains entries with a height information in meters each. The distance in meters between two horizontally neighbored entries in a row varies by y-position. Thus in one row the distance between two entries is 30m, where in a row above the distance e.g. is 31m.
How do i resample and interpolate the 2d array to have the horizontal distance between pixels equal a given value for each row? If possible, are there multiple options for interpolating?
I think you can solve this with 4-steps: