Some softwares have the graphic represantation of a bent keyframe path that suggests a proportional increase/decrease of playback speed. I wrote „suggest“ because, if you try and do the maths yourself, you will see that this is not possible with whole, „integer“ frames as the timebase. It appears that frame interpolation will allow it, but it doesn‘t. Because while it IS possible to generate a „stepless“, smooth pixel motion or optical flow, the phases have to be remapped to whole frames again, and there is no „floating point“ playback speed. The ramp with the sections of calculated slow down and speed up are actually as good as it gets. Wouldn‘t this be more accurate with more time blades? Only in theory and in the imagination. You can calculate the position a single pixel probably had in frame x, given that just this frame had had a shooting rate of 43,578 fps. What you really get then is a slightly different amount of motion blur - that‘s what optical flow does.
Finally: what would such a linear speed progression be good for in terms of perceived flow of time? We are not trained to see and judge anything else but realtime. We take an FCPX speed ramp with speed transitions and optical flow already for „linear“.