Is this based on simulation, lab results, or both?
If I understand your notation properly, you are seeing a drop of 120uV on the high side versus 30uV on the low side as temperature is increased (how much?) with DC signal inputs.
Given that each opamp is in a voltage follower configuration, I believe the difference is opamp related. Several parameters play into your 'room temperature' results including the offset voltage, the bias currents, the non-symmetrical drive capability of the opamp, the open loop gain, etc. Over temperature, all those parameters will move.
Looking at the datasheet for the OP295, the output current versus temperature curves show for a single supply configuration, the ability to source current drops off faster with increased temperature than the ability to sink current, which would affect the high side opamp more. The curves show the reverse capability for a dual supply configuration. That may be something you can test.
The opamp can also drive closer to it's negative rail than its positve rail and in this configuration the high side opamp has to drive Vbe above the output voltage (closer to the suppy rail) compared to the low side which only has to drive Vbe below the output voltage (closer to ground). The drive capability decreases with increased temperature.