Another question.

Some state that the AKV’s adapt better then TEV’s to the actual load in the cold room.
I disagree with this. I follow a think pad which is perhaps not correct

Let’s say we have a cold room, normal load, set on 0°C and for the moment at 0°C.
Fans keeps running all the time so air volume the same.
Differential of thermostat set at 1 K.
As soon as temperature increases 1K due to losses over the construction, SV opens and compressor starts.
Air Entering temperature entering = 1°C. The mass air flowing over the coil possesses x enthalpy.
Temperature in the room drops to 0°C and SV closes.

Now some warmer goods are stored in the cold room. Temperature rises faster to 1°C but as soon SV opens, room starts cooling and hold it at 1°C for a longer time.
Air entering remains at 1°C for a longer time but enthalpy remains the same but evaporator has to run longer to drop the temperature.

Now lots of warm goods are stored in the cold room (no normal load anymore but more a quick chill).
Temperature rises faster to 1°C but as soon SV opens, room starts cooling and can’t hold it at 1°C any longer. Rises to let’s say 3°C. entering air now 3°C so higher load, TEV opens more and LP rises (single unit), cooling capacity of compressor increases so cools relative faster (perhaps an increase in HP due to higher load)
With a pack, LP will be held constant, so DT over evaporator will increase when room temperature increases, so it will cool faster and perhaps temperature will have never the change to swing wide away from setpoint due to increased capacity.

But in most cold room applications, as soon as SV opens due to increase in temperature, there will no further increase of temperature anymore because the then ‘injected cooling capacity’ will be much higher then the heat losses. So the load is always the same and TEV is always positioned in the same position.

Any comments?