PDA

View Full Version : Urban Myth? - A 15% leak will increase energy usage by 100%?



buddy
12-09-2011, 03:25 AM
Hi Folks,

I have heard over the years and it been stated in training sessions I attended that a 15% leak (of full system charge) will increase electricity use (of the compressors) by 100%.

I have searched on the internet and cannot find any definitive source for this claim.

Is it an Urban Myth or is it true?

best regards

chillerman2006
12-09-2011, 03:48 AM
hi Buddy

can not give you any %, but if you think about it at 15% loss you will probaly not have a clear feed of liquid to the txv

So you are going to get a partialy wet evap coil, lowering efficiency & increasing run time

Chilled temps may still be reached depending on equipment in use but lower temps will be a struggle as you just dont get much work/heat transfer from a vapour

You will also be superheating the suction vapour too high & then in turn no compressor cooling and increased head pressure and temp

Its loose, loose, loose once you charge falls too low

R's chillerman

Brian_UK
13-09-2011, 12:01 AM
I've seen the myth in 'official' documents but I've found a test report that disproves it...

Read the introduction on page 2 and worst case is 22%.

http://eetd.lbl.gov/ie/pdf/LBNL-47476.pdf

chillerman2006
13-09-2011, 12:05 AM
Without seeing either

a system that incorperates good subcooling will still work but less efficient and one that does not will struggle to pull temp - as a system with low/no subcooling normaly will not receive full feed of liquid to txv & will hunt

now I'll have a butchers

R's chillerman

chillerman2006
13-09-2011, 12:10 AM
Having browsed the doc, I will rest my case on that one as conclusive

Lower effiency increase's run time, increase's cost's

R's chillerman

buddy
13-09-2011, 01:51 PM
Having browsed the doc, I will rest my case on that one as conclusive

Lower effiency increase's run time, increase's cost's

R's chillerman

Hi Chillerman2006,

I think most experienced refrigeration engineers will understand that systems short of gas (low refrigerant levels) that as display case temperatures creep up the compressors will keep on running as cooling is being constantly called for.

BUT!....at what point of a percentage of the system charge does the compressors start running at 100% all of the time?

That is my question, where is the proof that 15% loss of refrigerant charge means compressors are running at 100%?

There must be a tipping point...but where is the evidence??

best regards

buddy
13-09-2011, 02:01 PM
Hi Brian_UK,

Thanks for your input.

I actually pulled this document off the internet myself, if you do a search nothing comes up about the 15% gas leak causing compressors to run 100% of the time (at least I could,nt find anything).

It is not as though its new...this claim has been a constant for many years but I just never challenged it.

I even emailed Cool Concerns who are co-authors of the Zero leak project in partnership with the Institute of Refrigeration in London, thinking they should have an answer...no reply?

I am still stumped as to the source of this claim.

best regards

chillerman2006
13-09-2011, 02:12 PM
Hi Chillerman2006,

I think most experienced refrigeration engineers will understand that systems short of gas (low refrigerant levels) that as display case temperatures creep up the compressors will keep on running as cooling is being constantly called for.

BUT!....at what point of a percentage of the system charge does the compressors start running at 100% all of the time?

That is my question, where is the proof that 15% loss of refrigerant charge means compressors are running at 100%?

There must be a tipping point...but where is the evidence??

best regards

Hi Buddy

As with my first post mate....can not give any % ....even if at the level of some of these really clever tech guru's here (which am obviously not) I still would not be able to, as from the way I see it the percentage of loss and its affect on a system will vary from unit/type to unit/type

fare statement ???

Another is all these so called official docs released with there tests and conclusions, dont really mean much either do they as they need to be declaring the exact make & model & only test that make/model then and only then does there testing really give an accurate reflection

fare statement ???


R’s Chillerman (http://www.refrigeration-engineer.com/forums/member.php?4978-chillerman2006)

paul_h
13-09-2011, 04:10 PM
BUT!....at what point of a percentage of the system charge does the compressors start running at 100% all of the time?

That is my question, where is the proof that 15% loss of refrigerant charge means compressors are running at 100%?
I think you have to understand what they mean by 100% to answer that question.
It may be they don't mean 100% running/duty cycle of the compressor. It may just mean it's running twice as long as it would will a full charge. The art is in intepretation of the article and in the questions asked on it. They could just mean twice as long, or double the run hours, not running 100% as in 'all of the time'

Gary
13-09-2011, 11:12 PM
I would think overcharge would affect the efficiency more than undercharge... but I could be wrong.

buddy
14-09-2011, 04:56 PM
Hi paul-h,

I understand your thinking on this but if for example you had a unit designed correctly that runs for 18 hours a day and it then had for arguments sake a 15% reduction in charge. if you do the maths another 6 hours running (24hr running) the only out come i see is the unit running 100% percent all the time.

I am pretty certain that is what they meant by this....but where is the evidence?

That is the question I am asking.

best regards

buddy
14-09-2011, 04:58 PM
Hi Gary,

Ahh, but would overcharge increase compressor run time by 100%?

...thats another question (-:

best regards

paul_h
14-09-2011, 05:32 PM
Brian has already answered the question.
What ever this 100% figure was from, could have been a random number based on a system that only ran 6hrs a day with a full charge for all I know.
Basic fact is a system undercharged will have to run longer to do the same work as a fully charged system, as there may not be sufficient liquid feed to the evap. How much longer they have to run? depends on the design and the load put upon it.
So there isn't a universal tipping point where x% of refrigerant loss always equals 100% extra compressor run time IMO.
It's all about the work required to be done, and that's too variable.

750 Valve
16-09-2011, 10:17 AM
There is no way you can apply such a broad rule of thumb, each system would need to be taken on its own merits and a full analysis done before any sort of % could be applied to that particular system, even then that is only applicable for that specific set of conditions such as RH, ambient temp and any other external influences.

It was probably on literature from an electronic leak detector manufacturer :D

buddy
18-09-2011, 01:32 PM
Next time I ever hear this %figure quoted in a training session or other wise I shall challenge it as I think it is just an Urban myth that has taken on a life of its own.

Grizzly
18-09-2011, 04:59 PM
Only applicable on critical charge systems surely?
And even then, as many have said before these doubtful statistics are variable.
I know one statistic which may be relevant and that is.
For every 1c rise in set-point there is a 10% saving in power consumed.
Conversely for every drop in set-point by 1c there is an 10% increase.
This would be relevant to the above discussion, would it not?
Grizzly

chillerman2006
18-09-2011, 08:52 PM
Next time I ever hear this %figure quoted in a training session or other wise I shall challenge it as I think it is just an Urban myth that has taken on a life of its own.

Good for you mate

Magoo
19-09-2011, 02:45 AM
Hi buddy,
I would definitly go with the "urban myth "idea, scare mungering tactics by so called experts.

mad fridgie
19-09-2011, 06:35 AM
lets put the cat among the pigeons. ( and we do have to presume critical charge systems)
your 18 hour run time is based upon meeting duty for a 24hour hour period at design conditions.
You could loss a % of refrigerant and still maintain temperature at design conditions upto a point of running 24hrs in 24hrs any more loss then you would not meet temperature. But this in itself only gives a 25% increase in power use.
How ever when you loose refrigerant your suction pressure drops and your suction super heat increases, (some will be useful) this fact alone reduces the capacity and the efficiency of the refrigeration compressor, increasing the power used per nett cooling effect, but to counteract this you will have a reduced discharge pressure, so less power drawn, your discharge temperature will be higher.

So all being said, can we increase power usage by 100% based upon a normal run time of 18hrs "in my mind this means double" the simple answer no it will not on a simple single compressor system. On a complex system with variable refrigerant volume (more than one comp or VSDs or unloading) and load diversity, whilst driven solely by application temperature (not pressure as most would do it) then there is a possibility.

Who dreams up these rules of thumb!

Are you sure of the wording of the rumor.

It is more than likely that 15% reduction in refrigerant could reduce efficiency by 50%, which on first look seems like double the power, but it is not.

here is an example

A Copeland Comp

at normal conditions 0c SST and 45C SCT has a COP of 2.79
loss of refrigerant -20SST and 40C SCT has a COP of 1.39.

We have increased the efficiency by 100% (doubled)if the undercharged system was our base figure.

4 evr learnin
13-10-2011, 10:28 AM
Well in reading this thread, Iv'e come to the conclusion that unless you actually sit down and work out all of the variables to this "myth" there is no exact answer to your question.However I can say that a 100% leak will cause power consumption to drop considerably.