I've been trying to measure system efficancy without having lots of instrumentation. I have two thermistors installed in the air stream. One is ahead of the coil in the return air. The other is three feet beyond the air handler in the supply duct. I'm measuring resistance of the two thermistors with a Fluke 8050A digital meter. I'm converting resistance readings to temperature rises. The thermistors are Johnson AB99B sensors. They measure within 1 degree F with the fan running and no heat running.

I turned on the heat strips alone and measured voltage and current flow to the heat strip elements and temperature rise. I have a temperature rise of 58* F (32.2 C) at 10.344 KW coil input power. From this I calculated 584 CFM air flow for the 35,325 btu/hr input heat. This air flow seems way off compared with the air handler specs which say it should do around twice this amount.

I repeated the test with the heat strips off and hat pump running. I measured 37.75* (21C) temperature rise. Assuming airflow didn't change from 584 CFM, I calculated 23809 BTU/hr. This is with the outdoor unit power consumption of 4.67KW consumption at the outdoor compressor and fan.

Dividing output BTU gain vs outdoor BTU expenditure, I get 23,809/15947 = 1.49 performance coefficent or an HSPF of 5.1.

This is at outdoor temperature around 32F (0 C).

Do I have an error in my figuring somewhere. The measurements don't match my expectations.