Ive been pulling what little hair I have left out, trying to get useful energy cost numbers for my HVAC system. I need to replace the existing system and want to see if it makes economic sense to install a higher efficiency unit. However, my energy usage calculations are way out of line with my historical usage and I cant figure out what the problem is. I tried posting this on another forum and they werent able to provide any insights. Any assistance from the experts here would be greatly appreciated!
My home is a single story, slab-on-grade double brick (two layers of brick, a total of 8" thick, with no insulation) house located in southern Arizona. It was built in 1950, with 1400 square feet of conditioned space. It has a hip roof with a fully insulated attic (house ceiling and attic rafters both insulated with R-19 bats). The windows are the original single-pane casement windows. The existing HVAC unit is a 12 SEER, 4 ton, 80% fuel efficient, 80kBTU (input) Rheem natural gas furnace/AC on the roof. I plan to install a split unit in the attic (the original furnace was attic mounted).
I used HVAC Calc 4.0 and two other methods to estimate my heating and cooling loads. The HVAC Calc results are:
During the winter, we keep our thermostat at 70° during the day, 65° at night; our average 24/7 set point is 68.2°. In the summer, we keep our thermostat at 78° during the day, 74° at night; our average 24/7 set point is 76.6°
To get Heating Degree Day (HDD) and Cooling Degree Day (CDD) values, I downloaded 3 years of monthly 68° HDD and 77° CDD data for station KDMA from degreedays.net and averaged the values by month. For two peak months, the average values are:
I calculated my estimated energy use, in BTUs, as follows:
I converted my January heating energy to Therms, as follows:
I converted my July cooling energy to W*hr, as follows:
The problem is, my average January heating usage, in Therms, is 20.5 Therms; the highest monthly bill that Ive had over the last two years is 50 Therms. (I subtracted out off-season average gas usage to get to that number). Over the course of a heating season, I average 84 Therms, but my load-based calculations predict 570 Therms, a 680% error. On the AC side, my average July bill is 1280 KWH. Over a season I average 5630 KWH, but the calculations predict only 3750 KWH, a 67% error. If both calculated numbers were high (or low) I would assume that there was something wrong with my heat load analysis, but the heating is way over and the cooling is under actual usage.
I can easily see that the cooling performance is below theoretical, with an aging unit, poor duct setup, etc. But I cant figure out how that same decrepit unit and lousy ducts can perform so much better than expected.
What am I doing wrong?
Thanks in advance for any help.
My home is a single story, slab-on-grade double brick (two layers of brick, a total of 8" thick, with no insulation) house located in southern Arizona. It was built in 1950, with 1400 square feet of conditioned space. It has a hip roof with a fully insulated attic (house ceiling and attic rafters both insulated with R-19 bats). The windows are the original single-pane casement windows. The existing HVAC unit is a 12 SEER, 4 ton, 80% fuel efficient, 80kBTU (input) Rheem natural gas furnace/AC on the roof. I plan to install a split unit in the attic (the original furnace was attic mounted).
I used HVAC Calc 4.0 and two other methods to estimate my heating and cooling loads. The HVAC Calc results are:
Heating: 56,656 [BTU/hr] = 1,490 [BTU/deg*hr] (38° temperature delta)
Cooling: 26,170 [BTU/hr] = 1,090 [BTU/deg*hr] (24° temperature delta)
Cooling: 26,170 [BTU/hr] = 1,090 [BTU/deg*hr] (24° temperature delta)
During the winter, we keep our thermostat at 70° during the day, 65° at night; our average 24/7 set point is 68.2°. In the summer, we keep our thermostat at 78° during the day, 74° at night; our average 24/7 set point is 76.6°
To get Heating Degree Day (HDD) and Cooling Degree Day (CDD) values, I downloaded 3 years of monthly 68° HDD and 77° CDD data for station KDMA from degreedays.net and averaged the values by month. For two peak months, the average values are:
January (avg 51.8°): 494 [deg*day] = 11,900 [deg*hr]
July (avg 87.0°): 269 [deg*day] = 6,460 [deg*hr]
July (avg 87.0°): 269 [deg*day] = 6,460 [deg*hr]
I calculated my estimated energy use, in BTUs, as follows:
January heating: 1,490 [BTU/deg*hr] * 11,900 [deg*hr] = 17,700,000 [BTU]
July cooling: 1090 [BTU/deg*hr] * 6,460 [deg*hr] = 7,040,000 [BTU]
July cooling: 1090 [BTU/deg*hr] * 6,460 [deg*hr] = 7,040,000 [BTU]
I converted my January heating energy to Therms, as follows:
January heating: 17,700,000 [BTU] * 1 [Therm] / 100,000 [BTU] = 177 [Therm]
I converted my July cooling energy to W*hr, as follows:
July cooling: 7,040,000 [BTU] / 12 [BTU/W*hr] = 587,000 [W*hr] = 587 [kW*hr] = 587 KWH
(the 12 [BTU/W*hr] value is my existing units SEER rating)
(the 12 [BTU/W*hr] value is my existing units SEER rating)
The problem is, my average January heating usage, in Therms, is 20.5 Therms; the highest monthly bill that Ive had over the last two years is 50 Therms. (I subtracted out off-season average gas usage to get to that number). Over the course of a heating season, I average 84 Therms, but my load-based calculations predict 570 Therms, a 680% error. On the AC side, my average July bill is 1280 KWH. Over a season I average 5630 KWH, but the calculations predict only 3750 KWH, a 67% error. If both calculated numbers were high (or low) I would assume that there was something wrong with my heat load analysis, but the heating is way over and the cooling is under actual usage.
I can easily see that the cooling performance is below theoretical, with an aging unit, poor duct setup, etc. But I cant figure out how that same decrepit unit and lousy ducts can perform so much better than expected.
What am I doing wrong?
Thanks in advance for any help.
Walang komento:
Mag-post ng isang Komento