Volume 41, Issue 9 - September 2006

Much to Discuss
NFRC Members Meet in Minneapolis 
by Megan Headley

Minneapolis was the meeting place for the National Fenestration Rating Council’s (NFRC) summer membership meeting, July 24-28 at the Crowne Plaza Northstar Hotel. 

Numerous committees were called to order to discuss new business and ongoing tasks. 

Rating Thermal Comfort 
The inaugural meeting of the new thermal comfort rating task group opened with plenty of comments. The NFRC board had approved a thermal comfort rating research project at its spring meeting, and a task group subsequently formed to develop recommendations to the board for further consideration of the project findings. The creation of the task group has been controversial and discussion was held on whether or not the creation of a thermal comfort rating really fell within the scope of the NFRC.

“Our discussion here today is [whether] we want to pursue a rating,” said Jim Larsen of Cardinal Glass Inc., the task group’s chair. “Using a statistical basis can we come up with a measure where the bulk of the population is comfortable?”

“Thermal comfort is subjective,” argued Thomas Culp with Birch Point Consulting LLC, “and its effect on energy performance is secondary.”

John Hogan with the Seattle Department of Planning and Development opined that the Council’s mission statement—which says the “NFRC develops and administers comparative energy and related rating programs”—allows for them to pursue activities that are “not just energy-related.”

Others argued that thermal comfort is in fact energy-related, since people will use whatever energy is necessary to be comfortable. 

Larsen noted that part of the group’s research would have to include how people use the space in a room near the windows. 
“If people sit 2 feet away from the window the thermal impact will be much greater than if they sit 6 feet from the window,” said Larsen. 

He added that it would be very helpful to have the thermal comfort tool to help “further fuel the credibility of the [NFRC’s] annual energy performance rating.”

At the end of the discussion, a motion was passed to maintain the task group under the technical committee to investigate a thermal comfort rating. 

The long-term energy performance subcommittee heard a presentation from Margaret Webb with the Insulating Glass Manufacturers Alliance (IGMA) about insulating glass (IG) certification. The goal of the group is to ensure that all Energy Star®-rated products also have IG certification.

“What we mean by IG certification is not just the testing, it’s the ongoing assurance of compliance,” said Webb. “The reason we’re doing this is to upgrade the long-term performance of fenestration products.” 

Webb explained that IG product lines would be certified for durability, as the durability of the units leads, in turn, to reduced energy usage and increased probability of remaining energy-efficient over time. 

In a study recently completed by IGMA, IG units certified at the highest level after 25 years had a failure rate of 3.6 percent, “which I think is reasonable,” Webb said. 

The cost to the manufacturer for certification would run between $2,500 and $4,000 annually per IG product line, according to Webb. 

The subcommittee also supported the push toward certification.

Researching Condensation Resistance 
Charlie Curcija of Carli Inc., a simulation laboratory, opened discussion on the “Condensation Resistance Procedure for the Non-Residential Component Modeling Approach” during the research subcommittee meeting. There was a discussion about the correlation between the NFRC’s condensation-resistance (CR) number and the American Architectural Manufacturers Association’s (AAMA) condensation resistance factor (CRF). Some subcommittee members questioned whether they should note an equivalency between the two ratings. 

Bruce Croak of Graham Architectural Products supported educating architects about the equivalence between NFRC’s CR rating and AAMA’s CRF—so that architects would stop confusing them. 

Chris Mathis of MC Squared argued the opposite, saying that the Council had no responsibility to show how its ratings related to another organization.

William DuPont of Synergy Consulting pointed out that the CR number is used for simulation only, and CRF is for testing only, meaning that it could be dangerous trying to compare two such different numbers. 

The subcommittee finally resolved to first study the CR rating further and then potentially look into its correlation with CRF. 

The Technical Committee Reports
During the report from the U-factor subcommittee, a negative ballot came up on NFRC 100 addressing between-glass shading systems. Larry Livermore of AAMA questioned the use of 1.75 inches as an indicator of whether or not to simulate the glass shading system, asking where the number had come from. 

“I would think it’s difficult for the NFRC to say ‘if it’s bigger than 1.75 inches you do this, if you’re less than 1.75 inches you do this,’” said Livermore. 

“That’s the number we said did not impact the overall performance,” replied Michael Thoman of Architectural Testing Inc. 

Marles McDonald of Quality Testing Inc. agreed that 1.75 inches probably wouldn’t change the thermal heat gain significantly.
A second negative from Steve Johnson of Andersen Corp. asked why no simulation for glass shading systems smaller than 1.75 inches is included since simulation is required for grilles between the glass. 

A motion was passed to change the language to have everything simulated and to explore further the science behind the number.

The annual energy performance (AEP) subcommittee opened its discussions with a review of the proposed calculation reports for finding the AEP of a house. The tool will allow homeowners to view energy performance information for a default home, or input specific values for their home. 

The new draft of NFRC 901 “Guidelines to Estimate Fenestration AEP in Single-Family Residences” uses a range of values to show the AEP of the default home. Some subcommittee members were concerned that a range of variables wide enough to cover all possible lifestyles would offer too wide of a spread to be valuable. Use of a single number with an error bar was proposed. However, a vote of the membership supported continuing with the range for the default house.

Some members also questioned whether to include energy consumption only or energy costs as well in the model. The concern was that, since energy costs fluctuate and look backward, homeowners would not have an accurate idea of how their windows would perform in the long term. After discussing their options, subcommittee members voted to allow the homeowner to input their energy costs into the user-specific model home. Members also voted to keep an option that allows homeowners to find the AEP for existing buildings rather than new construction alone. 

The software review subcommittee began with a motion from Webb requesting that the component modeling approach (CMA) software request for proposal (RFP) be reviewed and tightened by a professional software writer before the RFP is sent to the board. 

The decision was supported by the subcommittee and later by the technical committee. 

CMA Status
During the CMA technical subcommittee’s meeting, subcommittee chair Gary Curtis, with Westwall Group, reported that “the board has agreed it would be reasonable to have component level testing.” He added that in using the CMA, whole product testing would not be needed.

However, throughout the CMA meetings there was a repeated concern about how the modeling approach would suit custom projects. 

“There’s a component of the industry that may not be well served by CMA,” said McDonald. 

McDonald mentioned a problem with what he called the “one-off” project: projects where extrusions are designed specifically for that project, and may never be used again. The technical documents say there is only one way to handle these products: if you can simulate it, you must simulate it. That means that the timeline will be significantly longer for custom products than for standard products. 

“If I’m a standard off-the-wall manufacturer … and I’ve got a project, I call up the ACE [approved calculation entity] and say ‘give me a number,’” McDonald elaborated. 

That process, he said, can take a couple of hours. However, for custom jobs, simulation takes several days. Next comes the peer review, which can also take several days, and then finally the information must be entered into the database. The entire process can take weeks, rather than days.

“If you’re in the bid process that can be a killer,” McDonald said. 

“I can assure you we do feel the pressure,” added Catherine Best with Benson Industries. “More steps add time.”

“It is an issue about timeliness, especially when using tools for the pre-bid process,” said Mike Turner with YKK AP.

“We don’t want to develop a program that would result in less [custom work],” said Greg Carney with the Glass Association of North America.

Carney added that manufacturers need to have the capability to do a lot of this approval work during the design phase.

Accepting the CMA 
Carney also stressed the importance of developing the CMA as a program that is acceptable across the board to building owners, code officials, architects, glaziers, etc.

“That should be priority one, not meeting one state’s deadline,” said Carney. The deadline he referred to is from the California Energy Commission, which has requested that the CMA be ready to implement in California this November. 

Ron Burton, the vice president for advocacy and research for the Building Owners and Managers Association International (BOMA), supported Carney’s point about the importance of reaching out to other segments of the industry related to non-residential glazing. 

“BOMA’s concerned that you have insufficient input from the segments that we represent,” Burton told the CMA subcommittee. “[You] should reach out to more of those groups.” 

He pointed out that many of those groups don’t understand the need for labeling their glazing products. 

“My guys don’t need a label on the jobsite to figure out whether they’ve got the right part or not—because they’re not going to check,” said Burton.

Rich Biscoe of Architectural Testing Inc. aimed to “reach out” to members who he’d heard discussing alternate approaches to CMA. He suggested forming a task group to examine “tweaking” the site-built rating system currently in use to suit non-residential buildings. 

As Jeff Baker with WESTLab noted, “If we can’t finish CMA in time to meet California’s deadline, we have to have a back-up plan.” 

“There’s a variety of ways to meet the California commitment,” added Alicia Ward of Midwest Energy Efficiency Alliance and vice chairperson of the NFRC board. She added, “The CMA has the potential to be a robust vehicle [with other applications].” 

Baker added that he preferred to seek the board’s input on forming a task group, “because this issue is a hot potato.”

The CMA technical subcommittee voted to approve the motion, however, forming a task group to address an alternative to component modeling for non-residential glazing. 

Paths for CMA Approval
In the meantime, the CMA ratings subcommittee was hard at work. The group heard from Culp on a new flow chart delineating two paths for component approval. The first path followed the “traditional method,” in which the frame is simulated and some amount of testing is done for validation. The second path would be for manufacturers of custom products or small manufacturers who don’t want to go through the rigor of the first path, Culp explained. With this path, no simulation would be done. Instead, a cross-sectional diagram of the component would be presented and, based on an inspection of the diagram, the component would be assigned to a given generic category. The worst-case value would then be assigned to the component. 

“There’s always an incentive to get products certified,” Culp noted. 

The next chart Culp presented showed the two options for approving calculations, each beginning with the specifying authority. The first path shows the process’ order when calculations are made by an independent approved calculation entity (ACE); the second path showed a slightly altered process for using a manufacturer’s ACE. To show its independence, a manufacturer ACE’s calculations would be followed by an Inspection Agency’s (IA) review. 

Allowing a manufacturer to have its own ACE was a point of major discussion in the group. 

In his negative ballot, Culp had stated that manufacturers should be allowed to act as an ACE and do final whole-product certification, rather than relying on an accredited simulation laboratory. A negative from Croak stated that requiring an ACE to meet the criteria in NFRC’s Laboratory Accreditation Program limits the ACE to being an accredited laboratory or simulator. Webb pointed out that having to use an accredited simulator would add to the testing cost for manufacturers.

Some committee members argued that without proper training, the manufacturer’s ACE would not be able to simulate the components properly. Others argued that once the information was in the database, it would be relatively simple to use the software to make the calculations. 

“I’m not concerned about using the software,” said Baker. “I’m concerned about what it takes to interpret the drawings.”

Adding to the discussion, Thoman motioned that the term “approved calculation entity” be changed officially to “accredited calculation entity.”
 
Thoman added that the definition of accreditation included in the NFRC glossary, which was also being balloted at the meeting, could be changed to not be laboratory-specific. After a vote, however, the motion failed. 

A second motion, to specifically use the term “approved calculated entity” throughout the document, rather than the abbreviation “ACE,” in order to prevent future confusion, passed. 

With so much to discuss, the NFRC anticipates another large gathering for its fall membership meeting, November 6-9, 2006, in Arlington, Va. For more information about the upcoming meeting, visit www.nfrc.org.

USG
© Copyright 2006 Key Communications Inc. All rights reserved.
No reproduction of any type without expressed written permission.