The audit concluded that Los Alamos National Laboratory was in compliance with the 10 mrem/year dose limit required
by 40 CFR 61, Subpart H for the year 2001 (referred to below as Subpart H for brevity). The audit also found LANL to
be compliance with all other requirements of Subpart H of 40 CFR 61 and related Appendixes. Further, the audit team
did not find any substantive technical deficiencies in LANL compliance program. It did make some recommendations for
"continued improvement" (p.1) without finding that any of the areas in which these improvements were desirable
constituted a substantive technical deficiency or a violation of Subpart H.
IEER is in general agreement with only one of these overall conclusions of the ITAT. Despite the uncertainties
and the technical deficiencies, as well as the essential lack of compliance in one area, IEER is in agreement
with the ITAT regarding the 10 mrem/year dose limit compliance. This is because the maximum estimated dose is
so much below 10 mrem per year (in part due to the fact that the main source of emissions, the Los Alamos Neutron
Science Center (LANSCE), is not in full operation) that it is highly unlikely that the dose limit of 10 mrem per year
was exceeded.
In monitoring the audit and reviewing the final report, IEER has concluded that the ITAT should have called out four
substantive technical deficiencies:
In relation to the first of these substantive technical deficiencies, IEER has also concluded that the ITAT should
have found LANL to be in substantive breach of its compliance obligations under the Subpart H and related requirements
under the Clean Air Act. As a result IEER finds that the main findings of the ITAT that LANL is in compliance with Subpart
H and that the compliance program of LANL has no substantive technical deficiencies to be in error.
IEER's conclusions regarding the substantive breach of Subpart H are based on the monitoring of the audit, which
included review of the data, review of the regulations, and review of the specific examples of a lack of quality assurance
in user supplied data that came up in the course of the audit. As regards these examples, IEER detailed them to the ITAT in
the course of its monitoring. (The IEER memoranda, as reprinted in the ITAT Final Report, are appended to this report.)
In these memoranda, IEER also specifically recommended to the ITAT that it investigate the issue of quality assurance in
regard to user supplied data in more detail with specific reference to compliance.
In reviewing the ITAT's findings and analysis as well as the conduct of the audit itself, IEER has concluded that the ITAT's
failure to find a substantive technical deficiency in this area arose partly from a near-exclusive focus of the ITAT's audit
on the work of the MAQ, rather than on the performance of LANL as a whole, in complying with Subpart H. The problem in this
case does not lie in the work of the MAQ, but in the failure of LANL as a whole to require users to adopt a quality assurance
program to ensure the integrity of the data supplied to MAQ. In effect, the ITAT Final Report implicitly deals with the
compliance issue as if it is MAQ rather than LANL that must be in compliance. This implicit narrowing of the focus is
incorrect, since Subpart H does not apply to MAQ but to LANL as a whole. IEER therefore finds that the ITAT's third audit
was not as complete as it should have been, even given the limitations of the resources available for the audit.
We found that the ITAT's evaluation of the 1989 EPA Guidance Document cited in Appendix D of 40 CFR 61 was inadequate.
Further, the ITAT Final Report did not present a careful evaluation of:
The ITAT did not evaluate at all the internal DOE quality assurance (QA) requirements that contractors are obliged to
follow to protect health and the environment. Further, despite the prominence of quality assurance issues during the
third audit, and despite the fact that they were part of the original lawsuit filed by CCNS, the ITAT did not interview
any LANL quality assurance personnel outside the MAQ, past or present, during the third audit.
In view of these omissions, IEER finds that the ITAT third audit was not thorough, even within the limitations of the
resources available to it.
Finally, in view of our conclusion of LANL's substantive breach in compliance with Subpart H, as well as the other
substantive technical deficiencies itemized above and discussed in more detail below, IEER has concluded that that ITAT
should have called for a fourth audit in order to ensure that LANL comes into full compliance. The Consent Decree requires
the auditor to make a judgment about whether a fourth audit is needed based on whether there are substantive deficiencies
in the program. Since IEER finds that the audit was in error in not finding such deficiencies (because the audit was
neither complete nor thorough), we find that the ITAT also erred in terminating the audit process at the third audit.
Usage data are part of an estimation process that serves as a substitute for periodic confirmatory measurements
of unmonitored sources, which measurements are required under 40 CFR 61 Subpart H. IEER has reviewed Subpart H as well
as related regulations and guidance from the EPA regarding quality assurance (QA) as it applies to usage data. We have
also reviewed the June 1996 Federal Facilities Compliance Agreement (FFCA) between DOE and the Environmental Protection
Agency (EPA) in this regard.
The issue of quality assurance in regard to compliance has a long history at LANL. CCNS raised it in the lawsuit it
filed against DOE that resulted in the Consent Decree. Years before that, in early 1992, the Tiger Team report raised
QA issues in regard to LANL's air quality compliance. In 1991, the DOE scientist responsible for evaluating LANL's
clean air program, Frank L. Sprague, noted in regard to dose estimation that "the model and its output is valid;
it is the input data that is questionable." (DOE Albuquerque Operations Office, August 7, 1991.)
One principal problem with the current system is that the expertise regarding usage estimation lies with the users.
The third audit process showed that MAQ does not possess the technical expertise to understand all the essential details
of the processes in order to set up a proper estimation process for usage and emissions in the absence of periodic
confirmatory measurements. Indeed, in IEER's view, it would be unreasonable to expect MAQ to have such expertise,
since there are literally hundreds of users of radionuclides at LANL carrying out a large variety of operations and
experiments. Only the full and engaged involvement of the personnel who are actually responsible for designing and carrying
out these multifarious activities can be relied on to make valid estimates of usage. Yet, the attitude of at least
some of the users, revealed both by the lack of desire to be involved in the data collection process and the casualness
of the manner in which the data are reported and changed, indicates a lack of the kind of involvement needed to assure
the scientific integrity of the result. Indeed, the risk of such an outcome is precisely the scientific basis for
instituting a quality assurance program. That is one reason why IEER has concluded that the ITAT should have called
out the lack of a quality assurance process at the users' end for user supplied data as a substantive technical
deficiency.
The ITAT report notes that "because LANL relies on emission and dose calculations based on usage data as a very integral
part of their compliance program, establishing an effective mechanism to assure the quality of facility-level data when
they are initially provided to MAQ is of high importance." (p. 23) The ITAT then argues as follows (on p. 23):
We find this argument to be misleading and incorrect. The documentation maintained by MAQ is not the issue at hand. It is
the quality of the data that is reported by the facilities that is at issue. The MAQ does not have the technical expertise
to judge the validity of the data supplied to it. The MAQ does not review raw data or experiment logbooks or other sources
of basic data that would be expected to go into the preparation of scientifically sound usage estimates. The QA procedures
at MAQ generally consist of checks of calculations supplied to it and of asking for verification of suspect data in some cases.
This is fundamentally insufficient to the required goal of adequate record keeping cited above by the ITAT. Adequacy of record
keeping requires the maintenance and verification of records at the users' end so that the raw data can be checked by the
regulatory agency. The first sentence of the next paragraph reveals much of the problem with this part of the audit. The ITAT
evaluated the "MAQ quality assurance program," but it failed to evaluate the LANL QA program as a whole as it pertains to
Subpart H. All of LANL must be in compliance with Subpart H, not MAQ alone. Moreover, the MAQ quality assurance program
is fundamentally insufficient to ensure the quality of the facility supplied data since the MAQ does not review the raw
data, logbooks and the like. IEER therefore does not agree with ITAT's conclusion even as regards the adequacy of the MAQ's
QA program, especially in light of the absence of a QA program at the facilities.
IEER also does not agree with the ITAT's view that the procedures outlined in 40 CFR 61, Subpart H do not explicitly define
the method to be used for estimating potential emissions from point sources that do not require continuous monitoring.
40 CFR 61.93(b)(4)(i) requires "periodic confirmatory measurements" for unmonitored sources to ensure that emissions
from these sources remain below the level required for continuous monitoring. In the FFCA, the EPA allowed LANL to substitute
dose estimates based on usage surveys to the exclusion of periodic confirmatory measurements. The scientific integrity and
validity of this permission depends in large measure on the quality of the data supplied by the users. The lack of quality
assurance in facility supplied user data undermines the premise of the compliance program in regard to calculations based on
radionuclide usage. LANL is not doing these measurements. One crucial part of IEER's point regarding the unmonitored sources
is based on the fact Subpart H is explicit in its requirements for periodic confirmatory measurements. Moreover, as noted
above, the FFCA has been terminated by the parties.
Appendix D of 40 CFR 61 has implicit QA requirements for user supplied data. While Appendix D of 40 CFR 61 does not itself
make explicit reference to quality of data, it does refer to an EPA guidance document for compliance as it applies to
NRC-regulated and other non-DOE facilities and suggests that the procedures in it be used (reference 1 in Appendix D).
c
This EPA document contains the following statements regarding data that are to be used in calculations:
Again, your report must include enough information for the EPA to judge the validity of the input used in the
calculations.
Not all the parameters listed below are needed for any given facility. You do not have to report any that you do not use.
12. The physical form and quantity of each radionuclide emitted from each stack, vent, or other release point and
the method(s) by which these quantities were determined.
...
16. The values used for all other user-supplied input parameters (e.g., meteorological data) and the source of these data.
d
Even a limited review of the usage data and the manner in which it was acquired during the third audit revealed that
MAQ does not have all the necessary expertise to evaluate the processes at the using facilities and therefore the
quality of the user-supplied data. The recommendation of the EPA guidance in Appendix D therefore cannot be systematically
fulfilled in the absence of QA at the users' end.
Further, the fact that the FFCA has exempted LANL from the requirement under Subpart H that it make periodic confirmatory
measurements of unmonitored sources itself places a requirement upon LANL to ensure that the quality of the input data
into the process of estimation of doses is equivalent to that which would have been obtained by those periodic confirmatory
measurements. Without the assurance of input data quality, the FFCA exemption is itself invalid, since it then comes into
conflict with the requirement of periodic confirmatory measurements.
Finally, since the FFCA has expired and the federal government has not replaced this with another agreement, LANL would appear
to be employing usage data in place of periodic confirmatory measurements without any explicit legal basis. The ITAT should
have investigated this issue because it was raised during the course of the audit; it did not do so.
In sum, the ITAT seems to have evaluated the compliance of the MAQ, rather than LANL. Its failure to audit the relevant parts
of LANL contributed to its erroneous conclusion that LANL was in compliance. IEER has concluded that the ITAT should have
found LANL in substantive breach of its Subpart H compliance obligations in regard to dose estimation for unmonitored sources.
Section 4 of DOE Order 414.1A sets forth the specifics of DOE QA program requirements. Among other things, it requires
the development of procedures to "detect and prevent quality problems." (Italics added). The LANL program for estimation
of radionuclide usage, and hence of doses based on this data, completely fails the test of prevention of quality problems,
since there is neither a QA program nor any institutionalized check on the quality of data supplied by the facilities
themselves. The MAQ has a procedure for checking some of the data, and has corrected mistakes in this way. But the
quality of most of the usage data from sources deemed to have very low emissions (Tier IV sourcesg Specifically, he raised some questions as to whether and when the ITAT had
consulted LANL personnel outside MAQ regarding QA procedures and requirements. We also interviewed a former lab employee,
Mr. William J. Parras, at the suggestion of Mr. Mechels. Mr. Mechels was also the one who pointed us to the general
laboratory QA requirements. Mr. Mechels was an interested member of the public who had raised similar questions
during a public meeting at the first or second audit. Yet the ITAT neither followed up with him nor reviewed the DOE
QA program requirements, of which he has considerable knowledge, particularly as they concern LANL.
Besides the problem of quality of data that is routinely maintained and reported by the facilities to MAQ, the lack of an
independent QA program that is thoroughly implemented for all data also raises the possibility of cover-ups of embarrassing
incidents. This possibility is raised by the charges that Bill Parras made when IEER interviewed him:
Bill Parras: ...This is an example - you asked me for an example. We had a fire in a glovebox in TA-55 processing
area - I want to say 1993, I don't remember the exact year....That's a reportable occurrence. Now, here's the interesting
thing about it. The TA-55 Operations Center (which is the central focal point for controlling all plant operation activity
especially emergency response requirements) didn't know there was a fire going on in the critical plant processing area.
Personnel manning the TA-55 Operations Center couldn't have called anybody to respond to it. I was told by somebody who
walked out of the plant and walked down the hall, and knew that I was responsible for occurrence reporting, that there
was a fire in a glovebox located in the plant processing area. I said, doesn't the Operations Center know that?
He said, no, they don't have the slightest idea. So I called the center and said, don't you know there's a fire in a
glovebox, someone just told me. They came out of the plant - because my office was not in the plant, it was in a
cold office area - they said, no, we don't have any idea. So somebody from that operations office went back to see
what was going on at the plant processing area of TA-55. What had happened was somebody had pulled the fire alarm out
of the glovebox when the fire had started, because he knew that would alert the Operations Center. So he had actually
pulled that out while the fire was going on. It turned out it was some rags that had caught on fire while they were
doing some soldering in the glovebox. They didn't have any special nuclear material [SNM] in the glovebox. So it wasn't
related to SNM catching on fire.
I immediately went to [XXX] and said, we have a serious situation here. It was okay to put the fire out but disassembling -
unplugging the fire alarm or without first notifying the Operations Center was an obvious reportable incident. They are never
supposed to do that, particularly if they haven't let the Operations Center know about it. Operations Center needs to know
when anything that is done in that processing plant, because if any alarm is disconnected then they have to send somebody
there to be on guard in case there is a reportable emergency. That was sort of standing operating procedure. Make sure it
wasn't a fire that was going to burn the building down. Because it's kind of hard to see what's going on in the plant from
where the Operations Center is.
...
The Operations Center didn't know whether there was any SNM back in processing plant area where the fire occurred. They could
have had some and they wouldn't have known it. I go to [XXX] who says, let me look into this myself. So he goes back there and
it's a friend of his that was involved in the incident.
Bernd Franke (IEER): How long after the incident - couple [of] hours?
Bill Parras: Probably at least an hour or two hours later. And then I got to him within 15 minutes after I'd talked
to the Operations Center. He said let me go back and check into this. He went back there and came back and informed me that
this wasn't an incident that he wanted reported. I said, how can you do that? This is something that sort of showing us that
we don't have some good procedures in place. And I said, even if the fire was put out, was a trivial matter, it was serious
enough to alarm somebody who came out of there and told us there was a fire, and somebody unplugged the alarm system,
pulled it out of the glovebox, without the Operations Center knowing about it. He insisted that that was not going to be
reported. He stated to me that he needed somebody else for this job. So I was immediately reassigned and this all happened
with a week of when he took over as division leader. He did assign me to a very trivial job of developing a records management
office.
...
Bill Parras: In terms of replacing me, he brought in [YYY] to do exactly what the division leader wanted which was
in line with what they had been told to do lab-wide, and that is, you are going to be careful about what you report out to
DOE because that is going to bring back some negative review of what the lab is doing with its safety program, in general.
Bernd Franke: What other dangers, curtail funding?
Bill Parras: No, DOE could shut you down. At TA-55 if you have something that is real serious, there are certain things
that DOE can then say, shut TA-55 down until we see whether or not appropriate procedures are in place and done right. And we
did have a couple of occurrences like that. We had an airborne contamination, one that literally DOE Headquarters shut the
plant down for at least several weeks to make sure that everything was safe before they brought it back up.
Bernd Franke: Before or after this change?
Bill Parras: Prior.h
These are serious allegations, because this kind of process for not reporting incidents that may cause problems for
the LANL's operations may also directly lead to fabrication of data required for environmental analysis and reporting.
IEER has not independently investigated the allegations made by Mr. Parras. For that reason, IEER has omitted all
names of parties not present at the interviews from the quoted text. We did ask Mr. Parras one-and-a-half months after
the interview to review the draft of the transcript and a draft of this report and consider very carefully the statements
quoted here. He has reaffirmed them and they are quoted here.
Further, the issue of QA and lab whistleblowers was raised in a vigorous way by CCNS during the very first audit. But apart
from one interview with one whistleblower, Joe Gutierrez, during the first audit, the ITAT did not systematically follow
up this issue. The ITAT did not conduct interviews with whistleblowers during the third audit even when very specific
issues regarding QA came up during that time.
The allegations made by Mr. Parras are not part of IEER's findings. But they have raised our level of concern regarding
the integrity of environmental, health and safety data at LANL. If the allegations made by Mr. Parras are verified, and if
the problems have not been systematically corrected, the problem of non-compliance may be even more complex and broad than
indicated here. However, an investigation of these problems is beyond the scope of IEER's monitoring work. We find that it
was the responsibility of the ITAT to investigate them, but it did not.