Marines, The Few, the Proud - link to the Official Marine Corps website.
Configuration Audit Guide

Configuration Audit Guide

PURPOSE: This guide prescribes the requirements for conducting both Functional Configuration Audits and Physical Configuration Audits on Systems, Equipment and Computer Software Programs.

CLASSIFICATIONS: Configuration audit is a formal examination of the functional and/or physical characteristics of a configuration item for verification of conformance to its documented requirements.  There are two types of audits within the configuration management process covered in this template.  They are described as follows:

  1. Functional Configuration Audit (FCA).  A formal examination of the "as tested" characteristics of a configuration item (CI) focuses on design and performance suitability.  An FCA validates the performance characteristics specified in the configuration item's functional or allocated configuration identification documents and it's operational and support documents are complete and satisfactory. The FCA also demonstrates that System Development & Demonstration is sufficiently mature for entrance into Low Rate Initial Production.
  2. Physical Configuration Audit (PCA).  A formal examination of the "as built" physical configuration or "as coded" total system software conforms to the design and construction process or technical documentation that defines it.  A PCA verifies the product configuration documentation to include design specifications, drawings, and code listings. It is performed during the Production and Deployment phase and focuses on production suitability. Satisfactory completion of the audits is necessary to establish the product baseline.

APPLICATION:  This guide is applicable for the performance of configuration audits as specified in the contract, Statement of Work (SOW), and the Contract Data Requirements Lists (CDRL).

The configuration item or system shall not be audited separately without prior Government approval of the functional baseline and allocated baseline for the CI, or system, involved.  Recommendation to approve or disapprove an audit shall be based upon and governed by the procedures and requirements delineated in this guide.


Audit performance ultimately resides with the Program Management Office. The Program Manager (PM) has overall disposition authority on audit results and reports.  The PM will designate a representative to be the focal point for audits issues. The designee will serve as audit team leader, develop an audit plan (Attachment 1), and write an FCA/PCA Review Report (Attachment 2) within two weeks of the audit completion to identify findings and any action items and the plans for their resolution.  The plan and review report will be submitted as recommendations for PM approval.


  1. Audit Plan
  2. Format of Audit Report
  3. General Guidance for Conducting Configuration Audits
  4. FCA Checklist for information needed prior to or at audit
  5. PCA Checklist for information needed prior to or at audit
  6. Criteria for conducting the FCA or PCA



  1. Background and Objectives: This section describes the background circumstances for FCA or PCA Audits. It should identify the contract, SOW and CDRL references, program/system, and tentative dates as indicated in the program schedule.

  2. Purpose: This section identifies the purpose of the audit, the Configuration Item(s) and the principal result (validation of requirements or design).

  3. Review Procedures: This section should identify specific events that have to be accomplished to ensure a successful audit.

    1. The FCA or PCA date should be established at least 60 days prior to the audit. This date should be coordinated with the contractor and approved by the PM. FCA/PCA Date: ______________________________.

    2. The designated representative should formally solicit support from the concerned disciplines and functional areas experts. This should be accomplished once a specific date is approved. Date letters sent: _____________________________.

    3. The contract should be reviewed at least 45 days prior to the audit to determine what requirements are on contract, what data items are due prior to FCA or PCA, whether they require approval action (agendas, specifications, drawings, etc.), and the status of delivery/approval of the data. Date of review: __________________.

    4. All team members should be identified by name at least three weeks prior to the audit.

    5. A preparatory meeting should be held at least one week before the scheduled audit. At this meeting, team members will be assigned specific responsibilities to be accomplished at the FCA or PCA.

    6. The audit shall be conducted using EIA-649 or MIL-STD-973 as guidance. Minutes are to be provided by the contractor with action items.

    7. Explain how action items will be tracked for completion.

  4. Reports:

    1. A formal FCA/PCA review report with a copy of the minutes of the audit, in accordance with Attachment 2, will be written NLT ten days after the audit.

    2. A final status letter will be provided to the Program Manager when all action items have been completed.





    1. Background: This section describes the background circumstances of the FCA/PCA. It should identify the location and dates and provide a brief synopsis of events leading to the requirements of the audit.

    2. Contractual Requirements: This section identifies the specific contractual requirements.

  2. PURPOSE: This section identifies the purpose of the audit. Audits are essentially a validation that the Configuration Item (CI) achieves the performance and functional characteristics and the "As Built" conforms to the technical documentation.

  3. AUDITS: This section documents in some detail the thoroughness and extent of the audit. Areas not investigated will be discussed, with reasons provided. This section also identifies the methodology used in conduction the audit. Team members will be identified along with their functional specialties. Specific methods and depth will be described. Formal minutes with action items shall be attached. This section should identify how the action items will be corrected and tracked for completion.

  4. SUMMARY OF FINDINGS: Summarize the findings of the audit. Identify those areas having an impact on production. Include a general discussion of the over-all evaluation of the contractor's efforts.

  5. RECOMMENDATIONS: The report will contain a recommendation to the PM for approval/disapproval of the audit. The recommendation will suggest specific methods to correct identified deficiencies. If applicable, make recommendations concerning the necessity for any audit follow-ups. The report is signed by the team leader and is submitted to the PM with a summary of the recommendations.



  1. On-Site Introduction (Team Orientation)

    1. Given by contractor host

    2. Identification of items to be audited

      1. Nomenclature

      2. Specification number

      3. Configuration identifier

      4. Serial number(s)

      5. Assembly/computer program/software identification numbers

  2. Conducting the Audit

    1. Contractor shall provide:

      1. Documentation described in EIA-649 or MIL-STD-973

      2. Facilities for team members to work

      3. Personnel to go through the audit exercise with the team members, with access to contractor's expertise or other contractor areas (facilities) or organizations as necessary to obtain resolutions or provide answers.

      4. Formal minutes based on mutually agreed-to rough draft provided by the team.

    2. Start the meeting by introducing your team, describing your FCA or PCA plans, and assure that the contractor understands that you have no authority to change the contract

      1. Break into groups after the on-site introduction and the start of the audit.

      2. Suggest disciplines requested for each group (but not limited to):

        • Audit Co-Chairperson:
          • Configuration Management

        • Design Specifications and Test Data Review:
          • Hardware Program Manager
          • Software Program Manager
          • Hardware Responsible Engineer
          • Software Responsible Engineer
          • Hardware Quality Assurance
          • Software Quality Assurance
          • Hardware Configuration Management
          • Software Configuration Management

        • Logistics/Technical Specialties (includes Safety, Human Factors, Reliability, Equipment Qualification):
          • Specialty Engineer
          • Logistics
          • Quality Assurance
          • Configuration Management

        • Drawings and Associated Documentation/Controls/Records:
          • Program Management
          • Responsible Engineer
          • Management

    3. Final Actions of the Audit

      1. Each audit day is concluded with a short program office team caucus
          • Members identify subjects that they feel need attention by team members
          • Identify suspect areas, data, and activity and suggest corrective action, as appropriate, or request direction.
          • Discuss problems, identify next day's activities/events, and provide information of interest or use to team members.
          • Receive any instructions/assignments/suggestions necessary for successful task accomplishment.

      2. Generate Audit Minutes
          • Certification sheets similar to examples in EIA-649 or MIL-STD-973 will be completed for the entire audit, plus any action item sheets generated as part of the audit.  Assure that the minutes include the disclaimer of any authority to change the contract.
          • Upon completion of the audit, minutes will be prepared and distributed to the chairperson.  Minutes prepared in final format will be signed by both the government and contractor representatives and delivered to the government in accordance with the Contract Data Requirements List.

    4. Government representatives are observers at subcontract audits.




Note: Items herein may vary according to
the provisions of the contract

  1. Objectives

    1. Verify the Hardware Configuration Item (HWCI) and/or Computer Software Configuration Item (CSCI) performance complies with its performance specification (e.g. B or development specification).

    2. Verify the compatibility/completeness of specification requirements versus test procedures/plans/results and analyses.

  2. Information Needed Prior to or at Audit

    1. Approved performance specification and draft design specification (e.g. approved development specification and draft of design specification.

    2. A current listing of all non-conformances (e.g. deviations, waivers, etc) either approved or requested.

    3. Status of test programs to test configured items with automatic test equipment.

    4. Test plans/procedures and available acceptance test plans/procedures.

    5. A complete list of successfully accomplished functional tests during which pre-acceptance data was recorded.

    6. A complete list of functional tests even if detailed test data are not recorded.

    7. A complete list of functional tests required by the specification but not yet performed (to be performed as a subsystem or system test).

    8. Test reports (Validated data).

    9. A list of tests, which failed (see E, F, and G); action items are generated from this data.

    10. Drawings to identify items to be audited, to include:
      1. Nomenclature

      2. Specification identification number

      3. Configuration identifier

      4. Serial number(s)

      5. Assembly/computer program/software identification numbers

    11. Review Preliminary Design Review (PDR) and/or Critical Design Review (CDR) minutes and action items for closure and items requiring further checks and/or verification. Review pre-FCA minutes.

    12. For CSCI the following additional items should be checked:

      1. The contractor should provide the FCA team with a briefing for each CSCI being audited and should delineate the test results and findings for each CSCI. As a minimum, the discussion should include CSCI requirements that were not met, including a proposed solution to each item, an account of the Engineering Change Proposals (ECP) incorporated and tested as well as proposed, and a general presentation of the entire CSCI test effort delineating problem areas as well as accomplishments.

      2. Audit the formal test plans/descriptions/procedures and compare them against the official test data. Check results for completeness and accuracy. Deficiencies should be documented and made a part of the FCA minutes. Establish and document completion dates for all discrepancies.

      3. Perform an audit of the Software Test Reports to validate that the reports are accurate and completely describe the CSCI tests.

      4. Review all approved ECPs to ensure that they have been technically incorporated and verified.

      5. Review all updates to previously delivered documents to ensure accuracy and consistency throughout the documentation set.

      6. Review the PDR and CDR to ensure that all findings have been incorporated and completed.

      7. Review the interface requirements and the testing of these requirements for each CSCI.

      8. Review database characteristics, storage allocation data and timing, and sequencing characteristics for compliance with specified requirements.



Note: All items herein may vary according
to the provisions of the contract.

  1. Objectives--The Physical Configuration Audit is:
    1. A formal examination of the as-built version of a configuration item against its design documentation in order to establish the product baseline. (A successful PCA is necessary to establish the product baseline.)

    2. A determination that the as-built configuration is reflected by its released engineering documentation and quality assurance records.

    3. A determination that the acceptance test requirements prescribed by the documentation are adequate for acceptance of production units of a configuration item (CI) (or end item) by quality assurance activities.

    4. A detailed audit of engineering drawings, specifications, technical data, and tests utilized in production of hardware configuration items (HWCI) (or end items).

    5. A detailed audit of specifications (or technical descriptions, flow charts/Program Design Language, code listing, manuals/handbooks) for CSCIs (or software end items).

  2. Information Needed Prior to or at Audit:
    1. Successful accomplishment of a PCA:
      1. May be conditional; e.g., higher-level tests/actions to be completed, as long as this will not impact PCA accomplishment. The FCA and PCA are frequently combined.

    2. A copy of FCA minutes and associated data (attachments), if FCA and PCA are not combined.

    3. An approved performance specification and the final draft design specification. (The design specification should be submitted sufficiently in advance of the PCA to permit government review.)
      1. Includes top level and any sub-tier specifications.

      2. For computer programs/software this shall include diagrams, code listings, etc which comprise part of the design specification.

    4. A current listing of all non-conformances (e.g. deviations, waivers, etc) either approved or requested.
      1. Identify any changes actually made during test.

      2. Identify changes not completed.

    5. Identification of items to be audited, to include:
      1. Nomenclature

      2. Specification identification number

      3. Configuration Item identifier

      4. Serial number(s)

      5. Assembly and/or computer program/software identification number(s).

    6. Engineering drawings, associated lists and related technical data.
      1. Includes engineering drawing index containing revision letters assembled by top drawing number (may be indentured listing).

    7. Approved nomenclature and nameplate(s).

    8. Operating maintenance and illustrated parts breakdown manuals (as applicable).

    9. Computer Program/Software manuals; e.g., users' manuals, positional handbooks, programming manuals, etc. as applicable.

    10. Version Description Document.

    11. Media for delivery of software; e.g., card decks, tapes, etc.

    12. Acceptance Test Plan(s), procedures and test data (report(s)) for each item.

    13. A complete list of functional tests required by the CI (or end item) specification but not yet performed.

    14. A review of PDR and CDR minutes and action items for closure and items for further check/verification.

    15. A complete shortage list.
      1. To include both hardware and software shortages (may be on separate lists).

    16. A proposed DD 250 (or subcontractor documents).

    17. Manufacturing instruction sheets must be available on site (on request).

    18. A list of approved and outstanding changes against a CI (or end item).

    19. Logistics data to include spares and repair parts lists or recommendations, common and peculiar tools, and other support requirements data as applicable shall be available on site.




In conducting a Functional Configuration Audit (FCA) or a Physical Configuration Audit (PCA), evaluate the contractor's policies, methods, and techniques against the following general criteria and other specifically developed criteria. You may not need all of these criteria, or you may have to add to them. Exercise discretion; make sure to document any discrepancies not resolved. Group efforts are to be oriented toward FCA/PCA objectives.

* Items apply to FCA only

  1. Compare configuration of the CI (or end item) as documented with the design specification.
    • Is the configuration of the CI (or end item) the same as the "as built" product configuration?

    • Review the list of contractor's internal documentation for hardware configuration items (HWCI) and computer software configuration items (CSCI).

  2. Compare configuration of CI as documented with the performance specification.
    • Does the documentation reflect the physical configuration of the items for which test data are verified? (For software this includes documents such as code listings, version description documents, manuals, media identification; e.g., card decks, tapes, etc. as appropriate).

  3. Compare configuration audited at FCA with that to be audited at the PCA.
    • Review FCA Minutes for recorded discrepancies that require action (Has proper closure been achieved and tested?)

  4. Differences between product and specification and those between FCA and PCA configurations tested shall be recorded in the minutes.
    • Differences that are due to approved and tested changes that are compatible with approved specifications and reflected in the engineering data are acceptable.

    • Determine impact on and validity of the previous FCA and the current PCA activity. Establish course of action (how to proceed or call it off). Differences due only to test instrumentation are acceptable.

  5. Review all approved non-conformances (e.g. deviations, waivers, etc) to determine extent of variance from applicable specifications and standards.
    • Does a basis for compliance with the specifications and standards exist?

    • Record any part of the PCA that fails to meet specifications or standards but is not an approved waiver/deviation.

    • Identify and record any deviation/waiver found which is not on the list supplied. (Initiate action item/corrective action.)

  6. Review list of shortages and un-incorporated design changes.
    • Accomplish as coordinated engineering/Quality Assurance effort.

    • Determine impact on PCA results and validity of any shortages/changes.

    • Determine where and by whom any necessary configuration changes will be made (before or after delivery).

  7. * Verify all requirements of the performance specification, whether qualification testing is required or not for a given requirement.
    • When qualification testing does not cover or does not adequately cover a test/design requirement, review contractor internal/engineering test data to verify all design requirements have been met.

  8. Verify all requirements of the design specification, whether acceptance testing is required or not for a given requirement.
    • When acceptance testing does not cover or does not adequately cover a test/design requirement, review FCA or contractor internal/engineering test data to verify all design requirements have been met.

  9. Review test plans, procedures, and descriptions, and review test reports for compliance with specification requirements.
    • Confirm inspection and test of contractor equipment end items. (Review contractor records to confirm. Data should support proper buy-off.)

    • For software this includes Preliminary Qualification Test (PQT) and Formal Qualification Test (FQT) test plans/procedures/results audit versus specification. (Objective is to verify PCA results are still exactly applicable or that any discrepancies are authorized, properly tested and documented in the engineering data and the test results are compatible).

    • Check completeness, accuracy, tolerances, and compatibility.

    • Are the tests properly witnessed?

  10. 10.  Review CSCI (or end item) for format and completeness.
    • Compare top-level (CSCI) flow charts with sub-tier computer software component (CSC) flow charts.

    • Review CSC flow charts and description.

    • Review flow charts for proper entries, symbols, labels, tags, and exits.

    • Review CSC interface requirements.

    • Compare detailed CSC flow charts with coded program for accuracy and completeness.

    • Crosscheck a current listing of instructions with the listing in the design specification.

    • Review database characteristics, storage allocation charts, and timing and sequencing characteristics.

    • Examine an actual tape, card deck or other media for conformance to the design specification. (Packaging and marking requirements shall be complete and compatible with all documents, material requirements conform, and serials are compatible with documentation of items audited.)

  11. Review the Version Description Document for completeness and compatibility with data audited.
    • Document discrepancies.

  12. Check positional handbooks, user and computer programmer manuals for format and conformance with applicable data items.
    • Formal acceptance/verification of handbooks/manuals is not part of PCA but must wait until system testing to insure procedural content is correct.

  13. Review microprocessor software portion of hardware design specifications for the characteristics noted herein for completeness and adequacy of software requirements and compatibility with the hardware design and interface.
    • Document discrepancies.

  14. Interface Requirements
    • Do all requirements interface adequately?

    • Are the test results acceptable? (Including hardware/software or software/software interfaces.)

  15. Engineering Change Proposals (ECP)/Changes
    • Have all ECPs/Changes been included in the development and acceptance tests, and are the results compatible with the requirements?

    • Are any outstanding items documented and appropriate actions and schedules established?

  16. PDR/CDR Minutes
    • Have all PDR/CDR findings been incorporated and tested?

  17. Verify the validity and adequacy of the acceptance test requirements defined in the specifications.
    • Are all tests clearly applicable?

    • Are test requirements completely defined?

    • Are all tests supported by applicable data requirements?

    • Make a direct comparison of test methods/procedures and test results with the performance/design requirements of the deliverable HWCI/CSCI (or end item) as required. (Must be in the detail necessary to establish the methods and instrumentation, as required by the contractor's internal procedures, satisfy the design specification and be adequate to verify the performance quality of the HWCI/CSCI (or end item)).

    • If acceptance test requirements changes are required, their validity must be reconciled with the results of the performance tests conducted at FCA. (Determine impact on PCA objectives and audit completion, recovery, or stop.)

    • Are any changes made during the test identified? (Are they properly authorized; are they acceptable in terms of test results/retest procedure?)

  18. Final HWCI/CSCI (or end item) test report.
    • Audit for accuracy and completeness.

  19. Test reports, procedures, and analyses and other data.
    • Generally made attachments to the certification and worksheets filled out as an audit record.

  20. Review HWCI/CSCI (or end item) specification(s) to assure that they have been validated as adequate to define the product.
    • Is the document complete, and is it in agreement with the approved issue?

    • Does the spec contain the necessary testing, mobility/transportability and packaging requirements?

  21. Audit a representative number of drawings.

    • Are the drawings accurate, and do they include authorized changes?

    • List drawings, revision letter, title, and approval date of drawing.

    • Are the drawings accurate and complete, and do they describe the equipment?

    • Do the drawings conform to contractual requirements?

    • Are engineering documents and/or release records capable of:
      1. Determining composition of any part at any level?

      2. Determining the composition of the end item (or CI) with respect to other end items (CIs)?

      3. Determining the CI (end item) and associated serial number on which subordinate parts are used?

      4. Determining accountability of Class I and Class II changes which have been partially or completely released against the CI (end item)?

      5. Determining the CI (end item) and serial number effectivity of any change?

      6. Determining the standard spec or part number used with any non-standard part number?

      7. Determining the contractor spec document and specification control number associated with any sub-contractor, vendor or supplier part number?

      8. Identifying changes and retaining records of superseded configuration formally accepted by the procuring activity?

      9. Determining the next higher using assembly?

      10. Identifying all Class I and Class II engineering changes released for production incorporation?

      11. Determining the configuration released for each CI (end item) at the time of formal acceptance?

  22. Review drawings of parts/assemblies to be provisioned.
    • How is the referenced test data to be furnished?

  23. Audit samples of drawings versus Manufacturing instruction sheets.
    • Check drawings to assure accuracy of translation and conformance with hardware "as built."

    • Do the drawing numbers, revision letters and approval dates comply (are compatible) with Manufacturing sheets?

    • Are materials called for the same?

    • Are instructions complete and compatible?

    • Are processes called on Manufacturing sheets compatible?

    • Are all marking requirements in conformance?

    • Are all changes incorporated in Manufacturing sheets?

    • Are there more than 5 Engineering Orders attached to the drawing?

    • Are all proprietary markings justified?

    • Are all drawings released and identified in release records?

    • Are all drawings accounted for in a selected black box? Is there drawing continuity?

    • Do part numbers agree with the drawings?

  24. Audit engineering release and change control records.
    • Are engineering change control numbers unique?

    • Are drawing/part number systems controlled, and do they satisfy requirements?

    • Are serial number assignments controlled?

    • Verify there is one release record for each drawing number and that it contains as applicable: Drawing number, revision, title, number of sheets, date of release, date of change release(s), ECO (EO) number, serial numbers, top drawing number and spec number.

    • Is control adequate to control processing and formal release of engineering changes?

  25. Review Logistics data for adequacy, completeness and compatibility with CI configuration.
    • Review Logistics Support planning documents, plans, lists (e.g. provisioned or to be provisioned).

    • Are manuals/training documentation/data acceptable?

    • Review long lead-time items.

    • Are all long lead or provisioned items acquired before PCA of current configuration?

    • Are interim releases of spares of the current configuration? (Do change/release control methods control this?)

    • Review data of parts/assemblies to be provisioned for adequacy.

  26. Review prepared back-up data for correct types and quantities to insure adequate coverage at the time of shipment to the user.
    • Review proposed DD 250 (or comparable shipping data) to insure that it adequately defines the equipment/computer programs. (Are open items/tasks listed as deficiencies/shortages?)

    • Review any shortages and un-incorporated design changes. (Assure compatibility of DD 250 (or comparable documents) and audit deficiency list and supporting action items or other back-up data.)

    • Does the DD 250 or equivalent document include any discrepancies uncovered at the PCA? (Specify any such items on the DD 250.)

    • Has a determination been made as to where any required configuration changes should be made? (Before or after delivery or in the field.)

    • Do quality control records reflect the configuration under audit?


[Give credit to Mr. M. Ucchino, DSN 986-0815 for the guidelines for audits.]


(Last Updated December 08, 2011)    Web Publisher | Privacy Policy | USMC Home Page