U.S. DEPARTMENT OF LABOR
Employment and Training Administration
Washington, D. C. 20210

CLASSIFICATION

UI/RQC

CORRESPONDENCE SYMBOL

TEUC

ISSUE DATE

April 23, 1992

RESCISSIONS

None

EXPIRATION DATE

??

DIRECTIVE

:

UNEMPLOYMENT INSURANCE PROGRAM LETTER NO. 24-92

 

TO

:

ALL STATE EMPLOYMENT SECURITY AGENCIES

 

FROM

:

DONALD J. KULICK
Administrator
for Regional Management

 

SUBJECT

:

Summaries of Comments on Revenue Quality Control (RQC) Design, with Responses

  1. Purpose. To summarize comments received in response to the September 21, 1990, request for comments on Revenue Quality Control (RQC) design documents, and present major policy and design decisions reflecting those comments.

  2. References. UIPL 44-90 (September 21, 1990); UIPL 8-90 (November 15, 1989); UIPL 14-89 (February 8, 1989).

  3. Background. By means of UIPL 44-90, State employment security agencies (SESAs) received for comment four documents explaining or embodying the proposed design for the RQC program. These were an overview paper; a set of performance indicators based on automated reports data; the handbook used to pretest "Core RQC,"the first of four modules the complete RQC design comprises; and a glossary. This package was the third in a series of design materials, with each presenting more specific design features.

    Twenty-nine comments were received on the materials. Twenty-five of these came from SESAs, two from Employment and Training Administration (ETA) Regional Offices (RO), and two from offices inside the National Office of the Department of Labor (the Department). In addition, the six States which pretested RQC materials offered comments on the revised handbook early in 1991.

  4. Overview and Major Themes of the Comments. In the broadest sense, two thirds of respondents (19) regarded the overall design favorably, while six were unfavorable and the overall position of the remainder could not be determined. The 19 favorable respondents indicated that they thought RQC would be an effective vehicle for assessing the quality of tax operations, although most recommended specific additions or deletions. Those expressing general disfavor did so because (a) portions of the design remain unfinished and they do not trust the Department to complete it to their satisfaction (all unfavorable respondents); (b) it is too much a Federal oversight system, not enough for State managers (5 unfavorable respondents); (c) RQC seems to be a disguised way for DOL to tell States how to do their jobs (4 respondents); or (d) they had concerns about RQC's true cost (3 respondents).

    1. Broad Design Issues. Respondents raised numerous specific issues, which are presented along with RQC's responses in attachment 1. The following broad issues were also raised:

      1. RQC's Cost and Cost-Effectiveness.Eleven respondents were concerned about RQC's cost and whether resources released from BQC would be sufficient to fund it.

        Response: Although the RQC pilot will give better estimates, pretest results seem to indicate that Core RQC will require a continuing level of only one (1) staff year. Transferring this level of resource should have a minimal impact on Benefits Quality Control (BQC) efforts, especially if alternative methods (i.e., telephone and mail) are permitted. Other costs are associated with Core RQC implementation and operation. The RQC position will entail some interstate travel as it is envisioned that RO-SESA teams will collaborate on Core RQC program reviews. SESAs will have to do some mainframe programming for acceptance samples and required reports; attempts will be made to minimize this effort by carefully drafting specifications and/or Department-written software (e.g., COBOL sample selection and downloading for Core RQC). If the Employer Compliance (EC) module is eventually added to RQC, we plan to provide three audit positions and possibly a supervisor (again, within BQC's resource limits with alternative methods) and programming support. EC staffing would not be a concern before 1995.

      2. Organizational Placement of RQC. Four SESAs raised the question of where RQC should be lodged.

        Response. Within the broad limits of the existing QC regulations, SESAs will be allowed to decide where to locate the RQC function. The regulation, at 20 CFR 602.20, requires that QC be "independent of, and not accountable to, any unit performing functions subject to evaluation by the QC unit. The organizational location of this unit shall be positioned to maximize its objectivity, to facilitate its access to information necessary to carry out its responsibilities, and to minimize organizational conflict of interest."

        DOL will propose adding "to maximize its credibility and the use of data generated" to the regulation's list of criteria for positioning RQC.

        The regulation permits locating RQC with BQC, or separate placement with accountability to the Tax Director (but not to the manager of a subordinate unit such as Status or Field Audit which RQC evaluates). Each SESA needs to consider the criteria, in consultation with the RO, before making a decision about placement of RQC.

        A comprehensive policy review about the placement of all performance measurement units will be developed through the Performance Measurement Review (PMR) process.

      3. Ensuring Qualified Staff. Five respondents noted the demanding nature of the RQC program review, and expressed concern that the Department might not be planning to ensure that RQC was staffed with people with the necessary qualifications.

        Response. Qualifications not routinely found in all SESA operations are likely to be required for two parts of RQC: the Core RQC position, which demands program audit-type skills in addition to tax experience; and EC, which may involve auditing large firms with highly computerized accounting systems. The Department recommends that the SESA decide how best to train or obtain qualified personnel for RQC, and will provide a list of the desirable qualifications as early as possible. Ways to provide training in program audit skills are being explored, and a process of developing a tax-auditor training program through a cooperative agreement with a SESA which will include some computerized audit training has been set in motion. The SESA tax experts detailed to the RQC workgroup agree that it is easier to develop a Core RQC specialist by training a tax specialist in audit principles and skills than teach an auditor UI tax processes.

      4. Concern that RQC will Set Desired Levels of Achievement (DLAs) or Release Program Review Findings. In various ways, nine respondents expressed concern that excessively high DLAs might be attached to certain performance measures, or findings would be publicized.

        Response. RQC is developing a system for objectively measuring the performance of tax operations. This is only one part of the Department's overall system for exercising performance oversight. The rest concerns how it uses those findings to determine whether or not some corrective action is warranted; and how it urges, or induces, or requires SESAs to take some corrective action. Setting DLAs and requiring corrective actions through the PBP process, or publicly releasing data, are alternative approaches to linking performance measurements to corrective action. Decisions about these elements are outside RQC's mandate. How RQC findings are linked with what States should do in response is being decided through the Performance Measurement Review (PMR) process, which is charged with developing a consistent approach to Federal performance oversight.

        One State expressed concern that the exception rates RQC sets for acceptance samples could become the equivalent of DLAs, and argued that States should be allowed to set their own exception rates. It is true that exception rates do represent standards of accuracy and may require the RQC regulation to contain language different from that in BQC regulation section 602.43 (regarding absence of sanctions or incentives to achieve specified error rates). Two points are pertinent, however: (1) acceptance sample results do not stand alone as indicators of quality. They are only intended to confirm a systems review finding that all internal controls and quality assurance systems are adequate. Consistent positive findings in the systems review and the acceptance samples provide reasonable assurance of accuracy in a process (e.g., new status determinations). (2) The exception rates used were selected not with the intention of establishing standards for accuracy but because unless the exception rate is chosen, the size of the acceptance sample is unknown. The rates chosen (5 percent in most instances, 2 percent in the others) seemed to represent reasonable standards of quality for their functions while economizing on sample sizes (46 or 115, respectively).

      5. RQC as a Disguised Policy-Setting Tool. Four respondents charged or implied that RQC was not useful for tax managers but was instead a way for DOL to second-guess State policies.

      Response. RQC has from the beginning been acknowledged as an aspect of Federal oversight. It is well recognized that the very decision by the Department to measure some aspect of performance gives it added importance in many States' eyes. However, the Department has from the beginning made great efforts to ensure that the RQC measures are relevant to tax managers' needs, as effective State management and Federal oversight share many common objectives and must complement one another. From the beginning, the Department solicited and incorporated SESA concerns and suggestions into its design--apparently to the satisfaction of many, as other respondents complimented the Department on the relevance of the proposed RQC measures for assessing performance. The Department has no intention to preempt State policies; it does desire to put into place sufficient performance measures to assess how well State operations carry out their own policy decisions, and to judge how well those policies contribute to achieving the quality that seems consistent with the requirements in Federal legislation. The Secretary of Labor is responsible for trying to see that such quality is delivered to employers and claimants. If SESA policies fail to promote this kind of quality, changes in SESA policies may be required.

    2. Major Themes by Functional Area. Most comments were function-specific. The detailed comments, and DOL responses, are given in Attachment I. Many comments summarized above as cross-cutting issues (e.g., whether or not there should be DLAs and where DLAs should be set) were raised in particular functional contexts (e.g., the question of DLAs was raised regarding Timeliness for Status Determinations, and completeness for Cashiering, etc.). Where possible, those issues are not treated in this section.

      1. Status Determination. (1) Fourteen respondents commented on a proposed new timeliness measure for New and Successor Status Determinations. The comments led to reconfiguring the indicators. (2) Two comments said a nonsubject status acceptance sample should be dropped as nonessential and not cost-effective. In response, this element was reexamined. Pretest results were inconclusive (the instructions were misunderstood) and so the accuracy of nonsubject determinations remains unknown. However, the panel of tax experts who advise the RQC technical contractor, Abt Associates, also recommended dropping this element from the design and so it has been eliminated. (3) There were minor comments about another acceptance sample, and the need for definitions, which have been developed since then.

      2. Cashiering. There was only one comment specific to cashiering (aside from DLA issues): One respondent asked that lockbox operations be reviewed. The review now includes a set of inquiries about lockbox operations which the Department will consider expanding if findings in the pilot States warrant.

      3. Report Delinquency and Collections.Many comments concerned these areas.

        (1) Eight comments concerned the cost of obtaining new delinquency data. This should not be a major problem; it appears RQC would have SESAs report one new element, available as the byproduct of continuing operations.

        (2) Ten comments addressed the need, appropriateness, usefulness and number of collections measures and indicators. In response, some measures have been revised, four indicators were deleted or combined, and the search--now three years old--continues to devise more universally acceptable indicators of collections performance. The tremendous variation across States in both overall collections performance and its constituent elements indicates the need for better indicators or better ways of managing past due accounts, or both. Definitions of measures continue to be refined and data will be validated as part of the Reports Validation effort.

        (3) Two respondents questioned the validity of delinquency data and how comparability of States' measured voluntary compliance might be affected by different SESA delinquency dates. This led to some more specific data element definitions. Although different State delinquency dates might cause the RQC measure to imply different degrees of voluntary compliance, this is only important if a national standard (e.g., DLA) causes SESAs to be compared with one another. If States are judged against their own past performance different delinquency dates will not matter.

      4. Benefit Charging (BC). Four comments, all from different respondents, concerned the need to examine suspense accounts, the number of cases required to examine Benefit Charging, whether only proportionate charging systems can be examined, and whether tax or Benefits QC staff should review. The BC pilot, now underway, should suggest answers to the last three questions. In the pilot, which includes a reverse chronological charging State, Benefits QC staff verify charge decision accuracy but other staff may check allocation accuracy; if the discrepancy rates found warrant adding BC to RQC at all, they will suggest sample sizes. The comment about suspense accounts was based on a misunderstanding: there are no suspense accounts for charges before liability has been established.

      5. Field Audit/Employer Compliance (EC).The most substantive comments concerned: the value of the EC exercise; the need to refine some audit impact indicators; EC's possible cost (a preliminary Abt design report implied 7,000 audits, all meeting ES Manual standards); and how this would relate to the 4 percent audit penetration DLA. One questioned whether a total wage change measure would put low wage States in an unfair light. Other comments asked for glossary definitions and one noted that in its experience, "most significant" audit discrepancies came from smaller, not the largest, firms. In response, the EC pilot is scheduled to run for 12 months starting in July 1993; it is a feasibility study intended to assess the value of random audits, but will include other modes of audit selection for comparison purposes. EC audits will probably average 1600 per State and will count toward a penetration indicator or DLA (if the DLA is not suspended during the pilot). In response to comments some of the indicators have been adjusted. Also, before the pilot begins, the quality of audits will be assessed using the RQC audit checklist.

      6. Account Maintenance. Comments on this function were quite scattered. Two respondents said one review should be more detailed; no other points were raised by more than one commenter. Most asked for clarifications. One respondent asked for reimbursable employers to be reviewed more, another argued that the delinquent report acceptance sample misses the true "delinquent employers"--those who remain hidden from UI. Most points have been reflected in changed definitions or instructions.

      7. Data Processing. Six respondents were concerned about the cost implications of programming for required reports and acceptance samples; one comment concerned defining the universe of transactions, and another the need to ensure consistent measures by surveying States first. In response, aside from cost issues, as soon as the Core RQC pilot results have been assimilated, the objectives for any new elements will be explained more fully and the assistance of States to develop them will be sought. The Department will provide extensive COBOL programming and specifications development to help minimize State costs.

      8. Sampling. Comments related to ensuring that samples all referred to consistent time periods, and specifying what is to be done in acceptance samples (AS) when information cannot be retrieved on the sampled case, or when there is an inconsistency in findings between the program review and AS findings (this latter point was raised by six respondents). A further comment concerned whether SESA disagreements might not slow results by requiring additional sampling. In response, reference periods for samples have been adjusted; and a technical appendix includes rules regarding missing cases, reconciling differences between systems reviews and AS findings, and a method for resolving SESA disagreements short of resorting to large-scale sampling, which should keep resolution time short.

      9. The Program Review Methodology.One respondent questioned whether review of internal controls ensured a quality tax program; three respondents wanted clarification on the frequency and scheduling of reviews; two suggested that other audits might substitute for some program reviews; and one said none should be done until the SESA indicated existence of a problem. Other single comments offered several suggestions for planning the program review and structuring the review package; another said an executive summary of results would be helpful. In response, it should be noted that although the combination of review of internal control systems plus acceptance samples seems sufficient to provide reasonable assurance of a quality tax program, its sufficiency is being tested in the formal pilot study. Some rules for the scheduling and frequency of reviews were developed by the Abt Expert Panel. In brief, complete Program Reviews will be required only every third year, although components may be required if there has been an important system change or corrective action. Acceptance samples will be taken every year, however. To assist executives, the results of each completed Program Review will be summarized into a brief compendium.

        Review of Revised Handbook by Pretest States and Regions RQC staff made numerous revisions to the Program handbook in response to comments received from both UIPL 44-90 respondents and the nine States which participated in the pretest. In late winter, these States and their ROs were asked to review the revised Handbook. All nine States plus one Region replied. All found the Handbook much improved; and all were pleased to see that their suggestions have been incorporated. The Handbook is now the basis of the Core RQC pilot test underway in eight States; it will be completed in April 1992. The findings of the pilot and inputs from the pilot States will help further refine the Handbook. Copies of the revised Handbook will be provided to all SESAs.

  5. Additional Guidance. Very shortly, SESAs will be provided with the schedule for the implementation of Core RQC. Included will be suggestions for preparing for the program, including the number and qualifications of staff.

  6. Action Required. SESA Administrators are requested to provide this information to appropriate SESA staff.

  7. Inquiries. Questions should be directed to the appropriate RO.

  8. Attachment. Comprehensive Summary of Comments Received in Response to UIPL 44-90, with Responses