Experts Tell DE/WIR team that possible OFCCP compliance issues depend on what factors are considered

Background: The American Community Survey (ACS) and other Census Bureau data products contain many kinds of statistical uncertainties, but to what extent do those uncertainties matter for OFCCP compliance? It all depends on which factors are taken into account in conducting availability analyses for Protected Groups, a group of experts told the DirectEmployers (DE) WIR team in interviews this week.

Census Bureau uses data uncertainty as a means to protect privacy

Before the Census Bureau publishes any statistic, it applies safeguards designed to help prevent someone from tracing that statistic back to a specific respondent. On its “Statistical Safeguards” webpage the Census Bureau notes that:

“We call these safeguards “’disclosure avoidance,’” although these methods are also known as “’statistical disclosure controls’” or “’statistical disclosure limitations.’”

Although it might appear that a published table shows information about a specific individual, the Census Bureau has taken steps to disguise the original data in such a way that the results are still useful. These steps include using statistical methods such as “’data swapping’” and “’noise injection.’’

Carnegie-Mellon study examined differential privacy and the impacts of data uncertainty on federal policy goals

Addressing some aspects of this practice, researchers at Carnegie-Mellon University published a paper in the journal Science, entitled, “Policy impacts of statistical uncertainty and privacy.” The journal is one of several which the American Association for the Advancement of Science publishes. The researchers explained that a privacy protection system referred to in the scientific community as “differential privacy” is “the cornerstone of the Census Bureau’s updated disclosure avoidance system.” The Census Bureau states that it used differential privacy in the 2020 Decennial Census to keep pace with privacy threats emerging in the digital age.

But the researchers cautioned that “[i]f quantified and unquantified errors alike are not acknowledged and accounted for, policies that rely on census data sources may not distribute the impacts of uncertainty equally.”

The Carnegie-Mellon researchers explained how differential privacy and other data uncertainty impact Census Bureau data:

“Designed to rigorously prevent reconstruction, reidentification, and other attacks on personal data, differential privacy formally guarantees that published statistics are not sensitive to the presence or absence of any individual’s data by injecting transparently structured statistical uncertainty (noise). But even before differential privacy is applied, estimates from the Decennial Census, surveys such as the American Community Survey (ACS), and other Census Bureau data products used for critical policy decisions already contain many kinds of statistical uncertainty, including sampling, measurement, and other kinds of nonsampling error. Some amount of those errors is quantified, but numerous forms of error are not, including some nonresponses, misreporting, collection errors, and even hidden distortions introduced by previous disclosure avoidance measures such as data swapping.” [citations omitted].

In their article, the researchers examined several issues related to statistical uncertainty when sharing sensitive data. In the opening paragraph of their article, they explain:

“Differential privacy is an increasingly popular tool for preserving individuals’ privacy by adding statistical uncertainty when sharing sensitive data. Its introduction into US Census Bureau operations, however, has been controversial. Scholars, politicians, and activists have raised concerns about the integrity of census-guided democratic processes, from redistricting to voting rights. The debate raises important issues, yet most analyses of trade-offs around differential privacy overlook deeper uncertainties in census data. To illustrate, we examine how education policies that leverage census data misallocate funding because of statistical uncertainty, comparing the impacts of quantified data error and of a possible differentially private mechanism. We find that misallocations due to our differentially private mechanism occur on the margin of much larger misallocations due to existing data error that particularly disadvantage marginalized groups. But we also find that policy reforms can reduce the disparate impacts of both data error and privacy mechanisms.” [citations omitted].

Lead researcher Ryan Steed told WIR that uncertainty may impact availability calculations, depending on a variety of factors

As most of our readers well know, federal contractors use the Census Bureau’s ACS data to calculate the availability of minorities and women for their regulatory-mandated written Affirmative Action Programs (AAPs). (For details on this process, see our story, Revised EEO Tabulation Data for AAPs Must Be Used Starting January 1, 2022). Last year, we detailed some concerns federal contractors have regarding the 2018 ACS data in our story, “AAP Administrators — Budget (A LOT) Of Time for Census Re-Mapping.” In addition, OFCCP used Census Data to set various forms of recruitment “Goals” for minorities and women for construction contractor jobs and for Individuals with a Disability, and for a “Benchmark for hiring” for Protected Veterans.

To find out whether any of the Carnegie-Mellon researchers’ observations about the uncertainty present in the ACS data might be relevant to AAPs, the DirectEmployers (DE) WIR team contacted the study’s lead researcher, Ryan Steed. Mr. Steed is a Ph.D. candidate at Carnegie Mellon’s Heinz College of Information Systems & Public Policy. He began his email response to us with a cautionary note:

“While we are not experts on Affirmative Action Plans and we haven’t done an empirical study on this particular use case, there are a few general comments that might be useful [to federal contractors]. Note that whether our results generalize to this use case depends on lots of factors – how granular the EEO tabulations are, the magnitude of the error estimates in the EEO tabulation, etc. These factors could also cause impacts to vary among federal contractors – some may be more impacted by uncertainty than others, depending on their job groups and geographic locations.”

WIR asked, “Are any of the study’s observations applicable to availability analysis?

Mr. Steed: “In general, uncertainty impacts all census-guided policies in some way – how great the impact depends on lots of factors. As far as I know, EEO tabulations use 5-year ACS data, which may have less existing data error than the annual poverty estimates in our study. But if federal contractors are using more granular data (e.g., MSA-level data, or data involving very small minority groups), the impacts of uncertainty could still be noticeable. This uncertainty may impede affirmative action and hurt marginalized job seekers depending on:

  1. whether contractors are accounting for uncertainty in their calculations of availability (e.g. by incorporating the EEO margins of error in their AAPs – or calculating derived margins of error when using custom estimates – and using statistical testing to compare internal [see Editorial Note, below] and external availability);
  2. how contractors interpret uncertainty and use it to inform their AAPs;
  3. the extent to which statistical uncertainty (the margins of error) and systematic uncertainty (e.g., undercounting) makes it difficult to tell whether internal and external availability are substantively different; and
  4. how the availability calculation ultimately influences contractors’ placement goals and achievement of those goals.”

[Editorial Note (our thanks to Michael S. Albert for this observation): When Mr. Steed refers to “internal availability,” we think he means, in the AAP context, employee headcount or the percent utilization rates of minorities and females. However, for AAP writers, “internal availability” means the second of two factors used in the overall Availability Analysis, that is, who is available to the contractor within its own workforce to enter the job group being analyzed, usually referred to as the “internal factor.” The first factor in the overall Availability Analysis, is, of course, the “external” factor, which is typically where Census Data are used.]

WIR asked, “What are the concerns of which federal contractors might need to be aware?

Mr. Steed: “If the main decision factor is how regional availability compares to a contractor’s employees, then uncertainty in the availability estimate could make it harder to determine whether external availability is significantly different than internal availability. How this affects AAPs depends on how contractors interpret this uncertainty. If contractors present availability estimates as certain when in fact they have large margins of error, for example, then it may appear that there is a significant difference in minority representation when in fact there is not; alternatively, if external availability is quite close to internal availability, but the margins of error are small, then it may appear that there is no difference in availability when in fact there is. Another concern, not addressed in our paper, is systematic uncertainty (e.g., undercounts of minority groups) that could make it appear as if external availability is much lower than it actually is, and cause contractors to miss disparities in their AAPs. Finally, the availability/quality of uncertainty estimates may also be a concern – many kinds of uncertainty may not be quantified in the census margins of error and could be difficult to account for. Likewise, it might be technically difficult for contractors to appropriately compute margins of error, especially for derived estimates – e.g., the weighted compositions of availability described in [§ 60-2.14 (g)].”

He added that there may be some Census Bureau guidance on this for the EEO tabulation.”

Again, Mr. Steed cautioned that he could not speak with “much more specificity without a better understanding of the policy and how the data are being used on the ground.”

The DE WIR spoke to three top AAP preparation experts about dealing with margins of error in the field

With Steed’s additional insights, the DE WIR team reached out to three top AAP developers with particular expertise in the intricacies of Census Data AAP developers use to better understand how the data are being used on the ground. The experts we talked to were: (1) Judy Julius, Owner/Primary Consultant of EEO Consulting, LLC; J. Stanley Koper, former Senior Vice President at Gaucher Associates and former Senior Compliance Advisor with OutSolve; and Michael S. Albert, an attorney and independent human resources consultant.

WIR asked, “Do you have any examples of how/why/when margins of error have adversely affected AAP Availability calculations?

Judy: “Let’s address the elephant in the room first. What everyone wants to talk about is the [ACS’] combination of the Aircraft Pilots with Flight Attendants. It was an unfortunate combination, and I’ve talked with the folks at the Census Bureau about the impact on AAPs when they combine a code that is heavily populated by females with a code that has very few females. This has the largest impact on the airlines since most employers have a handful of pilots at most. So, the airlines could certainly use the combined data for flight attendants without making their goals unreasonable or unattainable. Fortunately, there are other sources of data for pilots. The Pilot Institute published numbers in 2021 that female pilots represent 9.02% of the national workforce. And USA Today published national numbers for Pilots from the US Bureau of Labor Statistics for 2020: 3.4% Black, 5.0% Latinx, and 2.2% Asian for a combined minority percentage of 10.6%. There are many other sources of data including IPEDS for postsecondary data, EEO-1 data for broad data, NSF for Science and Engineer data, and a quick google search which is how I found the pilot data. But before leaving the census data, always look around for a different code first.”

“Another example of an unfortunate combination is the Laborers and Freight movers code (9620) which I always used for labor-intensive jobs. It now includes Hand Packers (9640). I don’t want my Labor-intensive jobs to include the same folks who package my Amazon deliveries, so I use Other Material Movers (9510) which does not include any female-dominated jobs.”

“Another code that can be problematic is the Metal Workers and Plastic Workers. The Machinists, Tool & Die Makers, and Welders have been combined with Molding Machine Operators, Metal and Plastic – which was 19.3% females in the 2010 tables – and Rolling Machine Operators, metal, and plastic – which was 25.7% females. I don’t want my welding jobs compared to folks making Barbies. There’s not another code to use, and often times it works out. For one client with a lot of welders, we tried to use the new data but landed at 14.2% females. They would have eaten me for lunch if I told them that was their new goal. So, I used the low end of the margin of error, and we landed at 5.4% females. And, by the way, I used the low end of the margin of error with the 2010 data as well. The 2010 data wasn’t perfect either. Once you get into rural counties, numbers can get really jumpy. As much as I love seeing women work in non-traditional jobs, we are far from having 14% female welders anywhere in the country.”

“There are some other codes which are used routinely that just have small populations; Natural Science Managers, Compensation and Benefit Managers, and Financial Analysts. Sometimes data will make sense in one metro area, and make no sense in another metro area. We’ve seen 100% females or 100% minorities. So, we will typically choose another code. But those codes did not change. They have always been hit-or-miss.”

“Most of the management codes and professional codes were untouched. And something folks never talk about is that some codes needed to be combined. Once you got into rural areas, you’d be lucky to find anyone in certain codes. Even data in metro areas were skewed in the 2010 EEO Tables. As a consultant, my job is to make sure the affirmative action goals are reasonable and attainable. So, I will use every trick in the book: find another source, use another code, or use the low end on the margin of error if needed. And I also curse like a sailor some days. That helps.”

Stan: Although “some presenters mentioned that there were margins of error involved in the 2010 Census data,” essentially, Outsolve used both the 2020 and 2010 data “without using margins of error for any of our calculations.”

“Keep in mind, too, that at Gaucher we used at least a four-factor availability analysis, with three external factors (reasonable recruiting area, expanded recruiting area, and (where appropriate) educational institution data), so we had some flexibility in developing availability estimates. In addition, of course, there is the OFCCP’s general lack of interest in availability estimates. It’s only when they were searching for something to find fault with that they might dig into availability estimates.”

“[With some exceptions], at Gaucher we used aggregate Minority rather than breaking out individual groups. That’s for the OFCCP to determine.”

“Otherwise, the only concern was at the client end, since, while failure to meet a goal was not a violation of any regulation, the OFCCP considers a failure to make good faith efforts to attain goals a material violation of the Executive Order. Our use of a combination of two standard deviations and whole person worried some clients, who were concerned at the lack of goals, as well as some Compliance Officers (such that we changed the heading of the column comparing availability to utilization from “2 or More Standard Deviations” to simply “’Statistical Significance?’”).”

Michael: Margins of error from the American Community Survey are the least of a contractor’s problems when putting together an AAP.”

“The Census Bureau maintains 867 Standard Occupational Classifications.  It has assigned each of these classifications to one of 236 Census Titles for the recently released “2014-2018” EEO Data Reports used in Affirmative Action plans. Unlike the Decennial Census which is sent to every household, the American Community Survey is a limited survey of American households, asking respondents to self-identify their race, ethnicity, and sex/gender and self-report each household member’s current job title and a brief explanation of job duties. ACS then determines which of the 867 SOCs to assign the respondent’s job to and which of the 236 Census Titles it will be counted in.”

“Since not every household is surveyed and not everyone responds, the sampling requires a “’qualification’” regarding the reliability of the data, with a statistically calculated “margin of error,” not unlike margins of error reported in political polls. These margins of error only concern the size of the census territory (population) and the number of responses requested and received.”

“The margin of error doesn’t take into consideration whether ACS assigns the job to the correct Census Title or how many different job titles, with different percentages of minorities and females, are mixed together into the same Census Title. And I think that’s more significant than the margin of error.”

“Contractors make many guesses when figuring out which data to use from the ACS: what census territories (usually one or more counties) “’represent’” their actual recruiting areas, whether to use residence or worksite reports, which census titles to use for each job group, and how to combine the data.  There are so many estimates and guesses, both by ACS and each contractor, that the margin of error in the ACS report should only be used as a reminder that none of the data can be relied upon as the “’truth.’”

“But that doesn’t matter in an AAP. Using census data is just a tool and an estimate of what the “’availability’” of minorities and females might be, not what it really is. And the best guard against these data forcing a contractor to set goals that might not be necessary aren’t to disregard the data. It’s to do two things: keep the number of people in a job group as small as possible (which may translate into having a lot of job groups within an AAP) and use a statistical test like the Fisher’s Exact Test to determine whether any discrepancy between a contractor’s workforce and the estimated Availability is statistically significant because that test leaves a lot of room when you have less than the availability before it’s statistically significant enough to declare a goal.”

WIR asked, “How do you deal with external availability calculations in AAPs when you start to sense they are NOT accurately reporting true external availability?

Judy: “I have never used external applicant flow in lieu of census data. External applicant flow has a LOT more issues than the census data and those issues are only getting worse with the high turnover of the TA staffs. What I will do on occasion is use some internal applicant flow. Let’s say there’s a training program for Machinist and the feeder pool is the Machine Operators. Machine Operators has 80% females, but very few of them would like to be Machinists. The client keeps record of who applies internally for the trainee program, and of course 90% of them are men. So yes – we will use three years’ worth of internal data instead of using everyone in the feeder pool. The only time I use external applicant flow is as my first line of defense when a client misses their goal. If the goal is to hire 20% minority engineers and they miss the goal, we will talk about the broad recruiting that was done, and how, despite our good faith efforts, only 10% of the applicants were minorities. Then we will promise to try harder next year.”

“I’ve been using the new census data since February 2021 when it arrived at the data warehouse in Excel. The new Occupation Index will be published in 2028, and the next round of EEO tables won’t be out until 2031. So, it’s not going away or being replaced. I’ve promised my new friends at the Census Bureau that I would give them a list of codes that should not be combined next time. These are great people who simply did not have a full understanding of how the data is actually used for AAPs. They are smart and kind, have answered all of my questions, and are eager to get feedback on how to improve the tables. Any time you go to the portal you can click on the “was this helpful” button and give them feedback. I encourage everyone to do that. In the meantime, 95% of the data is great and usable. So, don’t throw out the baby with the bath water. Find other codes, another source, or use the low end of the MOE for the 5% of the data which is unusable.”

As an Affirmative Action practitioner, I don’t look at the margin of error unless the percentage of female or minorities seems, on the surface, to be unreasonable for the recruiting area that I’m using. Then I look at the total number of people in the population. If that is really small, I will look for a different code with more people. The larger the population, the better the data. If that doesn’t work, I will find another source, or use the low end of the margin of error. Fortunately, the regulations do not require federal contractors or subcontractors to establish goals for minority subgroups. I hope the OFCCP does not go down that road. Not only would it be problematic with MOEs since the numbers will be smaller for each race/ethnicity, but there is no data for two or more races in the EEO Tables.”

“Also – we’ve known since the start that census data wasn’t perfect. That’s why we get 20% off!  We have four options when determining underutilization, the any difference rule which I hope no one uses, the 80% rule, very popular, the whole-person rule also very popular, and the 2 standard deviation rule which is popular with large companies even though it is used too often for small job groups. (I never recommend 2 Standard Deviations to my clients because their headcounts are too small, and it makes my client’s eyes roll back in their heads. Plus, I like to use one rule for all my job groups.)”

Stan: “With the 2010 Census data, we did have some situations where the Census data did not show anyone in particular occupational categories in the reasonable recruiting area for some clients, while at the same time, the clients employed people in those categories. If I recall correctly, in those instances we would add a percentage to reflect those individuals.”

“I continued to prepare “’consolidated’” AAPs well past the implementation of Functional AAPs, and extended the OFCCP’s “’campus’” approach beyond institutions of higher education to other supply and service contractors, so my attitude toward availability estimates is equally “’relaxed.’” For most of our clients, goals were like corks floating in the ocean, disappearing one year into a trough, rising on a wave into visibility in another year, then disappearing again.”

“And if you have a small establishment, with a Job Group of a dozen executives, you determine a goal (or not) based on the entire population of the Job Group, but you certainly aren’t likely to turn over all twelve in a year. Perhaps one, maybe two. What’s a Minority goal of 12% against that number? Similarly, if you have a Job Group with a mix of three or more Census occupational classifications, you do the weighted calculation of availability, but you’re unlikely to hire within more than one of those classifications. You hire an accountant, but all those HR professionals are pushing the numbers up (or down).”

“Companies like OutSolve (and Gaucher Associates) work to prepare AAPs as efficiently as possible, in order to keep costs low. To the extent that researching and determining if, when, and where to address margins of error increases the time needed to prepare an AAP, that means extra cost.”

“Given that the OFCCP has to date paid little to no attention to the construction of availability estimates, there is little pressure to tweak those estimates to a contractor’s advantage (or to satisfy those clients who want to have goals). And where the OFCCP’s focus is on goals, and a contractor’s efforts to meet those goals through outreach and recruiting, it isn’t so much having a goal (say for a Job Group composed of Airline Pilots, given that the 2020 occupational classification includes both Pilots and Flight Attendants), as being able to stay ahead of the OFCCP in terms of appropriate recruiting sources. Contractors don’t want the agency to come up with sources they aren’t already using, or have considered and rejected for good reason.”

“We had one audit where the Regional operations director was so desperate to find fault, that he identified a Service Job Group containing a medical technician title, which comprised perhaps six people out of a total of 200+ in the Job Group, and insisted that it belonged more properly in a Technician Job Group. Only this particular title was described in the EEO-1 guide as a service position (along the lines of “’Sanitation Engineer’” for garbage handler). We have also encountered Compliance Officers who did not know how to construct a weighted average of Census data.”

“The agency’s Compliance Officers are being whipped from above, and the demand is for pay and hiring discrimination “’scalps,’” so that’s where the focus is. And that’s where AAP preparers are spending the bulk of their time, trying to ensure that those things aren’t happening.”

Michael: “There’s no such thing as “’true external availability.’” When I’m tentatively done with the Availability Analysis, I put the contractor’s incumbents and the proposed Availability percentages through the Fisher’s Exact Test. If the differences aren’t statistically significant, and I wouldn’t have to declare a goal, I don’t adjust anything.”

“But if any are statistically significant, I do look back at all the assumptions I’ve made and consider whether the census data are responsible for a false positive. That includes whether I chose the best matches or should consider other census titles that might better estimate the availability of minorities and females.”

“I look at the hiring and applicant percentages for the most recent year, and several years, if I have the data. If the contractor is doing a good job at recruiting minorities and females in other job groups, I’m inclined to consider merging hiring or applicant percentages, whichever is higher, into the census percentage for the outlier job group.”

“I’m mindful of some other considerations. The OFCCP isn’t conducting many audits, so the chances that any of my clients’ AAPs being audited is minimal. Declaring a goal is not a red flag during an audit, but it does require some good faith efforts during the year to meet it.  If the contractor is doing “’everything’” it can think of to hire more minorities or females in the job group and after several years, hiring percentages of minorities or females don’t change, it may be time to reevaluate whether the Availability and goal percent are simply too high and that the census may be giving us a false positive.”

Following up on Judy’s and Stan’s comments about Pilots and Flight Attendants, Michael offered the following observations:

“In the 2000 EEO reports, 7.1% of Airline pilots in the US were minorities and 4.0% were females. For Flight Attendants, 27.2% were minorities and 78.7% were females.”

In the 2006-2010 ACS reports, 9.4% of Airline pilots in the US were minorities and 4.7% were females.  For Flight Attendants, 27.8% were minorities and 79.6% were females.”

In the 2014-2018 ACS reports, Airline pilots and Flight Attendants were consolidated into the same Census Title: “Air Transportation Workers,” with 22.8% minorities and 34.1% females.”

“Airlines most likely keep their Pilots in a separate job group from their Flight Attendants. The ACS only gives airlines one set of data to use for both job groups. If I were preparing an AAP for an airline, I wouldn’t use the ACS EEO reports for either group.”

THIS COLUMN IS MEANT TO ASSIST IN A GENERAL UNDERSTANDING OF THE CURRENT LAW AND PRACTICE RELATING TO OFCCP. IT IS NOT TO BE REGARDED AS LEGAL ADVICE. COMPANIES OR INDIVIDUALS WITH PARTICULAR QUESTIONS SHOULD SEEK ADVICE OF COUNSEL.

SUBSCRIBE.

Compliance Alerts
Compliance Tips
Week In Review (WIR)

Subscribe to receive alerts, news and updates on all things related to OFCCP compliance as it applies to federal contractors.

3 + 15 =

OFCCP Compliance Text Alerts

Get OFCCP compliance alerts on your cell phone. Text the word compliance to 55678 and confirm your subscription. Provider message and data rates may apply.

Share This