PARTICIPANTS

David Neumark, John Taylor, Uschi Backes-Gellner, Eric Bettinger, Patrick Biggs, Michael Boskin, John Cochrane, Bradley Combest, Steven Davis, Andy Filardo, Shanon Fitzgerald, Bob Hall, Rick Hanushek, Robert Hodrick, Ken Judd, Matthew Kahn, Tom Kulisz, David Laidler, Ross Levine, Michael Melvin, Casey Mulligan, Vinh Nguyen, Elena Pastorino, Valerie Ramey, Alison Schrager, Richard Sousa, Tom Stephenson, Jack Tatom

ISSUES DISCUSSED

David Neumark, Hoover visiting fellow and distinguished professor of economics and codirector of the Center for Population, Inequality, and Policy at the University of California-Irvine, discussed “Help Really Wanted? The Impact of Age Stereotypes in Job Ads on Applications from Older Workers,” a paper with Ian Burn (University of Liverpool), Daniel Firoozi (Claremont McKenna College), and Daniel Ladd (Quantitative Economic Solutions, LLC).

John Taylor, the Mary and Robert Raymond Professor of Economics at Stanford University and the George P. Shultz Senior Fellow in Economics at the Hoover Institution, was the moderator.

PAPER SUMMARY

Correspondence studies have found evidence of age discrimination in callback rates for older workers, but less is known about whether job advertisements can themselves shape the age composition of the applicant pool. We construct job ads for administrative assistant, retail, and security guard jobs, using language from real job ads collected in a prior large-scale correspondence study (Neumark et al., 2019a). We modify the job-ad language to randomly vary whether the job ad includes ageist language regarding age-related stereotypes. Our main analysis relies on computational linguistics/machine learning methods to design job ads based on the semantic similarity between phrases in job ads and age-related stereotypes. In contrast to a correspondence study in which job searchers are artificial and researchers study the responses of real employers, in our research the job ads are artificial and we study the responses of real job searchers.

We find that job-ad language related to ageist stereotypes, even when the language is not blatantly or specifically age-related, deters older workers from applying for jobs. The change in the age distribution of applicants is large, with significant declines in the average and median age, the 75th percentile of the age distribution, and the share of applicants over 40. Based on these estimates and those from the correspondence study, and the fact that we use real-world ageist job-ad language, we conclude that job-ad language that deters older workers from applying for jobs can have roughly as large an impact on hiring of older workers as direct age discrimination in hiring.

To read the paper, click here
To read the slides, click here

WATCH THE SEMINAR

Topic: “Help Really Wanted? The Impact of Age Stereotypes in Job Ads on Applications from Older Workers”
Start Time: December 13, 2023, 12:00 PM PT

>> John: You have people listening from various parts of the world and you have a group here. Thank you, Steve. Thank you.

>> David Neumark: So thanks so much for being here and thanks for hosting me. It's a great place to visit and I kind of love the seminar. I gave a talk here I think last November or January, which I think was my first time seeing this group.

And it's a treat to get a lot of very different perspectives, more questions and different questions than you get with 30 labor economists in the room, which is great cuz I'm used to that. And I've had most of those questions by now. I guess I sort of, I think the email kinda set the ground, I'm happy to do the seminar style rather than talking in questions at the end.

I'm gonna-

>> John: We don't allow that.

>> John: It's not an option.

>> David Neumark: But I just to warn you, I'm kind of talking about a sequence of papers, although focusing on the last one. So if we get too bogged down on the early ones, I may sort of try to blow through those cuz the seminar is not really about those, but we'll play it by ear.

I'll even take my watch off so I know what time it is. Always a good idea. Okay, I want to, before I start, really quickly, so no questions on these slides. These we can talk about later. Just sort of tell you a bit who I am, cuz you don't all know.

For some people in this room, probably best known for my work on minimum wages, but I have a much broader research agenda. Not that surprising, I suppose, but it is pretty unified, I think. After I wrote the slide, I looked at my CV and I think it applied to pretty much everything on the first four or five pages.

So that's good. But that is policies that promote employment, and in particular, that can help encourage or promote economic self sufficiency. And it comes from a view that I just think this isn't economics, I suppose, but maybe it is. But we're all better off if people can actually work and achieve standard of living that we deem acceptable.

And all rich, all western societies have some standard of what's an acceptable standard of living. There's obviously other ways to achieve standards of living, but I think there's a lot of broad agreement on. So there's a number of things that fit in this. One is, I'll start with redistribution.

Sometimes like any western society, we might decide the market isn't delivering a satisfactory distribution of resources. So we might redistribute income. But we still have choices about how to do that. And my work on that has been to really think about which of these does more to encourage or not to discourage work, and does more to encourage human capital investment and not to discourage it.

Because if we encourage it, then people might get out of the lower incomes that we find unsatisfactory to higher incomes. And a lot of my work on minimum wages and the earned income tax credit is posed very much in that way. Minimum wages are obviously one way to say, how do we make sure people at low wages earn more?

Other problems, which I won't get into today, EITC, I think, works very much in very effective and opposite ways. I have a pretty large research agenda on, not mainly spatial, but sort of related impediments to employment. A lot of that focuses on place based policies, enterprise zones, now opportunity zones, and hiring credits more generally.

Other cases where there might be a reason and where we can find where it works, which is harder but the answer is not never, where subsidizing employment might actually help put more people to work. And then closer to what I'm talking about today, is reducing barriers to work.

My earliest work, and something I've continued to work on my whole career, is labor market discrimination, which obviously, if it is occurring, is a barrier to employment. And also I study the effects of policies that are meant to counter it and how well they work or don't work.

So I have a large body of literature on that. I think it's a really important question. I was driving home last night, actually, an Uber guy's driving me home about 100 miles an hour, which is a little scary, but he got me home quickly. And I was listening to your GoodFellows with Barry Weiss, which is great.

And I think it might have been you, John, who said, I don't know if it was you or somebody else who said anti-racism is racism. I don't know if it was you who said that. I don't want to attribute it to the wrong person. I do think, though, and there clearly is something to that.

But there clearly is also something to the fact that there is discrimination. It may not be as structural, as systemic as people think it is. And those of us who think that we need to avoid sort of the so called anti-racist remedies, which I realize is a very large label.

I think I also need to think seriously about maintaining, and perhaps strengthening the laws we have to make sure the playing field actually is level. You can't, on the one hand say, in some sense, that perspective goes too far and on the other hand, we have nothing to do.

Unless you think there really is no discrimination, and I'll be showing you a lot of evidence today on that question. I do wanna say I've worked on this topic for a long time. I, like anyone in this room, probably read Becker early in my career, and as an economist, of course, very attuned to non-discriminatory explanations.

So I spend a lot of time focusing on those as well, and really trying to come up with rigorous evidence on discrimination when I can. And my work is, it's probably been all over the map. Sometimes I find discrimination, sometimes I don't. I probably shouldn't say it depends on the context, but it probably does.

Sorry, I couldn't resist. Okay, and just next up, I have this agenda I wanna get started on, where I've been thinking a lot about the barriers to employment, not so much faced by those who, and I don't mean to say it's all demand, maybe supply as well. We spend a lot of time focusing on those who don't work much or ever.

But there's a lot of people who work and find jobs and lose them a lot. And it's people on the marginal parts of the labor market for whom the world is a lot more chaotic than our world. If we have to go to the doctor, we just go to the doctor, no one cares.

We don't even have to tell anybody. Other people don't show up, the kid gets sick, their car breaks down, they get fired. And there are interventions I've been thinking about, including experimental ones, to try to think about ways where we might, it might be low hanging food I haven't question marked, I don't know the answer yet.

But keep those people employed more cuz they're the ones who already want to work and are able to find work and find jobs, which may be a lot easier to flip the switch for them than the folks who haven't worked for a very long time. Both important, but that may be as much and perhaps an easier to access source of increased employment.

Okay, I'm gonna skip that and just jump to the talk. Okay, so long title, help really wanted. Impact of age stereotypes in job ads on applications from older workers. This is joint work with three former students, two professors now, one a consultant. So let me jump right into the motivation first of all, and again, feel free to ask questions.

I've been interested in economics of aging and age discrimination as a part of that for a long time. And I think a core motivation is aging populations, right? We know the population is getting older here. It's getting older a lot faster in other countries. And when I talk about this in Japan, they're really interested in it, given that their population is already shrinking.

It brings about higher dependency ratios, slower output growth, some strain on, obviously, social programs related to the share retired versus the share working. And I don't think there's any question that western governments, as well as the Chinese government for that matter, is very interested in trying to boost employment among older people.

So we do a lot of supply side things. We've had Social Security reforms, I don't need to talk about that in this room. Other countries have done similar pension reforms, but there may also be demand side barriers if age discrimination is important, and that may frustrate these. If we are strengthening supply side incentives and age discrimination is very important, then we may kind of be pushing on a string here.

And the real downside potentially is we look at Social Security trust fund solvency and we do the supply side reforms and we don't get as much reaction as we hope, perhaps. And then we need to do something even harsher. And for people like us having to work a little longer, a lot of us will kind of work until we can't work anymore anyways, no big deal.

But there's a lot of people for whom working at older ages is harder or even may become punitive, depending on the work they did earlier. So the notion that reducing demand-side barriers, if they're important, is complementary with these supply side efforts, I think is pretty compelling, depending on what the evidence says.

Of course, I'm not gonna focus only on retirement ages, because we have a big problem of low employment among less educated people at older but younger age as well, before retirement. And those may well carry over into retirement, because if you're out of the workforce a lot in your 40s and 50s, the chances of getting back later might be quite slim.

Okay, this paper focuses on hiring. And the first thing you might say is, we're talking about people working longer. Why is hiring part of the picture? Well, it is for two reasons. First of all, a lot of people go through multiple jobs on the way to retirement. People who work in this field call them bridge jobs, partial retirement, even use the word unretirement.

And that is simply that people are in full-time career jobs and move to less demanding, whether hours or physically demanding or scheduling or who knows what jobs, sometimes in and out for a few years on their way to sort of kind of retirement as an absorbing state. If we're really talking about extending work lives even more, that is, older than people are now, then of course that may become even more important.

Because to the extent those are driven by health challenges or you have grandchildren and you want to spend more time with them or whatever it is that leads people to maybe seek more flexible jobs for a while before they stop working altogether. There presumably even more of that if we're gonna talk about a lot more 70-year-olds or 75-year-olds working.

The second reason I think hiring is important is because we have very strong discrimination laws in this country. I haven't done research on them in other countries. I've done a lot in the US, but I think they're fairly clear. Quite clearly, by far the strongest and most effective in the world.

Which doesn't mean they work perfectly, but they're very strong. But on the hiring side, discrimination laws, at least some people think, including the EEOC, don't work very well or don't work as well as they could. And there's really two reasons. Enforcement of discrimination laws in the US, very much driven by private attorneys.

Plaintiff's attorneys typically work on contingency fees, they're attracted to, and the best lawyers are attracted to large cases where they can get a share. And in hiring cases, the damages may not be that big, right? You get terminated at 55, you've lost pension accumulations, maybe healthcare, higher wage job.

The damages can be huge, even for a sort of moderately paid professional person, if you don't get hired. It's not that you never get hired, but you might get hired three months later. The second reason is we use class action in the US, and that's what really makes damages sort of multiply.

When a bunch of workers get terminated and they realize it was the women or the blacks or the older people or a bunch of people think they're underpaid and they can compare notes with their coworkers, it's kind of clear where the class comes from and how that emerges.

When you don't get hired, you don't know who else didn't get hired. You don't have any interaction with them, you may have no idea who they are. So there are, as a fact, clearly fewer class action suits on the hiring side. So the law may not be as effective.

So this, as I said, is sort of the culmination of a set of papers. Every time I look at the slide, I get really tired. Only because it was a lot of work over many years. But I also often say to people, which is a nice thing to say at this stage of a career, this is some of the favorite work, my favorite work I've done, which is an exciting thing to be able to do later in your career.

So I'm gonna briefly describe the progression of these papers and go through the first really quickly, the second in somewhat more detail, cuz it lays the groundwork for the final paper that I really focus on today. So the first is a large-scale field experiment on discrimination. These are called correspondence studies.

I'll give you a few details later. I would argue this paper provides very strong evidence of discrimination against older workers, more so, older women, in hiring. That's conventional, but very large-scale and complicated. The second is completely out of the blue. I didn't even know what I was doing before I started on it.

I think a lot of us are learning these methods as we go, was to use machine learning and computational linguistics methods on the ads from the correspondence study. So, we had these ads that we sent fake applications to, and we use these computational linguistics methods to characterize those ads as to whether they contain age-related stereotypes or not.

And I'll talk about how we do that later, that you'll get in some detail. But the finding is that discriminatory employers, that is, those in the correspondence study who, in the experiment discriminated, by which I mean, they called back the younger person and not the older person. Were more likely to have ageist stereotypes in their job ad language, okay, which is interesting.

It's not entirely clear what that means. And that leads to this final paper, and there's an intermediate one along the way, which is what's going on. And what we do in this paper, and I'll explain a lot more about why later on, is we run a large scale field experiment where we flip the script.

So, labor economists have, since the 70s, really been running these kind of auditor correspondence studies. This paradigm where you create fake applicants to jobs, originally in person, now almost always online. Identical, except for membership of group X or group Y. And you test for discrimination at the hiring stage, at the callback stage.

 

>> John: That's why you keep using hiring.

>> David Neumark: Right.

>> John: Callbacks or hiring?

>> David Neumark: No, it is callbacks, and I'm gonna say that, yeah, right. Yes, well, so a correspondence study, just to fix the terms here, there are audit studies. The big famous set were those urban institute ones in, I wanna say, the early mid-eighties.

And there, you actually sent people out as actors, and they actually went through the job offer stage. This was pre-IRB stage, I don't think you could get those approved anymore. Although a lot of my colleagues at other universities can't even get correspondence studies approved. Correspondence studies are, the original one was, someone sent out letters, and that's why it's called the correspondence study.

And now, of course, we do it online. You're sending out fictitious applicants, there's no one to interview, and you don't wanna waste people's time, so it's a callback. But there is evidence saying that, there's an ILO study where they sort of do all the stages. And most of the discrimination seems to happen at the callback stage, which I think kind of makes sense cuz that's before once you're in the interview pool, there's data, and HR sort of gets involved.

But you're right, they are callbacks. So but anyways, we flip the script here, and instead of sending fake applicants to real ads, we put up fake job ads, and we study the applicants that come in. And the idea here is that we're gonna vary the treatment across these ads by inserting quite subtle and barely noticeable, but clearly, people notice them as you'll see.

Phrases related to job requirements that convey sort of classical age of stereotypes, and I'll tell you what they are and where they come from later. When you do that, you get, well, I'll preview the answer to the next slide, sorry. So that's sort of the progression of these papers.

There's also this quote I ran across which sort of also helps motivate this, if you just wanted to think about it in isolation, which says that despite protections by the ADEA, that's like the part of the civil rights law that protects older people. Employers have gotten clever in masking what is age discrimination by using ageist phrases and job ads.

And the idea here is to test whether employers might essentially manipulate who applies or shape the applicant pool by using subtle cues in the language of the job ads.

>> John: Can you define first, in an economic sense, what you mean by age discrimination? Because old people are biologically different from young people and their career, the option on a 30-year career trajectory is much different from an old person than a young person.

 

>> David Neumark: Right.

>> John: Physical abilities are different. I mean, you can try to hold some of that constant. So what do we really mean by discrimination here?

>> David Neumark: No, it's a very good question, I've obviously thought about this a lot, no surprise. So the typical paradigm for discrimination, this goes back to the Becker model, it's used in court, it underlies these correspondence studies, is people who are identically productive are treated differently.

And as you're pointing out, not everyone's just making widgets today, so what does identically productive mean in a sense? So I think it's a real issue, I think I'll say a few things, and I'll come back to more of this later. But first of all, the law is actually pretty wisely written.

When we passed the Civil Rights Act in 64, they deferred on age, there was a Department of Labor was commissioned to study it, and they wrote a different law. A lot of it sounds the same, but there are some differences that do explicitly recognize these things. A seniority system is allowed, taking account of the higher benefit costs of older workers is sort of non-discriminatory.

The standards are higher, and there's more recognition of exactly what you say, that the law, in a sense, is written like black and white, is just randomly stamped on your forehead, and there's essentially no distinction. Gender almost, there's a few distinctions allowed for gender jobs, and that's about it.

But it recognizes that age is quite different. I'm gonna show you, this may not be the full list of responses. So I think, first of all, we wanna be a little careful, because when you hire somebody, you don't care how long they're gonna work, you care how long they're gonna work at your firm, right?

And in fact, if you look at longitudinal data, obviously not at 70 years old, but at 50 or 55, expected job tenure isn't necessarily that different, as far as we can tell, between new hires at different ages. Cuz younger people move for all the reasons we've learned from the job shopping literature.

I'm gonna show you that part of what I find is setting in, in the 40s, when I suspect these things are not that important. I'm gonna show you maybe some evidence of just a slide, citing some evidence that pretty careful research on productivity by age up to around 65 doesn't really find any clear differences.

But it's definitely a real issue, and it is trickier and muddier than other divisions between people. But we can go back to more later.

>> Speaker 3: We had a very same question, but I was waiting for you to describe your design. I'm thinking it very much depends on the occupation and the industry.

I mean, mathematicians are in their prime, in their 20s, in other jobs-

>> David Neumark: Sure, no one's doing these studies for mathematicians, just to be clear.

>> Speaker 3: That was a rhetorical point. But I'm thinking if job tenure or human capital profile or cost or hiring, you may not be acquiring any human capital, but your productivity process, say, even as proxy by a health process, as we all know, I mean, it's different when you're a little younger, a little older.

If I were to be thinking that another wise identical individual who's young and old is not equally productive, even from a point of view, the empirical design of an experimental study. How do you allow for this option value different that that's what exactly is being equalized by a forward-looking employer?

A person with different ages, it is otherwise productive, equally productive, statically speaking, but of different age is gonna produce an expected presence. Can evaluate herms a different amount of output.

>> David Neumark: Right.

>> Speaker 3: Because I think that the older person, for a variety of reasons is more likely to drop out of the job, drop out of the labor force.

 

>> David Neumark: Well, the dropping out of the labor force doesn't matter, you only care about your firm. And that's an important point, right? I mean, clearly they're gonna work for fewer future years, but not necessarily future tenure, which is all you care about?

>> Speaker 3: From an expectation point of view, even if, so to speak, the shock for us, you may not be seeing a lot of breakups of relationship from the expected person, it's like time varying disaster risk.

 

>> David Neumark: Right.

>> Speaker 3: There's a probability of risk that an older person is subject to than a young person.

>> David Neumark: True, okay, so lemme just say this, and then we'll move on. So I agree, these are all true, and these kind of studies focus on quite low-skilled jobs, which are the jobs most people hold.

All of this work on correspondence studies, if people aren't being careful, they should. You have to say it's sort of a high internal validity, low external validity world like most field experiments are. I think the nature of these experiments, I'm sending out resumes and you didn't hear this person.

So if I send my resume here, if no one heard of me, they're not gonna hire me, right? If I send my resume to McDonald's, it's not like, all right, that guy's gotta reputation as a really good hamburger maker. So I think these experiments are always done in those kind of jobs.

And I think you wanna be very careful, that's all I can say about assuming this applies to more professional jobs. The only study I could think of, and I was surprised that it worked in a sense, Doug Cruz at Rutgers did a study of disability discrimination where he actually did kind of professional workers.

And I was very skeptical when I first heard he was doing this. But it seems to work, people get hired, even though it's a field where you think anonymous, not anonymous, but unknown resumes would get no attention.

>> John: Everywhere in economics, we have the unobserved heterogeneity problem. You're gonna do what you can about it and then somebody can complain, well, there was other reasons that you.

 

>> David Neumark: Sure, and the first study, as I will tell you in a few minutes, was prompted the reason I did it, why do another correspondence study was prompted by exactly some of these concerns. So I will come to that later.

>> John: I once met a guy who was hiring people, this was in the recession.

There's all these 55 year olds who were having jobs. Why don't you hire these people? And he said, I wanna hire people who are on their way up, not who are on their way down. Is that age discrimination or is that something correlated with age?

>> David Neumark: Well, it's a good question.

At UCI, we have a distinguished professor rank, and no one that young is distinguished or eligible for it. It's kind of a good deal. And the most important thing about it is we don't need a slot to hire someone, there's sort of a university-level thing. So every fall when we first meet, the chair always say, does anyone have any candidates for distinguished professors because it's kind of a free position for us, not for the university.

And usually people bring up names, and usually, as you would expect, they're older. And there's one guy who's very smart guy and a good friend of mine, so I won't name him. But not a fast learner, every year he says, he's too old or she's too old.

>> David Neumark: And one response is, that's illegal, but I try, I've said that a few times and stop.

And although it does say something interesting, I think, in that person would never dismiss somebody out of hand cuz of the race or ethnicity or religion or gender, right? An age is, this is someone who's willing to say in front of his or her colleagues, they're too old, we shouldn't hire them, but by law, there's no difference.

 

>> John: What he means, of course, is the chance of that person making an even better contribution while here is low.

>> David Neumark: Right.

>> John: Whereas if we hire someone young, they might, right, they're even better.

>> David Neumark: Yeah, the thing I don't say is look around the table at the people we didn't hire this way and tell me they're more productive, usually not, some are, but many, many aren't.

None of these people flame out, and again, but I go back to the tenure, not academic tenure, but time. What you care about is how long they're gonna work here, and younger people move.

>> Speaker 4: So can I just ask you to back up for a second, talk about your view of what equilibrium in the labor market looks like, in the view of potential discrimination.

I mean, going back to Gary's work and others, there's always this issue of why don't firms, realizing people are getting underpaid because they're discriminated against? Why don't they swoop in, weep up all those productivity gains, share in them, and eliminate the problem? So obviously, there's maybe some fixed costs or other things may be expensive to observe, there may be lack of competition in the labor market.

But what is the baseline that you think of as going on in the labor market against which these standards are, I guess should be in your view applied, but are being applied in the real world?

>> David Neumark: Well, I think I'm answering the same question, but tell me if I'm not, I think that view of Becker, that sort of competition should root out discrimination.

He's actually, if you go back and read it, he's very careful, to be clear, there are very specific assumptions under which this holds. So, first of all, if you get utility from hiring the favored group, as opposed to disutility from hiring the disfavored group, then discrimination persists in equilibrium.

Matthew Goldberg has a QJ paper, I never knew what happened to the guy afterwards, but it's a really interesting paper.

>> John: Everyone has to get that utility.

>> Speaker 5: The employer.

>> David Neumark: The employer.

>> John: And the employer has to be able to, right.

>> David Neumark: The employer gets utility.

>> John: Many fringe of employers who will make money but don't have that.

 

>> David Neumark: Well, perfect competition, right, these guys always lose but as long as it's not in perfect competition, if there's imperfect competition and you get disutility from hiring, you and I should still be willing to trade firms for a price and get rid of it. But if I get utility from hiring them, and I can take that out of my profits, that's no different from the old managerial perks and reasons, my profits are lower because I wanna bigger office or a corporate jet or whatever.

Customer discrimination, employee discrimination, don't get competed out of existence. I do a little bit of expert witness work, and a case I worked on where actually, amazingly enough, cuz I was a plaintiff's expert, got the data to write a paper that I'm publishing is a restaurant chain. It was like a healthy image, kind of restaurant, and it was clearly a customer discrimination thing that wasn't their product, and they were discriminating against older workers, and there's no competition that forces that out.

So I think there's a lot of ways to have discrimination and equilibrium as long as you don't have perfect competition. Okay, overview of the paper really quickly, so we create a bank of job ads for three occupations into the earlier discussion. It's a narrow set of fairly low skilled occupations, we randomly vary across job ads, these phrases related to stereotypes, which I'll document later.

We post these on job boards in 14 cities over about a year and a little less than a year and a half. And then we're not sampling cuz we post the ads and we see what comes in essentially, the sign doesn't surprise me, the magnitude surprises me a lot.

And I'll try to put this in perspective, and then I'll tell you why you should be more surprised on this than you might be when first reading the slide. Ages stereotypes reduce the likelihood that workers apply by a lot on a base of around age about 33 reduces average age by around two and a half years, proportion over 40 falls by about half.

And then we do this, it's a back of the envelope calculation, but I think it's telling. We kind of compare what you get from this study where you see this discouragement of who applies in the first place from placing these ads versus what we found from the correspondence study, which you might call direct discrimination.

Not hiring, not calling back an older person who on paper looks the same at least, and the magnitudes of these two effects are about the same, right? And what that says, if you take that seriously, is that, remember that quote I put up about employers getting cleverer, that this use of language cues and job ads can have about as big an impact.

As the thing we usually measure when we go, when enforcement authorities and attorneys and all go around looking for discrimination and maybe prosecuting case based on discrimination. Which is are there fewer hires, are there fewer older hires or black hires or whatever compared to the applicant pool. The effect of shaping the applicant pool can be just as big, which suggests we might be missing an important part of the picture in enforcing.

 

>> John: This is consistent with one version of Becker, where discrimination takes the form of segregation across workplaces of different types of workers and there is no effect on wages. Some firms are specializing in younger workers, some firms are specializing in hiring older workers for non-productivity reasons but back to Mike's point, that doesn't tell you what happens to compensation.

 

>> David Neumark: That's absolutely true, right? That's absolutely true about Becker, right, I mean, his key point, which is on the margin, it's the marginal employer who affects wages in some models, now, in statistical discrimination models, that's not true, right. But in the original Becker model, that's right.

>> Speaker 3: Do you have also a sense of the target demand?

I'm thinking that maybe, again, a stereotype, all due respect, you prefer to meet an older physician because you think it's an experienced doctor. But if you are responsible IO and marketing study that for certain classes of conspicuous consumptions, buyers want relatively younger, relatively attractive type of sales representative.

 

>> David Neumark: Right, that could be, I mean, employers are not allowed to cater that by law, so let me do.

>> Speaker 3: I mean, you know the position, but you know what kind of firm and what kind of products the firm.

>> David Neumark: No, these are, I'm gonna give you more details on these jobs, the retail jobs may be relevant, that's one of them, it could be.

Okay, so I think I wanna spend a little time on this question, like, are these results surprising or interesting? Obviously, you know my answer, but, so it wouldn't be surprising if I put really blatant ageist language in job ads. Now, of the 14,000 plus ads we used for the correspondence study, only one said, must not be over 40 cuz people understand that's illegal, right.

So it wasn't things like that, but we do, for example, put into the experiment as a kind of extreme case to see if the experiment is actually gonna be informative. These sort of extreme phrases that are suggested by the AARP, things like being a digital native or things like that.

That wouldn't be surprising If you put in phrases like that and older people don't apply, you'd say, well, okay, so what, big surprise. There's a couple reasons our evidence, I think, is much more interesting and surprising. So first of all, everything we do is constructed from, is sort of, we take all these real ads, and we scramble them up and sort of kinda make them uniform for the experiment.

But we're using real world phrases, and they're much more subtle. I will show you, once I explain this computational linguistics stuff, where these phrases fit in, into the distribution of kind of ageist phrases. And they're not very biased. People still perceive them that way. I mean, clearly, older people respond to them, but they don't jump out at you.

Certainly, people like us who maybe don't apply for these jobs, they're not these blatant phrases. And second reason I think it's interesting is because putting the first two papers together, we found that employers who use this language actually are much more likely to call back the younger workers than the older workers.

So they discriminate, I'll put it that way in the experiment itself. And then I'm gonna show you a lot of other evidence, and this is partly why I'm gonna cut through some questions or maybe give you too short an answer, cuz I wanna get to it. But some other evidence that I think, and I say this not just cuz I'm sitting at Hoover, but anywhere.

I say this with a lot of caution, but I have some interesting evidence, I think. That I really think the most natural interpretation of what we see, given other evidence I'll show you, is that employers are intentionally doing this to avoid hiring older workers. They're actually using these phrases to shape the applicant pool.

 

>> John: Clarifying question.

>> David Neumark: Sure.

>> John: When you use the word discrimination, does that mean relative to a legal standard, or does it mean relative to identically productive people get treated the same?

>> David Neumark: What it means in my case is discrimination in the correspondence study. So I see two resumes, and I'll talk about some features I vary.

 

>> John: Conceptually, the two are not the same. They might be the same.

>> David Neumark: Legally, it would be the same. I mean-

>> John: There's a legal standard, okay, which is set forth in the law, and employers need to think about that for a variety of reasons. They want to abide by the law, they don't wanna subject themselves to lawsuits.

That standard may or may not coincide with the notion that people who are identical and all relevant productivity attributes get treated the same in the market.

>> David Neumark: No, no, no, I'm saying the legal and the correspondence study definition are very similar. Cuz what I was studying-

>> John: When you say discrimination, then it means relative to the legal standard.

I'm just trying to understand.

>> David Neumark: Yes, yes, yes. But, yeah, that is correct. Okay, so I think there's some policy implications here. I already said this shaping the applicant pool appears quantitatively very important.

>> Speaker 4: Are you gonna show us the questions? And I hope we're not on the conclusions and moving on.

 

>> David Neumark: No, no, no, this is the introduction.

>> Speaker 4: Okay.

>> David Neumark: That's slide ten. There's 70, but don't worry. But I have some I can skip over cuz I know the slides well. So explicit language is already. EOC gives a lot of guidance, and obviously you can't say things like over 40 need not apply.

But they also, in the code of federal regulations, which is, of course, where Congress's laws get sort of interpreted and fleshed out. Refers to other phrases that don't say, over 40 need not apply, but are clearly age related, young, recent college student, things like that. Our evidence implies that language much more subtle than this, and you will see what I mean, can act as a form of age discrimination potentially.

And you could use that a couple of ways. EEOC could just issue stronger guidance, and I've talked to enforcement agencies in the EU and Australia so far about this stuff, and they're interested in thinking about this. There's also a big issue of targeting of discrimination enforcement. We have the EEOC and then we have the state level versions in, I think, every state.

And people complain and they decide how much resources to put into an investigation and whether to ask for data, and that is burdensome on both them and on the companies. This kind of language might be sort of a clue as to, you know, where it might be more productive or less productive to look.

Okay, so let me now plunge into very quickly on the first paper and then much more on the other two. So I think most labor economists agree that these field experiments, resume studies, correspondence studies, audit studies provide the cleanest evidence on discrimination. Now, it is a bit of looking where the light is, right?

There's a lot of labor market decisions you just can't study with this method, or at least no one has figured out how to yet. I think that's an interesting challenge. There's a lot of these on race, ethnicity and gender and a much smaller number on age. So what's the paradigm?

We already talked about this, I don't have to repeat it again, but the reason I wrote this paper and got the Sloan Foundation to fund it, I'm not sure if this is why they funded it. But the reason I wanted to spend all this time doing this is I had two reasons to be skeptical of the existing studies, and the existing studies, they're all cited in that last bullet there.

All found evidence consistent with age discrimination. There were really two things I was concerned with, I guess, on the next slide, but that's okay. I'll jump to the bottom and then jump back to the details. One is, and it pertains to a lot of the questions we were getting before in even a much simpler way, how do I make them the same on paper?

Right, I can't give a 30 year old 40 years of experience, he hasn't been around that long. And if I give the 40 year old, the 60 year old ten years of experience, they look kind of weird, and you might wonder what they're doing. In fact, before we went into the field with this correspondence study, I gave a bunch of seminars.

People thought it was a little weird, but that's okay, where I just sort of said, here's our protocol. Here's what we're gonna do. And one of the things I did in there was to provide people with some real resumes and some fake resumes we had created and see if they could tell the difference.

And some of those resumes had black or Hispanic names and big missing gaps in experience. So they were 60 year olds or 65 year olds with ten years of experience. And on more than one occasion, people said, I just assumed they were in prison. Which highlights this point that having a big missing experience gap might look odd.

The existing studies tend to do that and therefore may tend to overstate age discrimination. The second concern, this is an area I've been working in kind of separately. So Jim Heckman wrote this paper, and then with Heckman and Siegelman, I think actually the two of them were first.

That said, while these correspondence studies sound so intuitively sensible, just as in terms of rigorous evidence for what you can study with them, there's this problem that you can take the best case scenario, you did the resumes right. So, in terms of what people observe, there's no reason to think there's an expected value difference between the blacks and whites or whomever.

And let's even assume there's no reason to assume unobserved differences. So wipe out some of the things you were talking about before. Sounds like they should be perfect there, right? How can this fail? Well, they can fail if the variance of the unobservables differs. And when I first read that a long, long time ago, I said, that doesn't sound right, but it was Heckman.

And so it was right. What's the intuition? The intuition is when I design one of these studies, I sort of pick, out of the air is not a bad characterization. Kind of the level of resumes I'm gonna create, and then I just put some random variation in. Now, suppose I choose resumes that are kind of low relative to what the employer usually sees, right?

Then the low variance group is dead because there's no reason for the employer to think, there's some chance they're gonna be great. The high variance group might get some leeway. So the high variance group will be favored, and you flip it around and I send out two high quality resumes.

Then the high variance group gets penalized. So I was working on this a long time ago. This is one of these things that sat on my desk forever. I have a JHR paper which proposes a solution to this. And basically, without getting into detail, by putting in at least a little bit of a continuum in the quality of resumes, you can kind of tease out this distributional thing and solve this problem.

I sent it to Ekman one night, well, you know him well. 30 minutes later you get a response.

>> John: All caps.

>> David Neumark: Exactly, all caps, no punctuation. You know, this is great, David, you could do better.

>> David Neumark: Which was probably always true, I suppose. Anyways, so we wanted to deal with both of these things.

We wanted to, so one thing we do in this study is we actually put in a lot of different kinda resumes, some where we do the typical paradigm, which is equal experience. Some were where we do it, we call it equal, what I call the experience commensurate with age.

So this goes back to this question of what's discrimination? When a 55-year-old tells you, and if you talk to non-academics, they'll all tell you this, there's age discrimination. They don't mean I'm being discriminated against as a 55 year old with ten years of experience. They mean I'm being discriminated against as a 55 year old with, whatever, 30 years of experience.

So I think that may be a more meaningful standard. We did that, we did a bunch of other things trying to indicate there wasn't a health problem. There might have been a health problem. The person was computer savvy, they weren't computer savvy to try to get at some statistical discrimination ideas, which is hard to do.

We spent a lot of effort doing this, the thing is huge. We sent out 40,000 applications to over 13,000 jobs. We've really ruined this for everybody else who does these studies cuz now you can't do these with 1,000 people anymore. And none of these things matter, I mean, the results are very consistent regardless.

John, you have question?

>> John: Well, I'm still troubled. Let's even take a classic one on race or gender, which seems easy. You have to assume that the rest of the CV carries all other information. If I'm hiring for somebody to work at a Polo club and two things come in and one's black, it is a fact that because of stark discrimination, black people have less exposure to Polo.

And so, hey, he's likely not to know as much about Polo as Jay Houston Witherspoon III, who comes in as white, and now, that's statistical discrimination. But it's perfectly Bayesian, it's nothing bad about black. And the same thing's gonna come through in spaces.

>> David Neumark: Right, so back to Steve's point, legally there's no distinction.

 

>> John: Legally?

>> David Neumark: Right.

>> John: This is on economic, just legally that's discrimination.

>> David Neumark: That's right, so-

>> John: Same way as if I put 55 versus 35 in, all sorts of things correlate with age that do in fact correlate with age. So even though I don't hate old people, I know that there's a good bet that x doesn't have kinds of experience, unless I believe that the rest of the CV completely controls for all information, which it doesn't.

 

>> David Neumark: Right, so you will still be troubled by this question at the end of the seminar, let's just be clear. Which is fine-

>> John: There's no way around it.

>> David Neumark: It's a very hard question. Statistical discrimination is really hard to test for. A lot of these correspondence studies claim to be doing it.

I have a JL piece I don't wanna get into here on this. And I think a lot of them aren't really testing for statistical discrimination or can't be sure they are. There's a couple papers with good tests. My favorite one is Laouénan and Rathelot, who have this Airbnb.

I just like it cuz I teach it, it's great. So they have Airbnb data and they have the rental price difference for black and white owners, cuz you put a picture up, most cases, in the same neighborhood. And then you have all these characteristics of the place, right?

And when there's very few reviews, it's cross sectional data, so they don't follow people over time. But at a point in time when there's very few reviews, there's a big race penalty, right, they get less, and as the reviews accumulate and you condition on them. So it's the classic sort of statistical discrimination, gets more reliable, more reviews, and the race gap goes away.

And there aren't many other papers that really convince me, but that's, of course, not legal.

>> John: The law says statistical discrimination's illegal, so-

>> David Neumark: Right.

>> John: If the standard is finding legal discrimination-.

>> David Neumark: Right.

>> John: This is easy.

>> David Neumark: Right, and I'm gonna show you, yeah. So we do some things in the correspondence study and perhaps a little more here, but this is still not a fully resolved question, for sure.

Just a few details, we did four occupations in this study, admin assistant, retail, security, and janitors. These were chosen because they're sort of the kind of jobs you can do in these studies because people don't, like, know the applicants. These are jobs in which there's actually a lot of low ten year older workers.

I mean, there aren't a lot in the absolute terms, but in relative terms, these are jobs they tend to take, they're in the upper percentiles. And we do people around 30, around 50 and around 65. And I'm only gonna show you one slide of results cuz we're gonna move on.

This is just descriptive statistics, but after all the econometrics, the picture's really not any different. So these two are for women in admin, and that's roughly, call it 30, 50, and 65. For women, you see a clear downward gradient in callback rates for men. You always go from higher for the young to lower to the old, but it's clearly not as monotonic.

Those results are a little less robust, a little more sensitive to these unobserved variants kinda thing. But still, you still find evidence of these age differences, but they're not as strong. I did wanna say, I skipped one point, the reason the unobserved variance question I think is really important here is because of the human capital model, right?

The human capital model says the overtaking age, we don't have any differences in unobserved human capital investment. When we're really young, they're pretty small. But they haven't accumulated, so they can't be that big. But then earnings profiles fan out, and that all accumulates. And as long as that's not all reflected on a resume, and surely it isn't, there's a good reason to expect higher variance, higher unobserved variance for older workers.

Meant to say that, sorry. Okay, so the proposal that did the correspondence study ended literally with the sentence, we will retain the text of the job ads to which we apply to see if we can detect ageist stereotypes in the ads that might predict employer behavior. I thought that was an interesting question.

At the time, I had no idea how to do this, I didn't know we were gonna string match or whatever. So, it was a good time to be doing this cuz everyone started working on this. And this gets to a little of what we talked about. So there may be some repetition here, but there's some issues we didn't cover.

So why would an employer use ageist language in job ads? So the most obvious explanation, I think, they don't want to hire older workers, but they know they might get caught if they just don't hire the applicants. So you just discourage them from applying. Could be taste or statistical, the guy just don't like them.

Or it could be kinda the assumptions that John and others were talking about. Now, you might say, where does the don't like older workers come from? This is not nearly as obvious as for race, let's say, and we all become older, we honor and respect older people. So I actually spent about six months reading into this in the psych literature cuz I was sorta curious.

Is there a taste discrimination or animus kind of perspective here that could apply to older workers? And the psychologists talk a lot, and it sounds weird until you think about it, or maybe it even sounds weird after you think about it. But they talk about older people reminding us of our mortality, right?

And when it's our own older person, our own mother or a grandparent, that's not the principle way we relate to them. But when it's just somebody else, we do. And I don't know if that's true or not, but they argue that it's not necessarily just stereotypes. There may be this sort of aversion, you might call it, to older people, at least with whom we don't have a personal connection.

The second is what you guys were talking about. It's sort of innocuous, there's statistical discrimination, jobs have different requirements. People put them in the job ads because why wouldn't you? One possibility, employers hold stereotypes about whether older workers meet those requirements or not. Now, these are things in the ad, so I want to focus on those, but they act on them or they're not in the resumes, and they just make assumptions about them.

Now, both of these are legal, as we discussed, although, and this is where I mentioned the ADEA being more nuanced than the Civil Rights act. There is what they call a reasonable factor other than age defense. And I'm gonna read this quote, and I love it because it sounds incomprehensible EU speak.

What is an RFOA? It's a non-age factor that is objectively reasonable when viewed from the position of a prudent employer mindful of its responsibilities under the ADEA under like circumstances. And I've read it a lot of times. So now I know what it means, and I've read case law, but basically what it means is the employer knows they're not supposed to discriminate against older workers.

They know there's the ADEA, they're trying to comply, but they still find it necessary to use age, right? And a great example is this Hodgson versus Greyhound case. This was Greyhound bus company where they actually had a maximum hiring age of 35, and they were sued. They lost the state court, I can't remember all the ways up, but the Supreme Court actually ruled in their favor, right?

And what they said was Greyhound had evidence, I think this was real, that it took 15 to 20 years to sort of reach peak sort of bus driving skills, but there was a tremendous amount of variation. Now, in other jobs, if I'm making products, you test a few of them.

If the quality starts to suffer, fine. That's pretty costless to do that, but these are bus drivers with people in them, right? It's not entirely clear. We wanna wait till we know there's a problem, cuz then there's 20 dead people on the side of the road. And basically that's kind of what the case says, and allow them to actually have an explicit age, not just something related to age, right?

An explicit age criterion. And this would never, I'm pretty sure, be allowed under civil rights act with respect to race, ethnicity or gender. Okay, so what do we do here? We start by identifying with a very frankly boring literature review, the industrial psychology literature, industrial organization psychology. These folks talk a lot about age stereotypes, and even better, they do a lot of meta analyses, which makes it more efficient to get a handle on the literally hundreds of papers that have been written.

So we identify these common stereotypes. We have not gotten pushback on these from people in this field. This is the nutshell version. We scrape the text of the job ads from the experiment, these 14,000 or so job ads. And then we use these computational linguistics techniques to say how similar is the language, each phrase, in fact, in the job ad, to each of these stereotypes.

And if you think about that, then I can essentially, I'm gonna have a list of 17 stereotypes in this paper. And for each of those, I can say, let me sort of pull kind of the most stereotyped phrase in the job ad, or the phrase that is most linguistically related to that stereotype, and figure out what it is.

And now I can compare two ads. So, one has a phrase about so technologies, technological knowledge is a stereotype about older workers, lack of it. So one phrase says, must use Excel. The other talks about social media and digital native, to take my extreme example. Well, the latter is gonna have a higher measure of semantic similarity with that technology stereotype.

And I'm gonna characterize all the job ads in terms of each of these stereotypes. And then ask if that index of similarity to the stereotypes actually predicts that the employer avoided calling back the older worker, okay? That's the basic idea.

>> John: Can you say a little about your vision?

So what is the difference?

>> David Neumark: Yeah, but yeah. I will, let me say a couple things how we do this. Let me just quickly show you the stereotypes. There's 17 of them, the ones in white are unambiguously negative about older workers in the literature. The ones in orange are positive, careful, dependable, more experienced, and the ones in yellow are actually ambiguous in the literature.

That negative personality, warm personality, better communication skills, worse communication skills. So this is just our sort of raw material. Then we have this issue of how do you characterize the job ads relative to. And then I'll get to the regression right after that, relative to these stereotypes. So basically, the way to think about this, for those who haven't done or seen seminars on this, what is computational linguistics?

Well, one thing to know is most of the people who write papers in computational linguistics work for search engines, most of them for Google, right? So you've all done a computational linguistics exercise, most likely today, when you've entered a few words into Google and searched for something, and it returns websites and forgetting about the advertising part, that might prioritize some.

Aside from that, they're trying to show you the most relevant things, cuz obviously, pretty useless search engine if it didn't. How does it do this? There's a trained model. So basically, you take a big corpus, usually Wikipedia. English language is used for people working in English, and you train a model which is basically trying to predict how near each other in Wikipedia words are used.

So you take Wikipedia, break it into every paragraph and every sentence, and you're basically doing this projection onto a large set of vectors to try to predict closeness. And essentially what's gonna happen? Here's a good example is whale and dolphin are gonna be very semantically similar cuz you can imagine they'd be used in Wikipedia in a lot of the same sentences or paragraphs, right?

Whale and barista are not gonna be very semantically related cuz it's very hard to imagine a place where they would be used together. And that's what semantic similarity means. People use it for a lot of things. And then there's a scaling factor where you sort of use a scale that cosine similarity score, which scales it between minus one and one, which is arbitrary, but just gives you something to use.

So just an example, here's an ad. Now, this ad doesn't look unusual if I hadn't put the yellow highlighting it. But you see reliable, energetic, customer friendly, detail-oriented, experience preferred. That sounds pretty typical computer savvy. This doesn't look a weird ad, except maybe now it does. Having sat in seminar for 45 minutes, it doesn't say over 40 need not apply.

It doesn't say we don't want blacks or anything like that. It just has things that are natural and ads, but are actually related to age of stereotypes. And that's both we're exploiting that. But it's also the problem. Say it again.

>> Speaker 4: You can ask a question. If they ever saw the Beatles live in person.

 

>> David Neumark: I would, I watched them with my parents when I was four on the Ed Sullivan show. I'll never forget, probably my first memory. Just to give you a sense of what's going on. I'm gonna actually skip this and come back, I have a better version of this later.

So that's what I said. We didn't, Jon asked about the regression. So basically, here's the regression we're gonna run. This is a dummy, we do it different ways, but one way is that these employers got a triplet of applicants, almost all young, middle and old. So younger is called back, and the middle or older was not called back.

That's one definition. So that's a dummy probit, linear probability, doesn't matter. We have these 17 stereotypes, that's what s indexes. And p95 is the 95th percentile of the semantic similarity distribution. This is the distribution of the semantic similarity score of sort of the most stereotyped phrase in that ad.

So go to my example. If one ad said digital native, the semantic similarity score for technology would be higher than an ad that said, must know how to use Excel. And the question is, when it says digital native and not Excel, is the older person less likely to get called back?

That's essentially what we're doing there, okay? And I'm going through this quickly, but that's fine. I wanna get to the next paper, and the answer is, I'll skip that. Basically, yes for men, not so much for women. For men, this works really well. We have these, I didn't mention that those stereotypes are grouped into health, personality and skills.

On all three of those, you find evidence like this, and it's kind of almost always in the direction predicted by this industrial psych organization literature.

>> John: I may be imbecile. What are the jobs?

>> David Neumark: Admin, retail, janitors and security.

>> John: What?

>> David Neumark: Janitors and security.

>> John: Okay.

>> Speaker 3: Sorry.

 

>> John: Who decides this? Who puts in the bins?

>> David Neumark: What, sorry, the jobs or the-

>> John: You mentioned five bins.

>> David Neumark: Sorry, so we made a decision not to name the website we take these job ads from. But on that website, they are grouped. They are grouped this way.

 

>> John: Very good that way, okay.

>> David Neumark: Yeah.

>> Speaker 3: I just don't understand. So the outcome variable, the callback, it's supposed to be a callback for scheduling an interview or follow up.

>> David Neumark: I mean, it's a positive expression. It's usually a callback for an interview. I mean, the thing I learned doing that correspondence, you read these papers and they sound really slick and simple.

There are so many complications that come up when you're doing this that I just had no idea about. So one is that you have to record these messages and code them up. So we did like this inter rater coding kind of thing. The other is, this is sort of a funny one.

So the website we're using for this, if you can click on a link that sort of returns an email directly linked to the ad. So then I know exactly what ad they were responding to. But some people call back by phone. We did a pilot. We found that a lot of people go.

And they say, this is Joe, I'm calling about that job you posted. Great, now what do I do, right? How do I put my job? So we actually figured out I had more tech savvy grad students than me, that we could get a lot of online telephone numbers.

And then we could link them to occupation, city and all these different cells and then really narrow it down and then try by listening to try to figure out which ad it was, which could in every case. So we actually had 360 online phone numbers. And every month I gave our department administrator receipts for 360 phone bills.

And she thought I was nuts.

>> Speaker 3: That's right, hold up. Because what I was thinking, I think it made it into the popular press as well. And in the UK, there are strong anti discrimination laws for individuals with disabilities. And people have found out, I mean, a group of researchers within the government, that the callbacks were actually meant to discourage the applicant from proceeding further and looking for.

 

>> David Neumark: That's interesting. I'd love to see that reference in those conversations, okay, interesting.

>> Speaker 3: Scheduling, getting convenient times, or making clear that the job was very burdensome physically.

>> David Neumark: Interesting, yeah. So the idea there is you sort of get them in the applicant pool, kind of like what we're discouraged, encouraged at universities maybe, to make sure you're being fair, but then, yeah, interesting.

Okay, that's interesting. That would suggest if you did this kind of study, you'd underestimate it, maybe. Okay, for women, we don't find as much evidence of this, not as clear. Now, it may be because the kind of things people focus on with regard to older women versus older men are harder to put in ads.

You tend not to put looks related things in ads. You tend to put computer skills related things. It's also possible there's this very narrow. I wanna use the word intersectional, but in a very narrow way. Intersectional claims are not allowed under in most cases. And what that means is we have the civil rights law, which covers race and gender and other things.

Then we have the age law, which covers age. You can bring a suit as a black female, in which case, if you're sort of thinking about a regression analysis in support of a discrimination claim. If, let's say the blacks got fired at a higher rate than whites, but the black females at a much higher rate, you can effectively put in the interaction when it's age and gender.

You can't because they're two separate statutes.

>> John: Back to the issues that we've been discussing before. But now, in this concrete context, I want to take one example, security related jobs. So some security related jobs involve sitting in front of a tv screen and a monitor doesn't require a great deal of physical agility and stamina.

Other security jobs require walking grounds and so on. So it's easy to imagine there that there would be a form of discrimination that employers exercise for those kinds of jobs that they don't for the TV watching, monitor watching jobs. But that seems like an attribute that's correlated with age and is also highly correlated with ability to perform the job, a notion of productivity.

That's gonna show up in your study, as I understand it, as age-related discrimination, have I got it right?

>> David Neumark: In the correspondence study, yes. But as I said, we have sort of variants of resumes where we sort of.

>> John: This one here, some of these ads are by employers that want people who have some physical stamina, maybe even the capacity to intimidate.

 

>> David Neumark: Right, right.

>> John: To intimidate, I'm not sure.

>> David Neumark: Correct, right.

>> John: Occasionally, to be shot at, I mean, this is an important job.

>> David Neumark: So in this study, yes, in the paper, which I'm not gonna start on the very next slide and get into a lot of detail.

We actually have some information which I think is useful in ruling it out. Let me just tell you what it is, and I'll get to it later. So here, of course, I'm talking about worker responses. So now I'm really studying what workers assume employers are gonna do, in a sense, right?

And the drop off in applicants starts at age 40, which is a lot younger than I would think those things are relevant. I could be wrong about that. We also do this exercise, again, using the computational linguistics, where we study the similarity between resume items and cover letter items and these stereotype phrases.

And we don't see any drop off in, insofar as there are things signaled on the resume. And we also don't see that when you get the treatment, which might be in the physical case, like for security guard, I think it's lift 40 pounds versus carry a flashlight, is what we do.

You don't get sort of the people with less physical stuff indicated on the resume, in a sense, dropping out in response to the treatment. But let me, I'm rushing through the IRLS. I'll come back to that. Okay, all right, so now we're gonna get into this experiment. I'm gonna skip the LIT review cuz time is going quickly, although it's important.

So we choose three of the 17 stereotypes. We need treatments for these stereotypes, and you can't have that many treatments and do a feasible experiment. We chose three that are commonly expressed in job ad language. You don't see something related to hard of hearing in many job ads.

So it wouldn't be very interesting. It would be big for a very strange job ad. Stereotypes in the prior study were linked to how employers behaved in the correspondence study. And then we looked at, we read a lot of ARP stuff. Our older worker, there's a lot of things they and others put out about, things to look out for in the market when you're applying for a job as an older person, things they might be aware of.

And we focus on three communication skills, physical ability, and technological skills. And we do three of the four occupations, we drop janitors. There aren't nearly as many ads for them. So it seemed like our experiment would sort of contaminate the website, essentially. We do this study in 14 cities.

These dots are proportional, not to population, clearly. I mean, some extent New York, but how many applicants we actually got. So it's a function of both population size and how intensively this particular website is used.

>> John: Was it hard to choose 14?

>> David Neumark: Not really, what we chose, we started with 12 from the correspondence study, which we had originally chosen to have kind of different age compositions.

So we've got some old cities, like in Florida. Salt Lake City is the youngest city. We also chose cities because we wrote a paper on this where there's some variation in state laws. So we basically did those 12 and piggybacked a couple more. So just like in a correspondence study, the thing you really wanna be careful about and get right is the resumes.

Cuz those are like the applicants, right? Here you wanna really get the ads right. So we spent a lot of time on this. So, first of all, we did, we took a lot of our ads. We then tried to boil them down to some common ones. We wanted to keep enough details about the company and the job to be realistic.

We didn't want to actually name the company. We thought legally that might be a bad idea to put up a fake ad with a real company name. But about half the ads on this website actually don't have company names. You wanna think about this website as biased towards or overly representative of small employers.

So if I go on this website and there's an ad for a target job, it's gonna direct me to the target website. And then I'm sorta out of the experiment. This is more like small places where you basically email the owner or whoever it is that is doing this.

And then we sort of modified the requirements to just, we wanted to try to not exclude people needlessly. So we have a lot of flexibility as far as that goes. The important part is getting in these treatments. So we're gonna have an ad like the one I showed you before.

And then we're going to, we took all these phrases from all the job ads, right? Which, I mean every three word phrase, removing what are called stopping words of, in things like that. And we took these template ads. And then we wanted to insert a few word phrase in some and not the other.

And in so doing, we wanted to trigger a high semantic similarity of that ad with that particular stereotype. But this is the hard part, but not with any of the other stereotypes. And not just the three, not with any of the 17, right? So we wanted to really just try to trigger the one stereotype.

And that was essentially just a lot of, to guess at that. I'll show you how we sort of assess that. But that was the goal here. So here's just an ad, for example, administrative assistant ad. And we're gonna have, this is where the treatments go. We're gonna have one sort of control case where none of these are meant to be reflective of stereotypes.

Then one at a time, then all three, so that's five. And then we do these crazy AARP ones just to see if we can learn anything from the experiment.

>> Speaker 4: So are we not just learning how older people view themselves? I see a job with communication skill requirement.

Well, I'm not real good at talking. I'm a grumpy old guy. I'm not real good at talking to people. I won't apply to that.

>> David Neumark: You might think, yes, that is one interpretation and that's why-

>> Speaker 4: I don't know how to use these dumb computers. So I don't apply.

 

>> David Neumark: So I'm gonna show you that by doing the machine learning stuff on the resumes and the cover letters that they sent, that there's no indication that people are selecting out on anything except age. So when I have a treatment phrase that is about communication skills, the stuff on the resume doesn't sort of get.

We don't see the ones who have less of that on the resume dropping out relative to the ones who have more.

>> John: Then what you're finding is that people ignore the requirements on job ads when they-

>> David Neumark: They all apply, they generically all apply a little less when you put any of these in because they are demands but they're not paying attention to the content, but they are responding very much on age.

So it's striking. And you could say, well, how well does my machine learning on the phrases and the resumes and the letters really address your question? That's a perfectly fair question, but there is something we can do and we don't see any evidence that this is happening, which is surprising because it's obviously another interpretation.

 

>> Speaker 3: I can ask you, why is it not a natural design to think about targeting via the semantic metrics, the job-

>> David Neumark: Targeting by what? Say it again.

>> Speaker 3: Your semantic metrics. Targeting jobs in actual platforms where employers advertise targeting jobs that seem to be advertising position, they're relatively neutral in discriminatory language.

And see and actually send fake applications engineered to look like clearly an older NAICs.

>> David Neumark: No, well, that's not this experiment cuz this is fake ads. Not fake ad, no, I agree. I agree-

>> Speaker 3: Can you talk to us why this is to you looks like a more useful experiment than the one I had in mind?

 

>> David Neumark: I'm not saying yours is less useful, I'm just saying it's completely different. It's something you might do, having learned about, but I'm not sure you would have thought of it before you see these results. I mean, I agree, once you start to think about the language in ads, you could probe some of these hypotheses by doing sort of a more complicated, the typical correspondence that you don't do any of what you're saying.

Just like find an ad, send some broad criteria, job x, city y, send it out. And what you're suggesting is you could learn something from a different country.

>> Speaker 3: You would think that if there disclosing issues on the employer side, you wanna have a measure, you wanna expose them to a controlled treatment and see how they respond.

 

>> David Neumark: Well, no, I'm sorry, I mean the application. Right, so the application. Okay, there's actually a really useful suggestion here, aside from a new study, which is also useful, but bigger deal, I hadn't thought about it. So when I designed the correspondence study, we built in a lot of variation in some of these.

Think about the resume section. That's like skills, hobbies, interests, and we sort of played around with what goes in there to convey some of these things, like health or physical ability. Do you run marathons? I hadn't thought about linking those to the stereotype classification of the ads. It was assigned randomly, so I actually have the data.

I might do it a little differently, but I actually have the data, I think, to do something like what you're doing. But let me suggest we-

>> Speaker 3: Just understand the hypothesis. So what would be the hypothesis here? If there is-

>> David Neumark: Here is to understand why you said that what you saw was employers who use stereotype language appear to, in terms of the correspondence study, discriminate.

And the question is why? We're just trying to understand here, what role does a subtle ages phrase actually shape the applicant pool? This claim that employers are, like I put the quote up, using language to effectively discriminate by shaping who applies. Does it seem to actually shape who applies?

That's what we're doing here.

>> Speaker 3: Because it could be, and then I follow, it could be that they are actually testing. From my point of view, suppose I'm not understanding anything. What are the effective strategies at discriminating? Which strategies are more effective at targeting specific group of potential applicants?

It doesn't necessarily test whether any particular employer is actively engaging in isolation.

>> David Neumark: I agree, you gotta put the studies. I think you have to put the studies together to draw that inference. Nothing in this study alone says employers are discriminating. I agree with that. Okay, here's just some examples of phrases, I'm paying attention to the clock here.

I'm gonna skip a few slides, but just wanted to give you some examples. So, Steve, you mentioned security guards. So here what we're showing you is the control phrase and the machine learning phrase. So we try to as much as possible choose a stereotype phrase and a non-stereotype phrase that have some similarity.

So here was the, you need to carry a flashlight. You must be able to lift 50 pounds. The technology one, easy to read about it. You must write patrol records in a journal notebook. You must type patrol entries into a journal application on a computer system. And I show you these cosine similarity scores with the corresponding stereotype.

And of course, always higher in this column and this column. That's by design, I mean, we designed them that way, okay.

>> Speaker 4: Could you have a requirement that was along the lines of this job is particularly suited for someone with a hearing loss?

>> David Neumark: Some of those stereotypes that just don't fit naturally into ads.

And, I mean, you could imagine an ad like that, but they're just not common, which is kind of why we focused on these stereotypes, cuz you find a lot of references to these things, and I think.

>> Speaker 4: So you have some way of getting rid of that or?

 

>> David Neumark: Well, we're creating the ads, we can do whatever we want. So we just leave out those things. Yeah, yeah, we chose to focus on just these three, and we make sure we leave out the other ones. And I'll show you that in a second. Just to give you a sense of where these phrases are.

I'll just look at the middle graph. So this distribution here is the cosine similarity score with the physical ability phrase from all phrases in the correspondence study job ads, the 14,000 or so job ads. The two black bars are our two different controls and where they are in the distribution.

And that was like, whatever, carrying a flashlight, whatever it was. And this was the lifting 40 or 50 pounds, and where it is in the distribution. And you could see the kind of same idea here for communication skills and technology. And the important point is the controls are near the median, and the treatments are kind of at around the 75th percentile.

We didn't target that number particularly, but that's where they end up. And that's why I said we're not using really blatant phrases. These are not completely outlandish things. Also note that these are all to the right of zero, which makes sense because they're real job ads. And there's another important point here.

One thing you might say, going back to the correspondence study or going back this study. Okay, fine, I see a stereotyped ad. They don't want me, I move on. Well, there's a lot of other ads that have similar stereotype language, and there aren't that many ads in the first place.

So this probably is actually costly for workers, although I don't actually study their job search behavior. I wanna show you this is a bit in response to Bob's question. This is a complicated graph, I apologize. Let's just look at the one that's easy to see on the bottom left here.

 

>> Speaker 3: Nice.

>> David Neumark: You know what this says? This is really hard to explain. So what we want is, now that we're all amateur epidemiologists, we want high sensitivity and high specificity. I want an ad to be perceived as stereotyped when that's my treatment ad. But I also want it to be perceived as stereotype only on the stereotype I'm trying to trigger.

So this is the median to the 99th percentile of this cosine similarity index across the treatment ads. The second line is the control ads. And you can see that in every case, we're much higher here than here. But if I look at the other stereotypes, here's the two in the study and here's all the other ones combined.

It doesn't trigger any of those. Now, the real ads from the correspondence study actually are closer, but they're not control ads, so that's fine. And that's true in every single case. It's even true if I break up this other into the other or whatever it is, 14 of them.

So we really successfully, I think, wrote ads that triggered treatment ads that triggered the right stereotype. Okay.

>> Speaker 4: I'm gonna trigger one stereotype per ad.

>> David Neumark: Well, the treatments, we have three treatments where we just try to trigger one. We have another treatment where we do all three at once.

But the point is that we're just triggering those three. And then we have the ARP1, which is a little vaguer. We did the ARP1 just because if we found nothing here, you might say maybe your experiment was uninformative. So we kind of wanted to confirm that these ARP stereotypes actually really reduced older applications, and they did.

I'm just gonna skip. We did an MTurk study to see if people perceive these phrases as biased. And I'll just skip it and say they do pass over that. This is just the plan for posting. The only important details are we sort of the pilot and we saw how long ads stepped.

We never had two of our ads up for the same job in the same place at the same time. That's really the important thing here. And there's no P hacking here. We specified exactly how many of these we're gonna do in advance. We did get our IRB to approve this, which many people find very hard to believe.

Their first response was, well, you have to do informed consent and UCI, which is the right question. They're actually, I always characterize, RIRB is great because they're lawyers who figure out how to help you comply, not lawyers who just are playing risk averse, make sure the university doesn't get in trouble.

So we went back and basically to minimize cost, we had to inform, oops, sorry. We had to inform them within 24 hours that we decided a different candidate and at the end of the study, this was nice. Not till the end of the study, we had to write to all of them from the email address they used and tell them what we were doing and why we couldn't do informed consent and give them the option to opt out, and three people did.

And one person said, this is a great study. Let me tell you how old I am, and-

>> John: Opt out means your data's done.

>> David Neumark: It is gone, yeah, but there's really just three people. We filed a pre-analysis plan, there's a lot of details in the paper. There are some additional analyses we've put in based on feedback.

They're all clearly delineated, but you can follow all that if you really care about details. Okay, so this is just kind of interesting. The job board makes money from job ads. Everything else is free. It's not a job board, it's a website. You wanna sell your bike, it's free.

You wanna put a job ad up, you pay money. So they actually care a lot that the job ads are high quality and aren't a lot of phishing schemes and things like that. So we anticipated this, but we didn't know how bad it was gonna be. First of all, each city has a human checker.

There were two cities who just could never get an add up. I don't know what they were seeing, but it never happened. Then we found problems, when you try to post an ad, they wanna call you back, okay? If you use the same number a lot, they start digging you.

If you're doing it from the same IP address, they start digging you. So one thing we realized was we needed to get pay as you go phones that randomly assign IP addresses. And I don't know why they exist, except the obvious reason is for criminals, right? That's the only reason they can exist.

And then we thought we'd use credit cards. The first thing we did was a bunch of us applied for credit cards and got a bunch of our colleagues, too, but used them maybe a second time or a third time. They got dated. So that was why I have a lot of credit cards at the moment.

So we started buying gift cards. And trial and error, some worked and some didn't, Target did. I'm just making that Walmart's didn't, whatever. Turns out, as I say right here, which is obvious, ex post that gift cards are really popular with money launderers and credit card thieves. That's where you got a bunch of cash or if someone stole a credit card, buy a gift card, throw the credit card away.

So some of the gift cards are trying to trace you in case this ever comes up, and some just don't. So my grad student used to tell me, if the FBI comes to my apartment, I'm in big trouble. This is a picture of his desk in the middle of the study.

And a lot of these gift cards got burned, we couldn't use them. So my wife and I had $20,000 in gift cards that I bought off my grad students that we used for groceries for around three years. Okay, calculate age, I won't go into the details, but people give year of high school graduation.

They give earliest work experience. So we can approximate age pretty well, we think.

>> Speaker 4: I'm surprised, so you didn't approach the company directly, the website directly?

>> David Neumark: They would never have let us do this.

>> Speaker 4: They would never, but that doesn't violate?

>> David Neumark: It does not violate. We made an explicit decision not to use websites where we'd violate the terms of service.

 

>> Speaker 4: Wow.

>> David Neumark: There's no fictitious person. I mean, an ad isn't really a person anyways. There's no-

>> John: That's our business.

>> David Neumark: Well, that's why they do all this checking, right? But, yeah, there's no-

>> John: Right in the terms of service, don't do the thing that we're checking to stop you from doing.

 

>> David Neumark: Right, that's true.

>> John: Which they soon will, and that'll be the end of this.

>> David Neumark: I guess, right, right.

>> Speaker 6: IRB was totally cool, everything that you laid out.

>> David Neumark: IRBs don't have to be totally cool. They have to weigh costs and benefits. This is what you gotta remember.

There aren't any blanket prohibitions in human subjects research, it's cost versus benefits. The argument I always make is I appeal to sort of Social Security trust fund finances. If we could get older people working each two months longer, it would be worth massive amounts of money. And that tends to work, at least at UCI.

 

>> Speaker 4: At Stanford, the IRB, you required approval about the IRB if you're using data from the national income accounts?

>> David Neumark: Yeah, okay.

>> John: Big backlog of work.

>> David Neumark: One thing we worried about, and we had nothing to do apriori, we just had to cross our fingers and close our eyes and hope for the best was, what if people manipulate age in response to the treatments, right?

So we thought, ex-ante, great. We can see whether there's no age information reported, and that we can actually test that has no response to the treatment. Then it turned out we didn't think this would happen. We just didn't think of it, you get repeat applicants, right? So we get people multiplying to different treatments or controls and treatments.

So for them we can actually test whether they change the age information or they don't.

>> Speaker 3: Sorry, how do you verify the age of the applicant?

>> David Neumark: Year of high school graduation is the most common. If we don't have that, we use year of first job and then one other thing.

But year of high school graduation is the most common, and that's actually what we used in the correspondence study. So luckily, cuz we had no recourse, they aren't manipulating their age. That would obviously screw everything up quite quickly if they did, right? Okay, so finally some results. But nice thing about experiments is the results are fairly simple.

This is just the CDF. I'll skip to the next one. For any of the treated groups versus the control groups, the treated group is the darker line. It doesn't really stand out here, and it's to the left, which means, of course, you're getting younger people in the treatment group.

This is at age 40, it's fairly big. This is all I break them up. These colors don't come out here as well as they do on my computer, but if you look at this, they really don't. The most extreme treatments are largely tracing out, like the all three.

That's the three macher lean treatments are tracing out the lower envelope largely, and the control is largely tracing out the lower envelope. But you may not believe that cuz it's a little hard to see on that graph. So I'm gonna show you regression tables. I do wanna show you one more thing first cuz I referenced this earlier.

Let's just look down here because I can see it. That's the all three treatment. Sorry, the dark and light are much better contrast on a computer screen. But what's going on here, if you look at around at low 35-ish to 60, you're getting the separation between the treatment and the control group.

And that was the point I was making earlier, that this age shift, it's not all coming out here, it can't. Well, there aren't that many people out there in the first place. But you're getting fewer people in their sort of late thirties to 60 here as well, but it's just not strong.

And more people in their kind of twenties and thirties. So whatever's happening here is happening at quite young ages. Okay, so here's some regression tables. I'm gonna skip to this one. This is just all the data. So this is all five treatments, one at a time to your question, all three and then these AARP ones.

So the first thing you might, and this just to dispense with, that's the no age information reported. So there's nothing going on there, which is great. Those are actually pretty precise. The second thing we notice is there are 24 coefficients, 20 coefficients in this table, and every single one of them is negative, which is a lot harder to say statistically.

But that certainly says things are pointing to lower ages among the treated groups. And then you see, for example, that was the number I cited at the beginning. When they get the all three phrase treatment, their average age is 2.5 years younger. Right here, proportion under 40 is about 0.12.

These are the ARP treatments. They're much bigger, they were just kind of our baseline. You notice there's stars, and daggers, and bold and italics. That's to do all this multiple testing stuff, which doesn't ever make much difference and I'm gonna not get into it here. We do this a slightly different way.

We take actually the average cosine similarity score with the stereotypes, and just put that on the right hand side instead of sort of dummy treatments. And then we standardize them. And what this says is the higher the measured similarity from the computational linguistics with these stereotypes, the lower age of applicants is.

And alternatively, we take the Likert score from that Mturk study where we asked real people how biased they perceive these phrases to be. These are standardized so you can compare coefficients. You get very similar results. So sort of three different ways of doing that. You get that result.

I wanna get onto the things I haven't talked about yet, cuz we only have a few minutes. This just says you get a bit of a decline, not significant in younger applicants, we get a much bigger decline in older applicants. That was about the comment before about, these are, in a sense, higher skill demands.

I talked about this, that the discouragement effect seems about as big as what we found in the correspondence study. And I mentioned this. Okay, so this is the stuff I wanted to get to, other explanations. So the natural interpretation, maybe, or the most obvious one, is job searchers perceive language as signaling an employer who discriminates against older workers and don't bother applying cuz it's costly to apply, and it's probably costly to get rejected.

Here's the alternatives, we've been talking about all of these. Job ads signal job requirements that older workers are less likely to have, or that workers think firms believe they're less likely to have, or that workers just don't want, any of those things. And those imply, then, two things, that if, it's a big if, maybe medium sized if, if the resumes actually capture worker characteristics, we should see a reflection of that.

We should see maybe some dropping off of these characteristics with age, and we should see that the treatment essentially steepens the age gradient. So when an older worker sees, must lift 50 pounds to the extent the resume has items related to physical ability, I'm gonna get kind of a more select group of older applicants, okay?

And neither of those happens. So these are just the profiles, ignore the top one. These are the three profiles. What I'm doing here is, it's the exact same machine learning exercise. I take the phrases from the ads, I compute the semantic similarity with the stereotypes, physical ability, communication skills, technology skills.

I just look at it by age. These are smooth, they wouldn't be that smooth, obviously. And then I just plot them by age. And those are about, I mean, maybe you see a slight downward slope, but they're not significantly sloped, and they look awfully flat to me. Obviously, out here, they're noisy cuz there aren't many 70 year old applicants.

 

>> John: I'm not exactly following what you did but. So in the control group, there's no age information.

>> David Neumark: That's who this is.

>> John: Right, and-

>> David Neumark: There's no age stereotypes in the ads.

>> John: No age stereotype.

>> David Neumark: And these are their applications.

>> John: Is it the case that in the older people who apply to the ads with age discrimination information, and they're not statistically significantly different than the control group, older people who apply?

 

>> David Neumark: I haven't shown you, that's coming. So this is a precursor. This is just the respondents to the control ads, so there should be no influence of language. And what I'm showing you here is I just look at, so think about you looking at resume items like I play tennis or I run or whatever, or I know how to use Excel and who knows what.

To the extent, you're linking those semantically to the stereotypes, there's no change in those with age. The older workers don't look any less qualified on that dimension, okay? The next thing is what you're saying is, and I'll just show you, I'll put the three graphs up there. I guess I don't get all three at once, sorry.

I'll go back to one at a time, so that's the treatment. So now I'm comparing the treated and controlled. So if when I see must lift 40 pounds, I'm older, I don't want a job like that, or I think employers believe I don't want a job like that or I think employers believe I can't do it.

Whatever the reason, more people with kind of signals of physical ability in the resume should drop out and therefore the semantic similarity for those people should go up. So the treatment should slope up relative to the control, right? And it doesn't for any of the three stereotypes. So that's my best response so far to a lot of these questions, which I think are absolutely legitimate questions and ones I have lose and continue to lose sleep about.

About how to interpret these evidence and how compelling is it. Back to John's initial question, age discrimination is probably particularly challenging with this regard. Why? Because obviously at some point as you go out on those horrenonts laxis, everything changes. Not the same place for everyone and probably not in the 40 age range or 50 age or we see action happening, but that definitely just makes this more challenging and maybe you have to be just a little more careful.

So I don't think I need to summarize and conclude cuz you've heard everything. I'd rather stop and take questions and I haven't even glanced at the screen because it's over to my left. So maybe we can give them a chance first if we still have time. Other than that, I'm done.

 

>> Speaker 4: Older workers are less strong than younger workers.

>> David Neumark: Yes.

>> Speaker 4: I mean that's so if you have to lift 40 pounds, I mean, it's gonna drop off some people.

>> David Neumark: Right.

>> Speaker 4: You know that they don't wanna do that.

>> John: They seem to be sending in job applications anyway.

 

>> David Neumark: But also it's starting to happen at age 40 where I made a very warped perspective cuz I lift weights every day, so. But,

>> Speaker 4: You lose 1% or something that every year, starting 30.

>> David Neumark: Right, I could be wrong but I think if I saw on the physical ability one, if I saw all the action late 50s and 60s, I'd be more worried about that.

I'm a little less worried about it. On the technology skills, the communication skills, I don't see any obvious relationship to AIDS. I don't know why that would matter. And the technology skills we put in they're pretty, it's not like Python programming, it's like Excel and the highest tech one is Google Docs or Google sheets, fortunately.

So again, we chose technology skills that are not things that, I mean, my mother, when she was 90 had a lot of trouble with email, but I'm sure 30 years earlier, if she'd been exposed to all that stuff she wouldn't have had any trouble mastering it. So our convener is gone.

 

>> Speaker 4: I'm the sub convener.

>> David Neumark: You're the convener. I think Eric-

>> Speaker 4: Clear that the time he said, and you disprove the Bob Hall theorem about how many slides one can do in a minute.

>> David Neumark: Maybe.

>> Speaker 4: So, thank you very much for a great show.

>> David Neumark: Thank you.

 

Show Transcript +

Upcoming Events

Wednesday, May 7, 2025 10:00 AM PT
military_radar_istock-536058563.jpg
Building Strategic Competence: An Urgent Priority For Government And The Academy
The seventh session will discuss Building Strategic Competence: An Urgent Priority for Government and the Academy with H.R. McMaster and Stephen…
Friday, May 9, 2025
At Home with the KGB.png
The KGB And Western Plots Against The Soviet Union
The Hoover Applied History Working Group hosts The KGB and Western Plots Against the Soviet Union on Friday, May 9, 2025.
Tuesday, May 13, 2025
Markets vs. Mandates 2025
Markets Vs. Mandates: Promoting Environmental Quality And Economic Prosperity
On May 13, 2025, the Hoover Institution will host its third annual one-day conference Markets vs. Mandates: Promoting Environmental Quality and… Shutlz Auditorium, George P. Shultz Building
overlay image