Grade inflation refers to the phenomena of getting an "A" despite having what used to be a "B" or "C" level of knowledge.
So, it's possible there's an incidental correlation but there's no causation if the phenomena is real.
If you've seen how the average kid uses the internet, it's a rather laughable suggestion... there is little overlap between social media entertainment sites and the web of knowledge. The biggest contributor to higher grades are the book summary and answer-key sharing sites.
Grade inflation refers to the phenomena of getting an "A" despite having what used to be a "B" or "C" level of knowledge.
If an institution gives grade A based on some objective criteria and students getting the grade C turn out to be successful then methodology of the institution will be considered flawed by the vast majority. Soon, the institution will lose its relevance.
If the institutions have to stay relevant, they'll have to make sure the grades are correlated to successful people in some way even if it means handing out grade A to any student!
>> Grade inflation refers to the phenomena of getting an "A" despite having what used to be a "B" or "C" level of knowledge.
> If an institution gives grade A based on some objective criteria and students getting the grade C turn out to be successful business people then methodology of the institution will be considered flawed by the vast majority. Soon, the institution will lose its relevance.
What you describe is grade inflation.
Grades aren't meant to be predictors of career success, they're meant to be indicators of past academic performance. My understanding is that once upon a time, a "C" grade meant your work was average or typical, which was perfectly respectable. An "A" or "B" mean your academic work was above the norm or outstanding. People could get C's and go on to have successful, non-academic careers (such as in business) with little comment.
With grade inflation, when "average" is an A or B, it's much harder to distinguish the truly academically talented based on their grades, since they're grouped into the same category as average performers.
> Grades aren't meant to be predictors of career success, they're meant to be indicators of past academic performance.
Isn't it both? If we had figured out the perfect academic program for all time, then it could be just a reflection of performance compared to all prior students in the program. Since we haven't, we update the programs and that might mean some aspects become easier and some harder. Or maybe overall easier, if career demands have generally gone down (which, imo, is a much more interesting discussion than whether an 'A' is really an 'A').
That's not what happens. If A meant success at X, like running a mile in X time meant you could outrun a bear, no matter the percentage of people, it would be great.
What's happened is parents pressured schools to get their kids A's to get into college, then colleges set the minimum bar at A+ instead of A, then at A+ and a random selection of thousands of students, all with A+'s. We are at the point where a single A- will drop you 50 slots in class rank. This is like judging the difference between Olympic sprinters instead of seeing the real spread among students.
That's why I think the universities should be holding high school exams. More generally, students/pupils should be judged by the institutions they're about to enter, rather than the ones they're leaving.
Grading guidelines differ. Where I went, C was considered a good grade - more or less defined as the average among the students who didn't fail. Even an E means you got at least 40% of the points on the test.
This is so ignorant to the advancements in pedagogy over the past several decades as to be insulting. Part of, if maybe not the biggest or only factor, the increase in grades is we are just hugely better at teaching than we were 30 or 40 years ago. Teaching for mastery, the idea of reversing the role of homework vs tests (in terms of relative difficulty and exposure to new information) to better facilitate self learning, formalized programs to act as tutoring and support systems, all of these are increasing student performance by design.
Not only that but as far as higher education goes how are you even trying to compare to "what used to be B or C knowledge"? What in the 70s was graduate level transistor research is basic undergrad course studies, for example. Introduction to programming would have been on punch cards in the 80s, today's students are using python. Are you surprised that they do better given the advances in tooling?
The "kids these days" argument is so, so tired and honestly every time I see it Im not surprised so many of the old guard (rich off of just being in the industry during the dot com boom) think they're being targeted for "age discrimination".
I used that argument back in 4th grade to my parents: “Math is harder for me since there have been more advancements since you were kids”.
I can only speak for electrical engineering, but the basics have not changed in 100 years. Crack open a text book from 100 years ago and you’ll find the same control theory, circuit analysis, electromagnetics, etc. The difference may be vacuum tubes vs. transistors, but the analysis is them same. Many things are a lot easier to do now with computers, and I’ll wager a student of 100 years ago had better analytical skills.
It’s not just grade inflation either, but degree inflation. The PhD is the new MS, etc down the line.
That's patently false, and your example is a straw man. Sure your grade school arithmetic hasn't increased much, but I'd assure you university maths have.
As far as EE goes, I mean sure macro-electronic theory is pretty much the same, but micro-electronics didn't even exist 100 years ago. This is now a standard required class. Digital signal processing again, didn't even exist 100 years ago, now we teach it to sophomores. The foundations that were the same have persisted and the field has done nothing but expand; this trend is replicated across most disciplines.
The data doesn't support your assessment of declining analytical skills. The ability for abstract thought has been monotonically increasing for the past 100 years.[0] I understand the appeal, oh kids these days aren't disrupting anything, my job is safe. The issue is, there's not much data to support the platitude. Globalization means you're competing with billions more people, education achievement has to increase to match; or be left behind.
Interestingly enough, IQs are now decreasing in the western world and have been since around the mid 90s. There were numerous studies showing this quite clearly in 2013 (when that talk was given) but now it seems every study is corroborating it. Perhaps we can assume he simply felt it required more research as an explanation for why he would give such an ostensibly misleading talk.
Linking to another comment chain here [1] on this topic as this is something multiple people have also brought up as an argument against grade inflation. Unfortunately, it makes the inflation look even more absurd.
I think maths is relatively unique in that it is a really old subject that mostly doesn't depend on hardware. Sure computers have moved maths forward, but there's a lot you can do without.
High school maths is pretty much Antiquity level, though much more formal.
First couple of uni years are mostly 17-18th centuries maths.
Last 2 undergraduate years is 19th and early 20th centuries.
A few areas have changed though, particularly statistics I think, also computer science if you consider it part of mathematics, but have a look at a textbook from a 100 year ago, and you'll feel right at home except for the style.
I think you're assuming those advances in theory are universally available in practice.
They are not. My experience as a public school student was far worse than when my parents went to school - double or worse class sizes, no teaching assistants, and occasionally the same textbooks my parents used. "Experienced" teachers were the worst - it often seemed like the only thing they changed over the years was to stop corporal punishment.
My parents required unreasonable numbers of As, so I regularly got undeserved grades just by acting like I'd make a big deal about it... teachers were too overwhelmed to pay attention even if they cared. Some of my peers had helicopter parents that called in arguing their kid deserved an A simply for studying all night, even though that "sudying" was often was just online RPGs.
The trend followed in college for non-major classes. Professors simply didn't care unless a student would be advancing to another class in their department - then they cared enough to fail students, but the grade curve was still logistic. Not the expected "C = passing knowledge".
Even then, I still never saw pedagogical advances outside of brand new assistant professors in liberal arts classes.
I mean YMMV applies, instruction quality is certainly going to be dependent on the institution; but that's certainly the opposite of my experience at a public (though well ranked) public university.
So, it's possible there's an incidental correlation but there's no causation if the phenomena is real.
If you've seen how the average kid uses the internet, it's a rather laughable suggestion... there is little overlap between social media entertainment sites and the web of knowledge. The biggest contributor to higher grades are the book summary and answer-key sharing sites.