Author: Julia Wolffe

How to better study how societies work, and all that razzmatazz

One of my favorite essays, by Sarah Constantin (one of my favorite contemporary thinkers), is Fact Posts: How and Why. Fact posts are “an exercise in original seeing and showing your reasoning, not finding the official last word on a topic or doing the best analysis in the world”, so it’s okay if you get things wrong. Starting with an empirical question or general topic (“how common are hate crimes?”):

You look for quantitative data from conventionally reliable sources.  CDC data for incidences of diseases and other health risks in the US; WHO data for global health issues; Bureau of Labor Statistics data for US employment; and so on. Published scientific journal articles, especially from reputable journals and large randomized studies.

You explicitly do not  look for opinion, even expert opinion. You avoid news, and you’re wary of think-tank white papers. You’re looking for raw information. You are taking a sola scriptura approach, for better and for worse.

And then you start letting the data show you things. 

You see things that are surprising or odd, and you note that. 

You see facts that seem to be inconsistent with each other, and you look into the data sources and methodology until you clear up the mystery.

You orient towards  the random, the unfamiliar, the things that are totally unfamiliar to your experience. One of the major exports of Germany is valves?  When was the last time I even thought about valves? Why  valves, what do you use valves in?  OK, show me a list of all the different kinds of machine parts, by percent of total exports.  

And so, you dig in a little bit, to this part of the world that you hadn’t looked at before. You cultivate the ability to spin up a lightweight sort of fannish obsessive curiosity when something seems like it might be a big deal.

And you take casual notes and impressions (though keeping track of all the numbers and their sources in your notes).

You do a little bit of arithmetic to compare things to familiar reference points. How does this source of risk compare to the risk of smoking or going horseback riding? How does the effect size of this drug compare to the effect size of psychotherapy?

You don’t really want to do statistics. You might take percents, means, standard deviations, maybe a Cohen’s  here and there, but nothing fancy.  You’re just trying to figure out what’s going on.

It’s often a good idea to rank things by raw scale. What is responsible for the bulk of deaths, the bulk of money moved, etc? What is big?  Then pay attention more to things, and ask more questions about things, that are big. (Or disproportionately high-impact.)

You may find that this process gives you contrarian beliefs, but often you won’t, you’ll just have a strongly fact-based assessment of why  you believe the usual thing.

(I love that long quote. I keep coming back to it again and again.)

The thing that’s relevant here for what I’ll quote next is what Sarah calls “a sense of the world that stays in place, even as you discover new facts, instead of wildly swinging around at every new stimulus”:

There’s a quality of ordinariness about fact-based beliefs. It’s not that they’re never surprising — they often are. But if you do fact-checking frequently enough, you begin to have a sense of the world overall that stays in place, even as you discover new facts, instead of swinging wildly around at every new stimulus.  For example, after doing lots and lots of reading of the biomedical literature, I have sort of a “sense of the world” of biomedical science — what sorts of things I expect to see, and what sorts of things I don’t. My “sense of the world” isn’t that the world itself  is boring — I actually believe in a world rich in discoveries and low-hanging fruit — but the sense itself  has stabilized, feels like “yeah, that’s how things are” rather than “omg what is even going on.”

In areas where I’m less familiar, I feel more like “omg what is even going on”, which sometimes motivates me to go accumulate facts.

Scott Alexander adds:

Don’t underestimate Wikipedia as a really good place to get a (usually) unbiased overview of things and links to more in-depth sources.

The warning against biased sources is well-taken, but if you’re looking into something controversial, you might have to just read the biased sources on both sides, then try to reconcile them. I’ve found it helpful to find a seemingly compelling argument, google something like “why X is wrong” or “X debunked” into Google, and see what the other side has to say about it. Then repeat until you feel like both sides are talking past each other or disagreeing on minutiae. This is important to do even with published papers!

Success often feels like realizing that a topic you thought would have one clear answer actually has a million different answers depending on how you ask the question. You start with something like “did the economy do better or worse this year?”, you find that it’s actually a thousand different questions like “did unemployment get better or worse this year?” vs. “did the stock market get better or worse this year?” and end up with things even more complicated like “did employment as measured in percentage of job-seekers finding a job within six months get better” vs. “did employment as measured in total percent of workforce working get better?”. Then finally once you’ve disentangled all that and realized that the people saying “employment is getting better” or “employment is getting worse” are using statistics about subtly different things and talking past each other, you use all of the specific things you’ve discovered to reconstruct a picture of whether, in the ways important to you, the economy really is getting better or worse.

(That last part, that people operationalize vague questions plausibly-but-differently, is well worth internalizing.)

This segues into a recent essay on The Scholar’s Stage (so recent it just appeared last month!). Let’s say you’re interested in the following kinds of questions:

What makes human society work?

Why do people do what they do?

How does culture/wealth/geography/[enter your favorite variable here] change human behavior?

What is the relationship between human behavior seen at the micro-scale and at the macro-scale?

Do ideas matter?

How much does individual choice matter?

Is it possible to live morally in human society?

Is it possible for societies as a whole to become more or less moral over time?

Let’s say you have no special background in any particular fields, and you aren’t especially mathematically inclined (if you are, I’d recommend whatever Cosma Shalizi recommends, or just his Bactra Review.) Then you, like lots of (quote) “bookish overly-intellectual American teenagers”, will be especially drawn to a certain genre of ‘soft’ SF exemplified by Banks’ Culture series, Card’s Ender’s Game, Herbert’s Dune, Heinlein’s Starship Troopers, Le Guin’s The Dispossessed, Ayn Rand’s books, Nietzsche’s works, etc. The Scholar (who has no ‘About’ page, so I don’t know a single thing about her/him) writes:

None of these books (well, maybe a few of Hermann Hesse’s…) were designed for the ‘young adult’ audience. Almost all were written before publishers considered ‘YA’ a distinct consumer demographic. Much of their attraction to the teenage mind comes from this fact. These books are adult works written for adult audiences. They are meant to be taken seriously. And these young readers do take them seriously.

These are all books with big ideas. The ideas rest at the intersection of action and thought. Foundational to all of these works is a critique of the conventional. This is quite explicit in the work of the philosophers and existentialists, who write directly of what bothers them in human life, and how humans might do better. The critique is more subtle in the science fiction novels. Here readers are presented with societies vastly different than their own, fictional utopias and dystopias that discard all of the assumptions of American middle class life. They operate on a different set of values than that taught in classrooms and living rooms of suburbia. They force readers to reassess their own values and assumptions about what makes society work. No matter what else might be packed into it, this is an underlying message behind any thoughtful work of ‘soft’ science fiction: things could be different.

You could learn this other ways, of course. A look at the political philosophy of the Aztecs or the feuding laws of Medieval Iceland will force you to rethink your assumptions about what makes humans tick. But that is hard. In contrast, science fiction writers wrote with modern audiences in mind and package their material into engaging narratives. You can read them without bothering with supporting class lectures or extended footnotes. That appeals to an intellectual 16 year old. Well written science fiction is history and political philosophy on the cheap.

Now the issue is that if you don’t know history, all these depictions of alternate societies will sound extremely plausible, because they’re so detailed. To someone like the Scholar, who does know history, and is familiar with civilizational failure modes both legible and less-legible, they’re obviously doomed to failure.

What if you really are interested in the questions above, want your “sense of the world to stay in place” as you come across new convincingly-detailed depictions of alternate societies that contradict ones you already know, but don’t have tens of thousands of hours to spare? (This is the crux of this whole post.)

The Scholar suggests a five-step reading strategy: history+archaeology/ethnography, literature, behavioral science, political/moral philosophy, and social science. I’m attracted to their explanation of the first step:

History is the most important thing you can read. Why? Only a strong background in history can you tell you when writers in other fields are full of crap. I cannot tell you the number of times I have a found a political argument (or even fairly well regarded work of social science) that reads compelling at the 10,000 foot view but falls apart when you stack it up against concrete facts of history seen from the ground view. Humans are motivated reasoners. We bend the data to fit our theories. If you are not familiar with the data, you will not realize when it is being distorted or misused.

The data of the social sciences is history.

The problem with history is that it is too big. It is impossible to get a fine grained picture of every people and era on the planet. There is just too much of it.

My recommendation is to pick three very different historical periods that you find fascinating. They can be any three, really, but ideally they will be a bit separated from each other in space, time, and culture. For example, you might choose pre-Columbian Mesoamerica, the Abbasid empire, and revolutionary Russia. Or maybe your interests lie with Republican Rome, the Protestant reformation, and 20th century India. That all works. It does not really matter what you choose, as long as you have decent spread (at least one is ‘modern,’ at least one is ‘ancient,’ and at least one is from a non-Western civilization). The important thing is that you have a genuine interest in these societies strong enough that you could gladly read 4-6 books about each of them without getting bored.

Because that is what you should do. Read 4-6 books about each of the eras in question.

Your goal here is to build up a fairly granular knowledge of a particular time or event than can be called on to test and assess theories and narratives that will be thrown at you. “Famous scholar X proposes that y leads to z, but did y lead to z in each of the eras I am most familiar with?” You will know you have the background knowledge to do this right when you can answer questions like the following for a given era of expertise: “What are some of the biggest disagreements historians have about this era/event? What are the main sources historians or archaeologists use to try and understand the era, and how might they bias this understanding? If you had to pick one small incident or detail about the era that seems insignificant at first, but is actually very revealing example of the way this society/event worked, what would it be?”

You don’t need PhD levels answers to these questions. Just something more insightful that you would get from the Wikipedia page.

From that point, you can broaden out to more general histories. If you read fast enough to keep reading 4-5 books on different eras, keep on doing that. More normal people will probably want to transition to broader surveys that fill in the blank spaces they have with the rest of the world. There are plenty of fine histories that cover entire countries or regions from antiquity to the present (e.g., India: A History  Japan and the Shackles of the Past). Others might follow the history of a specific topic (say, war, the environment, or the financial system) over multiple centuries (e.g. the Pursuit of PowerEcological Imperialism, the Cash Nexus). Others might do the same thing, but restrict themselves to a slightly smaller geographical scale (e.g. Asian Military RevolutionChina: An Environmental HistoryAn Economic History of China). Global histories of entire centuries are also somewhat in vogue (I blame Hobsawm’s series for this development). Others will be comparative histories—works of history or ethnography that line up dozens of societies (e.g. The Lifeways of Hunter-GatherersUnderstanding Early Civilizations, War in Human CivilizationDynamics of Ancient Empires), or just a few (Islamic Gunpowder EmpiresEmpires of the Atlantic WorldThe Industrial Revolution in World History). All of these will do.

If that seems overwhelming, one way to make it easier would be to focus one particular macro-topic that can be explored in almost every single society. I personally have a special interest in warfare and military affairs. Reading about the wars and military institutions of different societies across history of human civilization has proved useful for learning much about the broader history of the societies involved. Something similar can be said for economic, religious, institutional, and environmental history.

The last group of history books are the ones you are likely the most eager to read. These are books like Jared Diamond’s Guns, Germs, and Steel, Francis Fukuyama’s Political Order, Ian Morris’s Why the West Rules. While methodologically these books are properly considered histories, for the purpose of this series I group them with the social sciences. They are concerned with the same questions that animate works of social science like Acemoglu and Robinson’s Why Nations Fail: The Origins of Power, Prosperity, and Poverty or the entire oeuvre of Peter Turchin. Why do some countries become wealthy while others do not? What explains the rise and fall of civilizations? Why did Western countries conquer the world instead of the other way around?

These books are fine to read and fun to contemplate, but if you start here you are doing it wrong. I have collected fifteen separate 400+ page books that try to answer the question “why did the West get rich first.” And that was seven years ago! The number of books tackling this question has only grown larger. But if that is all you read, you are in trouble. How will you know who is right and who is wrong? If you have not read widely in history and anthropology, who are you to judge? There is absolutely no point, for example, in reading one of Peter Turchin’s books if you don’t have the background knowledge needed to assess whether his models match the historical record. There is no point reading Diamond’s explanation for why China stagnated and why Europe did not if you do not know anything about Chinese or European history yourself (I am not convinced Diamond does). Grand theories of civilization should be at the bottom of your list. They are worth reading, but not before you have the foundation in facts that you need to distinguish between the good work and the ill.

So how do you find the history books worth reading? Occasionally people you can trust will put up reading lists. I have a reading list here on books to read in Chinese history. Here is Razib Khan’s recommendations on Roman history. Will Buckner has a list of valuable ethnographies over at Traditions of Conflict. Bryn Hammond has an absolutely fabulous set of reviews on just about every book ever written on the Mongols and Inner Asian nomads. Website like Five Books are another good place to start.

But if no reading lists come to mind, there are two methods in particular I have often have useful. The first is to Google syllabi. If you are interested in the history of the Roman Republic, Google “Roman Republic syllabus” and see what pops up. Read a few courses and see what books are included. Alternatively, if you just read a book you thought was particularly good, put its title into Google and then the word “syllabus” afterwards and see what other readings college professors have paired with that book in their courses.

The other route is the more old-fashioned: read the footnotes. A significant percentage of what I read comes not from book reviews or book lists, but by looking up and purchasing the books mentioned in the footnotes of other books I found interesting. Often times the best book on a topic is not the newest one. This is how experienced academics and researchers fill up their own reading lists. What works for us will work for you.

(Don’t just read my quotes – if you’re interested in the Scholar’s post, go to their blog!)

Top annual citation rates by field

(More intuition-building.)

Ivan Corwin, perhaps the most cited mathematician since 2011 (by PhD), who’s impressively a probabilist (you’re at a severe disadvantage compared to analysts and applied mathematicians), had 9 cites in 2009 going up to 27 and 63 in 2010/11, so by PhD he’d already amassed 99 citations. Even then he took 6 years (so his 9th year) to get to 500+ citations (525 in 2017). Last year he had 638; this year he’s poised to surpass that. Already he has 3,049 citations.

Nicola Gigli, PhD 2008, went 15-60-100 from 2006-08, took 7 years to get a 500+ year (548 in 2015), and 10 years to get a 1K+ year (1,030 in 2018).

Alessio “the Fieldster” Figalli, PhD 2007 and Fields 2018, only started getting cites the year after his PhD (from the GS chart), and took 10 years to get a 500+; the next year he got 604, but by that time the awards were many and included the Fields. This should give you an idea of how hard it is to get citations. Figalli is an absolute star, in demand everywhere.

Yuri “isogeometric analysis” Bazilevs, PhD 2006, the most successful applied mathematician since perhaps Emmanuel “signal reconstruction” Candes, started getting cites the year after his thesis, but he galloped out the gate: 119-212-343-757-1,261 (only 5 years to break the 1K mark!), 2,049 7 years down (he almost broke the 2K mark the year before, with 1,979), 11 years to break 3K at 3,362 in 2017. He looks poised to break 4K this year, 13 years into the game.

Terry Tao, PhD 1996, is about as good cites-wise as a pure mathematician can get. (And “pure” is stretching it – his six most cited publications are all on signal reconstruction with Candes, who, I’m completely shocked to learn, is both way more cited and younger PhD-wise than even Terry is; he’s pretty much the Terry of applied math cites-wise.) Curiously, he seems to have started slow; Google Scholar’s chart only starts listing his 6th year onwards (197 citations), by which time he’s turned 27 and is 3 years into his full professorship at an R1 institution, the youngest in America since probably Charlie Fefferman himself. Terry takes 9 years to break 500+, by which time it’s 2005, only a year prior to his Fields. That year (2006) he almost doubles to 992, just about missing the 1K mark; he handily surpasses that in his 11th year. He passes 2K in his 12th (2,307), 3K in his 13th (3,252), 4K in his 14th (4,062), comes really close to 6K in his 16th (5,931), and has plateaued at something like 6,600+ for the last 7 years, peaking at 6,854 in 2014, being extremely consistent all that while. This seems to be about as good as pure mathematicians get.

Eric Lander, PhD 1980 at 23 as a Rhodes Scholar at Oxford (after graduating Princeton at 21 as valedictorian – holy hell he’s fast), is about as good cites-wise as a mathematician can get, period. Google Scholar only starts tracking him from 1994 onwards in its chart; 14 years in, he’s already surpassed 1K (1,353). 16 years in he passed 2K (2,345), 18 years in 3K (3,433), 19 years in 4K (4,112), 20 years in didn’t quite reach 5K (4,900+), then suddenly the year after he blows past Terry’s plateau, hitting 8,227, and he’s just getting started; 22 years in he passes 10K; 23 years in, 12K; the year after 14K; then 15K; 30 years in he’s nearing 20K with 19,599; passing it the next year; 33 years in he passes 25K; 2 years after that, 30K (31,000+); finally peaking in 2017 with 33,874, where he seems (seems!) to have plateaued – literally five times higher than Terry has, apparently a thing with these huge genomics programs. Nearing 39 years since his PhD, Eric Lander has 430,000+ citations from 992 publications and a scarcely believable h-index of 263, along with 573 papers with 10+ cites, 506 with 30+, 302 with 200+ cites, 178 with 400+ cites and a mind-boggling 109 papers with 1,000+ cites. (I don’t know if there are that many pure math papers with 1K cites period!) He seems to have coauthored 40-50 papers per year for the past several years. Eric really is an industry upon himself.

Ed Witten, PhD 1976, the consensus strongest theoretical physicist in the world for the last 30-40 years, is about as good cites-wise as a theoretical physicist can get. Interestingly, his plateau for the past 20 years is comparable to Terry’s, perhaps a tad higher, averaging mid- to high-6Ks, 5 times breaking 7K, peaking at 7,421 just last year. Like Terry, Ed’s GS chart only starts 6 years in, but by that year (1982) he already went past 500+ with 672, passing 1K handily within 8 years (1,410) and 2K equally handily within 10 (2,643) and 3K within 11 (3,349). Here he plateaued for about 8 years. He got the Fields during this time, but it didn’t help at all; in fact his cites dropped by nearly 10% the year after. And then, in 1995, Witten launched the second superstring revolution with his famous M-theory suggestion at USC, and he passed 4K for the first time (19 years in!); he’d pass 5K 2 years later, 6K the year after, and the year after that “settle” into his plateau.

Michael Nielsen, PhD 1998 at 24 and at one point “Australia’s youngest academic” (per their media), published (not wrote, published) the standard textbook(!) on quantum computing at the tender age of 26(!!); that tends to do fantastic things for your citations, and indeed “Mike & Ike”, at 37,400+ citations and counting, is one of the top ten most cited publications of all time, so he’s pretty much as good as you get for a fast start in a technical field: 2 years out he’s at 251, 3 years out 630, 4 years out he passes 1K with 1,080, 7 years out he blows past 2K with 2,325, 10 years out he flirts with 3K at 2,990, but by that time he’d already left his tenured(!) position to “be an advocate for open science” (you have no idea how much respect I have for a guy so devoted to his ideals he’s willing to “throw away” a position at the top of his field in a system he wants to change); ever since then he’s just been generally trending upward, finally passing 4K last year, 20 years out of his PhD. That’s what a “jackpot start” looks like in a technical field.

Daniel Dennett, perhaps my favorite contemporary philosopher, is about as good cites-wise as a living analytic philosopher (i.e. not Zizek) can get. His h-index is just 1 shy of a hundred, he’s published 383 works with 10+ cites and 226 with 30+ and 167 with 50+ and 99 with 100+ and 52 with 200+ and 34 with 400+ and 27 with 500+ and 18 with 1,000+ and 8 with 2,000+ and 3 with 8,000+, and he has 94,000 citations in total from a staggering 1,261 GS-listed publications (his own bibliography lists something like 600ish). Since 2013 he’s plateaued at around 4.5-5K, peaking at 5,259 citations in 2013.

Barry Simon, mathematical physics giant

Barry Simon, IBM Professor of Mathematics and Theoretical Physics at Caltech, is without a doubt a giant. He’s the Simon in Reed-Simon’s four-volume masterwork Methods of Modern Mathematical Physics, which has been cited over 21,000 times since it was published in the seventies; generations of physicists used his text as one of their standard references all over the world – it was so important that it was brought behind the Iron Curtain during the Cold War and translated into Russian so as to be used over there. On MathSciNet he’s second only to the French nonlinear PDE specialist Pierre-Louis Lions, who won the Fields Medal in 1994. On Google Scholar he has 75,112 citations from 755 publications, including a staggering 390 with 10+ and 289 with 30+ and 220 with 50+ and 127 with 100+ and 21 with 500+ and 10 with 1,000+ (none with 2,500+ aside from his textbook series, which means he “did it the hard way”, like Noga Alon). The Notices of the AMS talks about how “Barry Simon’s influence on our community by far transcends his approximately four hundred papers, particularly in view of 126 coauthors, 50 mentees, 31 graduate students, and about 50 postdocs mentored”.

Great statistics, all of them. But what does that look like in terms of personal anecdotes from the people he’s touched over the years?

I enjoyed this collection of reminiscences by his friends, colleagues and relatives on the occasion of his 60th birthday. Here are some of them. The most salient theme that emerges is that Barry is fast, really fast, at everything, and he was really good at multitasking (writing papers while listening to lectures and offering shorter proofs after their conclusion etc). He just has this tremendous dynamo of a brain.

Gary Chase:

As undergraduates at Harvard, several of us took the Real Variables course given by Professor Lynn H. Loomis. Loomis came to class with few or no notes and lectured fluently. One day he was writing a proof on the blackboard and paused briefly, scratching his head. He stepped back from the board and looked into the audience where he saw Barry. Loomis said “Are you thinking what I’m thinking?” and Barry said “Yes.” Loomis said “I thought so” and then went back and completed the proof. The rest of us had no idea what this thought was, since it was not verbalized by either of them! This incident was one of many that convinced us that Barry was uniquely gifted in such matters.

Yosi Avron:

At Princeton there were the lunch time seminars and the Math-Phys seminars. In the lunch time seminar, Barry was the prima ballerina. (Can you imagine Barry on his tiptoes?) He would normally either tell a new result of his or a new result of someone else. With Dyson, Lieb and Wightman in the audience, most grad students and postdocs, were too terrified to expose their slowness if they were to ask an innocent question. Most of the time, nobody dared open his mouth. The notable exception was the fresh grad student from Harvard, Alan Sokal, who never had a fear of authority and was sufficiently smart and self-confident to argue with Barry.

The math phys seminars were a different business. There was an outside speaker most of the time. Wigner would usually show up and ask his typical Wignerian questions. Barry would sit in the audience and write a paper. From time to time he would look up from his notes and ask a question that would unsettle most speakers: Someone in the audience seemed to know more about what he was talking about than himself. Sometimes, at the end of the talk, Barry would go to the board and give his version of the proof, which was always slick.

E. Brian Davies:

In 1983 Derek Robinson invited me to visit him at the Mathematical Institute in Canberra for a month. As one of the inducements he mentioned that he had also invited Barry Simon for the same period. The prospect of seeing Barry, Derek and kangaroos was enough to make my decision, and in July 1983 I set off on the gruelling journey.

I was very surprised shortly before my departure to hear from Derek that he was unable to avoid a commitment to go to a conference in Japan, and would not return until ten days after my arrival. Barry arrived about the same time as myself, and I asked him a problem about heat kernels of Schrodinger operators which I had solved in one dimension by a method that could not be extended to higher dimensions. At that time I had read several of the papers on hypercontractivity, a concept that was invented specifically to solve problems in quantum field theory, in which there are an infinite number of degrees of freedom.

Barry listened to my question carefully and agreed that some progress should be possible. The very next time I saw him he told me that he had solved the problem by an improvement of the standard hypercontractive estimates that made use of the finite dimensionality. He then proceeded, without notes, to give me a lecture on the subject, explaining every step in detail, including the infinite summation procedure that allowed one to pass from L® Lp bounds to L2 ®  L¥ bounds by controlling the constants involved.

Although very impressed, I had the temerity to suggest at the end of his lecture that although he had clearly done what was needed, I did not like his solution aesthetically, and would prefer an account that depended on differential inequalities. I had the feeling that this would yield better constants and be in some sense more natural. On the next occasion Barry rewrote the entire account in this language, and we realized that this was going to have enough ramifications to occupy the entire month. By the time Derek got back we were fully committed to the project, and I now feel embarrassed that I did not spend nearly enough time talking to him. Derek was, however, the person who coined the term ultra-contractivity, which was the focus of almost all of my research over the next ten years.

Juerg Froelich:

I first met Barry(-boy) in front of the conference building of the Les Houches summer school, back in 1970. He introduced himself like this: “Hi, I’m Barry Simon from Princeton University.” I must have replied roughly as follows: “Good afternoon,  Professor Simon. I guess I am Juerg Froehlich, a student from ETH in Zurich. How are you?”
This was to be the beginning of a wonderful friendship that has lasted to this day.

Barry was a very lively “student” at that famous Les Houches school (constructive quantum field theory and statistical mechanics). He was already pretty voluminous, physically, but also mentally, as he appeared to always know more than the lecturer. He was engaged in a friendly competition with Alain Connes, also a student at that school: The competition was about who of the two was faster in simplifying the proof of the lecturer or rendering it more elegant. I will only say that both of them performed admirably!

Barry was almost always writing some manuscripts, even during lectures. Nevertheless he appeared to always understand what was being taught to us. In the library, a new Simon preprint could be found every week – the school lasted for eight weeks. These preprints were handwritten. Some of them were co-productions with people like Graffi, Grecchi (Borel summability of the anharmonic oscillator), or the  late Raphael Hoegh-Krohn (hypercontractivity). These preprints were consumed like French croissants by the other kids (like myself), because they were very clearly written and not enormously technical and could therefore be understood by people whose level was not terribly advanced yet.

David Ruelle presented some rather qualitative lectures about statistical mechanics. (He likes to be qualitative and write only few formulae, in his lectures.) Barry asked him: “David, when are you going to start to do some serious work?” David replied: “Go to the library and read my paper about superstable potentials. This is as serious as I can get.” I imagine that the contents of that paper will appear in Barry’s second book on statistical mechanics that we are all awaiting impatiently.

Barry had wonderful qualities of all kinds! For example, he always applied for money from the NSF, not just for himself but also for Valja Bargmann and for me, and probably for further members of the club. And he apparently got the money. He was unbelievably well organized and efficient! He would always write the paper when the research was done, and he did it in five percent of the time ordinary mortals need to write such papers. I once told Barry that I always wanted to be at a department with a colleague like him, who takes care of essentially everything and doesn’t forget anything. He said: “Juerg, I only know one place where you could have a colleague like me.” (He had already moved to sunny California.)

Fritz Gesztesy:

Barry’s lightning mental agility, his extraordinary talent to strip the unnecessary clutter surrounding an argument, getting straight to its core, and his remarkable ability to see connections to related topics other than the obvious ones in question, are legendary…

The following observations will sound familiar to many of us:

Scenario #1: You joyously walk into Barry’s office to present him with a new idea, just to exit a few minutes later, your idea having been shred to pieces. “Back to the drawing board” is sometimes his comment, with a broad grin on his face. All this sounds more cruel than it is: After all, you have just been saved from going down a cul-de-sac and you can start regrouping!

Scenario #2: You proudly walk into Barry’s office to show him something new. Barry thinks for a second, then jumps up to the blackboard and explains to you in no uncertain terms what you “really had in mind.” That’s great, because at this point you realize a joint paper will eventually be written.

Scenario #3: You march into Barry’s office and this time you’re convinced you have a blockbuster at your fingertips. You start to explain to Barry, and then he says “time out” and silence fills the room. After a bit of eerie silence you realize this time you’re going to write a very nice joint paper with Barry. Of course, after a few more moments scenario #2 will be repeated, but that’s quite alright!

Scenario #4: Barry asks you into his office and explains what he was struggling with lately. (He likes to put it this way, though: “I was banging my head against the wall about this…”) After he suggests jointly working on this you return to your office with a big puzzle in your hands. Those rare instances in which you can actually do the job asked of you and complete the argument are priceless.

Barry likes to pick on me since I’m usually not afraid of computing anything, well, almost anything (while he doesn’t have the patience to do so!). So once in July of 1997, he confessed to me that he had a terrible contradiction in his long manuscript on “The classical moment problem as a self-adjoint finite difference operator” (it later appeared in Adv. Math. 137, 82–203 (1998)), but he just couldn’t find the error. So I was supposed to look at this. It was an intricate puzzle! I spent a day on it and well after midnight was sure I had found the error. So I e-mailed him what I thought was the culprit and slumped home to the apartment. Next morning I opened my e-mail and there was Barry pointing out that I was dead wrong. It was “back to the drawing board” as he still grinned during our brown bag lunch meeting that day. I was dejected! Well, I had another day before going away with my wife to Hawaii, and I was not about to let this ruin our vacation! So I frantically computed like a dog and finally saw the light: This was it! I decided to treat myself to dinner and left after e-mailing him my second attempt to find the culprit. After returning from dinner late that night, I had received quite a different message from Barry. It started out: THANK YOU, THANK YOU, … and went on like this for half a page.

Evans Harrell:

One of the rare categories in which I can compete pretty well with Barry is in the execrableness of our handwriting. I was at MIT as a postdoc when we wrote our long paper on the Stark effect. Back in those days, one actually wrote articles by hand and a secretary, using a device called a “typewriter” turned them into manuscripts, leaving blanks for formulae to be inserted. As a lowly postdoc I didn’t get first pick of the secretarial staff, and the manuscript ended up in the hands of a well-intentioned but struggling secretary who would produce about one page per day, which was usually sent back multiple times with corrections, often amusing. One day a favorite adjective of Barry’s, “operator-theoretic,” came back as “operator neurotic,” and I knew the manuscript was taking its toll on her. With lots of encouragement and little gifts she finished the manuscript after months of work, as the term ended. But she didn’t return the next term — it was doubtless the last mathematical manuscript she ever typed.

Barry has always been remarkable for his vast knowledge of mathematics, so it was many years before I can recall ever telling him a published theorem he didn’t already know. One day I saw Barry in Princeton shortly after a meeting and told him about an old inequality for PDEs, which, as I could tell from his intent look, was new to him. I said, “It seems to be useful. Do you want to see the proof?” His response “No, that’s OK.” Then he went to the board and wrote down a flawless proof on the spot.

Ira Herbst:

Barry, Yosi Avron and I were working on magnetic fields. As everyone knows Barry is a very fast worker and he writes up his work even faster. Barry and Yosi felt we should write something and as usual I wanted to get more done first. One day the two of them arrived in my office and began trying to convince me again that we should write something up.  I protested, at which point Barry took his hand from behind his back and with a smile produced a manuscript which he had presumably written the night before.

Woody Leonhard:

Many of you know Barry from his academic work and community achievements. I have a rather, uh, different perspective. I had the distinct honor and privilege of co-writing a handful of computer books with Barry, including several Mother of All Windows books, and The Mother of All PC Books. 

I’ll never forget Barry’s squeals of delight when he found foolish inconsistencies in Windows, the way his voice would drop low – and he’d talk fast – when he was working through a particularly snarly problem, and the way he’d rub his hands with glee when a solution suddenly appeared.

Barry wrote about PCs with extraordinary clarity and wit. The Mother books became (in)famous for their casts of characters – no dry technical mumbo-jumbo here. My favorite character from the early Mother books was the eight-legged cockroach (and bug expert) known as Erwin. We gave Erwin the enviable assignment of pinpointing and explaining bugs in Windows, a task for which he was eminently qualified. 

Barry’s one of the most intensely intelligent people I’ve ever met – and delightful, in every sense of the term. Except for the puns. The puns were really, really bad.

Ed Nelson:

In the late 1960s, Barry was a graduate student in physics at Princeton and attended some courses I taught. I soon learned that I did not need to prepare with great thoroughness; it was enough to get things approximately right and Barry from where he was sitting would tell us how to get them precisely right. I miss Barry.

Barry received an ugly, uncivil letter from a mathematician complaining that he had not been given sufficient credit in a volume of Reed-Simon for his work on a certain topic. Barry responded with a dispassionate two-page letter calmly reviewing the entire history of the topic and the contribution of each person to it. He concluded the letter with a one-sentence paragraph: “I hope that you will receive this letter in the same spirit in which you sent yours.”

Shortly after moving to Caltech Barry came east for a visit. He said that someone had stolen his attaché case. When we asked whether he had lost anything of importance, he replied, “Only the paper I wrote on the flight.”

Once Barry wrote a paper on hypercontractive semigroups and when he got it back from the typist, every instance of “hypercontractive” was rendered as “hypercontraceptive”.

Peter Perry:

As an undergraduate I took Quantum Mechanics from Barry, little knowing that I would later wind up a student. At the same time, I was taking Functional Analysis from Ira Herbst. The Quantum Mechanics course met on Tuesdays and Thursdays and there was a break in the middle of the lecture. One day I had my copy of Methods of Mathematical Physics, Vol. I, Functional Analysis with me. Barry walked up to me during the break and broke into a big smile followed by mock indignation when he saw that I had a copy of Reed-Simon. “Don’t read that stuff!” he admonished me. “It’ll pollute your mind! It’s worse than comic books!” Long before Barry became a department chair he was already a master recruiter. Several years later, I had the good fortune to begin thesis work with Barry as a graduate student.

Beth Ruskai:

After he moved to Caltech, Barry invited me to spend a month there, which, I think, ended up as April, 1984. Since e-mail was not yet widely used, he phoned me in February to arrange times for us to talk before his schedule filled up. 

While I was there, I sat in on some of his lectures. One day he wrote a bound on the board and claimed it followed from the Schwarz inequality. I interrupted and said that was not true. He stopped, said “You’re right”, paused about 10 seconds and proceeded to give a different, correct proof of the desired bound. He did this so quickly, that I assumed the second argument was what he’d prepared and the first was the kind of blackboard glitch I often make. Later I learned that the incorrect Schwarz claim was in his Trace Ideals book, and he’d come up with a different argument on the spot.

Yakov Sinai:

In the mid-1970’s, Barry visited Moscow. One day he went into a store to buy some eggs. He handed over a 10 ruble bill to the storekeeper and said “Eggs” in Russian; it was the only word in Russian which he knew. She asked him whether he wanted to spend all 10 rubles (a considerable amount in those days) on eggs. But this was a different phrase which Barry didn’t understand, and in reply he just smiled his charming smile. She then gave him a check for a hundred eggs.

The following day, Barry gave a seminar at the university. It was his last day since he was leaving Moscow the next day. After his talk, he distributed the eggs among the participants. Following the American tradition, undergrads and graduate students received the largest number of eggs and professors received almost nothing.

I also enjoyed this issue of the Notices of the AMS, also celebrating Barry Simon’s contributions and impact.

Evans Harrell:

The very first article in Barry Simon’s publication list, which appeared in Il Nuovo Cimento when he was a 22- year-old graduate student, was concerned with singular perturbation theory. This paper showed that a certain regularized, renormalized perturbation expansion for a two-dimensional quantum field theory model converges with a positive radius of convergence. As Barry candidly admitted in that article, in itself the result was of limited significance, but in a subject for which at that time “all the mathematically suitable results…are of a negative nature,” it announced a new, more constructive era.

To the reader familiar with Barry Simon’s works on mathematical physics of the 1970s, it is striking how many of the hallmarks of his technique are already apparent in this first article. Before entering deeply into the research, Barry first carried out a thorough and penetrating review of the entire literature on the subject. This signature of his method was something those of us who were students at Princeton in the 1970s would witness every time Barry began a new research project: Seeing him emerge from the library shared by Jadwin and Fine Halls with a mountain of books and articles, it was humbling to realize that Barry was not merely brave enough to collect all of the knowledge about the next subject he wished to study, but seemingly overnight he would absorb it in detail and carefully assess each contribution for its mathematical appropriateness. True to form, in that first article, Barry laid out which claims in the literature were established with mathematical rigor, which were plausibly to be believed, perhaps with some extra attention to assumptions, and which were frankly dubious. Finally, Barry’s own way of formulating the problem was sparse and clear, and his reasoning incisive.

Percy Deift:

The 1970s were a very special time for mathematical physics at Princeton. The list of people who participated in math-phys at Princeton University in those years as students, postdocs, junior faculty or senior faculty, or just visitors for a day or two reads like a who’s who of mathematical physics. Leading the charge were Arthur Wightman, Elliott Lieb, and Barry Simon. But there were also Eugene Wigner, Valentine Bargmann, and Ed Nelson. And in applied mathematics, there was Martin Kruskal, still flush with excitement from his seminal work on the Korteweg-de Vries equation, and across the way at the Institute were Tulio Regge and Freeman Dyson, doing wonderful things. Barry was a dynamo, challenging us with open problems, understanding every lecture instantaneously, writing paper after paper, often at the seminars themselves, all the while supervising seven or eight PhD students.

I was one of those students. I had an appointment to meet with Barry once every two weeks. I would work very hard preparing a list of questions that I did not know how to answer. Say there were ten questions; by the end of the first ten minutes in Barry’s office, the first six questions were resolved. Regarding questions seven and eight, Barry would think about them for about two or three minutes and then tell me how to do them. Regarding questions nine and ten, Barry would think about them, also for about two or three minutes, and say, “I don’t know how to do them. But if you look in such and such a book or paper, you will find the answer.” Invariably he was right. So in less than half an hour, all my questions were resolved, and as I walked out of the door there was the next student waiting his turn!

Barry is one of the most prolific mathematicians of his generation. It was in the late 1970s, around the time that we were working on nonisotropic bounds for eigenfunctions, that I got a glimpse of the speed with which Barry did things. Soon after Volker Enss introduced his seminal time-dependent ideas on spectral theory and scattering theory, a few of us went to Barry’s house in Edison, New Jersey, to discuss a potential project inspired by Enss’s work. We spent the afternoon laying out in detail a list of problems that needed to be addressed and left in the late afternoon. The next morning Barry came into the office: Not only had he solved all the problems on our list, but he had in his hand the first draft of his subsequent paper [3]! We were overwhelmed. For a young person like me, this was most discouraging. And I was doubly discouraged: Barry was younger than I was!

Barry has many fine qualities as a colleague and as a researcher, but I would like to focus on just one of them, viz., Barry’s keen sense of fairness and correct attribution of results. People in orthogonal polynomials know well Barry’s insistence on calling the recurrence coefficients for orthogonal polynomials on the circle Verblunsky coefficients, in recognition of the almost forgotten seminal work of Samuel Verblunsky. But I would like to tell a different story. In the early 1980s Barry was in Australia, where he met up with Michael Berry, who was also visiting. Berry began telling Barry about some curious and puzzling calculations he had been making in quantum adiabatic theory. Barry immediately understood that what was really going on was a matter of holonomy, and with characteristic speed he wrote and sent off a paper to Physical Review Letters, pointedly titled “Holonomy, the Quantum Adiabatic Theorem, and Berry’s phase.” In this way, a major discovery that could quite easily have become known as “Barry’s phase” was fixed in the literature as “Berry’s phase,” and justly so.

Lon Rosen:

I must confess that my first meeting with Barry was far from auspicious. In 1967 I was in my first year of doctoral studies at the Courant Institute. Feeling isolated, I was reconsidering my decision not to have chosen Princeton for graduate school. I asked a friend to arrange a lunch meeting for me with a typical student of Arthur Wightman’s. I knew little about the “typical student” who was chosen (Barry Simon), although I was familiar with his name because Barry and I had both been Putnam Fellows in the 1965 competition.

Some typical student! He practically tore my head off. Whatever I said about my interests or ideas, Barry would trump it. I’d never met anyone else with such extensive knowledge, amazing recall, and proofs at the ready. I still haven’t. Thanks to Barry, I stayed put at Courant. Fortunately, James Glimm, who was to be my terrific thesis advisor, soon joined the faculty there. I learned later that Barry had been going through a rough patch in his personal and professional life around the time we met and that the fire-breathing dragon who had me for lunch was actually a gentle prince in disguise, if I may be permitted a fairy tale metaphor.

Three years later I gave a seminar at Princeton on the subject of higher-order estimates for the 𝑃(𝜙)^2 model. At the conclusion of the seminar Barry showed me a clever bootstrap trick that quickly established my most difficult estimate—or at least a weaker but perfectly acceptable version of it. I was grateful and revised the published paper accordingly.

This experience was not unique to me. As many speakers know, Barry’s rapid-strike ability could be unnerving at seminars. He would sit front row centre, working on a paper, only to surface with astute observations, counterexamples, or shorter proofs. This penchant for “tricks” arises, it seems to me, from Barry’s imperative to understand everything in the simplest possible way.

Mike Reed:

When people ask, “How long did it take for you and Barry Simon to write those four volumes of Methods of Modern Mathematical Physics?” I usually say, “About ten years,” since we started in the late 1960s when Barry was a graduate student and I was a lecturer at Princeton, and we finished in the late 1970s. Writing those books took 50 percent of my research time for ten years but only 10 percent of Barry’s research time, and that wasn’t because I contributed more—far from it. The reason is that no one works faster than Barry. He instantly sees the the significance of new ideas (whether in mathematics or physics), understands the technical structures necessary to bring the ideas to fruition, and immediately starts writing.

Barry’s legendary speed sometimes got him into trouble. I remember going to seminars at Princeton with Barry carrying new preprints from more senior mathematicians. As the seminar proceeded, Barry would read the preprint, absorb the idea, understand the correct machinery to prove a stronger result, and begin writing. No one is more generous than Barry at giving credit to others; he always does and did. Nevertheless, when Barry’s paper with a stronger result and a better proof would appear before the original result, the preprint’s author would sometimes have hard feelings. These feelings would usually dissipate when he or she actually met Barry and discovered how open and generous he is.

Of course, we were pleased and proud that so many colleagues and students found our books useful. We both still teach out of them and field email questions about the problems. Since we were so young when they were written, we got lots of funny remarks at conferences from mathematicians who didn’t know us, such as, “You can’t be the Simon who wrote those books; you’re too young,” and, “Hah! I always thought that Reed was Simon’s first name.”

Very highly-cited mathematicians, according to Google Scholar

(This is all an exercise in intuitionbuilding.)

The issue with Google Scholar’s search-by-tag is that it’s both insensitive and unspecific: the vast majority of mathematicians aren’t tagged with “mathematics”, like Terry Tao, and the majority of search results aren’t mathematicians but researchers who use math, like Eric Lander, founding director of the Broad Institute of MIT and Harvard and biology professor, who’s the top researcher in the entire world in the “mathematics” tag. There’s a reason he has 430,000+ citations, over triple that of the 2nd ranked guy, and quadruple that of the first name I recognize, mathematical giant Benoit Mandelbrot – it’s to do with papers in biology just being way more cited than in math.

(Actually, upon Googling, Eric turns out to be an applied mathematician, so I’m forced to eat my hat. And a spectacularly decorated one too: IMO silver medalist, Westinghouse Science Talent Search winner at 17 for a paper on quasiperfect numbers, Princeton valedictorian, Rhodes Scholar at Oxford, MacArthur Fellow, combinatorialist and representation theorist for a while. He looked to be one of the best of his PhD class worldwide for sure. The only difference between Eric and other handful-per-year talents like him was that he decided not to spend his life in “such a monastic career”, despite liking math, so he wandered around academically for a while from managerial economics (at Harvard) to information theory to neurobiology, where he found a rich trove of unexplored problems to tackle with math; he finally retrained as a geneticist. Actually there’s another difference: Eric had that once-in-a-generation combination of technical talent and leadership ability, founding multiple research centers like Broad and Whitehead that became world leaders in their field. Holy hell. I apologize to Eric for insulting him in my ignorance, and I take back the specificity claim above. Also I digress.)

The issue with Google Scholar’s search-by-tag is its insensitivity. One way around this is to use MathSciNet. But MSN requires paid subscription, so in lieu of direct access to a database I can play with I’m using the website Most cited mathematicians as proxy instead, to come up with additional names to search in GS.

Some perspective first. Let’s start with a Fields medalist who got his PhD in 1995 at age 26, the Russian representation theorist Andrei Okounkov. This is a mathematician good enough to have his own MacTutor History of Math page. He was damn good right off the bat: NSF grant for a postdoc position at MSRI Berkeley, IAS visitor, Sloan Fellowship in 2000, Packard million-dollar no-strings fellowship in 2001, full professor at Princeton in 2002, EMS Prize in 2004, culminating in the Fields Medal in 2006. The EMS prize citation calls him “an extremely versatile mathematician, he found a wide array of applications of his methods” and the Fields press release talks about how “his work is difficult to classify because it touches on such a variety of areas”. Andrew Wiles: “One of his greatest strengths is his amazing versatility. He works in many different fields of mathematics and succeeds in taking results from one area and applying them in a seemingly quite different field.” This is a brilliant, brilliant man. How does he fare on Google Scholar?

(There are the usual caveats about citations being an extremely imperfect proxy for research output. Yes, yes, Goodhart’s law and all that. I am trusting that you will be charitable enough to go with the basic thrust of this whole exercise. This isn’t to put down Andrew at all! After all, I have exactly 0 citations myself, having washed out of academia because I was too dumb to hack grad school for physics.)

Andrei has 9,040 citations from 128 publications, of which 80 have 10+ cites and 47 have 47+ and 20 have 150+ and 6 have 300+ (but only 1 has more than 440), so he’s a Jacob Fox-type, just consistently damn good.

To give another perspective on how good Andrei does – and now I wish I did this first – Scott Aaronson, one of my favorite academics for his blog output and take-all-comers attitude and polymathic bent and expository style etc, has exactly 6,600 citations from a voluminous 238 publications, including 81 with 10+ cites (yay!) and 17 with 100+, only 2 more than 500, none 700. This isn’t fair to Scott actually, since he’s only 38 years old and got his PhD in 2004 (much more recently than I thought…), so that his stats compare favorably to the best guys in the 2000s(!).

Andrei is damn good, and definitely a star. So is Scott Aaronson. But it’s fair to say they aren’t exactly giants. What does a giant look like?

Noga Alon is without a doubt a giant. Combinatorialists don’t get as many citations as analysts (let alone applied guys), so it’s pretty impressive what he does manage. Noga Alon is so big his students are themselves star advisors to today’s star young talents. It’s fun to read the CV of a giant like Alon – it’s three unformatted summary paragraphs of positions and prizes and academy memberships and editorships. (I wonder why giants even need CVs in the first place. When you’ve Transcended like Alon, everyone knows your name, learns about your foundational results in their apprenticeships in graduate school, goes to conferences in honor of your legacy, etc. What on earth is a CV even supposed to achieve?)

Noga Alon, who obtained his PhD in 1983, has 45,119 citations from 748 publications, of which a staggering 464 have 10+ cites, 306 have 30+, 215 have 50+, 96 have 100+, 19 have 300+, 3 have 1,000+ and one book, Spencer-Alon’s The Probabilistic Method, has nearly 7,000(!). I’d like to emphasize that he did it “the hard way” – his sole GS tag is “combinatorics”.

Whew.

You know, I follow Steven Strogatz (PhD 1986) on Twitter. I vaguely perceive him as a writer of influential textbooks and purveyor of pop math, who seems generally respected (albeit lukewarmly) by all and sundry. I was dense – those, in retrospect, are the tells of a giant. Now it needs to be said that Strogatz and Alon can’t be compared directly, since the former is an applied mathematician working in complex systems and networks (so he should be exactly the kind of person I’ve always wondered about, if Cosma Shalizi were just much stronger at math or whatever), a hot highly-cited field, whereas the latter is just about walking uphill both ways in the snow. But!

Steven, as it turns out, is a monster. He’s the second-most cited complex systems scientist in the world, after Boston University physicist Eugene Stanley (who also does stuff like biophysics, econophysics, and networks). Steven’s been cited a breathtaking 94,143 times in only 270 publications, of which 130 have 10+ cites, 105 have 30+ (that’s surprisingly “dense”), 39 have 200+, 23 have 400+, 8 have 2,000+(!), 3 have 8,000+(!!), and Strogatz-Watts’ Collective dynamics of small-world networks (1998) has an awe-inspiring 39,000 cites. (Duncan Watts, the “Watts” in the coauthorship, is the third-most cited complex systems scientist in the world, just 4k cites and change behind Steven.) That was a seminal paper in the true sense of the word – the most cited networks paper of all time, one of the top ten most cited physics papers of all time, one of the top hundred most cited papers outright of all time. His textbook Nonlinear dynamics and chaos has been cited nearly 11,500 times. I’ve never seen such a steep cites curve that wasn’t just “one jackpot and an army of duds”; Steven really does have it all.

(More than all this simplistic reduction-to-citations, he’s a teacher and expositor, one of the greats of our generation. I’m honored just to be able to retweet him.)

Speaking of Cosma (2001 PhD, also somewhat younger than I thought), I’m tremendously delighted to discover he’s on Google Scholar, since so many good guys aren’t. It’s heartening to discover that perhaps my all-time favorite geek, whose blog I turn to for sheer intellectual pleasure and quote from more than any other among academics, also happens to be a research star outright, although (to be frank) he’s the jackpot type. Out of 16,071 citations from only 90 publications, nearly 11,700 (almost three-quarters!) come from two works alone: Power-law distributions in empirical data (coauthored with Clauset and Newman) and An introduction to econophysics (sole author). That answers my confusion as to why he isn’t full professor yet, I guess? He does have 46 with 10+, 31 with 31+, 21 with 60+, 14 with 100+, 7 with 200+. Oh well, many more ways to contribute to the marketplace of ideas than just original research eh?

Another delightful surprise: Michael Nielsen (1998 PhD at 24) is actually a giant. Particularly astounding for a “freestyler” (nontraditional intellectual instead of academic), and doubly so considering he’s so damn young (at 45 he’s in Terry Tao’s ballpark – bonus: he’s Aussie, and hence made a splash in the papers, as “Aussie’s youngest academic”, joining Geordie Williamson and Terry), he’s the most cited open scientist of all time, the most cited intelligence augmentation researcher of all time, and the second-most cited quantum computing scientist alive (this is misleading actually, since everyone is mixing cites, plus he’s a jackpot type too, but whatever). Mike has 53,035 citations from 120+ publications, of which 75 have 10+, 54 have 54+, 21 with 200+, 16 with 300+, 9 with 500+, 3 have 1,000+ and his textbook has – hold your breath – over 37,000 citations, as much as Strogatz-Watts, which is mind-bending. It’s all the more surprising when you realize that the textbook he co-wrote with Isaac Chuang (also a damn young guy for such an impactful scientist, PhD 1997) was published when he was just 26 years old, and it went on to become the standard text in the (exploding!) field of quantum information theory.

We’ve been skirting the subject too long. How does Terry Tao do?

The first thing I noticed was that all of his 6 most-cited papers were co-written with the applied mathematician Emmanuel Candes; none of the ones he wrote with others exceeded 1,600 cites. (That should’ve clued me in to the fact that Candes is a giant.) Terry has 4 publications with 6,000+ cites, which except for Candes would’ve been unheard of (to me); his seminal field-founding Exact signal reconstruction from highly incomplete information has nearly 15,000. He has 9 with over a thousand, 12 with 500+, 20 with 300+ (where he starts to “become Noga Alon”), 31 with 200+, 55 with 150+, 89 with 100+, 217 with 30+, 304 with 10+, and about 585 publications in total. Noga Alon’s tail is much fatter – he has more 30+ papers than Terry’s 10+, which is amazing – perhaps mostly due to the 13 extra years between their PhDs. He is the most cited analyst in the entire world, and the most cited PDE specialist alive, certainly the most cited random matrix theorist ever. He is also the second-most cited combinatorialist alive, after Ron Graham himself; considering that the latter only has 0.2% more cites it’ll be a short while before Terry becomes the most cited mathematician on the planet in all (4) his tags, which aren’t that related too.

A few words too about the mathematical physics giant Barry Simon, because I’ve been recently reading his colleagues’ remarks on the occasion of the conference in his honor, talking about his scintillating mind. (To give an idea: “The body of his work during the time of his doctoral research was of such importance that he was immediately appointed to assistant professor” after his PhD at 24.) 75,112 citations from 755 publications (within 1% of Noga Alon!), of which 390 have 10+ cites (2nd only to Noga), 289 have 30+, 127 have 100+, 21 with 500+, 10 with over a thousand, 3 with 2,000+, and his famous Mathematical methods of modern physics textbook with Mike Reed that has 21,000 citations. Barry appears to be “peaked Noga”, since 10 > 3.

What does the most celebrated theoretical physicist since Albert Einstein himself look like?

Ed Witten never hit the jackpot: his most cited paper ever, Anti de Sitter space and holography, only has 11,300 citations. But he has 19 publications with 3,000+ cites, including 9 with 8,000+. He has 33 with 2,000+, 67 with 1,000+, 101 with 600+, 111 with 500+, 210 with 200+ (although his h-index is listed at 190, so there must be a bunch of repetitions – okay I can’t be bothered), 302 with 50+, and slightly under 400 in total, giving 195,352 citations.

What does the most prolific mathematician of all time by paper count look like?

Paul Erdos died in 1996, the same year Terry Tao got his PhD, which is sad since they could’ve perhaps collaborated. (Some people like to interpret this as a symbolic “passing of the torch”. I think that’s unfair to Terry, who’s a wonderful handful-in-a-generation talent who nonetheless isn’t a literal goat like Erdos.) Interestingly Google Scholar only counts citations since 1980 if I’m interpreting the graph correctly. No matter. Although his h-index is low (“only” 115), his i10 index is absolutely unparalleled. Erdos published 790 papers with 10+ cites. He had 25 with 500+, 37 with 400+, 55 with 300+, 84 with 200+, 211 with 80+, 317 with 50+, 487 with 30+. All this made for 80,967 citations in total, from 1,657 publications, perhaps with repetition and including non-journal stuff.

(to be continued)

Some promising young mathematicians by MathSciNet citations

This list of mathematicians ranked by MathSciNet citations is nice. (I have a feeling it’s maintained by Quora’s John Math, because John always signal-boosts this site and nobody else does.) Let’s go through some of them.

Apparently there are only two mathematicians with more than 100 citations who got their PhDs in 2017. One is Amir Abboud, with 131, currently at IBM:

A native of Haifa, Dr. Abboud was just 12 years old when he began studying mathematics at The Open University of Israel, and 15 when he entered the University of Haifa, where he earned his BSc, summa cum laude, in computer science in 2010. He completed his MSc (2012) summa cum laude at the Technion—Israel Institute of Technology and his PhD (2018) at Stanford University, both in computer science as well. Since finishing his doctorate, Dr. Abboud has been employed by IBM Research’s Almaden Lab in San Jose, California, where he has been examining the computational complexity of fundamental computer science problems.

Dr. Abboud has earned awards for excellence throughout his academic career and has held visiting researcher appointments at MIT, the Simons Institute at UC Berkeley, and the University of Haifa. An invited speaker at professional conferences around the world, Dr. Abboud serves on the program committee of various gatherings, including the Symposium on Foundations of Computer Science, the International Colloquium on Automata, Languages and Programming, and the Symposium on Discrete Algorithms—conferences at which his submissions became the top-cited papers.

http://www.weizmann.ac.il/WeizmannCompass/sections/new-scientists/solving-complex-problems-from-a-to-z

The other is Sebastien Vasey, currently a Benjamin Pierce fellow at Harvard whose PhD was from CMU, whose research “investigates interactions between algebra, infinite combinatorics, category theory, and large cardinals, often using ideas from model theory”. Google Scholar actually lists him as having an incredible 427 citations, with a h-index of 14 and an i10-index of 15 (wow!). He’s an absolute star: he wrote the best mathematical logic thesis in the world, leading to the Sacks Prize; his thesis, Superstability and categoricity in abstract elementary classes, was a 572-page monster. His CV lists 36 papers(!), including multiple publications with Saharon Shelah(!!), 23 of which were submitted before he defended his thesis (holy crikey).

What about 2016 PhDs?

David Kamensky tops the list, which is to be expected given he’s an applied mathematician. His 36 papers have 870 citations on Google Scholar, of which 266 made their way to JM’s list. After obtaining his PhD from UT Austin in CSEM (comp sci, eng and math) he did a bunch of engineering-related postdocs at UCSD and Brown before landing an assistant professorship this year (congratulations!). He appears to be an infrastructure builder, which is awesome and underrated:

My research is interdisciplinary, with the unifying theme of exploring the use of flexible, automation-friendly representations of geometry in physics-based computer simulations. This work aims to address the current bottleneck in computational mechanics, namely that analysts spend a substantial amount of their time on geometry manipulation, which drives up the cost of applying computer simulation to design, optimize, or predict behavior of physical systems. Eliminating the need for manual intervention by analysts will ultimately enable artificial intelligent agents to set up, execute, and learn from physics-based simulations, contributing to automation of broader tasks, such as engineering design, scientific inquiry, and technically-informed decision-making. I have worked on a number of different numerical methods and applications …

https://david-kamensky.eng.ucsd.edu/research

The second most cited young mathematician is, also predictably per JM, an analyst: Ciqiang Zhuo at Beijing Normal University, who ResearchGate credits with 22 publications totaling 351 cites of which 217 made it to the list.

The cite count really starts to balloon before that. Among 2015 PhDs nobody was more cited than the Italian analyst Maria Colombo, age 30, who “leads the Chair of Mathematical Analysis, Calculus of Variations and PDEs” (wow!) and is an assistant prof at EPFL. She has a staggering 76 papers already, with 1,072 citations on GS (492 on JM), and i10-index 22(!). From her CV it’s clear she’s never not been a perfect scorer: 100/100 grade to finish high school, 110/110 undergrad, 70/70 master’s. She has so many invited lectures to all the top institutions it’s ridiculous.

Among 2014 PhDs the 31 year old Catalonian PDE specialist Xavier Ros-Oton reigns supreme. The man, now assistant prof at ETH Zurich, has coauthored with all the world’s top analysts: Caffarelli, Figalli the Fieldster etc. Among his 1,556 GS citations is a paper with 432 cites, published in 2012, 2 years before he got his PhD. He’s got 27 papers with more than 10 cites, 20 with more than 20, and 44 papers in total. (RG lists 40 papers and 1,124 cites. Now I have an idea of how RG/GS converts.) He won the Premio Vicent Caselles for the best math PhD thesis in Spain (holy crikes). He was, at 29, the youngest ever winner of the Premio Antonio Valle award for best under-34 researcher. He finished a 5 year undergrad program in 4 years and still ranked 1st overall.

2014 seems to have been a good year for analysts: the top 3 were all analysts. We get the “youngest” ICM speaker here in terms of years since PhD (although he’s actually 36): algebraic geometer/combinatorialist June Huh, who (spectacularly so) is a visiting professor at the IAS. He was a Clay Research Fellow since 2014, and an IAS Veblen Fellow since 2014 too (immediately out the gate!), and he won the New Horizons prize in 2019. To give an idea of what “spectacular success” looks like cite-wise, he has 464 GS cites from 26 papers (including a 99-cite single-author paper, wow), of which only 144 made it to JM for some reason, and only 12 papers with 10+ cites.

We also have the “youngest” number theorist to make it to JM’s list: Michael Rassias at the IAS/ETH Zurich, who has 745 GS cites (229 JM) from a staggering 95 publications (45 papers in journals and 7 in collected volumes, and 11 books), including 3 with 100+ cites (well done!) and 22 with 10+.

2013 is the latest year where all JM’s math subfields have entrants. An analyst again tops the list: nonlinear functional analyst Yisheng Song, who’s been a distinguished professor at Henan University since 2014, and has a staggering 2,041 GS citations (822 JM) from 110 publications (whew, what a workhorse) including 45 with 10+ cites over 12 years, beginning 6 years prior to his PhD actually (wow).

Except JM fcked up, because Song didn’t top the list. Graph theorist Michal Pilipczuk, assistant professor at the University of Warsaw and an expert on parameterized complexity, 2nd in JM’s list, has way more GS cites: a mind-blowing 2,940 from 126 publications (149 per RG), including a 921-cite monster in Springer and 3 other 100+ papers (Song only has 1).

We also have another ICM speaker, the 31 year old Canadian analytic number theorist Maksym Radziwill (whose name I recognize as being a Tao coauthor), who’s only 2nd in number theory but has won the SASTRA, the New Horizons, and has published 40 papers. Unfortunately Google Scholar doesn’t list him. He got his PhD under Soundararajan (another Tao frequenter) at Stanford, then immediately went to the IAS, became an assistant prof at Rutgers a year later, won the SASTRA 2 years after that with Matomaki (Tao again!), Sloan year after, Coxeter-James and Banach prizes and Caltech full professorship and ICM speaker year after at age 30. Whew. All this, and only 159 citations on JM’s list.

The best young geometer/topologist/dynamical systems theorist is Semyon Dyatlov, assistant prof at MIT, who works in physics-related stuff: scattering theory, microlocal analysis, quantum chaos, dynamical systems, and general relativity. He was a Clay fellow right after PhD, and has 1 paper with 100+ cites. He’s been cited 1,055 times in GS (371 JM), of which 27 papers have 10+.

The best young probabilist? Either Nicolas Perkowski or Joseph Neeman. Perkowski, a junior prof at Humboldt U and Heisenberg researcher at the Max Planck Institute, has 806 citations from 44 papers, including 19 papers with 10+ and 2 with 100+. Neeman has an astounding 1,735 citations from 37 papers, including an amazing 4 with 100+ (indeed 4 with 180+) and 12 with 50+.

There is no contest at all for 2012, because that was the year Peter Scholze got his PhD.

I am extremely tempted to say, knowing full well how blasphemous this sounds, that Scholze is the second coming of Grothendieck. By his mid-20s already the community was coming to him in droves. He was considered a serious candidate for the Fields as early as 2014, when he was just 26 years old, which would’ve made him younger than Serre. The way Michael Harris talks about him, I’ve never seen any of the 21st century Fields medalists get that reception. Most of these young stars approach genius. Scholze isn’t just a genius, he’s a prophet.

The one thing that fascinates me is how hard it apparently is to get citations on MathSciNet for algebraists. An actual once-in-a-generation talent like Scholze only manages 371 JM, whereas the top-cited mathematician from 2012, Chris Goodrich (who unfortunately isn’t classified), gets 1,017, which is in fact the most cites of any mathematician since 2012.

(I’ve given up trying to look up the Chinese. The top-cited actually-classified person is Binlin Zhang, who doesn’t appear in GS or RG. This is a recurring theme.)

I’d like to highlight the algebraic number theorist Jack Thorne, perhaps the most distinguished member of the 2012 class who isn’t Scholze. He’s probably the 2nd youngest ICM speaker after Radziwill, at age 31. He did his PhD with Gross and Taylor at Harvard, then was a Clay fellow, then (at 28) became a Professor at Cambridge. He got the Whitehead Prize at 30, and the year after won SASTRA.

The class of 2011 PhDs raises the bar for most cites for the 7th straight year. The surprise this time is that it’s a probabilist, 35 year old Ivan Corwin at Columbia, who’s got 1,125 JM cites (Hugo Dominil-Copin only has 512 in 2nd). It’s unusual for them to be this highly cited; no wonder Corwin was considered a strong contender for the 2018 Fields. He was made full professor at 33. He seemed to have really taken his time, getting his bachelor’s at 22 (when most on this list are nearing the end of their PhDs) and PhD at 27. But then the year after he became a Clay fellow, and won the Young Scientist Prize, and got the first Schramm Memorial postdoc fellowship at Microsoft Research and won a $150k NSF grant, then 2 years later held the first Poincare Chair at the Institut Henri Poincare and won the near-million-dollar Packard Fellowship and was invited to present his work at the ICM (at age 30, rivaling Radziwill), and in 2018 at age 34 he won another NSF grant, this time half a million. He has 3,030 citations from 77 papers, 44 of which have 10+, 14 with 50+, 5 with 100+, and 3 with an incredible 300+. Whew. Another perspective on how good Corwin is: you have to go back 7 years, to 2004, to find a probabilist more cited than him – Ivan Nourdin, with 3,726 citations from a voluminous 155 papers including 69 papers with 10+ cites and 7 with 100+.

Okay now I don’t know what to believe. Applied mathematician Michael A. Scott is listed in JM as 2010 PhD, but on his own page he says 2011 (CS, eng, math from UT Austin). They’re most likely the same guy: this one is a citations monster, perhaps the best from the 2010s. Google Scholar lists 5,571 citations from only 59 papers, of which 34 have 10+, 11 have 200+, 15 have 100+ and a ridiculous 6 have 300+, so he’s a few-but-ripe guy.

The most cited pure mathematician, PDE specialist Mihai Mihailescu, loses out to Corwin: 1,972 citations from 97 publications, of which 40 have 10+ and 4 have 100+. That’s still pretty damn impressive of Mihai, because no other analyst does as well right up to 2007 when you get Figalli the Fieldster.

Notable: star combinatorialist Jacob Fox, the best of his field this decade. In high school he won the Intel Science Talent Search. At MIT he won the Morgan Prize for best undergrad researcher. He got his PhD at 26 under Benny Sudakov, another Hungarian-style combinatorialist, himself trained under the giant Noga Alon; same year he won the Denes Konig Prize, and the MIT Simons Fellowship. At age 29 he got a Sloan fellowship and the million-dollar Packard fellowship. At age 30 he was invited to speak at the ICM, at age 31 he became full professor at Stanford. He already has 8 PhD students, including IMO superstar Lisa Sauermann (4 golds 1 silver, 3rd all time) who graduates this year to immediately become Szego Assistant Professor at Stanford wtf, and 50 or so coauthors (I lost count). He has a staggering 179 papers. It’s especially impressive that despite not a single one of them breaching 100+ cites, he has 5 with 70+ and 13 with 50+ and 18 with 40+ and 29 with 30+ and 58 with 10+, so he’s just consistently damn good, giving him 2,275 citations in total. Fox has more 10+ papers than anyone else in the 2010s. Another perspective on how good Fox is: you have to go back 7 years, to 2004, to find a combinatorialist more cited – Daniel Kral at Masaryk University has 2,769 citations from 192 papers on GS, of which 81 have 10+ cites, although even then Fox has 13 papers with 50+ cites already to Kral’s 10.

Happy to note that the most cited member of the class of 2009 handily beats anyone from the 2010s, including applied guys. Computer scientist Daniel Lokshtanov at UCSB has a staggering 5,923 citations from 193 publications, 15 with 100+ and a mindblowing 101 of which have 10+ cites. That’s both more papers than Fox and more cites than Scott. Wow! I don’t understand why he isn’t as awarded.

But Lokshtanov isn’t actually the top-cited researcher in JM 2009, he’s second. The top, by 1,924 to Daniel’s 1,636, is the American cryptographer Craig Gentry, who’s 46 years old. His educational path took a bunch of turns: BS math at Duke in 1995, Harvard Law JD in 1998, then suddenly pivoting to Stanford CS PhD in 2009, 11 years later. But he’s impactful indeed. Ever heard of fully homomorphic encryption? Craig constructed the first FHE scheme in his doctoral thesis, winning him the ACM Doctoral Dissertation Award and (next year) the ACM Grace Murray Award, then in 2014 the MacArthur. Craig, unlike Daniel, was also an ICM speaker.

Another standout is the English combinatorialist David Conlon, an ICM speaker at age 32, who got his PhD under Tim Gowers and was made Professor at Oxford at 34. He has 1,401 citations from 80 publications on GS, of which 33 papers are 10+ and 8 are 50+.

37 year old Australian group theorist Geordie Williamson isn’t anywhere close to being the most cited person among 2008 PhDs (he’s not on GS unfortunately, but on JM he’s only 319 for 3rd among algebraists to the leader, Italian optimal transport expert Nicola Gigli’s 2,009), but he’s definitely the star of the class. Among mathematicians not named Scholze he was one of the frontrunners for the 2018 Fields Medal, so it was a bit of a surprise he didn’t win. He’s the youngest living member of the FRS, at 36. He got the Clay Award (not fellow!) and Chevalley Prize and EMS Prize and ECM invited speaker in 2016, the New Horizons in 2017, and was ICM plenary speaker in 2018. Aussie media calls him “the star of Aussie academia right now” – that’s outright, not just among mathematicians. Some neat quotes about him:

Fortunately, Williamson is a good talker – jaunty and light, his sentences tripping along before ending with an upward inflection, like a little trampoline kick-out off the final syllable. He’s a little goofy. He smiles a lot; his eyes go wide. You get the sense that inside his head is a banging dinner party where all these brilliant ideas are elbowing one another to get out and roam around.

“Geordie is a world leader in his field, and his presence is much sought after at all the major mathematical centres,” says Peter Sarnak, a professor of mathematics at America’s Institute for Advanced Study, in Princeton.

Williamson is playful, creative and casually nonconformist, a slacker-esque figure with a 100-megawatt brain. “Geordie has given surprising answers to very hard questions in unexpected ways,” says Professor Jacqui Ramagge, head of the University of Sydney’s School of Mathematics and Statistics. “What is different about his work is that lots of people had tried and failed where he succeeded.” A good example of this is the Lusztig conjecture, which was, since first being posited in 1979, the most famous “open problem” in representation theory. The entire field was convinced the conjecture was true, but they didn’t know why. Colossal amounts of computing power had been dedicated to the problem; lots of exceptionally clever people had devoted their careers to it. Then, in 2013, Williamson demonstrated that the conjecture was false. “I remember being at a conference in representation theory in Shanghai at the time,” says Anthony Henderson, “When the news came through about Geordie’s discovery, all discussion immediately ceased.”

Geordie went to primary school in Moss Vale – Leigh would drive him – but he got bullied for being a “smart-arse”. He then went to the Steiner school in Bowral, half an hour away, until year 7. He found maths a breeze, and got easily bored: “I wanted to know all this stuff that wasn’t being taught.” One day, on a four-hour car trip to Bathurst, he read a 656-page physics textbook. (He was 12.)

Williamson left school in 1999, with an ATAR of 99.45 (the highest score is 99.95). He enrolled at the University of Sydney in a Bachelor of Arts degree, with a double major in science and arts. Williamson chose pure maths as his science option, but only as a backup. “What really interested me was English,” he says. “I actually loved English more than maths. I remember in first year uni, I was like, ‘This maths stuff is just bookkeeping to make a solid theory.’ English was deeply interesting in a way that maths wasn’t.”

That changed in second year, when he came across the work of Évariste Galois. Born in Paris in 1811, Galois was a flamboyant prodigy and radical republican who, at the age of 18, solved a mathematical problem that had baffled experts for 350 years. (He died at the age of 20 in a pistol duel over a lover.) “It took the next 50 years for people to figure out how he had done it,” says Williamson. “The thing I loved about it was that he had used the most phenomenally beautiful working. It was the first time that I saw that mathematics can be really, really deep.”

Nicola Gigli himself happens to be on Google Scholar, so some stats: 5,088 citations – that’s honestly outstanding, only Lokshtanov has more among non-applied guys – from 106 papers, including 44 papers with 10+ cites, 11 with 100+ and an absolute jackpot of a book, Gradient flows, with 2,333 citations (no other paper exceeds 250).

Last notable among 2008 PhDs is 2014 ICM invited speaker and American combinatorialist/computer scientist Adam Wade Marcus. I confused him for a while with another Adam Marcus, a young white bespectacled database expert with nearly 4,000 citations on GS, yeesh – he’s not on GS. Adam’s a funny guy. He’s also a star. Following his PhD, he immediately became Gibbs Assistant prof at Yale, and became the recipient of the inaugural Denes Konig Prize in discrete math for solving Stanley-Wilf (wow). By resolving Kadison-Singer with Nikhil Srivastava (class of 2010, 2nd among combinatorialists to Fox the superstar) he won the Polya Prize in 2014 and got invited to speak at the ICM; he was 35.

And then we come to 2007, the year of Alessio Figalli. You’ve heard all the stories, the breathtaking ease, the quickness and the breadth. How does that translate to Google Scholar? 4,067 citations from 214 papers – that’s more than anyone above – an amazing 95 of which have 10+ cites, an even more amazing 25 with 50+, yet only 3 with 100+. Figalli is like Fox, just consistently damn good, except at a higher level. He’s the best analyst in the world since 2007. At age 30 he was invited to speak at the ICM. He got his PhD under Cedric Villani at 23, then became full professor and won the Peccot-Vimont Prize at 27, won the EMS Prize and the Cours Peccot at 28, the Stampacchia Medal at 31 for best calc of variations contribution in the preceding 3 years etc.

Okay, honestly I’m losing interest since a lot of the top-cited guys here are 40ish years old so “promising & young” doesn’t quite apply anymore, and anyway guys like Scholze, Fox, Figalli are sucking the air out of the room. So I’ll just broadly survey the 2000s.

The analyst Stevo Stevic (2001) is the top non-applied guy from the 2000s, with a massive 12,014 citations from a staggering 516 papers to his name, including 260 with 10+ cites (wow), 115 with 50+ (holy crikey), 12 with 200+. Hm, it seems that a lot of his very top papers seem to repeat on GS, inflating his cites. It’s hard to figure out anything about him. Huh.

PDE specialist Luis Silvestre (2005) at the U of Chicago seems to be a star, although he was only made full prof last year. He’s got 5,088 citations including 46 with 10+ cites and 8 with 100+.

Russian computational mechanic/engineer/Paul Sorensen Chair Yuri Bazilevs (2006) at Brown is the king of the 2000s by a huge margin, because he’s an infrastructure builder and his infra, isogeometric analysis, is ridiculously impactful. His IGA paper, where he introduced the term, has over 4,000 citations alone, and his IGA book has 2,600. The man is so impactful it’s ridiculous. An unfair 25,142 citations, 119 papers with 10+ cites, 68 with 68+ cites, 33 with 200+, etc.

Ben Green, annoyingly, isn’t on GS. He’s the top-cited number theorist of the 2000s. Neither is Manjul Bhargava, the surefire candidate for the 2014 Fields.

Martin Hairer is. He has 5,585 citations from 164 papers, 84 with 10+, 41 with 41+, 14 with 100+, 4 with 300+, but none more than 415 (his prizewinning work) – very even. He’s listed as a probabilist.

Before I head to bed, I’ll close with the giant of the last 30 years: Terry Tao. A scarcely-believable 68,860 citations. 91 papers with 91+ cites. 4 papers/books with 6,000+ cites, including a 14,900-cite monster in exact signal reconstruction. 20 with 300+, 31 with 200+, 88 with 100+. 585 publications in total. Whew…