For most of my computer science PhD the “trick” was just to get the inductive definition to work, and then how to tweak it for the next paper. Or, get enough structuret we can do an “abstract nonsense” proof[0].
As one of those that do not like the “sets at the bottom” approach I just want to highlight why. For me, mathematics built on sets have leaky abstractions. Say I want natural numbers, I need to choose a concrete implementation in set theory e.g. Von Neumann, but there are multiple choices. For all good definitions, so get Peano arithmetic and can work with, but the question “Is 1 and member of 3” depends on your chosen implementation. Even though it is a weird question, it is valid and not isomorphic under implementations. That is problematic, since it is hidden in how we do mathematics mostly. Secondly, it is hard to formalize, and I think mathematics desperately needs to be formalized. Finally, I do not mind sets, they are great, and a very useful tool, I just do not like they as the foundation. I firmly believe we should teach type theoretic or categorical foundations in mathematics and be less dependent on sets.
> Say I want natural numbers, I need to choose a concrete implementation in set theory e.g. Von Neumann, but there are multiple choices.
You don't need to choose a concrete implementation. If you don't want to choose a construction, you can just say something like "let (N, 0, +, *) be a structure satisfying the peano axioms" and work from there.
> For all good definitions, so get Peano arithmetic and can work with, but the question “Is 1 and member of 3” depends on your chosen implementation. Even though it is a weird question, it is valid and not isomorphic under implementations. That is problematic, since it is hidden in how we do mathematics mostly.
Why is that problematic? The constructions are isomorphic under the sentences that actually matter. This kind of statement is usually called a "junk theorem", and they are a thing in type theory too, see for example this quote from a faq by Kevin Buzzard about why Lean defines division by zero to be zero:
> The idiomatic way to do it is to allow garbage inputs like negative numbers into your square root function, and return garbage outputs. It is in the theorems where one puts the non-negativity hypotheses.
> Secondly, it is hard to formalize, and I think mathematics desperately needs to be formalized.
Is that actually true? At the very least writing out the axioms and derivation rules is easier for set theory, since it's simpler than type theory. And there has been plenty of computer-verified mathematics done in Metamath/set.mm and Isabelle/ZF, even though less has been done than in type theory. Currently the automated tools are better for type theory, but it seems likely to me that that has more to do with how much effort has been put into type theory than any major inherent advantages of it.
---
More generally, types in type theory are also constructed! The real numbers in Lean don't come from the platonic realm of forms, they are constructed as equivalence classes of cauchy sequences. And the construction involves a lot of type-theoretic machinery which I'd usually rather ignore when working with reals, much like I'd usually rather ignore the set-theoretic construction of the real numbers. And the great thing is that I can ignore them, in either foundation!
So I just don't really buy these common criticisms of set theory, which to me seem like double standards.
Sure you can work around it most of the time, but some times you cant. The whole point is that isomorphic is not equality in set theory, and sometimes proofs does not transfer along isomorphism because they refer to implementations. I agree that it is much preferable to work with abstract structure, but that not always what happens in practice. The natural number example is contrived but easy to see.
My point of view is also that I do not like the Lean approach. It would actually like no junk theorems to exist in my theory. I am much more partial to the univalent approach and in particular univalent implementation that compute e.g. cubical. Regarding how easy it is to formalize, you are right. Lots of good work happens with set theory based type theory. My point was also that set theoretic foundations themselves are very hard to formalize, e.g ZFC + logic is very difficult to work from. A pure type theoretical foundations is much easier to get of the ground from. To prove that plus commutes directly from ZFC is a nightmare.
As noted in another reply, the natural numbers example is contrived, but illustrative. Nevertheless, if you have a set theoretical foundation, e.g. ZF/C, at some point you need to define what you are doing in that foundation. Most mathematicians do not and happily ignore the problem. That works until it dont. The whole reason Vladimir Voevodsky started work on HoTT and univalent foundations was that he believed we in fact DO need to be able to pull definitions back to foundations and to verify mathematics all the way down.
A point that is maybe not obvious to people who have not done mathematics at a high level or done “new” mathematics, is that often you end of changing your theorem or at least lemmas and definitions while figuring out the proof. That is, you have something you want to prove, but maybe it is easier to proving something more general or maybe your definitions need to change slightly. Anecdotally, during a project I spend perhaps a year figuring out exactly the right definition for a problem to be able to prove it. Of course, this was a very new thing. For well-know areas it is often straight forward, but at the frontier, both definitions and theorems often change as your proceed and understand the problem better.
A dissenting voice that might of interest here is Peter Zeihan. He claims that the US is one of the few places that won’t collapse in the coming decade. I do claim here is right, but his analysis is, if nothing else, interesting. His book, The End of the World Is Just the Beginning, is an easy enough read.
This depends greatly on the working definition of collapse. If your mere existence becomes illegal it doesn't really matter if the economy is technically still functioning.
I guess if you are are straight white make you will probably be ok. Most of my friends are women and/or LGBTQ, and almost all of them would be considered enemies by the current administration.
The US. I am from one of the Scandinavian countries with small kids, and trying to align work to move, at least temporarily, in a couple of years when they are able to handle them self in English. My wife was an exchange student many years ago and kept connection with the host family. We’ve visited multiple times. The company I work for has a US branch. It’s the only other country I've ever wanted to live in. Visited many others. It seems to me to be the most “alive” and dynamic western country, even with faults. Scandinavia and western Europe is stagnant, regulated to death and slumbering. It might change, I hope it will, but living here, not on the horizon yet.
Biochar is exactly doing that and is an active area of research in many places. There is several ongoing projects also showing that biochar can improve soil quality and crop yields.
I work with technology and research management at a multi national industrial company. This includes strategy and roadmapping for specific products, but also broadly following research trends and networking with universities. I did a phd in theoretical computer science and stumbled into by accident. It is interesting following research but also see it applied and working with business from the ground up. It is a lot of stakeholder management and knowledge dissemination which I enjoy. Lots of internal networking and relations and informal power. Besides job, I am on the board a handful of volunteer organizations or non-profits. Last year I also taught a semester which I missed a lot from academia. I need many different things to keep my interest so the job works well for me.
As mentioned by another comment, this is a big reason that Vladimir Voevodsky started his Homotopy Type Theory and Univalent Foundations program. He had see first hand a field collapse by a mistake in “first lemma on the first page” of a foundational paper. Arguably, he initial work on UniMath and the special year at IAS ending up in the HoTT book, pushed the whole formalization of mathematics topic forward to where it is today.
Or for the mathematically inclined: How many n x n puzzles with unique solutions exists for a given size n?
n=1 is trivial, and n=2 it small enough to enumerate with 3^4 = 81 solutions, but many of them being degenerate (no solutions), but already n=3 is pretty bad with ~20.000 possible puzzles. I do not see an obvious path to compose solutions either and make use of some kind of structural induction.
[0]:https://ncatlab.org/nlab/show/category+theory#AbstractNonsen...