>Life is full of variable reward schemes. Probably why we evolved to be so enamoured by them.
Important to point out that that every high culture produced restrictions on exactly those behaviors, gambling was a universal vice when that concept still mattered.
America in particular had a work culture that favored well, work and technical excellence. Now work is for suckers, thinking is for suckers, precision not worth it when you can have some machine do it half-right.
"Yes I could go for the reliable option. But taking a punt is worth a shot if the cost is low.", might as well be the national slogan from vibe-coding to the department of defense. Even the venture capital industry that excels at slot machine sectors was itself already a slot machine.
Genuinely baffled by the logistics of this. The article makes it sound like these are large numbers of people in NK or surrounding countries who rely on Google translate, so not sophisticated spies or whatever.
Even if they get their hands on a fake American ID, these are taxable, insured jobs, they're not working at a restaurant under the table. IT companies ship out hardware, where are these people banked etc?
How does this practically look, officially you're working with Mark Johnson but you end up on a zoom call with a guy who speaks broken English and connects from the other part of the world and you're not suspicious?
we need sources for the fact an electric motor, all other things being equal, is better than a combustion engine? If you agree that people in general value the health of their lungs that alone is sufficient reason.
It's also becoming quickly a question of geopolitical resilience, running your transport system on dinosaur juice coming from regions where people blow each other up is bad in particular if you happen to be Japanese automaker Honda
> an electric motor, all other things being equal, is better than a combustion engine?
This is not the core argument. Motors maybe superior - we can agree on that. The power source (batteries) and the environmental impact they have - that has always been the core argument. [1]
My background is global geophysical exploration, primarily for mineral resources with some dabbling in the energy domain.
For a single example, this passage:
High demand and prices are already encouraging some producers to cut corners and violate environmental and safety regulations.
For example, in China, dust released from graphite mines has damaged crops and polluted villages and drinking water. In Africa, some mine owners exploit child workers and skimp on protective equipment such as respirators. Small artisanal mines, where ores are extracted by hand, often flout laws.
is entirely emotive, intended to tug on feelings (which it does) but otherwise it has no bearing on the bulk of major mining that contributes to bulk of mineral processing.
The tonnes of nickel and cobalt we see largely comes from big mines, big trucks, formal Occ Health and Safety regulations, etc.
It also commits the usual mistake of confusing "just in time" exploration results that firm up suspected deposits with sizes and density estimates for the commodities of interest with absolute limits on what is available over the cycle of time.
As demand increases further areas that are "known" (but not measured) get further technical inspection (magnetics, drill sampling, etc) and become new fresh reserves.
Does the article you cited cost money to read? I found a description on google scholar:
> Ten years left to redesign lithium-ion batteries
> Reserves of cobalt and nickel used in electric-vehicle cells will not meet future demand. Refocus research to find new electrodes based on common elements such as iron and silicon, urge Kostiantyn Turcheniuk and colleagues.
I notice that the article was published in 2018. So I guess we only have to wait two more years to decide if it's right or not. Will we be out of cobalt and nickel by then? I'd be happy to take a bet with you, assuming you stand by the article you cited.
it's not a fact, it's an opinion, and just because you see it as truth doesnt mean it is. This is why the left/progressive crowd is so disliked by the conservatives - they phrase any argument from an inherent view point that they assume is self-evident.
> This is why the left/progressive crowd is so disliked by the conservatives - they phrase any argument from an inherent view point that they assume is self-evident.
the fact that a combustion vehicle inherently produces byproducts that are extremely harmful to your health and an electrical engine does not is not an opinion, it's a medical fact you can verify yourself by breathing next to a car exhaust.
Conservatives, I assumes this means American modern conservatives, dislike this because they make French postmodernists from the 70s look like evidence based scientists
> Conservatives, I assumes this means American modern conservatives, dislike this because they make French postmodernists from the 70s look like evidence based scientists
because we don't value them at all, literally. It's a tragedy of the commons, internet pollution is like air pollution, the polluters don't pay and there's no cost associated with overusing other people's attention.
>corruption is, rather than a problem, a necessary element of society to keep things going.
There's a prof at Johns Hopkins, Yuen Yuen Ang, who wrote a whole book on the topic in China. She dubbed this 'access money'. Corruption that 'greases the wheels' and removes red tape by bureaucracies, which authoritarian states are very prone to, is a net positive. It doesn't erode trust because it stimulates growth and doesn't interfere with the lives of ordinary citizens. It's basically a hack to get things done.
The East Asian countries in particular tend to have this at the corporate levels. Chaebols in South Korea and Zaibatsu in Japan tend to be corrupt in that sense but it has if anything an organizing function.
In most democratic countries corruption tends to happen at the individual level, think Indian police in some states who are famous for extorting travelers like road bandits. That's significantly more trust eroding and economically harmful. If you don't differentiate what kind of corruption you're talking about you can't really make sense of it.
my old CS prof at my uni used to say when this question came up "do you sign up for an astronomy course and expect they teach you how to build a telescope?"
It's always puzzled me why people sign up for an academic education that has 'science' literally in the name and then complain when they get a theoretical education. It's not a tool workshop
Well the issue is the majority of people study CS to become software engineers not academics in CS. There are only a small number of software engineering degrees at select universities, so CS is the de facto route to becoming a SWE. So it’s not unreasonable students would want a bit of practical industry education in their CS degree.
I’m actually surprised with as much money is in tech that there hasn’t been more influence towards shaping curriculum to be more industry relevant. Companies waste tons of money ramping up new grads and bridging the CS to SWE gap, surely the incentives are there for a different curriculum.
universities aren't job centers either. They don't supply you with food and shelter, you pay them money and they give you the education you want. Which can mean making a lot of money if you happen to pick something private businesses value but it can also mean reading Ulysses or The Old Testament or number theory all day.
Higher education is entirely up to you, it's not a company pre-training. If you want that there are literal vocational programs that are not computer science.
And I told both of my step sons I wouldn’t pay for a degree that wouldn’t lead to a job and I had them research the expected income and types of jobs they could get based on the degree they were pursuing
>There were more wars before any type of mechanisation of warfare
yes but they weren't comparable. With the exception of ancient Chinese wars which are a bit of an odd case given the population sizes and that they kept sending farmers to the front until everyone starved, European pre-modern wars consisted of small armies and relatively low civilian casualty ratios.
It's this and the late 20th century that saw civilian death ratios climb up to 80-90% in mass bombing campaigns and urban warfare environments. People like to use 'medieval' as an insult but the medieval age was quite constrained compared to Gaza. And if you take the pilots out of the equation and fully automate this, that's probably only a taste of what people will do to civilian populations.
Because a picture says more than words, this is the kind of thing you can probably look forward to:
"European pre-modern wars consisted of small armies and relatively low civilian casualty ratios." I don't think the Napoleonic wars of early 17th century can be considered small armies. French Empire had around 1.2 million regulars in 1813.
And this is a bait and switch: you were talking about the propensity of countries or people to go to war, now you are talking about the scale of destruction.
Cities were routinely razed and famines and disease killed scores of people in historical warfare as well - we have the accounts, we know it happened. The "difficulty" of implementing any of this was enormous given the lack of modern logistics or simple things like refrigeration to keep armies resupplied.
How does this support your argument though? World War 1 increased the level of danger and destruction of warfare and...then we had World War 2. If the hypothesis was that making war easy leads to more wars, then no example presented shows that because WW1 was at the time the most destructive war in history and simply set the stage for an even more destructive war.
you couldn't because someone who sexts people for 2$ an hour is always going to engage in wage slavery, and if that is what offends you, you could just ban it directly.
We all know it's not the point though, you're just offended by porn, if she was cleaning floors for two bucks you wouldn't care. In fact her chatter job, on account of her doing it, is likely better than a lot of other work.
the problem is that nobody listened to Alan Kay and writes dynamic code the way they'd write static code but without the types.
I always liked Rich Hickey's point, that you should program on the inside the way you program on the outside. Over the wire you don't rely on types and make sure the entire internet is in type check harmony, it's on you to verify what you get, and that was what Alan Kay thought objects should do.
That's why I always find these complaints a bit puzzling. Yes in a dynamic language like Ruby, Python, Clojure, Smalltalk you can't impose global meaning, but you're not supposed to. If you have to edit countless of existing code just because some sender changed that's an indication you've ignored the principle of letting the recipient interpret the message. It shouldn't matter what someone else puts in a map, only what you take out of it, same way you don't care if the contents of the post truck change as long as your package is in it.
That's a terrible solution because then you need a bunch of extra parsing and validation code in every recipient object. This becomes impractical once the code base grows to a certain size and ultimately defeats any possible benefit that might have initially been gained with dynamic typing.
>then you need a bunch of extra parsing and validation code in every recipient object.
that's not a big deal, when we exchange generic information across networks we parse information all the time, in most use cases that's not an expensive operation. The gain is that this results in proper encapsulation, because the flipside of imposing meaning globally is that your entire codebase is one entangled ball, and as you scale a complex system, that tends to cost you more and more.
In the case of the OP where a program "breaks" and has to be recompiled every time some signature propagates through the entire system that is significant cost. Again if you think of a large scale computer network as an analog to a program, what costs more, parsing an input or rebooting and editing the entire system every time we add a field somewhere to a data structure, most consumers of that data don't care about?
this is how we got micro-services, which are nothing else but ways to introduce late binding and dynamism into static environments.
> when we exchange generic information across networks we parse information all the time
The goal is to do this parsing exactly once, at the system boundary, and thereafter keep the already-parsed data in a box that has "This has already been parsed and we know it's correct" written on the outside, so that nothing internal needs to worry about that again. And the absolute best kind of box is a type, because it's pretty easy to enforce that the parser function is the only piece of code in the entire system that can create a value of that type, and as soon as you do this, that entire class of problems goes away.
This idea is of using types whose instances can only be created by parser functions is known as Parse, Don't Validate, and while it's possible and useful to apply the general idea in a dynamically typed language, you only get the "We know at compile time that this problem cannot exist" guarantee if you use types.
> The goal is to do this parsing exactly once, at the system boundary
You are only parsing once at the system boundary, but under the dynamic model every receiver is its own system boundary. Like the earlier comment pointed out, micro services emerged to provide a way to hack Kay's actor model onto languages that don't offer the dynamicism natively. Yes, you are only parsing once in each service, but ultimately you are still parsing many times when you look at the entire program as a whole. "Parse, don't validate" doesn't really change anything.
> but under the dynamic model every receiver is its own system boundary
I'm not claiming that it can't be done that way, I'm claiming that it's better not to do it that way.
You could achieve security by hiring a separate guard to stand outside each room in your office building, but it's cheaper and just as secure to hire a single guard to stand outside the entrance to the building.
>micro services emerged to provide a way to hack Kay's actor model onto languages that don't offer the dynamicism natively
I think microservices emerged for a different reason: to make more efficient use of hardware at scale. (A monolith that does everything is in every way easier to work with.) One downside of microservices is the much-increased system boundary size they imply -- this hole in the type system forces a lot more parsing and makes it harder to reason about the effects of local changes.
> I think microservices emerged for a different reason: to make more efficient use of hardware at scale.
Same thing, no? That is exactly was what Kay was talking about. That was his vision: Infinite nodes all interconnected, sending messages to each other. That is why Smalltalk was designed the way it was. While the mainstream Smalltalk implementations got stuck in a single image model, Kay and others did try working on projects to carry the vision forward. Erlang had some success with the same essential concept.
> I'm claiming that it's better not to do it that way.
Is it fundamentally better, or is it only better because the alternative was never fully realized? For something of modern relevance, take LLMs. In your model, you have to have the hardware to run the LLM on your local machine, which for a frontier model is quite the ask. Or you can write all kinds of crazy, convoluted code to pass the work off to another machine. In Kay's world, being able to access an LLM on another machine is a feature built right into the language. Code running on another machine is the same as code running on your own machine.
I'm reminded of what you said about "Parse, don't validate" types. Like you alluded to, you can write all kinds of tests to essentially validate the same properties as the type system, but when the language gives you a type system you get all that for free, which you saw as a benefit. But now it seems you are suggesting it is actually better for the compiler to do very little and that it is best to write your own code to deal with all the things you need.
> I think microservices emerged for a different reason: to make more efficient use of hardware at scale.
Scaling different areas of an application is one thing. Being able to use different technology choices for different areas is another, even at low scale. And being able to have teams own individual areas of an application via a reasonably hard boundary is a third.
Important to point out that that every high culture produced restrictions on exactly those behaviors, gambling was a universal vice when that concept still mattered.
America in particular had a work culture that favored well, work and technical excellence. Now work is for suckers, thinking is for suckers, precision not worth it when you can have some machine do it half-right.
"Yes I could go for the reliable option. But taking a punt is worth a shot if the cost is low.", might as well be the national slogan from vibe-coding to the department of defense. Even the venture capital industry that excels at slot machine sectors was itself already a slot machine.
reply