>I realized using tech in education (pre university) was a mistake.
I think we should use tech in education, but in a targeted way. It's important that children gain basic technical literacy, like how to touch type and use basic software. I suspect there is a gap in the technical literacy of lower income students, whose parents are less likely to have a computer at home.
The real problem is separating reading/writing skills from tech skills. We shouldn't stop teaching handwriting just because typing exists. And being able to read long books and essays teaches fundamental cognitive skills like attention, focus, and information processing.
That's not using tech that you're describing here. You're talking about literally learning some basic computer skills (such as word processor, excel, reading email, some basic website building, use printer, and some amount of programming)
For those, obviously you need a computer and completely agree that those are important skills to learn... But you maybe need to spend 1h/week during last 2 years of middle school on those at the computer lab (as it's been done since the 90s in many schools around the world)
But for any other course such as Math, English (or whichever primary language in your country), second languages, history, etc. : that's where using tech is a mistake
A bit of tech is ok, but it cannot be "everyone does their homework and read lesson on a iPad/Chromebook"
I am pretty skeptical about the value of learning to build websites. I think it is too tempting for students to devote significant time to something that is not foundational knowledge and where they won't get any valuable feedback anyway.
It makes me think back to my writing assignments in grades 6-12. I spent considerable time making sure the word processor had the exact perfect font, spacing, and formatting with cool headers, footers, and the footnotes, etc. Yet, I wouldn't even bother to proofread the final text before handing it in. What a terrible waste of a captive audience that could have helped me refine my arguments and writing style, rather than waste their time on things like careless grammatical errors.
Anyway, I do agree with the idea of incorporating Excel, and even RStudio for math and science as tools, especially if they displace Ed-tech software that adds unnecessary abstractions, or attempts to replace interaction with knowledgeable teachers. One other exception might be Anki or similar, since they might move rote memorization out of the classroom, so that more time can be spent on critical thinking.
Building websites, I agree has little value, but using it as a way to explain basics of how the web works I think is pretty valuable. Web likely isn't going anywhere for a long time, having some basic knowledge of how it works I think very useful for a lot of people. I hate the idea of any more MS apps like Excel being regularly incorporated, but basic usage of something similar definitely can help know of how to use a useful tool/computer skill. Even in the early 90's we had computer labs for learning computer skills which I think there is value. But forcing tech everywhere into teaching is an issue IMO.
The beautiful thing about programming (which also makes edtech such an appealing dream to chase) is that you get immediate feedback from the computer and don't have to wait for someone whose attention is at least semi-scarce to mark your paper.
re: Anki. It is not as optimized but you can do SRS with physical flash-cards.
* Have something like 5 bins, numbered 1-5.
* Every day you add your new cards to bin nr. 1 shuffle and review. Correct cards go to bin nr. 2, incorrect cards stay in bin nr. 1.
* Every other day do the same with bin nr. 1 and 2, every forth with bin nr. 1, 2 and 3 etc. except incorrect cards go in the bin below. More complex scheduling algorithms exist.
* In a classroom setting the teacher can print out the flashcards and hand out review schedule for the week (e.g. Monday: add these 10 new cards and review 1; Tuesday: 10 new cards and review box 1 and 2; Wednesday: No new cards and and review box 1 and 3; etc.)
* If you want to be super fancy, the flash card publisher can add audio-chips to the flash-cards (or each box-set plus QR code on the card).
Would it be a mistake to use Desmos in a math classroom, or 3Blue1Brown style animations, to build up visual intuition? Should we not teach basic numerical and statistical methods in Python? Should kids be forced to use physical copies of newspapers and journal articles instead of learning how to look things up in a database?
I'm all for going back to analog where it makes sense, but it seems wrongheaded to completely remove things that are relevant skills for most 21st century careers.
> Would it be a mistake to use Desmos in a math classroom, or 3Blue1Brown style animations, to build up visual intuition?
I don't think there's anything wrong with showing kids some videos every now and then. I still have fond memories of watching Bill Nye.
> Should we not teach basic numerical and statistical methods in Python?
No. Those should be done by hand, so kids can develop an intuition for it. The same way we don't allow kids learning multiplication and division to use calculators.
>> Should we not teach basic numerical and statistical methods in Python?
> No. Those should be done by hand, so kids can develop an intuition for it. The same way we don't allow kids learning multiplication and division to use calculators.
I would think that it would make sense to introduce Python in the same way that calculators, and later graphing calculators are introduced, and I believe (just based on hearing random anecdotes) that this is already the case in many places.
I'm a big proponent of the gradual introduction of abstraction, which my early education failed at, and something Factorio and some later schooling did get right, although the intent was rarely communicated effectively.
First, learn what and why a thing exists at a sufficiently primitive level of interaction, then once students have it locked in, introduce a new layer of complexity by making the former primitive steps faster and easier to work with, using tools. It's important that each step serves a useful purpose though. For example, I don't think there's much of a case for writing actual code by hand and grading students on missing a semicolon, but there's probably a case for working out logic and pseudocode by hand.
I don't think there's a case for hand-drawing intricate diagrams and graphs, because it builds a skill and level of intimacy with the drawing aspect that's just silly, and tests someone's drawing capability rather than their understanding of the subject, but I suppose everyone has they're own opinion on that.
That last one kind of crippled me in various classes. I already new better tools and methods existed for doing weather pattern diagrams or topographical maps, but it was so immensely tedious and time-consuming that it totally derailed me to the point where I'd fail Uni labs despite it not being very difficult content, only because the prof wanted to teach it like the 50s.
Fwiw calculators were banned in my school. Only started to use one in university - and there it also didnt really help with anything as the math is already more complex
I was allowed to use calculators when I started algebra in seventh grade.
I found that calculators didn't help all that much once you got into symbolic stuff. They were useful for the final reductions, obviously, but for algebra the lion's share of the work is symbolic and at least the relatively cheap two-line TI calculator I was using couldn't do anything symbolic.
I know that there are calculators that can do Computer Algebra System stuff, and those probably should be held off on until at least calculus.
Those are great examples. Not familiar with Desmos, but 3Blue1Brown style animations are great.
The problem is that people seem to want to go to extremes. Either go all out on doing everything in tablets or not use any technology in education at all.
its not just work skills, its also a better understanding that is gained from things such as the maths animations you mentioned.
> The problem is that people seem to want to go to extremes. Either go all out on doing everything in tablets or not use any technology in education at all.
I think the latter is mostly a reaction to the former. I think there is a way to use technology appropriately in theory in many cases, but the administrators making these choices are largely technically illiterate and it's too tempting for the teachers implementing them to just hand control over to the students (and give themselves a break from actually teaching).
Until most kids are about 12 - 14 years old, they're learning much more basic concepts than you're describing. I don't think anyone is trying to take intro to computer science out of high schools or preventing an advanced student younger than that from the same.
I would rather a teacher have to draw a concept on a board than have each student watch an animation on their computer. Obviously, the teacher projecting the animation should be fine, but it seems like some educators and parents can't handle that and it turns into a slippery slope back to kids using devices.
So for most classrooms full of students in grades prior to high school, the answer to your list of (presumably rhetorical) questions is "Yes."
There's an in-between point my math teacher loved using: an overhead projector. Hand-drawn transparencies that could be made beforehand or on the fly, protected large so everyone could see, without hiding the teacher behind a computer - they'd still stand at the front of the class facing the students.
>Would it be a mistake to use Desmos in a math classroom
Maybe. Back in the day I had classes where we had to learn the rough shape of a number of basic functions, which built intuition that helped. This involved drawing a lot of them by hand. Initially by calculating points and estimating, and later by being given an arbitrary function and graphing it. Using Desmos too early would've prevented building these skills.
Once the skills are built, using it doesn't seem a major negative.
I think of it like a calculator. Don't let kids learning basic arithmetic to use a 4 function calculator, but once you hit algebra, that's find (but graphing calculators still aren't).
Best might be to mix it up, some with and some without, but no calculator is preferable to always calculator.
> (as it's been done since the 90s in many schools around the world)
I had computer lab in a catholic grade school in the mid-late 80's. Apple II's and the class was once a week and a mix of typing, logo turtle, and of course, The Oregon Trail.
What for? I've been writing computer programs and documentation since 1969 and I can't touch type. I've never felt enough pressure to do it. I can still type faster than I can think. When I'm writing most of my time is spent thinking not tapping the keys.
> Although [touch typing] refers to typing without using the sense of sight to find the keys ... the term is often used to refer to a specific form of touch typing that involves placing the eight fingers in a horizontal row along the middle of the keyboard (the home row) and having them reach for specific other keys.
The strict definition of touch typing reminds me of how when I was a kid, my parents would always tell me that there’s a specific way of holding chopsticks. You gotta hold the top one like a pencil, and rest the bottom one between the crook of your fingers and your ring finger, and make sure they’re the same length and the bottom one isn’t moving and you’re just using it as a base to press against.
And then I became an adult and visited China and met actual Chinese immigrants and married a native chopstick holder. And half of them don’t hold chopsticks “the real way”. Somehow it all works out. As long as you can eat a peanut with them, you pass.
As an adult I learned that there’s also a whole lot of prescriptive bullshit that basically nobody pays attention to. The strict definition of touch typing seems like one of those. If you can type without looking at the keys, you can touch type.
I will say you are far faster touch typing proper. I never fully learned it in school. I kind of half do it. Left hand is pretty religously touch typing byt right doesnt' stay on its home row.
Just never cared to get perfect at it in school. I would get absolutely crushed on typing tests though with the kids who actually learned touch typing. They all had piano experience and could reach the modifiers while holding on to the home row still. I still can't really do that on my right hand, its like my pinky doesn't reach.
I use a Dvorak keyboard, so usually outpace the touch typers. By the strict definition, it's not technically touch typing. By any colloquial definition, it absolutely is, if I looked at the keys I'd be touching the wrong letters. I just have the Dvorak layout burned into my brain so it's what I type regardless of what the keys say.
With such a strict definition the OP’s comment becomes basically meaningless. They could be referring to using index fingers only. They could be using an alternative keyboard layout. They could mostly be using left-hand only. Pretty much any WPM between 1 and 200 seems possible with the statement: “I don’t keep my fingers on home row in between key presses.”
In many cases the understanding of the term "touch typing" isn't just "typing without looking" but a very specific way of doing so.
You should be able to type without looking at your keyboard.
But the specific 5 finger arrangement taught often as "tough typing" isn't needed for that, some common issues:
- it being taught with an orthogonal arrangement of your hand to they keyboard, that is nearly guaranteed to lead to carpal tunnel syndrome if you have a typical keyboard/desk setup. Don't angle your wrist when typing.
- Pinky fingers of "average" hands already have issues reaching the right rows, with extra small or extra short hands they often aren't usable as intended with touch typing.
I guess this is technically correct in the same way that stenographers and highly-ergonomic alternative-layout keyboard users also don’t “touch type” according to a strict definition.
If you’re capable of typing quick enough to publicly take meeting notes, then it’s fine. But if you can’t, I could see it being professionally embarrassing in the same way that not understanding basic arithmetic could be professionally embarrassing.
That’s the kind of (in)capability we’re talking about when it comes to Gen Z. Like not knowing ctrl-c ctrl-v.
Zen Z doesn't types to store knowledge. They would rather record the lecture or the meeting. I put aside my fone and put it on record while I am carefully listening to the meeting. I'm not even zen z. I would rather write than type
> Zen Z doesn't types to store knowledge. They would rather record the lecture or the meeting. I put aside my fone and put it on record while I am carefully listening to the meeting. I'm not even zen z. I would rather write than type
Recordings are one of the worse ways to store knowledge for later reference, except in usual scenarios. They're very awkward to work with. The only plus is their cheap an easy to make.
Trust me, I work at a company where "documentation" is often an old meeting recording (and sometimes you have to count yourself lucky to even have that).
Previously I would have agreed with you but as of the past year or so that's out of date thanks to automatic device local transcriptions becoming good enough.
No I don't have to search for the keys. But I don't use all the fingers of my hands and I do look at the keyboard quite often. No it's not mentally exhausting, it's the thinking that is exhausting.
I don't do it for every key but without looking, even if just sort of indirectly, I will repeatedly make mistakes. I also don't use proper finger placement. But never have I felt it limiting or slowing me down. If anything I feel like it gave me a heads up on screen typing although I still way prefer a keyboard.
Fast typing is not about throughput, it's about latency. If I only needed to type fast enough to produce the 125-something lines of code I get into production per week, I would be able to work at a word a minute. Alas, that's not how that works.
I learned how to touch type in middle school with school software like Mario teaches typing and Mavis beacon. I peaked around 80wpm and I was faster than 90% of my classmates.
A few years ago I invested in a rectilinear split keyboard which has a slightly different layout, but much more ergonomic. But interestingly I can now type 120wpm+.
I think touch typing is very similar to learning penmanship (and I guess cursive to an extent). If I followed the exact rules I learned about handwriting in school, I'd have much more legible handwriting but I'd write so much more slowly. Instead I my own way, which lets me get my thoughts out quickly, albeit not as neat as "correct" penmanship. Fortunately typing is much more lenient on this front.
In general, mastery involves taking the basic mechanics of something and making them completely automatic, freeing up cognitive resources for higher level processes. Expert pianists don't need to look down at their hands when sight reading.
IFF we interpret "touch typing" as the typical thought typing method and not just "typing without looking at the keyboard".
In general key arrangement traces back to physical limitations of type writers not ergonomics and layout choice isn't exactly ergonomic based either.
But even if it where, the biggest issue of touch typing is that it's often thought around the idea of your hands being somewhat orthogonal to your keyboard, _which they never should be_ (if you use a typical keyboard on a typcal desk setup) as it leads to angling you hands/wrist which is nearly guaranteed to cause you health issues long term if you are typing a lot.
The simple solution is to keep your wrist straight leading to using the keyboard in a way where you hand is at an angle to it's layout instead of orthogonal which in turn inhibits perfect touch typing. But still allows something close to it.
As keys are arranged in shifted columns this kinda works surprisingly well, an issue is the angle differs depending on left/right hand :/
Split or alice style keyboards can also help a bit, but I often feel man designs kinda miss the point. Especially many supposedly ergonomic keyboards aren't aren't really that ergonomic, especially if your hand is to large, small, or otherwise unusual...
Which brings us to the next point, human autonomy varies a lot, some people have just some very touch typing incompatible hands, like very short pinky fingers making that finger unusable for typical touch typing (even with normal hands it's a bit suboptimal which is why some keyboards shift the outer rows down by half a row).
Anatomy might matter if you're talking about world champion speed typing. I don't think it matters for just being competent (I say this as a man with short and fat fingers)
> It's important that children gain basic technical literacy
They certainly will at home.
> I suspect there is a gap in the technical literacy of lower income students, whose parents are less likely to have a computer at home.
In which country?
I live in Mexico and even here you really need to go to the poorest families to find a home without a laptop. Even those families have multiple smartphones. Today a smartphone is not a good replacement for a laptop but maybe in a couple of years it will be.
Even if they have a computer at home, that doesn't mean they're practicing the relevant skills. Touch typing, word processing, researching a topic online, etc. are things that need deliberate practice. Based on my own experience, using a computer at home 99% of the time meant playing video games.
The following article suggests that in the United States, about 59% of lower income households have a laptop or desktop computer, compared to 92% of upper income households.
> Based on my own experience, using a computer at home 99% of the time meant playing video games.
I ended up with a non-home-row style of touch-typing just by being forced to get chat messages out quickly in StarCraft multiplayer. So it's at least possible to learn it from that, even if most don't.
Leaving aside the wealth gap factor, which I do agree is important:
When I think back to using computers as a kid, both at school(starting in 1999) and at home I don't think it's all that black and white wrt just playing at home vs learning useful skills at school.
At some point in the early 00s my underfunded elementary school acquired a bunch of old windows 95 computers. We would have classes where we mostly did basic touch typing, MS Office etc. At home, my middle class parents had also managed to find me some old outdated clunker. And yes, most of my time at home was spent playing games, chatting with friends on msn, pirating mp3s etc.
But I'd say I learned orders of magnitude more from my frivolous activities than from whatever we did at school. At home I was learning things like: online research(into warcraft cheat codes or quest guides for Runescape etc); software troubleshooting(having to reinstall windows because I downloaded malware on limewire or otherwise borked my install somehow); fast typing(from chatting with friends about whatever 12 year olds like to discuss.
Probably 90% of my typing practice back then came at home, not at school, and there was no touch typing. I could type 100+ wpm on just 4 fingers by the time I was in middle school. Never actually learned to properly touch type until I had force myself into it 5 years ago due to RSI); English as a second language(from various forums, irc, etc, hard to avoid back then); And I believe one of my first forays into programming was trying to get a cracked game with a broken .bat installer off TPB to work. I had a friend who got into it via Morrowind modding.
Actually, come to think of it, most of computer class was also in reality spent sneakily playing flash games and/or messing around with the computer settings just to screw with the next student/teacher to use it.
Even generalising beyond computers, I think a remarkable portion of the skills and interests that end up defining us as people, can be traced back to stuff we did trying to avoid boredom as children.
To summarise though, I do think computers have a place in school. But especially at an elementary school level, I think play should be a significant portion of their use, because play is how kids explore the world and themselves.
Just wanted to point out that you and other people who responded to you basically do agree on same points, you are just presenting it differently.
I just find it amusing/endearing that we argue with each other even when we do agree on the core issue. :D
Touch typing is not a basic tech skill, and also pretty useless on the long term. I expect dictation to take over very soon, as finally voice recognition is getting to be usable, and commonplace.
> It's important that children gain basic technical literacy, like how to touch type and use basic software. I suspect there is a gap in the technical literacy of lower income students, whose parents are less likely to have a computer at home.
Some of us "a bit older" seem to have gone through a golden era of tech, where we actually learned that tech en-masse. In a class of maybe 30 students, around 20, 25 of them were able to configure dial up modems, come on IRC (servers, ports, channels needed to be configured) and do a bunch of other stuff our parents mostly considered "black magic" (except for a few tech enthusiasts), and the general concensus was, that every generation will know more and be "better" than the previous generation.
A few decades have passed.. and kids can't type anymore on a keyboard, can't print, have no idea what can be changed in the settings on their smartphone, don't know how to block ads, can't cheat in games anymore (except via pay-to-win) and have no idea where to change their instagram password.
So, now you have boomers, who can't use computers and kids, who can't use computers anymore.
Boomers are split between a demographic that is very computer literate, having worked in/around science and tech for decades, and a demo that isn't remotely literate and may not even be online.
The latter is a fairly small demo though - supposedly around a third.
The split is more by education than by age.
Kids can use computers - phones - as app appliances, but they don't understand computers.
Peak literacy is young Gen X and older millennials.
It's important to note that this isn't fixed by ad blockers. To avoid this kind of fingerprinting, you need to disable JavaScript or use a browser like Firefox which randomizes extension UUIDs.
The key phrase is "kind of thing". It certainly does matter what kinds of things we focus our attention on as a species. I think you would have to be quite cynical to think that progress in spaceflight over the past 60+ years hasn't had a positive impact.
I think this is a false dichotomy. If you're passionate about your craft, you will make a higher quality product. The real division is between those who measure the success of a project in:
- revenue/man-hour, features shipped/man-hour, etc.
- ms response time, GB/s throughput, number of bugs actually shipped to customers, etc.
People in the second camp use AI, but it's a lot more limited and targeted. And yes, you can always cut corners and ship software faster, but it's not going to be higher quality by any objective metric.
Most mathematicians don't understand the fields outside of their specialization (at a research level). Your assumption that intuition and applications are limited to hobbyists ignores the possibility of enabling mathematicians to work and collaborate more effectively at the cutting edge of multiple fields.
>The litmus test will be when, if ever, someone wins a Fields using a proof-assistant in an essential way.
You're assuming that the point of interactive theorem provers is to discover new mathematics. While that's an interesting research area, it seems like the more practical application is verifying proofs one has already discovered through other means.
Exactly this. LLMs really aren't built for discovering new mathematics, especially _interesting_ new mathematics. They're built to try the most obvious patterns. When that works, it's pretty much by definition not interesting.
What LLMs are good at is organizing concepts, filling in detail, and remembering to check corner cases. So their use should help mathematicians to get a better handle on what's terra firma and what's still exploration. Which is great. Proof by it-convinced-other-mathematicians doesn't have a flawless track record. Sometimes major theorems turn out to be wrong or wrong-as-stated. Sometimes they're right, but there's never been a complete or completely correct proof in the literature. The latter case is actually quite common, and formal proof is just what's needed.
LLMs and interactive theorem provers are vastly different. There are AI models that come up with workable formal proofs for ITPs but these aren't your usual frontier models, they're specifically trained for this task.
ITPs are far older than LLMs in general, sure, but that's a pedantic distraction. What everyone is talking about here (both the comments, and the article) are ITPs enriched with LLMs to make the "smart" proof assistants. The LLMs used in ITPs are not vastly different from the usual chatbots and coding assistants. Just a different reinforcement learning problem, no fundamental change in their architecture.
Of course, once LLMs are really good at that, they can be set loose on the entire historical math literature, all 3.5M papers worth. And then LLMs can be trained on these formalized results (the ones that turn out upon attempted formalization to have been correct.)
How good do you think AI will be at proving new results given that training set?
Math is going to change, and change massively. There's a lot of whistling past the graveyard going on from those who are frightened by this prospect.
Relevant philosophy paper: "The Vulnerable World Hypothesis" by Nick Bostrom [0].
In that paper, Bostrom floats the idea that it might be in humanity's best interest to have a strong global government with mass surveillance to prevent technological catastrophes. It's more of a thought experiment than a "we should definitely do this" kind of argument, but it's worth taking the idea seriously and thinking hard about what alternatives we have for maintaining global stability.
Cheap hypersonics don't threaten global stability, they threaten global hegemony. Which is really what I suspect irks most people afraid of them.
We've seen a shift towards cheap offensive capacity that gives middle powers or even smaller actors the capacity to hit hegemons where it hurts, very visible in Ukraine and the Middle East now. This leads to instability only temporarily until you end up in a new equilibrium where smaller players will have significantly more say and capacity to retaliate, effectively a MAD strategy on a budget for everyone.
GP's point was broader than that, it was about technological progress and the possibility of terrorist groups or mentally ill individuals getting their hands on weapons that can easily kill millions of people. That's also what the paper I linked is about.
Consider a future where individuals can relatively easily engineer a pathogen or manufacture a nuclear weapon. It's not hard to imagine how that would threaten global stability.
It has been proven that recurrent neural networks are Turing complete [0]. So for every computable function, there is a neural network that computes it. That doesn't say anything about size or efficiency, but in principle this allows neural networks to simulate a wide range of intelligent and creative behavior, including the kind of extrapolation you're talking about.
I think you cannot take the step from any turing machine being representable as a neural network to say anything about the prowess of learned neural networks instead of specifically crafted ones.
I think a good example are calculations or counting letters: it's trivial to write turing machines doing that correctly, so you could create neural networks, that do just that. From LLM we know that they are bad at those tasks.
So for every computable function, there is a neural network that computes it. That doesn't say anything about size or efficiency
It also doesn't say anything about finding the desired function, rather than a different function which approximates it closely on some compact set but diverges from it outside that set. That's the trouble with extrapolation: you don't know how to compute the function you're looking for because you don't know anything about its behaviour outside of your sample.
No, but unless you find evidence to suggest we exceed the Turing computable, Turing completeness is sufficient to show that such systems are not precluded from creativity or intelligence.
I believe that quantum oracles are more powerful than Turing oracles, because quantum oracles can be constructed, from what I understand, and Turing oracles need infinite tape.
Our brains use quantum computation within each neuron [1].
The difference is quantum oracles can be constructed [1] and Turing oracle can't be [2]: "An oracle machine or o-machine is a Turing a-machine that pauses its computation at state "o" while, to complete its calculation, it "awaits the decision" of "the oracle"—an entity unspecified by Turing "apart from saying that it cannot be a machine" (Turing (1939)."
This is meaningless. A Turing machine is defined in terms of state transitions. Between those state transitions, there is a pause in computation at any point where the operations takes time. Those pauses are just not part of the definition because they are irrelevant to the computational outcome.
And given we have no evidence that quantum oracles exceeds the Turing computable, all the evidence we have suggests that they are Turing machines.
Turing machines grew from the constructive mathematics [1], where proofs are constructions of the objects or, in other words, algorithms to compute them.
Saying that there is no difference between things that can be constructed (quantum oracles) and things that are given and cannot be constructed (Turing oracles - they are not even machines of any sort) is a direct refutation of the very base of the Turing machine theoretical base.
That's an irrelevant strawman. It tells us nothing about how create such a system ... how to pluck it out of the infinity of TMs. It's like saying that bridges are necessarily built from atoms and adhere to the laws of physics--that's of no help to engineers trying to build a bridge.
And there's also the other side of the GP's point--Turing completeness not necessary for creativity--not by a long shot. (In fact, humans are not Turing complete.)
No, twisting ot to be about how to create such a system is the strawman.
> Turing completeness not necessary for creativity--not by a long shot.
This is by far a more extreme claim than the others in this thread. A system that is not even Turing complete is extremely limited. It's near impossible to construct a system with the ability to loop and branch that isn't Turing complete, for example.
>(In fact, humans are not Turing complete.)
Humans are at least trivially Turing complete - to be Turing complete, all we need to be able to do is to read and write a tape or simulation of one, and use a lookup table with 6 entries (for the proven minimal (2,3) Turing machine) to choose which steps to follow.
Maybe you mean to suggest we exceed it. There is no evidence we can.
> P.S. everything in the response is wrong ... this person has no idea what it means to be Turing complete.
I know very well what it means to be Turing complete. All the evidence so far, on the other hand suggests you don't.
> An infinite tape. And to be Turing complete we must "simulate" that tape--the tape head is not Turing complete, the whole UTM is.
An IO port is logically equivalent to infinite tape.
> PDAs are not "extremely limited", and we are more limited than PDAs because of our very finite nature.
You can trivially execute every step in a Turing machine, hence you are Turing equivalent. It is clear you do not understand the subject at even a basic level.
> You can trivially execute every step in a Turing machine, hence you are Turing equivalent. It is clear you do not understand the subject at even a basic level.
LOL. Such projection. Humans are provably not Turing Complete because they are guaranteed to halt.
Judging from what I read, their work is subject to regular hardware constraints, such as limited stack size. Because paper describes a mapping from regular hardware circuits to the continuous circuits.
As an example, I would like to ask how to parse balanced brackets grammar (S ::= B <EOS>; B ::= | BB | (B) | [B] | {B};) with that Turing complete recurrent network and how it will deal with precision loss for relatively short inputs.
Paper also does not address training (i.e., automatic search of the processors' equations given inputs and outputs).
Is there actually a use case for graphing calculators anymore? Desmos provides a great graphing program for free in a web browser. In any professional capacity you would be using MATLAB, Mathematica, or the scientific Python ecosystem.
I mostly remember playing games on my TI-84 in high school. We used it in class maybe once or twice. None of my college classes allowed graphing calculators on tests, so ironically I had to buy a "dumb" calculator even though I owned the fancy one.
I don't think there was ever a solid use case for graphing calculators in school, at least not in my experience? The curriculum didn't make good use of them and I'm not convinced it could have. There's little value in having every kid in the classroom replicate the same plot of y = sin(x) or whatever on a tiny screen. And other than such demonstrations... what are you gonna do with it? It was never flexible or powerful enough for serious math. You weren't going to run circuit or physics simulations on a TI-89.
There are other features that can be useful - scientific notation, symbolic solver, unit conversions, etc - but graphing as such always seemed like a gimmick.
I think it's more of a not-entirely-rational appeal to parents: "if my kid has a top-notch calculator for high school / college, maybe they're gonna be better at math". And kids did not object, but in the end, mostly just sideloaded games and horsed around.
> You weren't going to run circuit or physics simulations on a TI-89.
Well, I wrote a couple of programs that were useful for quite a while. They involved electromagnetism and changing frames of reference. I definitely was able to do quite a lot of Physics with my Ti-89.
I think we should use tech in education, but in a targeted way. It's important that children gain basic technical literacy, like how to touch type and use basic software. I suspect there is a gap in the technical literacy of lower income students, whose parents are less likely to have a computer at home.
The real problem is separating reading/writing skills from tech skills. We shouldn't stop teaching handwriting just because typing exists. And being able to read long books and essays teaches fundamental cognitive skills like attention, focus, and information processing.
reply