I didn't realize it, but my '72 Dodge has a switching power supply to convert the 12 volt system voltage to 5V for the dashboard instruments.
It's a very simple device, relying on current heating a wire to bend it to and from a contact, but it's an engineering marvel of low cost effectiveness.
But it doesn't produce "clean" 5V, there's a jitter to it. Some electronics guys have replaced it with modern circuitry (an op-amp I think) but it turned out there was a problem with clean 5V. The jitter would unstick the the analog dials, so they'd display accurately. The clean 5V didn't do that. So, they had to add more circuitry to add jitter in the supply voltage.
I've been using a GaN-based power supply for my MacBook Pro and it's just as good as advertised. Slightly bigger than a deck of cards, yet it runs cooler than the stock unit.
Hopefully this tech finds its way into more devices as it gets cheaper. It would be neat to have GaN based inverters for electric cars.
* Make sure they are UL listed for safety, as Animats pointed out in an earlier discussion. [1] Some aren't, even from well-known brands like Anker.
* If they are multi-port, make sure you're happy with how they distribute power. Most likely you want the majority of the power going to port 1 where you stick a laptop, but some drop it down to 50W or lower if you just have a cable in one of the other spots, much less any device connected.
I love the monoprice ones. They're small, inexpensive, inconspicuous, and meet the standards I described above. They also charge all my devices reliably, unlike my (larger, I think non-GaN) Nekteck chargers that spend their time in a drawer now. Buy a 5A cable to get the full 100W out of them at 20V.
Even-though the Anker brick I have does not have an Underwriters Laboratory logo on it, it does have a TUV and a CE logo. TUV is the equivalent to UL but is a testing house mostly geared towards Europe.
That being said, TUV like UL is listed as a Nationally Recognized Testing Laboratory (NRTL) in the United States (for more information see: https://www.tuv.com/usa/en/ctuvus-certification.html) and thus is just as safe.
The UL logo may be more recognizable in the US, however they too certify for both markets, with UL also having their UL EU certification program.
A TUV mark would only be accepted in North America it was a cTUVus stamp on it.
What I wonder is if TUV in Europe just tests at just 240V if it’s labelled 100v-240v, or across the labelled voltages (though lower voltage should mean fewer problems).
I'm currently charging the Air M1 off a 20W Anker Nano which is kind of amazing it works for a 30g device although it charges very slowly but is quite handy to drop in your bag in case you run out. I've ordered a 45W one but they are out of stock presently.
I have a collection of RAVPower GaN chargers that are very solid so far. Been using them for about a year now.
In fact, the Apple power brick that came with my M1 MacBook Pro is still in its original packaging; I have not used it yet. The 65 Watt RAVPower one seems as if it's about half the size. More than half the mass, though.
(I have never needed to try warranty support for RAVPower things.)
It surprises me that GaN marketed power supplies cost extra. The actual GaN transistors are under 10 cents now, and by using one you can use a smaller inductor, smaller capacitor and smaller heatsinks. Those smaller components lead to a smaller plastic case and smaller circuit board.
All of those smaller things cost less, so the finished product has lower production costs.
The ones I've bought haven't been terribly expensive. About half the price of Apple chargers. I daresay you don't want to go too cheap with chargers as they can overheat/pack up/damage things.
Yes, it can, but that would be pricey with regards to how much current they can handle. GaN switches can't handle as much as silicon, or SiC, despite their higher efficiency.
I never heard that Apple invented switching power supplies... how common is that train of thought? I've been using them since way before this article was written
The article is neither making or debunking that claim. It is debunking the claim that apple revolutionized computer power supplies by borrowing the switching power supply design used in oscilloscopes. It’s the second sentence of the post.
To be fair, he (and many other people) consider significant quality and design improvements as “revolutionary”. I tend to agree especially when the entire industry makes garbage. For example the touchpad. It’s still one of the top reasons not to switch to anything else for me. Nothing else comes close to being as accurate and natural.
As the article explained, the Apple ][ SMPS didn't have significant quality or design improvements. It did have some novel design features, but they weren't good enough to be widely adopted by others, and Apple eventually dropped them too.
I'd say the word "revolutionary" has just become an inflationary term for "evolutionary": really - it's thrown around willy nilly with everyone, PR guys, and their mother to describe any sort of improvement (or, more often, "improvement").
It really dimishes (and mostly eliminates) its value.
Etymologically, it works (you “give another turn” to the base idea, you take an existing idea and you flip it, you tighten the screw on an existing idea, you adjust course by turning an existing direction, etc.)
For its personal computer contemporaries, the TRS-80, the Commmodore PET, the Apple II's use of a switching power supply was unique. I had never heard that they invented the switching power supply.
Apple's price premium meant they often could introduce the next cutting edge technology that would soon become ubiquitous when the prices came down.
In fact, switching power supplies were used long before the transistor, they were using mechanical switches.
Car radios used mechanical switches to generate the anode voltage for the tubes, and in trains there were voltage converters using a rotating stream of liquid mercury.
Whenever Apple is discussed, there often seems to be a conflation of invention (coming up with something new) with innovation (applying an invention in a way that changes things.)
Apple didn't invent the personal computer, or the GUI, or Wi-Fi, or MP3 players, or smartphones, or app stores, or tablets, or smartwatches, or ARM processors – but they introduced innovative, and indeed transformative, products in those categories.
At the time, "Apple invented the switching power supply" was a notion going around in bad tech/sci reporting circles, so it deserved dismantling along with the power supplies.
> I didn't see the need to frame it by attacking Apple.
Other posters have already pointed out that this article seeks to clarify the history around Steve Jobs' (not entirely accurate) claims.
I want to focus on the fact that people find a need to protect Apple.
Apple is a 2T+ market cap corporation. It is not a friend, it is not a family member, and it is certainly not beyond reproach. It doesn't care about you -- it just wants you to spend more money on its products and services.
Don't feel bad for Apple when people call it out for bad behavior or historical inaccuracies. People should do this.
While there are people that work at Apple that legitimately care about making good products, in the macro the predominant factor is still money. It drives the whole enterprise. The very shape of Apple's solutions and good will are fit by an optimization function to obtain money.
Brand, supply chain, innovation, fierce competition, fostering loyalty, building a moat. These are the things Apple does. It's a machine that makes money selling products.
You might like Tim Cook, Steve Jobs, or many of the other product people and engineers there. That's fine. But don't form a fond bond with the company. And also realize the motivations of the leadership. They're humans -- they can do good, but they can also make mistakes and tell lies to serve their own needs.
If Apple makes products you like and enjoy, buy them, appreciate them, and leave it at that. Don't let Apple create a sense of nostalgia, closeness, or loyalty. This is artificial. The company doesn't care about you at all. It can't.
It's my subtle protest against the HN belief that people need to be warned against articles from previous years. That said, it's a bit alarming to realize I wrote the article 9 years ago.
Agree - a date is often absolutely essential to placing what you’re reading in context. I always look for one at the top of any article I read, and assume based on past experience that the lack of one is usually indicative of clickbait (although clearly not in this case).
I did a small A/B test long ago, and found that the date, if more than a couple of years old, can discourage a lot of people from even reading the article.
So maybe it should be placed at the end as a compromise?
I can definitely appreciate the conundrum - you write a great piece and you want people to see it. My 2c would just be: trust your reader. People tend to have a good sense for whether a particular piece of content is going to have an effective “use-by” date or not.
Hi, kens. This is my n-th time rereading this article, this time I noticed a small typo. Footnote 90, "More recent VMR specifications" should've been "VRM".
Lack of dates in articles annoyed me way before i started reading HN. So i don't see the problem (didn't know that was a thing on HN), but i respect your decision.
The article is debunking this claim by Steve Jobs:
"That switching power supply was as revolutionary as the Apple II logic board was," Jobs later said. "Rod doesn't get a lot of credit for this in the history books but he should. Every computer now uses switching power supplies, and they all rip off Rod Holt's design."’
Which like the GUI was ripped off from someone else:
The GUI was stolen from Xerox, and the switching power supplies were ripped of from an oscilloscope company, Hewlett-Packard. Didn't Woz work for them. Hmmmm...
Stolen...right. Xerox invited Apple's engineers to tour PARC and then Apple gave them millions of dollars of shares which Xerox later sold for a hefty profit. IIRC they made more off the Apple stock sale then sales of the Alto and Star.
> The GUI wasn’t stolen. It was licensed. That it was stolen is completely false.
Not only was it not stolen, several Xerox PARC people moved over to Apple since they wanted to see their ideas actually commercialized and not squandered like Xerox was doing, so a lot of it was even the same people!
Read the article. Many computer companies used them before apple. Apple's use didnt revolutionize anything according to the article, because it was already in common use before them. Did we read the same thing?
I remember reading something by Art Spiegelman, whose father was a Holocaust survivor, about how his life felt unimportant because nothing could compare to what his father went through.
But then, he rhetorically asked, if surviving the concentration camp was success, did that mean the vast majority who didn't were losers?
And I think his father said something to the effect, that it was random, there was no "survival of the fittest", so no, matter of factly, the dead were not ones who failed.
Oh you're absolutely correct about that. Jobs in particular needs to be scrutinized.
But if that was the point of the article a few paragraphs under the section "History of switching power supplies to 1977" already accomplished that.
The article reads like it was written by a power supply enthusiast (who knew?), and the author did a good job of drawing me into the history, technology. It would have still been an interesting read without the "bookends".
To find this requires a combination of a few keystrokes and button pushes plus an "outlook" (rather than "inlook") attitude.
Instead, as with so many discussions in a so-called "CS" community, we find something like adolescents trying to BS each other by presenting their mere opinions as facts.
Come on! Please!
Long ago my research community put in a lot of work to make it easy to deal with many simple questions, but we didn't reckon with the sheer inwardness of so many end-users. A similar problem is that most people in CS have no idea what Doug Engelbart really did, yet just typing his name into Google will provide great info in just the first few hits.
How can the current community repair itself and start trying to become a real field again?
it is the paradox of the information era: society has never had such easy and immediate access to the limits of human knowledge, and at the same time, society can’t agree on whether the earth is spherical, vaccines work, or if demonstrably corrupt politicians are really that bad.
I know someone who was an exec at Astec in the 90s and Apple was notable for being willing to pay a premium for their power supplies (I think for better efficiency and miniaturization).
I don't remember any notable problems but Apple has a long history of doing interesting things for form factor rather than prioritizing reliability going at least as far back as the 128k.
Any suggestions on a good Macbook pro charger? I try to find a GaN-based one, with all the necessary certifications, but my search hasn't been successful so far. e.g. [0]
I doubt that the old MagSafe chargers are capable of outputting the variety of voltages necessary to be usefully compliant with USB-C charging. IIRC, the old MagSafe chargers put a fairly small voltage on their output pins, and when a load was detected they turned on the full voltage and power—but there was no real data communication between the computer and the brick (just between the computer and the connector at the end of the charging cable). USB-PD however is a complicated protocol.
If only they'd invest more work into CMC and LC filters for such power supplies in wall warts. As someone who listens to shortwave and AM radio DX it's sometimes impossible to pick up even strong stations due to the swampy noise of SMPS devices.
I'm not looking to argue over the meaning of "invent" or "personal computing", but I'll point out that Alan Kay wrote a 1972 paper "A Personal Computer for Children of All Ages" that pretty much describes the modern laptop or tablet. (He even mentions ad blockers!) Xerox PARC went on to implement a lot of this in the Xerox Alto.
I don't remember exactly when, but maybe around 2007, my cousin sent me a video where a man was essentially interacting with a tablet that was sitting on a table. It had an assistant that was helping him with everyone - he was putting together a presentation about a forest (I believe), I think he asked it the weather, set up meetings, etc.
It was an amazing thing to watch unfold.
To this day, I can't for the life of me find that video and really wish I could.
As I was reading through that paper, it reminded me of it.
If you are doing something so well and making tons of money, why spend money on what will absolutely ruin your core business? Personal computers weren't just going to supplement the copier business, they would seriously ruin it. But there was also a chance computers would flop so why spend the money on trying to ruin your core business and not succeed? People found uses for computers that the pioneers could never imagine so why would some CEO be any more imaginative? Disruptive tech usually doesn't come from the big companies, it comes from little guys who can take the risk because they have nothing else to fall back on. The big companies only get into it when they see that there is an actual market.
Kodak gets picked on unfairly. They weren't completely blind to the digital camera revolution, they produced the first digital camera prototype but it was laughably incapable. Their problem was building a business model that took a cut from every picture snapped, from film to processing to printing. There was nothing in the digital camera model that could duplicate that revenue stream; even if they had sold every digital camera ever made, it would not have saved them. Their final undoing was overestimating the importance of printed pictures, in the end it turned out people would rather see their pictures on their screens.
What would you have done differently than Xerox management?
Probably you, and most people on HN haven't been paying attention to them in a long time, but what happened in the last 15 years or so, is that they decided to diversify out from copiers into services.
It sounds vaguely plausible, people obviously deal with documents mostly on computers and not paper these days, so why can't they help with that?
Eventually they acquired a fairly big, but not a household name, outsourcing company called ACS. I used to wonder why Google couldn't do the same stuff, but they did in fact outsource, probably because it was the data processing equivalent of cleaning toilets.
But then a notorious corporate raider, aka Carl Icahn, came in and forced management to break up the company in the name of shareholder value, isolating the copiers in one business, and the document services in another, basically ACS under a new name.
Who's right? Diversify or get back to basics? It seems sometimes it's just random. You can come up with platitudes justifying either.
This may kind of seem like normal corporate America, but it just seems a bit uncharitable to call the CEO an idiot when you look at how the people running GameStop or whatever have had things a bit easier with billions being thrown at them.
Apple did revolutionize power supplies, but not in the way that is claimed: They made it a marketing stratjey to ship utter garbage, like the Appple III, and the "Hindenbook" along with the most inexpensive, cost cutting, garbage known in the industry, until Dell and eMachines decided to use the same model:
Just look up "Apple PowerSupply Recall" and you will see they are just repeating their cost saving success/failure.
Same story as with hard drive based audio players. I've been using one (if I remember correctly) from Creative long before iPod materialized. All was quiet but suddenly as iPod came it was of course called revolution.
At the time I bought that "boat anchor" (which was fine as I was only using it in my car) iPod was not there. So yes I bought it. And when iPod came into existence it was to be used with the most pathetic piece of software I've ever seen so I bought Zen instead which in my opinion was better for my use.
> "No wireless. Less space than a nomad. Lame." - CmdrTaco
Apple is very good at maximizing "consumer satisfaction". This means giving consumer the best deal for their money considering as many of the consumer needs as feasible. Other brands ignore a lot of these needs and then they wonder why people don't buy their products when, on some details, they are much better than Apple at a lower price.
Seriously, those MP3 players were nothing like the iPod. This is a bit like people going on about smart phones before iPhone. I had a couple of those and the iPhone was just so far ahead, that we are not really talking about the same category of products.
A product isn’t merely checking some boxes. Checking the “has hard drive” box doesn’t make any it an iPod.
In the context of the time, those mp3 players were following in the footsteps of portable CD players or walkmen, which were themselves bulky and about 70-90 minutes of music before changing media. There were also a few low capacity compact flash based players borrowing media from the growing digital camera market, and later on some CD players that would play yellow book CDs with mp3 files
"So far ahead" in some ways, while completely lacking in others.
When iPhone came out, the responsive UI was something I'd wanted for the past several years worth of Palm and HTC phones. But I couldn't get one because I'd have to give up things I used daily - like 3rd party apps, 3G data, turn-by-turn navigation, MMS (which was still a big deal before the prevalence of IP-based mobile IM platforms), multitasking, live streaming radio, and even "crazy" features like copy/paste and custom wallpapers/alert sounds.
iPhone did a few things very well, but it took a while to catch up on a lot of the common functionality of those older smartphones. Thankfully Apple did add those features over time. And other smartphones gained more responsive UI. Now we have more options than ever, with even the cheapest bargain device performing better than anything available in 2007.
Yeah I also didn't know that "hard drive based audio players" was a thing :)
But I remember that ipod was awesome (comparing with the other players with shitty controls and sticky buttons). I hated apple products at the time (religion) but I was amazed by the quality and UI of ipod and the quality of ipad (took me and my kids many attempts and years to actually kill it).
I don't think any other product amazed me that much since then.
The first "box" I used to check when buying things like that is that PC sees it as a hard drive to which I can copy files. That horror they called iTunes would not be let anywhere close to my computers.
While obviously Jobs’ claim was false, I will say that Apple is the only company I am aware of that manufacturers power supplies which are reliably completely free of perceptible inductor whine. I have very acute high-frequency hearing and I often have to replace non-Apple USB(-C) switching power supplies with Apple ones so I don’t go crazy from the whining. Teardowns of Apple PSUs typically reveal very favorable electronic and industrial design as well.
I've, at various points in my career, grumbled about various things whining audibly (one particular motion light sensor was defective and right outside my office for a while). The trick to getting other people to believe you ("I can't hear anything... are you sure?") is to wait for a bring-your-kids-to-work day. And ask if they can hear it.
Or, perhaps, if it's bad enough, you don't even have to do that, because the kids will ask what that horrid whining noise is. We did eventually get it fixed after that, but I was quite literally the only one in the hall who could hear it.
On the topic of power supplies, though - Apple has done some impressive work in their small power supplies. The Chinesium clones are similarly sized, they just skip literally every safety feature intended to keep mains voltage out of your USB cord...
> Doesn't take "acute," just takes "not destroyed."
It depends. Played in an orchestra, and I've found out that hearing is very different from person to person. Seen people with hearing lower than 20Hz band, or people who can hear a wrong note in a symphony orchestra recording, or people who can perfectly tune their instruments by ear... The list goes on and on.
Our ears' equalizers are not always flat and equal. We can damage them yes, but not everyone starts from the same point.
This is called subsonic hearing, and it is not any blessing. Those who can hear frequencies lower than most often experience pain. It's the low frequencies coming from diesel school busses that are most painful to me, when no one else nearby can even hear the bus at all until it gets close enough. What the GGP is talking about is supersonic hearing, which often also includes that extra detail of pain.
Gotta hand it to the kids. My 9 year old claims he can hear a whine from iPads and iPhones when their battery level is near 0%. And I thought I had good hearing...
It’s extremely frustrating, and I was always surprised when I returned mid-range USB chargers because of whine only to receive a replacement with the same problem and hundreds of reviews that failed to mention it. I’ve never had an issue with Apple chargers, and the extra cost is money we’ll spent.
Buy genuine Apple chargers, if not for you, then for your dog.
> Apple is the only company I am aware of that manufacturers power supplies which are reliably completely free of perceptible inductor whine...I often have to replace non-Apple USB(-C) switching power supplies
it's a well-studied economic fact that monopolists generally sell higher quality products, and it helps them maintain their monopoly. With their market power and the fat margins they earn, there is plenty of budget to do R&D and have achieve scale benefits. Their optimum-profit product-mix and pricepoints are skewed higher. Nobody complained about IBM mainframe quality, nor Bell Telephone quality.
So, it's not testimony to Apple's prowess, it's simply a cookbook outgrowth of their product differentiation strategy.
Well, that's completely ahistorical. "Nobody ever got fired for buying IBM" was due to conservatism and idiocy on the part of managers and businesspeople, not to mention... you know... IBM's monopoly power and the advantages that went along with that. During the bulk of the minicomputer and mainframe era it had little or nothing to do with the relative quality of IBM's stuff.
people who set policy need to think of all consumers, not just the rich ones such as yourself. More working class people could buy iPhones if they were not artificially high priced.
You too would benefit from a competitive iPhone market, there would have been an earlier introduction of large screens, cheaper memory options, perhaps getting choices that included repairability, replaceable batteries, microSD cards, etc.
Think of it this way: if Apple was broken into three Apples that compete with each other, the shareholders would not have lost anything that they are entitled to: they'd each still own what they owned before a share of each of the new companies instead of a share in the old. But previously monopoly prices would drop.
This is so off-topic, but no one ever believes me when I'm complaining about a wall adapter across the room driving me mad. Glad to know I'm really not imagining things hah!
It can be an issue even if you think you understand it at the design stage. I've sold services to identify sources so boards or assemblies can be redesigned. In one case two different clocks for two ICs on different boards interfered with each other causing problematic noise. Filtering the power supplies helped, but changing how one IC was used made the single biggest contribution to reducing the noise levels.
My Macbook Power Supply whines like crazy, especially when under load. The one thing that drives me crazy that I rarely see talked about is PWM fan control. I think different manufactures use different frequencies but it's usually far more annoying than the motor/air movement noise.
I'm guessing this article is from 2011, back when Apple fans were just ending their role as being a niche that believed anything.
At this point it is - I think - a pretty common assumption that Apple just puts things together in a decent ecosystem. If Apple devices offers a new frequency range, its because Qualcomm's radio allowed for it, which was dependent on other things further in the stream.
This is an interesting trip down memory lane. Apple still says magical sometimes in their keynotes, but nobody is really mystified only occasionally glad they decided to offer something in that way, since its more about the Apple implementation than the Apple innovation.
Not an Apple fanboy but I don’t this is completely correct. As a huge buyer of semiconductors they can, when they want to, exert a lot of pressure on suppliers. Famously they did this with Gorilla glass; they have done so with Intel and qcomm. Sometimes not so successfully (all the expense on the “liquid metal” company, for example).
It’s not all sheer pressure; they do a lot of collaborative design. After all they have one of the best semiconductor design teams (both digital and analog) around. And they are on standards bodies; they allegedly (some non-Apple people told me) contributed contributed significantly to USB-C.
I emphasized completely because in the modern ecosystem it’s broadly true (RAM, displays etc)
Apple’s supply chains and how they can apply pressure is massively underrated.
People still have this idea that “so and so actually invented it”. Or “Apple just combined X and Y”.
But they fail to see how truck loads of money and a customer willing to pay for something and making large pre-payments can change the trajectory of a company. Or even has impact on other players.
I don't like Apple generally. And it wasn't "their" technology - but thank fuck they brought "better than 1080p" screens to the mainstream.
Throughout the 90's screen resolutions were getting better and better. Then LCDs came and 1080p stopped mainstream screen resolution improvements for at least 5 years.
Thankfully Apple got the screen resolution race going again.
Mainstream 1080p wasn't the worst of it. Just a decade or so ago it was impossible to find a non-17" Windows laptop without a 1366x768 TN screen with awful contrast and viewing angles so bad there was no good angle. Even super premium $3000 "Ultrabooks", it was true insanity.
The last Windows app I worked on, I made sure the tester always tested at that resolution because it was so ubiquitous. I even wrote a utility to set the app resolution so I could run it at that size myself.
It's a very simple device, relying on current heating a wire to bend it to and from a contact, but it's an engineering marvel of low cost effectiveness.
But it doesn't produce "clean" 5V, there's a jitter to it. Some electronics guys have replaced it with modern circuitry (an op-amp I think) but it turned out there was a problem with clean 5V. The jitter would unstick the the analog dials, so they'd display accurately. The clean 5V didn't do that. So, they had to add more circuitry to add jitter in the supply voltage.