Hacker Newsnew | past | comments | ask | show | jobs | submit | tdfirth's commentslogin

No rename symbol? What am I missing? It seems like a no brainer.

References + a few extra steps will give you rename symbol. Anthropic is seemingly wanting to experiment with this - so it makes sense to limit the integration points.

If you do backend web development in 99% of software companies then being very good at whatever your RDBMS is is a superpower.

It's definitely worth learning SQL very well, but you also need to learn the data structures your RDBMS uses, how queries translate into operations on that data, and what those operations look like in a query plan.

You can go surprisingly far with just that knowledge.

A great resource is https://use-the-index-luke.com/


I don’t think this is confusing to the vast majority of people writing Go.

In my experience, the average programmer isn’t even aware of the stack vs heap distinction these days. If you learned to write code in something like Python then coming at Go from “above” this will just work the way you expect.

If you come at Go from “below” then yeah it’s a bit weird.


Go has been my primary language for a few years now, and I’ve had to do extra work to make sure I’m avoiding the heap maybe five times. Stack and heap aren’t on my mind most of the time when designing and writing Go, even though I have a pretty good understanding of how it works. The same applies to the garbage collector. It just doesn’t matter most of the time.

That said, when it matters it matters a lot. In those times I wish it was more visible in Go code, but I would want it to not get in the way the rest of the time. But I’m ok with the status quo of hunting down my notes on escape analysis every few months and taking a few minutes to get reacquainted.

Side note: I love how you used “from above” and “from below”. It makes me feel angelic as somebody who came from above; even if Java and Ruby hardly seemed like heaven.


Why have you had to avoid the heap? Performance concerns?


For me, avoiding heap, or rather avoiding gc came when I was working (at work) on backend and web server using Java, and there was default rule for our code that if gc takes more than 1% (I don't remember the exact value) then the server gets restarted.

Coming (back then) from C/C++ gamedev - I was puzzled, then I understood the mantra - it's better for the process to die fast, instead of being pegged by GC and not answering to the client.

Then we started looking what made it use GC so much.

I guess it might be similar to Go - in the past I've seen some projects using a "baloon" - to circumvent Go's GC heuristic - e.g. if you blow this dummy baloon that takes half of your memory GC might not kick so much... Something like this... Then again obviously bad solution long term


Garbage Collection.

The content of the stack is (always?) known at compile time; it can also be thrown away wholesale when the function is done, making allocations on the stack relatively cheaper. These FOSDEM talks by Bryan Boreham & Sümer Cip talk about it a bit:

- Optimising performance through reducing memory allocations (2018), https://archive.fosdem.org/2018/schedule/event/faster/

- Writing GC-Friendly [Go] code (2025), https://archive.fosdem.org/2025/schedule/event/fosdem-2025-5...

Speaking of GC, Go 1.26 will default to a newer one viz. Green Tea: https://go.dev/blog/greenteagc


Ha! I had not intended to imply that one is better than the other, but I am glad that it made you feel good :).

I also came "from above".


As someone who writes both Python and Go (and I've been using Python professionally since 2005), I remember that the scoping behaviour has changed.

Back in Python 2.1 days, there was no guarantee that a locally scoped variable would continue to exist past the end of the method. It was not guaranteed to vanish or go fully out of scope, but you could not rely on it being available afterwards. I remember this changing from 2.3 onwards (because we relied on the behaviour at work) - from that point onwards you could reliably "catch" and reuse a variable after the scope it was declared in had ended, and the runtime would ensure that the "second use" maintained the reference count correctly. GC did not get in the way or concurrently disappear the variable from underneath you anymore.

Then from 2008 onwards the same stability was extended to more complex data types. Again, I remember this from having work code give me headaches for yanking supposedly out-of-scope variable into thin air, and the only difference being a .1 version difference between the work laptop (where things worked as you'd expect) and the target SoC device (where they didn't).


I don't see how this is coming at go "from below".

even in C, the concept of returning a pointer to a stack allocated variable is explicitly considered undefined behavior (not illegal, explicitly undefined by the standard, and yes that means unsafe to use). It be one thing if the the standard disallowed it.

but that's only because the memory location pointed to by the pointer will be unknown (even perhaps immediately). the returning of the variable's value itself worked fine. In fact, one can return a stack allocated struct just fine.

TLDR: I don't see what the difference between returning a stack allocated struct in C and a stack allocated slice in Go is to a C programmer. (my guess is that the C programmer thinks that a stack allocated slice in Go is a pointer to a slice, when it isn't, it's a "struct" that wraps a pointer)


The confusion begins the moment you think Go variables get allocated on the stack, in the C sense. They don't, semantically. Stack allocation is an optimization that the Go compiler can sometimes do for you, with no semantics associated with it.

The following Go code also works perfectly well, where it would obviously be UB in C:

  func foo() *int {
    i := 7
    return &i
  }

  func main() {
    x := foo()
    fmt.Printf("The int was: %d", *x) //guaranteed to print 7
  }


Is that the case? I thought that it would be a copy instead of a heap allocation.

Of course the compiler could inline it or do something else but semantically its a copy.


A copy of what? It’s returning a pointer, so i has to be on the heap[0].

gc could create i on the stack then copy it to the heap, but if you plug that code into godbolt you can see that it is not that dumb, it creates a heap allocation then writes the literal directly into that.

[0] unless Foo is inlined and the result does not escape the caller’s frame, then that can be done away with.


ok, I'd agree with you in that example a go programmer would expect it to work fine, but a C programmer would not, but that's not the example the writer gave. I stand by my statement that the example the writer gave, C programmer would expect to work just fine.


I think the writer had multiple relatively weird confusions, to be fair. It's most likely that "a little knowledge is a dangerous thing". They obviously knew something about escape analysis and Go's ability to put variables on the stack, and they likely knew as well that Go slices are essentially (fat) pointers to arrays.

As the author shows in their explanations, they thought that the backing array for the slice gets allocated on the stack, but then the slice (which contains/represents a pointer to the stack-allocated array) gets returned. This is a somewhat weird set of assumptions to make (especially give that the actual array is allocated in a different function that we don't get to see, ReadFromFile, but apparently this is how the author thought through the code.


American hegemony, and all that.


In the US they spell it as nonze.


No we don't.


Pretty positive that was a joke/bait…


It absolutely was a joke

Slightly absurdist non-sensical humour I’ll admit, but none the less, a joke :-)


The best kind :)


I believe they change color to express emotion.


They change color to communicate AND to regulate body temperature AND as camouflage.

It is not a ‘myth’ that one of the use cases for their color changing is camouflage, I’m not sure what they are on about.


Google kills Gemini cloud services is the best one. I can't believe I haven't seen that joke until today.


10 years is way too long for Google. It will be gone in 5 replaced by 3 other AI cloud services.


You're right. How naive of me.


I mean Bard barely lasted a year. Arguably Gemini is just a rebrand of Bard, but Bard is still dead.


If you look at web traffic when making Gemini web requests, you'll see that Bard is still in the URL (so are LaMDA (pre-bard) and Assistant (pre-GenAI)):

gemini.google.com/_/BardChatUi/data/assistant.lamda.BardFrontendService/StreamGenerate


That's interesting! I wonder how many parts of their other dead projects are used in current projects.


Bing Chat suffered the same fate


Tbh I never even heard of that!


Didn't they also just shut down Vertex and Gemini APIs to launch a new unified API this week?


I can't find any news about this. That's not wholly unusual given the context. Do you have a link?


Finally that turf war ends


lol already forgot about bard like it was ancient history


The humor is hit or miss but when it hits it’s quite funny, and the misses are merely groan-worthy.

Triggered by the lighthearted tone of the prompt, I’d bet, but still quite impressive relative to most LLM-generated jokes I’ve had the misfortune to encounter.

My favorite: “Is it time to rewrite sudo in Zig?” which has a few layers to it.


> The humor is hit or miss but when it hits it’s quite funny, and the misses are merely groan-worthy.

Not sure, I thought basically every link was pretty hilarious. "FDA approves over-the-counter CRISPR for lactose intolerance" isn't even that funny on its face but for some reason it had me actually loling.


"IBM to acquire OpenAI (Rumor) (bloomberg.com)".... quick someone set up a polymarket so i can bet against it.


In my experience (I've put hundreds of billions of tokens through structured outputs over the last 18 months), I think the answer is yes, but only in edge cases.

It generally happens when the grammar is highly constrained, for example if a boolean is expected next.

If the model assigns a low probability to both true and false coming next, then the sampling strategy will pick whichever one happens to score highest. Most tokens have very similar probabilities close to 0 most of the time, and if you're picking between two of these then the result will often feel random.

It's always the result of a bad prompt though, if you improve the prompt so that the model understands the task better, then there will then be a clear difference in the scores the tokens get, and so it seems less random.


It's not just the prompt that matters, it's also field order (and a bunch of other things).

Imagine you're asking your model to give you a list of tasks mentioned in a meeting, along with a boolean indicating whether the task is done. If you put the boolean first, the model must decide both what the task is and whether it is done at the same time. If you put the task description first, the model can separate that work into two distinct steps.

There are more tricks like this. It's really worth thinking about which calculations you delegate to the model and which you do in code, and how you integrate the two.


It should have always worked this way. Without this feature you take the algebra out of relational algebra. That's the root of most of the composition issues in SQL.

Sadly it's a few decades too late though, and sadly this just fragments the "ecosystem" further.


The best strategy is to write a helper function in the console to click for you. Then invest heavily in the DVDs, DVD bounce rate, stimulation per bounce, and general SPS increases.

I reached several quintillion stimulation, at which point I was offered to purchase "go to the beach" for 2 million. This ends the game and plays a relaxing beach video.

You too can get to the beach in just 5 minutes.


This is a beautiful comment, and I couldn't agree more.

There is only one standard of accomplishment and it's set by people like Mozart.

Accepting that is humbling, but it's required to know yourself and grow. My contributions probably won't amount to much, but Mozart (et al) have shown us what good looks like and it's fun to strive.


And thank you very much for your most welcome comment. It's nice to know one's posts are read but also that some appreciate them. :-)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: