Hacker Newsnew | past | comments | ask | show | jobs | submit | schobi's commentslogin

It is a new neat idea to selectively adjust focus distance for different regions of the scene!

- processing: while there is no post processing, it needs scene depth information which requires pre computation, segmentation and depth estimation. Not a one-shot technique and quality depends on computational depth estimates being good

- no free lunch. The optical setup needs to trade in some light for this cool effect to work. Apart from the limitations of the prototype, how much loss is expected in theory? How does this compare to a regular camera setup with lower aperture? F/36 seems excessive for comparison.

- resolution - what resolutions have been achieved? (maybe not the 12 MPixels of the sensor? For practical or theoretical reasons? ) What depth range can the prototype capture? "photo of Paris Arc de triumphe displayed on a screen". This is suspiciously omitted

- how does the bokeh look like when out of focus? At the edge of an object? The introduction of weird or unnatural artifacts would seriously limit the acceptance

Don't get me wrong - nice technique! But to my liking the paper is omitting fundamental properties


It even requires depth information -

While this methods has no post processing, it requires a pre processing step to pre-calture the scene, segment it, estimate depth an compute the depth map.


I'm confused by the claims..

You try to compute something that supercomputer can't - by not computing it? Instead the formula is stored in a data structure. But once you need to access all the values you still have something that does not fit the memory and needs to be computed.

I can't judge on the Java side, but suggest to pick a better example on how this can be useful.


Most languages force you to choose: either compute everything upfront (O(n) memory) or write complex lazy evaluation code. Coderive gives you declarative intent with automatic optimization. You write what you mean (for i in [0 to 1Qi]), and the runtime figures out the optimal execution strategy. This is like having a JIT compiler that understands mathematical patterns, not just bytecode."

it only compute what was needed on the right time. See this output for example:

Enter file path or press Enter for default [/storage/emulated/0/JavaNIDE/Programming-Language/Coderive/executables/LazyLoop.cod]: > Using default file: /storage/emulated/0/JavaNIDE/Programming-Language/Coderive/executables/LazyLoop.cod Testing timer() function: Timer resolution: 0.023616 ms

Testing condition evaluation: 2 % 2 = 0 2 % 2 == 0 = true 3 % 2 = 1 3 % 2 == 0 = false 24000 % 2 = 0.0 24000 % 2 == 0 = true

Conditional formula creation time: 2.657539 ms

Results: arr[2] = even arr[3] = odd arr[24000] = even arr[24001] = odd

=== Testing 2-statement pattern optimization === Pattern optimization time: 0.137 ms arr2[3] = 14 (should be 33 + 5 = 14) arr2[5] = 30 (should be 55 + 5 = 30) arr2[10] = 105 (should be 1010 + 5 = 105)

Variable substitution time: 0.064384 ms arr3[4] = 1 (should be 42 - 7 = 1) arr3[8] = 9 (should be 82 - 7 = 9)

=== Testing conditional + 2-statement === Mixed optimization time: 3.253846 ms arr4[30] = 30 (should be 30) arr4[60] = 121 (should be 602 + 1 = 121)

=== All tests completed ===

---

From:

unit test

share LazyLoop { share main() { // Test timer() first - simplified outln("Testing timer() function:") t1 := timer() t2 := timer() outln("Timer resolution: " + (t2 - t1) + " ms") outln()

        arr := [0 to 1Qi]
        
        // Test the condition directly
        outln("Testing condition evaluation:")
        outln("2 % 2 = " + (2 % 2))
        outln("2 % 2 == 0 = " + (2 % 2 == 0))
        outln("3 % 2 = " + (3 % 2))
        outln("3 % 2 == 0 = " + (3 % 2 == 0))
        
        // Also test with larger numbers
        outln("24000 % 2 = " + (24K % 2))
        outln("24000 % 2 == 0 = " + (24K % 2 == 0))
        
        // Time the loop optimization
        start := timer()
        for i in [0 to 1Qi] {
            if i % 2 == 0 {
                arr[i] = "even"
            } elif i % 2 == 1 {
                arr[i] = "odd"
            }
        }
        loop_time := timer() - start
        outln("\nConditional formula creation time: " + loop_time + " ms")

        outln("\nResults:")
        outln("arr[2] = " + arr[2])
        outln("arr[3] = " + arr[3])
        outln("arr[24000] = " + arr[24K])
        outln("arr[24001] = " + arr[24001])
        
        // Test the 2-statement pattern optimization with timing
        outln("\n=== Testing 2-statement pattern optimization ===")
        
        arr2 := [0 to 1Qi]
        
        start = timer()
        for i in arr2 {
            squared := i * i
            arr2[i] = squared + 5
        }
        pattern_time := timer() - start
        outln("Pattern optimization time: " + pattern_time + " ms")
        
        outln("arr2[3] = " + arr2[3] + " (should be 3*3 + 5 = 14)")
        outln("arr2[5] = " + arr2[5] + " (should be 5*5 + 5 = 30)")
        outln("arr2[10] = " + arr2[10] + " (should be 10*10 + 5 = 105)")
        
        // Test with different variable names
        arr3 := [0 to 1Qi]
        
        start = timer()
        for i in arr3 {
            temp := i * 2
            arr3[i] = temp - 7
        }
        var_time := timer() - start
        outln("\nVariable substitution time: " + var_time + " ms")
        
        outln("arr3[4] = " + arr3[4] + " (should be 4*2 - 7 = 1)")
        outln("arr3[8] = " + arr3[8] + " (should be 8*2 - 7 = 9)")
        
        // Test that it still works with conditional
        outln("\n=== Testing conditional + 2-statement ===")
        
        arr4 := [0 to 100]
        
        start = timer()
        for i in arr4 {
            if i > 50 {
                doubled := i * 2
                arr4[i] = doubled + 1
            } else {
                arr4[i] = i
            }
        }
        mixed_time := timer() - start
        outln("Mixed optimization time: " + mixed_time + " ms")
        
        outln("arr4[30] = " + arr4[30] + " (should be 30)")
        outln("arr4[60] = " + arr4[60] + " (should be 60*2 + 1 = 121)")

        outln("\n=== All tests completed ===")
    }
}

Linux mainline kernel just had support for GPIB added. https://hackaday.com/2025/12/16/after-decades-linux-finally-...

Linux gpib also supports my adapter natively with a special linux gpib firmware you can download from the page above. It also supports multiple instruments connected to one adapter then.

This sounds more like a Matt Parker video idea - get a bunch of people, three theodolites to measure angles accurately, a good location and start measuring angles for line of sight and see how well this determines the earth's radius.

Rough estimate - with an excellent 0.5" angular resolution and 35km triangle this could work.


It would be interesting to learn how this was created.

Did you buy all these colors and paint and scan them? Did you analyze the shopping images of the bottles and classify them into hex colors? Or maybe just group by the color names given in the storefront listing?

Vastly different efforts, different "accuracy", but still, each methods has its use. But knowing what to expect would be nice.


I pulled out the RGB values from the solid color areas of the swatch images on the manufacturers' websites for each of the colors.

It's definitely just an approximation of the real-world color, but I figured that if that's the RGB value the manufacturer used, it's probably pretty close.

Then I calculate the euclidean distance between the RGB value from the provided hex and each of the paints, and show the two closest matches from each brand.


Closed circuit television (CCTV) is a term to describe video transmission that is not broadcast. Traditionally with BNC cables going to a control room, monitors and recorders.

I think this software-only post is meant for IP cameras / surveillance cameras. Internet is the oposite of closed circuit.

Maybe CCTV is used as a synonym for surveillance now in some regions of the world, but certainly confusing for a non-native speaker.


> I think this software-only post is meant for IP cameras / surveillance cameras. Internet is the oposite of closed circuit.

I think in this case, IP is referring to IP from TCP/IP, meaning "The Internet Protocol", not necessarily over/through "public internet links", so as long as you're only within your own local network/WAN, wouldn't that still be CCTV then? Or maybe the "closed circuit" thing is more of a physical property than I read it to be?

I'm also non-native English speaker FWIW.


I agree with the overall discussion, but in the last link (Curtis Mobley Ocean Optics Book) the argument is broken.

Mobley plotted two graphs of the solar irradiation and a black body fit. The only difference is supposed to be the x axis with a plot over frequency or wavelength. This is a non-linear mapping and a different shape is expected. But the result is that the peak irradiation level is at either 501nm or 882nm (when converted back). That can't be right. The labeling of the x axis does not change the maximum.

What he meant to do was to plot either the solar irradiation in W m^-2 nm^-1 or as the number of photons. With lower energy photons (towards the red) the same irradiation level will consist of more photons. This shifts the maximum number of photons towards higher wavelengths. 880nm sounds plausible.


No, see previous discussion here: https://news.ycombinator.com/item?id=44632819


So far it sounds reasonably environmental friendly and good that they have a pilot plant running.

A few questions remain unanswered though: What can the current plant already do? It sounds like a multi-day sequential process per batch. How many batteries could that give?

The mixed metal product also contains nickel-manganese-cobalt. But certainly with a lot of other stuff and not in the exact ratio you would put in a battery. Even if we were to continoue with NMC batteries (LFPs are more common today). It looks like a first concentration step to get the interesting 10% of the rock. What separation process still remains? I expect a concentrate still to be much more useful than bare rock.

What are the overall economics? I understand that you won't need the separate mining as Olivine is considered waste and has already been piled up. But is that an economic benefit? (cheaper?) Environmental? Or time to market? (you don't need another mining permission for more capacity).

Is it just a more green but more expensive extraction from unused Olivine? Or will this replace all other dirty extractions mining soon? (too good to be true)


I would not expect that this becomes competitive against a low power controller that is sleeping most of the time, like in a typical wristwatch wearable.

However, the examples indicate that if you have a loop that is executed over and over, the setup cost for configuring the fabric could be worth doing. Like a continuous audio stream in a wakeup-word detection, a hearing aid, or continous signals from an EEG.

Instead of running a general purpose cpu at 1MHz the fabric would be used to unroll the loop, you will use (up to) 100 building blocks for all individual operations. Instead of one instruction after another, you have a pipeline that can execute one operation in each cycle in each building block. The compute thus only needs to run at 1/100 clock, e. g. the 10kHz sampling rate of the incoming data. Each tick of the clock moves data through the pipeline, one step at a time.

I have no insights but can imagine how marketing thinks: "let's build a 10x10 grid of building blocks, if they are all used, the clock can be 1/100... Boom - claim up to 100x more efficient!" I hope their savings estimate is more elaborate though...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: