Hacker Newsnew | past | comments | ask | show | jobs | submit | ted_dunning's commentslogin

Sorta kinda.

TLDR: historical brine production and modern wetlands restoration.

https://en.wikipedia.org/wiki/San_Francisco_Bay_Salt_Ponds


That is merely medieval times.

In ancient times, floats were all 60 bits and there was no single precision.

See page 3-15 of this https://caltss.computerhistory.org/archive/6400-cdc.pdf


I see their 60-bit float has the same size exponent (11 bits) as today's doubles. Only the mantissa was smaller, 48 bits instead of 52.

That written document is prehistoric.

By definition, a document that is written is historic, not prehistoric.

Prehistoric information could be preserved by an oral tradition, until it is recorded in some documents (like the Oral Histories at the Computer History Museum site).


Julia supports full IEEE 754 rounding mode support.

And none of that doesn't improve the throughput of the clinical trials. It just decrease the cost of coming up with things to put into trial.

Actually, this AI Compute is not very useful for physics, protein folding or many other high performance computing.

The problem is that the connectivity required for much of AI is very different than that required for classic HPC (more emphasis on bandwidth, less on super low latency small payload remote memory operations) and the numeric emphasis is very different (lots of mixed resolution and lots of ridiculously small numeric resolutions like fp8 vs almost all fp64 with some fp32).

The result is that essentially no AI computers reach the high end of the TOP500.

The converse is also true, classic frontier scale super computers don't make the most cost effective AI training platforms because they spend a lot of the budget on making HPC programs fast.


Alphafold (protein folding) was trained on Googles TPUs which are not GPUs true but very close.

Flow simulation also happens on GPUs and not CPUs though.

El Capitan is the top 1 on top 500 and the flops ratio between CPU and GPU is nearly 1 to 100.


You should mark sarcasm as subtle as this.

This is a very good point.

Oxygen, nitrogen, CO2 and argon make up 99.94% of the atmosphere. The remaining 0.06% has 5ppm is nearly 1% helium. That's up 200x from the original concentration and is well above the 0.3% that is sometimes quoted as the limit for economic extraction of helium (and well below the 7% of some natural gas).

Furthermore, the leftover gas is also already cold. It is absolutely true that 85K isn't very close to the boiling point of helium, it is a lot closer than starting at the temperature of gas at the well head.

The gotcha is almost certainly going to be that an ASU probably doesn't liquify most of the gas it takes in. That means that the exhaust gas will only be slightly enhanced.


The article seems to be saying that is true unless you implicitly and somewhat invisibly grant access via the file picker.

Wow... that would be great.

All that remains is an algorithm to reliably determine which programs do "shady shit". How is it that you determine that Microsoft updates have not been tampered?

(insincere) apologies for the snarky tone. You are making light of a very hard problem and default deny until confirmed by the user isn't a bad first approximation.


Why should the permission even persist that long? You might leave that app running for the next two years.

Shouldn't a temporary access be temporary? Possibly scoped by time? Possibly scoped to a single access?


Because the app may generate more than one descriptor for it or perform more than one read or write operation in the normal course of usage. If I open a document, and come back to it 6 hours later and click the save button, I would expect it to save the document.

Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: