Hacker Newsnew | past | comments | ask | show | jobs | submit | funks_'s commentslogin

We don’t have plans for that, but you could try to convert the Markdown source: https://github.com/iclr-blogposts/2025/blob/main/_posts/2025...


Yes, this blog post indeed inspired us to submit ours!


If you are interested in color dithering with different color difference metrics [1], I've implemented just that [2]. You can find an example comparing metrics in my docs [3].

[1]: https://juliagraphics.github.io/Colors.jl/stable/colordiffer...

[2]: https://github.com/JuliaImages/DitherPunk.jl

[3]: https://juliaimages.org/DitherPunk.jl/stable/#Dithering-with...


I wish dex-lang [1] had gotten more traction. It’s JAX without the limitations that come from being a Python DSL. But ML researchers apparently don’t want to touch anything that doesn’t look exactly like Python.

[1]: https://github.com/google-research/dex-lang


It's very rare that an ML project is _only_ the ML parts. A significant chunk of the engineering effort goes into data pipelines and other plumbing. Having access to a widely used general purpose language with plenty of libraries in addition to all the ML libraries is the real reason why everyone goes for Python for ML.


It seems like an experimental research language.

Julia also competes in this domain from a more practical standpoint and has less limitations than JAX as I understand it, but is less mature and still working on getting wider traction.


The Julia AD ecosystem is very interesting in that the community is trying to make the entire language differentiable, which is much broader in scope than what Torch and JAX are doing. But unlike Dex, Julia is not a language built from the ground up for automatic differentiation.

Shameless plug for one of my talks at JuliaCon 2024: https://www.youtube.com/live/ZKt0tiG5ajw?t=19747s. The comparison between Python and Julia starts at 5:31:44.


Ah I had not realized I was corresponding with the author of that talk - I'd followed it back when it was happening as I'm particularly interested in adapting AD.

Where do you feel Julia is at this point in time (compared to say, JAX or PyTorch) from a practitioner's standpoint?


When it comes to general deep learning, Julia is much less mature than the JAX ecosystem. I think deep learning will be the hardest nut for Julia to crack. The field is moving incredibly fast, and network effects are strong. Julia's strength lies in scientific computing, so I think adoption will come through novel applications of AD/ML in the sciences, rather than trying to catch up with the latest LLM developments

I'm positive about Julia's future because the developer experience just feels so fun and productive. I always find it impressive how much a small group of self-organized volunteers has been able to achieve. Amazing things could happen if a company like Google or Meta paid a team of full-time engineers to advance the deep learning ecosystem. Fun fact: Julia strongly influenced PyTorch's recent design decisions [1].

[1]: https://dev-discuss.pytorch.org/t/where-we-are-headed-and-wh...


Dex is also missing user authored composable program transformations, which is one of JAX’s hidden superpowers.

So not quite “JAX without limitations” — but certainly without some of the limitations.


This is both its strength and its weakness. As soon as you write a jaxpr interpreter, you lose all the tooling that makes the python interpreter so mature. For example stack traces and debugging become black holes. If jax made it easy to write these transformations without losing python’s benefits it would be incredible.


Are you talking about custom VJPs/JVPs?


No, I'm talking about custom `Jaxpr` interpreters which can modify programs to do things.


It's not about the syntax, it's all the knowledge, tools, existing code, etc that make Python so attractive.


I don't doubt that, but I'm specifically talking about new languages. I've seen far more enthusiasm from ML researchers for Mojo, which doesn't even do automatic differentiation, than for Dex. And to recycle an old HN comment of mine, people are much more eager to learn a functional programming language if it looks like NumPy (I'm talking about JAX here).


Is Mojo actually getting significant uptake in research? I haven't been following closely but new tooling at that layer seems much more useful when cost-optimizing deployment.


Mojo is interesting, because I get to keep all my existing Python code and libraries for free. Then when I need to speed things up I can use Mojo syntax.


As far as I understand, you will only be able to speed up code that was previously written in pure Python. This excludes JAX, PyTorch, NumPy and any other Python package written in C/C++/Rust/Fortran.


Especially if you actually require vector-Jacobian or Jacobian-vector products instead of the full Jacobian.


There is HNTerm [1], which also has an online demo [2].

[1]: https://github.com/ggerganov/hnterm

[2]: https://hnterm.ggerganov.com


Are there any plans for native autodiff systems in Mojo?


> JAX is just numpy (mostly)

Or you could say JAX is a DSL that looks like numpy to trick Python programmers into using a compiled functional programming language. ;)


I've recently started using Taichi (https://taichi-lang.org/) for numerical codes and the fact it doesn't try to trick you into thinking it's numpy is a nice "feature". ;)


Shhh, don't tell them! ;-)


This is exactly why I would recommend a Hario Switch over a V60 or Aeropress. It makes my mornings much less complicated. However, you will still have to look for filters.


Fun fact: Atari and Tengen are both named after terms from the game of Go, the former describing a group of stones that can be captured [1], the latter the center point of the board [2].

[1]: https://senseis.xmp.net/?Atari

[2]: https://senseis.xmp.net/?Tengen


And don't forget Sente Technologies, an arcade game company in the 1980s founded by ex-Atari employees. Sente is also a term from Go -- meaning to be in the strategic position where your opponent will have to respond to your attacks.

https://en.wikipedia.org/wiki/Sente_Technologies


I've been playing go for 25 years and never made the connection. Mind blown.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: