Machine learning primitives in rustc -- An Opportunity

The language plugin concept looks really neat, and doable with little effort and change to the existing language. All though for me personally it doesn’t eliminate the strain caused by meshing immediate and deferred execution.

I think it’s an interesting mid-term solution to improve the ergonomics of interacting with these kind of libraries. But I also find it interesting to explore the fully integrated aspect.

I’ll try to outline the caveats from the (excellent!) article on Julia Lang linked in the initial post:

Bundled approaches optimize specific, generally high-level approaches to machine learning. The library provides excellent support for feature X, but researchers frequently violate the assumptions that makes these perform well. This hints at the need for even lower level primitives being first-class citizens.

Building and reasoning around “deferred execution” (my terminology) is hard on the programmer and the compiler. The best example I think is when you want to do step-through debugging. While it’s conceivable that you can build a sophisticated language and runtime that performs this, you always end up with different contexts. The intermediate and the deferred one. Two separate domains and languages. They also have different limitations. Some models could benefit from using more generalized language constructs (loops, recursion, …). But they are hard to fit into the sometimes esoteric, specialized language.

This is partly why the article provokes you to think about a fully integrated language. And in my opinion, one that might depend on CUDA/OpenCL when generating code :). It can also be that tensorflow becomes such a strong de-facto standard that they provide a low-level virtual machine which can be targeted as a general purpose backend.

1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.