You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Next »

Link to dev List discussion

TBD

Feature Shepherd

TBD

Problem

Currently MXNet only supports Custom operators written in higher level langauges (ie. Python, Java/Scala, etc.) via the Custom Op interface: https://mxnet.incubator.apache.org/versions/master/tutorials/gluon/customop.html?highlight=customop. This makes it complicated to add high performance routines written in C++ and CUDA. One solution was the MobulaOp project: https://github.com/wkcn/MobulaOP which enabled a seamless experience for loading these high performance C++ and CUDA routines built on-top of the Custom Op interface. This project was very successful and we propose to integrate the concepts and design directly into MXNet in this project. But in this project we will implement a CustomOp and dynamic library loader into the MXNet engine, enabling custom high performance ops to be leveraged from all language bindings and without the overhead of the engine using callbacks at runtime. 

UserExperience

Similar to the ideas presented in the Bring Your Own Accelerator proposal and the current user experience that MobulaOp provides, we want to provide a similar user experience where its easy to load operator libraries dynamically at runtime. Similarly, one benefit to writing custom ops is that you do not need to recompile MXNet. So we want to provide an easy to use build flow to compile custom operators into libraries without a TON of external dependencies.

However, we will aim to balance between "simplified build/limited dependencies" and "ease of writing custom operators". For example, many custom operators may need to execute basic tensor operations like addition, dot, etc. and it would be redundant and complicated for custom op authors to have to rewrite these core routines. 

Lastly, we want custom operators to be first-class operators and have access to all the capabilities that internal MXNet operators do. One example is enabling custom operators to leverage the MXNet resource manager for storage and memory.

Goals/Usecases

MXNet Java Inference API#Goals

Open Questions

Proposed Approach

MXNet Java Inference API#ProposedApproach

MXNet Java Inference API#ClassDiagram

MXNet Java Inference API#SequenceDiagram

Addition of New APIs

Backward compatibility

Performance Considerations

Test Plan

Alternative Approaches

MXNet Scala API Usability Improvement#AlternativeApproachconsidered

Technical Challenges 

MXNet Scala API Usability Improvement#TechnicalChallenges

Milestones

References

  • No labels