Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

class MXNetNodeData {
vector<NDArray*> inputs;
vector<NDArray*> outputs;
Autograd autograd;
Operator* op;
string name;
[...]
bool is_operator();
bool is_variable();
bool has_autograd();
};

class EdgeData {};

Graph<MXNetNodeData, EdgeData, size_t, uint16_t> g;

g.DFSVisit([](const MXNetNodeData& x) { ... });


Diferentiation, and optimization would be implemented as modular and flexible passes to the graph, which will be encapsulated and maintainable.


For example, a piece of code that would be greatly improved by this would be Imperative::Backward. This code as of today is breaking encapsulation of the classes that it's using. Holding indices to internal data structures in the graph as well as using node ids, pointers to arrays and variables. It's extremely complex to follow and reason. It can be broken into different passes that augment the graph and add nodes or mutate it when doing shape inference, adding backward nodes etc.

...