Skip to content

Commit

Permalink
[Graph] change of inplace-type setting method
Browse files Browse the repository at this point in the history
The method of setting the inplace-type has been redefined.

The reason why inplace processing becomes complicated is that since a multi-out layer shares output variables, so it needs to be considered whether or not inplace can be performed.

To simplify the problem, the layers that can perform inplace even after the multi-out layer are only no-operation layers(no-op layers). These no-op layers include identity, reshape, and flatten layers.

For other layers, even if they support inplace, they cannot perform inplace when there is a multi-out layer in front of them.

Note that because no-op layers connected with multi-out layer share memory with the multi-out layer, so they have the same properties as the multi-out layer. This is expressed as RESTRICTING in our script.

Based on these definitions, I've redesigned the method of setting inplace type.

1. By default, initialize the inplace type for each layer. If supportInPlace is true, it will be initialized as NON_RESTRICTING; otherwise, it will be initialized as NONE.
2. However, not all layers are initialized like this. For multi-out layers or no-op layers, if supportInPlace is true, they will be initialized as RESTRICTING types(However, the no-op layer will be changed to a non-restricting type if that is not connected with the multi-out layer).
3. After initialization, confirm the input connections from the network_graph.cpp to determine the final inplace type. It's clearer to see the source code for this part.

**Self evaluation:**
1. Build test:   [X]Passed [ ]Failed [ ]Skipped
2. Run test:     [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Seungbaek Hong <[email protected]>
  • Loading branch information
baek2sm committed Nov 19, 2024
1 parent ae8567d commit f9d8b06
Show file tree
Hide file tree
Showing 8 changed files with 169 additions and 167 deletions.
222 changes: 69 additions & 153 deletions nntrainer/graph/network_graph.cpp

Large diffs are not rendered by default.

16 changes: 14 additions & 2 deletions nntrainer/layers/flatten_layer.h
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,9 @@ class FlattenLayer : public ReshapeLayer {
/**
* @brief Constructor of Flatten Layer
*/
FlattenLayer() : ReshapeLayer(), flatten_props(
props::StartDimension(), props::EndDimension()) {}
FlattenLayer() :
ReshapeLayer(),
flatten_props(props::StartDimension(), props::EndDimension()) {}

/**
* @brief Destructor of Flatten Layer
Expand Down Expand Up @@ -58,6 +59,17 @@ class FlattenLayer : public ReshapeLayer {
*/
void setProperty(const std::vector<std::string> &values) override;

/**
* @brief Initialize the in-place type of the layer
* @return InPlaceType
*/
InPlaceType initializeInPlaceType() final {
if (!supportInPlace())
return InPlaceType::NONE;
else
return InPlaceType::RESTRICTING;
}

/**
* @copydoc Layer::exportTo(Exporter &exporter, ml::train::ExportMethods
* method)
Expand Down
12 changes: 12 additions & 0 deletions nntrainer/layers/identity_layer.h
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ namespace nntrainer {

/**
* @class Identity Layer
* @brief Identity Layer
* @note Identity layers takes multiple tensors as input, redirects to output
* without doing nothing (or if unavoidable, copying)
*/
Expand Down Expand Up @@ -73,6 +74,17 @@ class IdentityLayer final : public Layer {
*/
bool supportInPlace() const override { return true; }

/**
* @brief Initialize the in-place type of the layer
* @return InPlaceType
*/
InPlaceType initializeInPlaceType() final {
if (!supportInPlace())
return InPlaceType::NONE;
else
return InPlaceType::RESTRICTING;
}

/**
* @copydoc Layer::getType()
*/
Expand Down
28 changes: 28 additions & 0 deletions nntrainer/layers/layer_devel.h
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,18 @@ class InitLayerContext;
class RunLayerContext;
class Exporter;

/**
* @brief Enum class for the various types of inplace modes supported by layer
*
*/
enum class InPlaceType {
NONE, /**< layer is not inplace */
RESTRICTING, /**< layer is in-place and does place restriction on layers
ahead of it to be in-place */
NON_RESTRICTING /**< layer is in-place and does NOT place restriction on the
layers ahead of it to be in-place */
};

/**
* @class Layer Base class for layers
* @brief Base class for all layers
Expand Down Expand Up @@ -248,6 +260,22 @@ class Layer {
*/
virtual bool supportInPlace() const { return false; }

/**
* @brief Initialize the in-place type of the layer
* @details If it is a layer that supports in-place, the default in-place type
* is NONE_RESTRICTING, but if there is a RESTRICTING type among the input
* layers, it is set to NONE in the network_graph.cpp.
* Layers with exceptional behavior such as No-Operation layers should
* override this function.
* @return InPlaceType
*/
virtual InPlaceType initializeInPlaceType() {
if (!supportInPlace())
return InPlaceType::NONE;
else
return InPlaceType::NON_RESTRICTING;
}

/**
* @brief check if this layer requires label to be passed
* @note if requireLabel() == true means, for now, that it is endpoint of a
Expand Down
14 changes: 14 additions & 0 deletions nntrainer/layers/layer_node.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -930,6 +930,20 @@ bool LayerNode::supportInPlace() const {
return layer->supportInPlace();
}

/**
* @brief Initialize the in-place type of the layer
* @return InPlaceType
*/
InPlaceType LayerNode::initializeInPlaceType() {
inplace_type = layer->initializeInPlaceType();
return inplace_type;
}

/**
* @brief
*
*/

/**
* @brief check if this layer requires label to be passed
*/
Expand Down
22 changes: 10 additions & 12 deletions nntrainer/layers/layer_node.h
Original file line number Diff line number Diff line change
Expand Up @@ -55,18 +55,6 @@ class Packed;
class LossScaleForMixed;
} // namespace props

/**
* @brief Enum class for the various types of inplace modes supported by layer
*
*/
enum class InPlaceType {
NONE, /**< layer is not inplace */
RESTRICTING, /**< layer is in-place and does place restriction on layers
ahead of it to be in-place */
NON_RESTRICTING /**< layer is in-place and does NOT place restriction on the
layers ahead of it to be in-place */
};

/**
* @class LayerNode class
* @brief layer node class for the graph
Expand Down Expand Up @@ -365,6 +353,16 @@ class LayerNode final : public ml::train::Layer, public GraphNode {
*/
bool supportInPlace() const;

/**
* @brief Initialize the in-place type of the layer
* @details If it is a layer that supports in-place, the default in-place type
* is NONE_RESTRICTING, but if there is a RESTRICTING type among the input
* layers, it is set to NONE in the network_graph.cpp.
* Layers with exceptional behavior such as No-Operation layers should
* override this function.
* @return InPlaceType
*/
InPlaceType initializeInPlaceType();
/**
* @brief Notify that this layer will execute in-place
*
Expand Down
11 changes: 11 additions & 0 deletions nntrainer/layers/multiout_layer.h
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,17 @@ class MultiOutLayer : public Layer {
*/
bool supportBackwarding() const override { return true; };

/**
* @brief Initialize the in-place type of the layer
* @return InPlaceType
*/
InPlaceType initializeInPlaceType() final {
if (!supportInPlace())
return InPlaceType::NONE;
else
return InPlaceType::RESTRICTING;
}

/**
* @copydoc Layer::supportInPlace()
*/
Expand Down
11 changes: 11 additions & 0 deletions nntrainer/layers/reshape_layer.h
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,17 @@ class ReshapeLayer : public Layer {
*/
bool supportInPlace() const override { return true; }

/**
* @brief Initialize the in-place type of the layer
* @return InPlaceType
*/
InPlaceType initializeInPlaceType() override {
if (!supportInPlace())
return InPlaceType::NONE;
else
return InPlaceType::RESTRICTING;
}

/**
* @copydoc Layer::exportTo(Exporter &exporter, ml::train::ExportMethods
* method)
Expand Down

0 comments on commit f9d8b06

Please sign in to comment.