Skip to content

Commit

Permalink
[coverity] Fix coverity issues
Browse files Browse the repository at this point in the history
This PR resolves the coverity issues that were identified.

**Changes proposed in this PR:**
- Specify the return type of the lambda function
- Use reference to not copy the object.

This fixes:
- Use of auto that causes a copy (AUTO_CAUSES_COPY)

**Self-evaluation:**
1. Build test: [X]Passed [ ]Failed [ ]Skipped
2. Run test:   [X]Passed [ ]Failed [ ]Skipped

Signed-off-by: Donghyeon Jeong <[email protected]>
  • Loading branch information
djeong20 committed Jan 31, 2024
1 parent e93fc05 commit d2d8f49
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 6 deletions.
2 changes: 1 addition & 1 deletion nntrainer/layers/layer_context.h
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ class InitLayerContext {
bool trainable = false,
TensorLifespan lifespan = TensorLifespan::ITERATION_LIFESPAN,
bool private_ = true) {
auto prefix_ = private_ ? this->name : this->prefix;
const auto &prefix_ = private_ ? this->name : this->prefix;
tensors_spec.emplace_back(dim, init, trainable, prefix_ + ":" + name,
lifespan);
return tensors_spec.size() - 1;
Expand Down
9 changes: 5 additions & 4 deletions nntrainer/layers/layer_node.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -305,9 +305,10 @@ const std::vector<std::string> LayerNode::getInputLayers() const {
std::get<std::vector<props::InputConnection>>(*layer_node_props);
std::vector<std::string> names;
names.reserve(input_connections.size());
std::transform(input_connections.begin(), input_connections.end(),
std::back_inserter(names),
[](const Connection &con) { return con.getName(); });
std::transform(
input_connections.begin(), input_connections.end(),
std::back_inserter(names),
[](const Connection &con) -> const auto & { return con.getName(); });
return names;
}

Expand Down Expand Up @@ -571,7 +572,7 @@ InitLayerContext LayerNode::finalize(const std::vector<TensorDim> &input_dims,
layer = std::move(dlayer);
}

auto scope = getSharedFrom().empty() ? getName() : getSharedFrom();
const auto &scope = getSharedFrom().empty() ? getName() : getSharedFrom();
float max_norm = 0.0;
if (!std::get<props::ClipGradByGlobalNorm>(*layer_node_props).empty())
max_norm = std::get<props::ClipGradByGlobalNorm>(*layer_node_props).get();
Expand Down
3 changes: 2 additions & 1 deletion nntrainer/models/neuralnet.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -1381,7 +1381,8 @@ void NeuralNetwork::print(std::ostream &out, unsigned int flags,
std::vector<unsigned int> column_size = {20, 20, 20, 20};
auto print_graph_layer_info =
[column_size](std::ostream &out, std::vector<std::string> layer_info) {
auto trim_string = [](std::string str, unsigned int column_width) {
const auto &trim_string = [](std::string str,
unsigned int column_width) {
return str.size() < column_width ? str
: str.substr(0, column_width - 1);
};
Expand Down

0 comments on commit d2d8f49

Please sign in to comment.