Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make runtest fail #138

Open
AliRaza21918 opened this issue Jul 14, 2018 · 4 comments
Open

make runtest fail #138

AliRaza21918 opened this issue Jul 14, 2018 · 4 comments

Comments

@AliRaza21918
Copy link

AliRaza21918 commented Jul 14, 2018

OS:: Ubuntu16.04
CPU_ONLY

I have make changes in the line 56 of "contrastive_loss_layer.cpp" file to fix an error while make all
Dtype dist = std::max(margin - (float)sqrt(dist_sq_.cpu_data()[i]), Dtype(0.0));
here “(float)” is added

So when I executed make runtest my terminal display this result

CXX src/caffe/test/test_power_layer.cpp
In file included from src/caffe/test/test_power_layer.cpp:12:0:
./include/caffe/test/test_gradient_check_util.hpp: In instantiation of ‘void caffe::GradientChecker::CheckGradientSingle(caffe::Layer, const std::vector<caffe::Blob>&, const std::vector<caffe::Blob>&, int, int, int, bool) [with Dtype = float]’:
./include/caffe/test/test_gradient_check_util.hpp:208:26: required from ‘void caffe::GradientChecker::CheckGradientEltwise(caffe::Layer
, const std::vector<caffe::Blob>&, const std::vector<caffe::Blob>&) [with Dtype = float]’
src/caffe/test/test_power_layer.cpp:78:5: required from ‘void caffe::PowerLayerTest::TestBackward(caffe::PowerLayerTest::Dtype, caffe::PowerLayerTest::Dtype, caffe::PowerLayerTest::Dtype) [with TypeParam = caffe::CPUDevice; caffe::PowerLayerTest::Dtype = float]’
src/caffe/test/test_power_layer.cpp:167:3: required from ‘void caffe::PowerLayerTest_TestPowerTwoScaleHalfGradient_Test<gtest_TypeParam_>::TestBody() [with gtest_TypeParam_ = caffe::CPUDevice]’
src/caffe/test/test_power_layer.cpp:170:1: required from here
./include/caffe/test/test_gradient_check_util.hpp:167:31: error: no matching function for call to ‘max(const double&, float)’
Dtype scale = std::max(
^
In file included from /usr/include/c++/5/algorithm:61:0,
from src/caffe/test/test_power_layer.cpp:1:
/usr/include/c++/5/bits/stl_algobase.h:219:5: note: candidate: template const _Tp& std::max(const _Tp&, const _Tp&)
max(const _Tp& __a, const _Tp& __b)
^
/usr/include/c++/5/bits/stl_algobase.h:219:5: note: template argument deduction/substitution failed:
In file included from src/caffe/test/test_power_layer.cpp:12:0:
./include/caffe/test/test_gradient_check_util.hpp:167:31: note: deduced conflicting types for parameter ‘const _Tp’ (‘double’ and ‘float’)
Dtype scale = std::max(
^
In file included from /usr/include/c++/5/algorithm:61:0,
from src/caffe/test/test_power_layer.cpp:1:
/usr/include/c++/5/bits/stl_algobase.h:265:5: note: candidate: template<class _Tp, class _Compare> const _Tp& std::max(const _Tp&, const _Tp&, _Compare)
max(const _Tp& __a, const _Tp& __b, _Compare __comp)
^
/usr/include/c++/5/bits/stl_algobase.h:265:5: note: template argument deduction/substitution failed:
In file included from src/caffe/test/test_power_layer.cpp:12:0:
./include/caffe/test/test_gradient_check_util.hpp:167:31: note: deduced conflicting types for parameter ‘const _Tp’ (‘double’ and ‘float’)
Dtype scale = std::max(
^
Makefile:526: recipe for target '.build_release/src/caffe/test/test_power_layer.o' failed
make: *** [.build_release/src/caffe/test/test_power_layer.o] Error 1

@AliRaza21918
Copy link
Author

Somebody plz help me

@AliRaza21918
Copy link
Author

@alexgkendall plz help me to solve this problem

@vinicius-cleves
Copy link

I solved this by changing the line 168 in test_gradient_check_util.hpp to
(float)(std::max(fabs(computed_gradient), fabs(estimated_gradient))), (float)1.);

@Villjoie
Copy link

I solved this by changing the
Dtype scale = std::max(
std::max(fabs(computed_gradient), fabs(estimated_gradient)), 1.);

to
Dtype scale = std::max(
std::max(fabs(computed_gradient), fabs(estimated_gradient)),
Dtype(1.));

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants