Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Lilit summer student development] Flamegraph integration #192

Open
wants to merge 18 commits into
base: master
Choose a base branch
from

Conversation

oshadura
Copy link
Collaborator

@eguiraud I tried but I am confused why it works from the terminal but doesn't generate anything in ctest...

From rootbench build directory:

 export PATH=$PWD/FlameGraph-prefix/src/FlameGraph/:$PATH
./tools/flamegraph.sh -d . -b root/tmva/tmva/ConvNetCpuBenchmarks -c -m 
.......
----------------------------------------------------------------------
Benchmark                            Time             CPU   Iterations
----------------------------------------------------------------------
BM_ConvolutionalNetwork_CPU 39306567750 ns   38953830955 ns            1
[ perf record: Woken up 12 times to write data ]
[ perf record: Captured and wrote 3,832 MB perf.data (381 samples) ]

and both pngs (memory and CPU flamegraphs) are beautifully generated!

But from ctest:

ctest -R rootbench-fixture-flamegraphcpu-ConvNetCpuBenchmarks
......
69: [ perf record: Woken up 4 times to write data ]
69: [ perf record: Captured and wrote 0,000 MB (null) ]
69: failed to open perf.data: No such file or directory  (try 'perf record' first)
69: ERROR: No stack counts found
1/1 Test #69: rootbench-fixture-flamegraph-ConvNetCpuBenchmarks ...***Failed  119.84 sec

0% tests passed, 1 tests failed out of 1

Total Test time (real) = 119.89 sec

The following tests FAILED:
	 69 - rootbench-fixture-flamegraph-ConvNetCpuBenchmarks (Failed)
Errors while running CTest

The command is the same for both and I see the benchmark is running actually...

@oshadura
Copy link
Collaborator Author

Rebased and readapted from #94

@eguiraud
Copy link
Member

@oshadura I bet it's a "current working directory" issue. You can check the verbose ctest -V -R rootbench-fixture-flamegraphcpu-ConvNetCpuBenchmarks and also check, after the test failed, in which directory the perf.data file has been created.

@vgvassilev
Copy link
Member

Looks like the behavior of perf record is different if being called from ctest. Let's aim for merging the script and then the rest.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants