You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I hit a stack level too deep error while running jekyll in the jekyll-docker container (envygeeks/jekyll-docker#212), which uses musl as its libc. They said that the bug belonged here. Assuming that rb-inotify owns the offending thread, I think you should be able to set the stack size at thread creation time
Description
We use jekyll/jekyll:pages for local testing of the documentation of our project. Recently, local builds started failing with a stack overflow.
It reproduces with the latest image:
$ docker pull jekyll/jekyll:pages
pages: Pulling from jekyll/jekyll
...
Digest: sha256:0c0c3585c91bbb07328d1f32a443d2ce2a92e3ddf5e0ce5d385dd91c4d96125a
Status: Downloaded newer image for jekyll/jekyll:pages
I tried setting the ulimit and Ruby stack size:
docker run ... --ulimit stack=64000000 -e RUBY_THREAD_VM_STACK_SIZE=64000000 ...
This seems to reproduce on most (but not all) of my colleagues machines; it should be as simple as doing this:
git clone https://github.com/projectcalico/calico.git
cd calico
make serve
I was testing on an Ubuntu 17.10 VM.
Output
$ make serve
docker run --rm -ti -e JEKYLL_UID=`id -u` -p 4000:4000 -v $PWD:/srv/jekyll jekyll/jekyll:pages jekyll serve --incremental --config _config.yml
ruby 2.5.0p0 (2017-12-25 revision 61468) [x86_64-linux-musl]
Configuration file: _config.yml
Source: /srv/jekyll
Destination: /srv/jekyll/_site
Incremental build: enabled
Generating...
GitHub Metadata: No GitHub API authentication could be found. Some fields may be missing or have incorrect data.
done in 39.923 seconds.
Traceback (most recent call last):
186: from /usr/gem/bin/jekyll:23:in `<main>'
185: from /usr/gem/bin/jekyll:23:in `load'
184: from /usr/gem/gems/jekyll-3.6.2/exe/jekyll:15:in `<top (required)>'
183: from /usr/gem/gems/mercenary-0.3.6/lib/mercenary.rb:19:in `program'
182: from /usr/gem/gems/mercenary-0.3.6/lib/mercenary/program.rb:42:in `go'
181: from /usr/gem/gems/mercenary-0.3.6/lib/mercenary/command.rb:220:in `execute'
180: from /usr/gem/gems/mercenary-0.3.6/lib/mercenary/command.rb:220:in `each'
179: from /usr/gem/gems/mercenary-0.3.6/lib/mercenary/command.rb:220:in `block in execute'
... 174 levels...
4: from /usr/gem/gems/rb-inotify-0.9.10/lib/rb-inotify/notifier.rb:190:in `new'
3: from /usr/gem/gems/rb-inotify-0.9.10/lib/rb-inotify/watcher.rb:67:in `initialize'
2: from /usr/gem/gems/rb-inotify-0.9.10/lib/rb-inotify/native/flags.rb:79:in `to_mask'
1: from /usr/gem/gems/rb-inotify-0.9.10/lib/rb-inotify/native/flags.rb:79:in `inject'
/usr/gem/gems/rb-inotify-0.9.10/lib/rb-inotify/native/flags.rb:79:in `each': stack level too deep (SystemStackError)
Makefile:27: recipe for target 'serve' failed
make: *** [serve] Error 1
Expected
No crash, page should be served as normal.
The text was updated successfully, but these errors were encountered:
I hit a stack level too deep error while running jekyll in the jekyll-docker container (envygeeks/jekyll-docker#212), which uses musl as its libc. They said that the bug belonged here. Assuming that rb-inotify owns the offending thread, I think you should be able to set the stack size at thread creation time
Description
We use
jekyll/jekyll:pages
for local testing of the documentation of our project. Recently, local builds started failing with a stack overflow.It reproduces with the latest image:
I tried setting the ulimit and Ruby stack size:
but it didn't seem to help.
I noticed that the image was compiled with musl as its libc and found this in their docs: https://wiki.musl-libc.org/functional-differences-from-glibc.html#Thread_stack_size It sounds like muslc uses an unusually small stack and it is not configurable via ulimit or env var.
Steps
This seems to reproduce on most (but not all) of my colleagues machines; it should be as simple as doing this:
I was testing on an Ubuntu 17.10 VM.
Output
Expected
No crash, page should be served as normal.
The text was updated successfully, but these errors were encountered: