Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

textsampler.lua:38: bad argument #3 #192

Open
ThorJonsson opened this issue Mar 20, 2016 · 0 comments
Open

textsampler.lua:38: bad argument #3 #192

ThorJonsson opened this issue Mar 20, 2016 · 0 comments

Comments

@ThorJonsson
Copy link

Hi! I'm getting the following error running the recurrentlanguagemodel code.

`FileLogger: log will be written to /home/user/save/comp:1458431781:1/log    
==> epoch # 1 for optimizer :   
 [======================================== 4000/4000 ==================================>]  Tot: 12s48ms | Step: 4ms     
==> example speed = 216.87828023852 examples/s  
 [======================================== 24000/24000 ================================>]  Tot: 2m49s | Step: 7ms       
/home/user/torch/install/bin/luajit: ...2/torch/install/share/lua/5.1/dp/sampler/textsampler.lua:38: bad argument #3 to 'narrow' (out of range at /home/user/torch/pkg/torch/lib/TH/generic/THTensor.c:351)
stack traceback:
        [C]: in function 'narrow'
        ...2/torch/install/share/lua/5.1/dp/sampler/textsampler.lua:38: in function 'sampler'
        ...torch/install/share/lua/5.1/dp/propagator/propagator.lua:117: in function 'propagateEpoch'
        ...torch/install/share/lua/5.1/dp/propagator/experiment.lua:116: in function 'run'
        lstm_lm_example.lua:360: in main chunk
        [C]: in function 'dofile'
        ...hj92/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
        [C]: at 0x00405d70

I can get around this error by setting

trainOnly = true`

This is the model:

{
   accUpdate : false
   batchSize : 64
   bidirectional : true
   cuda : false
   cutoffNorm : -1
   dataPath : "/home/user/data"
   dataset : "PennTreeBank"
   decayFactor : 0.001
   dropout : true
   dropoutProb : 0.5
   evalSize : 100
   forestGaterSize : "{}"
   hiddenSize : {200,200}
   learningRate : 0.1
   lrDecay : "linear"
   lstm : true
   maxEpoch : 400
   maxOutNorm : 2
   maxTries : 30
   maxWait : 4
   minLR : 1e-05
   momentum : 0
   progress : true
   rho : 20
   saturateEpoch : 300
   schedule : {}
   silent : false
   small : false
   softmaxforest : false
   softmaxtree : false
   testFile : "test.txt"
   tiny : false
   trainEpochSize : 4000
   trainFile : "train.txt"
   trainOnly : false
   uniform : -1
   useDevice : 1
   validEpochSize : 24000
   validFile : "valid.txt"
   xpPath : ""
   zeroFirst : false
}       
Warning : the Perplexity of a bidirectional RNN/LSTM isn't necessarily mathematically valid as it uses P(x_t|x_{/neq t}) instead of P(x_t|x_{<t}), which is used for unidirectional RNN/LSTMs. You can however still use predictions to measure pseudo-likelihood.      
Language Model :        
nn.Sequential {
  [input -> (1) -> (2) -> (3) -> (4) -> (5) -> output]
  (1): nn.LookupTable
  (2): nn.Dropout(0.5, busy)
  (3): nn.SplitTable
  (4): nn.BiSequencerLM {
    (  fwd  ): nn.Sequential {
      |      [input -> (1) -> (2) -> (3) -> (4) -> output]
      |      (1): nn.Sequencer @ nn.FastLSTM(200 -> 200)
      |      (2): nn.Sequencer @ nn.Recursor @ nn.Dropout(0.5, busy)                                                                                                03:55
      |      (3): nn.Sequencer @ nn.FastLSTM(200 -> 200)                                                                                                            03:58
      |      (4): nn.Sequencer @ nn.Recursor @ nn.Dropout(0.5, busy)
      |    }
    (  bwd  ): nn.Sequential {
      |      [input -> (1) -> (2) -> (3) -> output]
      |      (1): nn.ReverseTable
      |      (2): nn.Sequential {
      |        [input -> (1) -> (2) -> (3) -> (4) -> output]
      |        (1): nn.Sequencer @ nn.FastLSTM(200 -> 200)
      |        (2): nn.Sequencer @ nn.Recursor @ nn.Dropout(0.5, busy)
      |        (3): nn.Sequencer @ nn.FastLSTM(200 -> 200)
      |        (4): nn.Sequencer @ nn.Recursor @ nn.Dropout(0.5, busy)
      |      }
      |      (3): nn.ReverseTable
      |    }
    ( merge ): nn.Sequential {
      |      [input -> (1) -> (2) -> output]
      |      (1): nn.ZipTable
      |      (2): nn.Sequencer @ nn.Recursor @ nn.JoinTable
      |    }
  }
  (5): nn.Sequencer @ nn.Recursor @ nn.Sequential {
    [input -> (1) -> (2) -> output]
    (1): nn.Linear(400 -> 9663)
    (2): nn.LogSoftMax
  }
}

I would be very thankful for any help you can provide to solve this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant