-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
requirements to run the code #4
Comments
How did you preprocess the raw Laval HDR dataset to get the training runnable? Thanks |
Once you have access to the depth and HDR data from Laval HDR dataset, you can use the function depth_for_anchors_calc in my forked code: https://github.com/weberhen/Illumination-Estimation/blob/main/RegressionNetwork/data.py Please let me know if you find some bug :) |
Thank you for your reply. Yes, I have access to the depth and HDR data. So where is the script the call "depth_for_anchors_calc" function. Or could you kindly teach me which scripts need to run in order to train the model from scratch? Suppose now I only have the raw depth and HDR data. Thanks! |
the code I used to preprocess the dataset is this old commit, I ended up erasing it by mistake but you can use it here: https://github.com/weberhen/Illumination-Estimation/blob/5960738cdd7184c3cd897a47840db6b647d013ac/RegressionNetwork/representation/distribution_representation.py this code creates the pkl files necessary during training. It will take the GT panorama and the depth and will create a pickle file with those infos. Once you have the pkl files, all you need is to run train.py from the RegressionNetwork folder using my forked code: https://github.com/weberhen/Illumination-Estimation But to be honest I'm not sure its working, its on epoch 38 and it outputs always the same prediction since epoch 3. |
EMLight needs the image warping operation in Gardner 's work to warp the raw HDR and get input images. But I am not sure whether the codes about warping operation are released here. How do you sample images and warp panoramas? Thanks in advanced. |
There is some trick for the training. I first overfit the model in a small dataset, then train it on the full dataset. |
Hello @fnzhan , I work with prof Jean-François Lalonde, one of the creators of the dataset. You can share the trained model, the only thing is that it can be only used for research purposes :) |
Hi,Would you like share the code that generate the pkl files? or share the Dataset preprocessing code?Wish for you reply! |
Hi,would you like share the process of the depth and HDR data from Laval HDR dataset?wish for your reply! |
Hello sir, this connection may not be open, how to deal with the original Laval data set, if you can tell me, I will be very grateful! |
Hi! You can try my fork: https://github.com/weberhen/Illumination-Estimation-1 |
hello! I am trying to follow your code to crop Fov from Hdr, using 'gen_hdr_crops.py'. But i found that your code using some modules like 'envmap'、'exzer', which i am not familiar with, and wonder how to pip install them. After google, i guess these modules come from ' skylibs'(https://github.com/soravux/skylibs), but still get some error after 'pip install --upgrade skylibs'. So can you share me with your requirements.txt ? Thanks! |
Hi @jxl0131 ! Its indeed skylibs. I just tested 'pip install --upgrade skylibs' and it worked. I suggest you ask the original developer if you cannot install since it will be the easier way to get gen_hdr_crops.py to work. Good luck! |
thanks! I am ok with my enviromment now. I am following your edited GenProjector code, and i found that in your 'Illumination-Estimation-1/GenProjector/data.py' file, function 'getitem', you try to get envmap_exr from '/crop' directory. I can't understand it, i think it should be another directory containing full panorama picture instead crops. wish for your reply! |
Hi! I'm sorry about that, its called crop but inside were envmaps, I cant recall why I named it this way. But you can see that the code continues with the creation of the actual crop from that envmap, so just replace the folder 'crop' to match your dataset structure and it should work. |
haha, i can understand what you said! Thanks! |
Hi! EMLight's author tone crop and envmaps in all his 'data.py' code, which i think is to convert crop from hdr to ldr. So why you use 'genLDRimage(..)' in your 'gen_hdr_crops.py'(a Repeat the operation)? code:
Hope for your reply! |
Hi! re-exposing is not the same as tonemap: re-expose simply maps the range of a given HDR image to another range, which makes the images more similar. I do that since the dataset has some pretty dark HDR images and some super bright, so I just apply this (invertible) operation to make them be in a similar range. So this script is for that, generate (re-exposed) HDR crops :) |
Hi! |
Hi again!
Could you please share your running environment so I can try to run the code with little to no changes in it please?
I'm facing some silly problems like conversion from numpy to torch, which I suspect is because you use another pytorch version than mine, otherwise you would have the same problem as me. One example of error I'm getting:
$ Illumination-Estimation/RegressionNetwork/train.py
0 optim: 0.001
Traceback (most recent call last):
File "Illumination-Estimation/RegressionNetwork/train.py", line 82, in
dist_emloss = GMLoss(dist_pred, dist_gt, depth_gt).sum() * 1000.0
File "/miniconda3/envs/pt110/lib/python3.7/site-packages/torch/nn/modules/module.py", line 493, in call
result = self.forward(*input, **kwargs)
File "/Illumination-Estimation/RegressionNetwork/gmloss/samples_loss.py", line 43, in forward
scaling=self.scaling, geometry=geometry)
File "/Illumination-Estimation/RegressionNetwork/gmloss/samples_loss.py", line 72, in sinkhorn_tensorized
self.distance = distance(batchsize=B, geometry=geometry)
File "/Illumination-Estimation/RegressionNetwork/gmloss/utils.py", line 79, in init
anchors = geometric_points(self.N, geometry)
File "/Illumination-Estimation/RegressionNetwork/gmloss/utils.py", line 70, in geometric_points
points[:, 0] = radius * np.cos(theta)
TypeError: mul(): argument 'other' (position 1) must be Tensor, not numpy.ndarray
Thanks!
The text was updated successfully, but these errors were encountered: