Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add robots.txt #2

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

boney-bun
Copy link

fix #1

@boney-bun
Copy link
Author

@cchristelis the not allowed folder may be added on the robots.txt.
i tried to add the script on Dockerfile, but it seems that this repo is going to be further merged according to the need of the project.
so i made a shell script.

What do you think?

@cchristelis cchristelis self-requested a review March 8, 2018 06:59
Copy link

@cchristelis cchristelis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @boney-bun, I think you should also be adding an entry in the nginx configs to serve the robots file. Also where are you running copy_robots.sh?

@cchristelis
Copy link

@boney-bun, so we should have this running when we don't change this image, but either way we need to add instructions of some sort to this repo to explain what we did here?

@boney-bun
Copy link
Author

thanks for the feedback @cchristelis

I think you should also be adding an entry in the nginx configs to serve the robots file.

Yes, I agree with you.
previously, I added these lines on my local:

    location /robots.txt {
    	root $root_location
    }

but, the challenge is how to determine the root_location.
if we run the copy_robots.sh, we could obtain the root location.
but, i think the script can't be run from the conf file.

I tried running the original image locally, but failed to start.
that's why I think this repo may be need further modification based on the need of a project.

Also where are you running copy_robots.sh?

the copy_robots.sh suppose to be executed when the nginx has started so it will copy the robots.txt into the nginx's root location.

either way we need to add instructions of some sort to this repo to explain what we did here?

I agree. let me try to write some explanations and you can comment on it later.

@cchristelis
Copy link

have a look at how to copy files into containers during setup. You can determine the location.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

add robots.txt
2 participants