Crawler doesn't work unless I use a production URL #298
Replies: 19 comments 3 replies
-
Also experiencing this issue, using Laravel 5.8. |
Beta Was this translation helpful? Give feedback.
-
I'm experiencing the same issue. Though regardless of what URL i use, my sitemap file is blank.
Here is a snapshot of my composer file
Resulting file contents
|
Beta Was this translation helpful? Give feedback.
-
same on one of my production website while other website works perfectly and I cannot understand the difference between the two sites... |
Beta Was this translation helpful? Give feedback.
-
I've the same issue. Did anyone find any alternative? |
Beta Was this translation helpful? Give feedback.
-
anyone made progress on this already ? I'm still investigating from time to time, haven't find anything yet. |
Beta Was this translation helpful? Give feedback.
-
I'm having the same problem |
Beta Was this translation helpful? Give feedback.
-
I added links manually. There was no other way to do it. |
Beta Was this translation helpful? Give feedback.
-
Same problem here |
Beta Was this translation helpful? Give feedback.
-
Same problem here Laravel 7 |
Beta Was this translation helpful? Give feedback.
-
Work for me if I use '/' in the path,
OR
|
Beta Was this translation helpful? Give feedback.
-
I have the same issue
|
Beta Was this translation helpful? Give feedback.
-
This worked for me too. So, what you could try is using something like ngrok (which I use myself) to expose your local or testing environment to the web, and point the crawler to that URL. This is how I got it to work. |
Beta Was this translation helpful? Give feedback.
-
I was using valet and I had this issue, I switched to If you aren't using valet, you can use |
Beta Was this translation helpful? Give feedback.
-
Same issue, doesn't work if get URL string from a function:
$URL content is the same in both cases |
Beta Was this translation helpful? Give feedback.
-
I have the same issue with Laravel Valet. It doesn't seem to pick up the .test sites at all. Using Given that it's generating on a different url to the live site, a handy feature would be to be able to automate the find replace as part of the process. This would allow us to schedule the creation and update a static site. |
Beta Was this translation helpful? Give feedback.
-
Hello everyone. I had the same problem. So I decided to add the links manually. There are thousands of links in my project and I keep the links in a separate database to be SEO compatible. My database table is as follows.
I created the artisan command as explained in the read me section. Below is my code in the command.
I hope it helps you too. |
Beta Was this translation helpful? Give feedback.
-
I have same issue none of your solutions work for me :( Laravel 8 |
Beta Was this translation helpful? Give feedback.
-
@freekmurze Facing the same issue. |
Beta Was this translation helpful? Give feedback.
-
Facing the same issue here with Laravel 10.33, I solved it by adding the following in the 'guzzle_options' into 'config/sitemap.php': SitemapGenerator::create(config("app.url")) |
Beta Was this translation helpful? Give feedback.
-
Hello!
In my code, I have a scheduled command to generate a new sitemap daily. It looks extremely simple:
You'll notice one thing though: I hard coded the production environment's URL. When I do
SitemapGenerator::create('/')
or evenSitemapGenerator::create(config('app.url'))
, the sitemap isn't generated at all.It bothers me because I need to run tests to make sure the sitemap is correctly generated. My test would be something like "create 10 fake posts and make sure the sitemap contains their URLs".
I always make sure the local URL is accessible. Despite that, even if I hard code https://example.com.test, the crawler won't work.
I just can't figure out why this is occurring. It worked perfectly before Laravel 7 (I have no idea if it's really related though).
Thanks in advance for your help.
Beta Was this translation helpful? Give feedback.
All reactions