-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Save files & folders in Google Drive, instead of 1 unreadable JSON file with everything #54
Comments
The point was storage, that's it. It could have been base64 in any kind of format.
You can still use your fav editor, and then paste it back into RWeb. I don't understand how "synchronize it with the browser" would work... You mean save it back into RWeb's file on Drive? That would require copy & paste too, I think. Or is that a plugin for your fav editor? You have a very good point about folders and files being more readable, and neater. http://dabblet.com/ does this in a Gist. I like that. In RWeb's case, that would be very slow, because it'd have to download ALL the files in ALL the folders. In the curent set-up it's just 1 file. (In the current set-up it needs 2 more Drive queries to find that file, it's not perfect, but still fewer downloads.) If you can find a way where having 150 sites will not require 150 downloads, and uploads for saving, I'm all for this method. For now this is the most efficient (except for #55) method, which is very important IMO, because it's done at least every 30 mins. |
I see... Then it's OK. Sorry for disturbing. I just thought it was in case if someone wants to fix custom css or js code in files. |
No disturbance. It's still a good idea. I'd love to see this work. I don't have time to improve Drive sync though. If you do, that'd be great. And maybe I'll look into it some time. |
There might be a way to do this relatively efficient. Changes. I'd have to create an RWeb directory, and a directory per website, and then a file per property, and then watch the root directory. Every next time (every 30 min), I'd download the changed files. Might even be more efficient than currently (download everything every 30 min). The directory per website would have to be named independent of content though, so it can't be the hosts/domains you see in the UI. It could be the UUID, but that's not readable. And how to combine this with #39 ? I still like the idea. I'll looking into it some more soon. |
No special permissions needed. Conversion from 1 file to many files is tricky and slow. |
I just installed the plugin and it works like a charm. One improvement that comes to mind is that it would be better if it would sync to Google Drive not a JSON file (what's the point of it if it's not human readable?) but a folder with separate CSS and JS files. This way we would be able to use our favorite editors to modify the code and then synchronize it with the browser.
The text was updated successfully, but these errors were encountered: