Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backend-API #4

Open
torwag opened this issue May 6, 2020 · 6 comments
Open

Backend-API #4

torwag opened this issue May 6, 2020 · 6 comments

Comments

@torwag
Copy link

torwag commented May 6, 2020

The server might always contain just the raw data. However, with rclone and some other tools, it would be nice to have a backend, that whenever a data change is reported by the server, and post-processing is triggered e.g. to create a set of readable files and sync them with another cloud service, e.g. nextcloud, borg-backup, git, rsync, etc. This would allow users to use dedicated tools for other tasks (sync, backup, notification, etc.) without bloating the server itself.
For this a set of events like on_newfile, on_changedfile, on_removedfile, to trigger certain scripts would be helpful.

@ddvk
Copy link
Owner

ddvk commented Sep 9, 2020

in the meantime, rmapi can upload/manage files (the address can be set via env variables)

@torwag
Copy link
Author

torwag commented Sep 10, 2020

So we could have a dockerfile which contains rmfakecloud and rmapi, define an input and backup folder and start rmapi whenever a file comes into the import folder (by whatever method) to put it into rmfakecloud and hence on the rM. Furthermore, periodically export all data from rmfakecloud into the backup-folder where <name_your_favourite_sync_backup_tool> could pick it up.

@torwag
Copy link
Author

torwag commented Sep 10, 2020

I have something very similar (without the need for a fakecloud as the data is just normal data) running using rclone and nextcloud for my ebook reader. In case of the rM, a folder on nextcloud serves as the entry folder like "rM_incoming", all files in their get synced and deleted afterwards. Another folder like "rm_backup" contains the entire content of the rM, in case it gets lost or you want to have access to it outside the rM universe.

@ddvk
Copy link
Owner

ddvk commented Sep 10, 2020

i got confused :> those are 2 differnt things i think

  1. uploading data to rmfake, should trigger a script with the Document Name and file path?
  2. having a drop folder and using rmapi to push stuff to the server

@torwag
Copy link
Author

torwag commented Sep 10, 2020

My thought was that
a) observe a folder and upload via rmapi new content to rmfake, delete the original file in that folder
And
b) rmapi could export all documents of rmfake into (another) folder periodically, ideally only by updating new content

What happens to those folders and how data gets in resp. how the output folder gets used would be out of scope

Example scenario:
A) setup a ftp server and point to incoming folder... Setup scanner to scan to ftp as pdf. Every scan ends in the incoming folder->rmapi->rmfake->rM
B) the output folder will be observed by rclone, which points to a nextcloud server. Different folders there are shared with different coworkers. Thus, on the rM you have a folder John and one Administration and whenever you copy a file into one of the folders on your rM or change them (e. g. Sign them) it ends in their nextcloud share accessible by your colleagues.

@ghost
Copy link

ghost commented Mar 12, 2023

ERROR: 2023/03/12 06:45:17 auth.go:93: failed to create a new device token

actually while trying to register actual version of rmapi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants