From aae74a4aa9baca2859ea3693c246488084a28b7f Mon Sep 17 00:00:00 2001 From: Jonas Baumann Date: Sat, 7 Sep 2024 14:37:33 +0200 Subject: [PATCH] Update readme. --- README.md | 57 +++++++++++++++++++++++++++++++++++++++++-------------- 1 file changed, 43 insertions(+), 14 deletions(-) diff --git a/README.md b/README.md index 618d8ef..d4fe05d 100644 --- a/README.md +++ b/README.md @@ -2,7 +2,41 @@ This project was done as part of the GLAMHack24, the Swiss Open Cultural Data Hackathon. See the [project description](https://hack.glam.opendata.ch/project/211). -## Load csv data with curl +## Goal + +With the [photos accessible on the Zentralgut website](https://zentralgut.ch/glasplatten_zug/), +the Bibliothek Zug aims to bring the collection to a broader audience through the development of a mobile app. This app will notify users when they are near a location where one of the georeferenced photos from the "Glasplattensammlung Zug" was taken, offering a unique and immersive way to explore the region's history. This app provides a look into the past, allowing users to compare historical life and architecture with the present-day, offering a distinctive perspective on Zug's evolution over time. + +## Demo + +The demo installation can be accessed under https://photos.histify.app/ + +## Solution + +This project provides a mobile web app (PWA), which allows to expierence the photos while walking through the city of Zug. +The app is built in a way that it can be reused for other dataset and hosted by anywone. + +It consists of these components: + +- `frontend` - the web application +- `api` - a small api for retrieving the data and uploading the data +- `elasticsearch` - an elasticsearch installation hosting the data + +## Try it out + +The `compose-prod.yaml` provides a setup with which you can try out the application. +In order for trying it out in a local machine, you need `git` and `docker` installed. +Follow these steps: + +- clone the git repository +- run `docker compose -f compose-prod.yml up -d` +- visit http://localhost:8000/api/docs and upload a dataset + - you need to prepare your dataset in a CSV similar to `data/bibzug_glasplatten_adapted.csv` + - you need to authenticate with a bearer token, which is `dataimporttoken` in this setup + - you can also do it with curl; see below +- visit http://localhost:8000 for testing the app + +## Load data with curl Sample data is in the `data` folder @@ -16,19 +50,14 @@ curl -X 'POST' \ -F 'file=@bibzug_glasplatten_adapted.csv;type=text/csv' ``` +In order to do that, you need an api key. -## Set up local directories for docker-compose and elastic kiban +## Development environment -Do this in app root folder : -``` -mkdir esdata01 -mkdir esdata02 -mkdir kibanadata -chmod g+rwx esdata01 -chmod g+rwx esdata02 -chmod g+rwx kibanadata -sudo chgrp 0 esdata01 -sudo chgrp 0 esdata02 -sudo chgrp 0 kibanadata -``` \ No newline at end of file +You can start the development environment within docker containers: + +- clone the git repository +- run `docker compose build` +- run `docker compose up -d` +- visit http://localhost:8000