Skip to content

Commit

Permalink
Update readme.
Browse files Browse the repository at this point in the history
  • Loading branch information
jone committed Sep 7, 2024
1 parent 0c84bf4 commit aae74a4
Showing 1 changed file with 43 additions and 14 deletions.
57 changes: 43 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,41 @@

This project was done as part of the GLAMHack24, the Swiss Open Cultural Data Hackathon. See the [project description](https://hack.glam.opendata.ch/project/211).

## Load csv data with curl
## Goal

With the [photos accessible on the Zentralgut website](https://zentralgut.ch/glasplatten_zug/),
the Bibliothek Zug aims to bring the collection to a broader audience through the development of a mobile app. This app will notify users when they are near a location where one of the georeferenced photos from the "Glasplattensammlung Zug" was taken, offering a unique and immersive way to explore the region's history. This app provides a look into the past, allowing users to compare historical life and architecture with the present-day, offering a distinctive perspective on Zug's evolution over time.

## Demo

The demo installation can be accessed under https://photos.histify.app/

## Solution

This project provides a mobile web app (PWA), which allows to expierence the photos while walking through the city of Zug.
The app is built in a way that it can be reused for other dataset and hosted by anywone.

It consists of these components:

- `frontend` - the web application
- `api` - a small api for retrieving the data and uploading the data
- `elasticsearch` - an elasticsearch installation hosting the data

## Try it out

The `compose-prod.yaml` provides a setup with which you can try out the application.
In order for trying it out in a local machine, you need `git` and `docker` installed.
Follow these steps:

- clone the git repository
- run `docker compose -f compose-prod.yml up -d`
- visit http://localhost:8000/api/docs and upload a dataset
- you need to prepare your dataset in a CSV similar to `data/bibzug_glasplatten_adapted.csv`
- you need to authenticate with a bearer token, which is `dataimporttoken` in this setup
- you can also do it with curl; see below
- visit http://localhost:8000 for testing the app

## Load data with curl

Sample data is in the `data` folder

Expand All @@ -16,19 +50,14 @@ curl -X 'POST' \
-F 'file=@bibzug_glasplatten_adapted.csv;type=text/csv'
```

In order to do that, you need an api key.


## Set up local directories for docker-compose and elastic kiban
## Development environment

Do this in app root folder :
```
mkdir esdata01
mkdir esdata02
mkdir kibanadata
chmod g+rwx esdata01
chmod g+rwx esdata02
chmod g+rwx kibanadata
sudo chgrp 0 esdata01
sudo chgrp 0 esdata02
sudo chgrp 0 kibanadata
```
You can start the development environment within docker containers:

- clone the git repository
- run `docker compose build`
- run `docker compose up -d`
- visit http://localhost:8000

0 comments on commit aae74a4

Please sign in to comment.