This application is a prototype. It may contain errors and small bugs. If you notice something you can report an issue. Thank you!
Geo data is required for a large number of projects. A wealth of knowledge about Berlin lies dormant in the open datasets. Can AI help to find and understand this data more quickly, evaluate its relevance for a project and perhaps even create completely new ideas? The GeoExplorer was developed based on these questions. It is part of a feasibility study on open data and AI carried out by the Open Data Informationsstelle Berlin.
The prototype tool GeoExplorer is designed to help you find and quickly understand geo data. Thanks to AI support, the tool searches for suitable or nearby geodata sets based on your query. You can also delve deeper into each dataset description and have a AI explain the content to you.
The open geo data is stored in the Berlin geo data infrastructure, which is operated by the Senate Department for Urban Development, Building and Housing (SenSBW). GeoExplorer only accesses the metadata. It is not an alternative the other open data portals of Berlin such as the FIS-Broker or the Berlin Open Data Portal, but is intended to provide an opportunity for users who are not yet familiar with geo data from the Berlin administration.
Other, non-spatial data from the Open Data Portal has not yet been taken into account for the Explorer at this point in time, as these generally have significantly less good metadata.
For each dataset, metadata was automatically scraped (collected) from Berlins Open Data Portal and Berlins Geo Data Portal (FisBroker).
Metadata is data that describes a dataset, e.g. the attributes that a dataset has or the descriptive text written by a human.
Afterwards, a so-called embedding was created for each individual metadata set and written to a database. Each embedding contains a special vector based on the content of the metadata.
A vector is like a kind of multidimensional coordinate that locates the content of the metadata in the logic of the AI. Example: The vectors for the terms “dog” and “cat” would be located closer than the term “car” because both are animals.
When you enter a search query, a vector is created and compared with the existing vectors in the database. If the vectors have a certain proximity to each other, the respective embeddings are displayed in the search result.
This website is a NextJS app configured with:
- Typescript
- Linting with ESLint
- Formatting with Prettier
Basic Next.js app
This project is a Next.js app which requires you to have Node.js installed.
Clone the repository to your local machine:
git clone [email protected]:technologiestiftung/odis-geoexplorer
Move into the repository folder:
cd odis-geoexplorer
Make sure you use the Node.js version specified in .nvmrc
. Find out which Node version you're currently on with:
node --version
If this version differs from the one specified in .nvmrc
, please install the required version, either manually, or using a tool such as nvm, which allows switching to the correct version via:
nvm use
With the correct Node version, install the dependencies. NOTE: We use pnpm here not npm!
pnpm install
The app queries data from the Supabase DB API and Open AI. You will need to provide connection details in your environment. In this repository you can find a file .env.example
. Duplicate this file and name it .env
.
In .env
you must enter the connection details suggested in .env.example
. If you do not know how to obtain the necessary details, please ask a repository maintainer for access.
You are now ready to start a local development server on http://localhost:3000 via:
pnpm dev
You can explore 1662 datasets (as at 3 July 2024). WMS that exist as WFS (except arial images) have no been included in the search.
You can find more information about the data on this Github repo.
The embeddings are hosted on Supabase - a service that allows you to host a PostgreSQL database and query it via an API.
New features, fixes, etc. should always be developed on a separate branch:
- In your local repository, checkout the
main
branch. - Run
git checkout -b <name-of-your-branch>
to create a new branch (ideally following Conventional Commits guidelines). - Make your changes
- Push your changes to the remote:
git push -u origin HEAD
- Open a pull request.
You can commit using the npm run cm
command to ensure your commits follow our conventions.
The app is deployed to the cloud with Netlify.
We use Matomo for website analytics. Matomo is respectful of the users' privacy, the page visits are tracked anonymously.
In the production environment, a NEXT_PUBLIC_MATOMO_URL
and NEXT_PUBLIC_MATOMO_SITE_ID
is configured for this purpose.
Before you create a pull request, write an issue so we can discuss your changes.
Thanks goes to these wonderful people (emoji key):
Hans Hack 💻 🖋 🔣 📖 📆 |
alsino 💻 🖋 🔣 |
Lisa-Stubert 🖋 📆 |
anna 🎨 |
Klemens 🖋 📆 |
This project follows the all-contributors specification. Contributions of any kind welcome!
Texts and content available as CC BY.
Made by:
|
Together with:
|
A project by
|
Supported by
|