Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update routes for specific dates #638

Open
wants to merge 56 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 19 commits
Commits
Show all changes
56 commits
Select commit Hold shift + click to select a range
614bfe3
renamed save_routes method to save_new_routes
Brian-Lee Apr 22, 2020
857fe04
fixed naming error with routeconfig.new_save_routes
Brian-Lee Apr 22, 2020
c87b43e
added save_old_routes method
Brian-Lee Apr 22, 2020
3817c06
called save_old_routes after calling save_new_routes - doesnt save pr…
Brian-Lee Apr 22, 2020
22cb4c3
some progress making a versioned cache dir for old route
Brian-Lee Apr 23, 2020
e56fa1e
eliminated unecessary use_versioning variable
Brian-Lee Apr 23, 2020
28b1931
consolidated save_old_routes and save_new_routes into save_routes
Brian-Lee Apr 23, 2020
cb55856
framework for executing a scrape saving routes normally followed by a…
Brian-Lee Apr 23, 2020
33a8e53
removed 'notdated' from non-archived routes JSON files
Brian-Lee Apr 23, 2020
690e0cc
removed unused method download_gtfs_data and 'dated' from filenames
Brian-Lee Apr 23, 2020
0c9fba6
put in a more realistic date for the archive date version for the sin…
Brian-Lee Apr 23, 2020
26f1772
moved imports to the top
Brian-Lee Apr 23, 2020
9998299
added reminder comment to properly get archived GTFS data
Brian-Lee Apr 23, 2020
ee2ebe5
can add multiple archive urls to archive routes
Brian-Lee Apr 23, 2020
50812c4
pulling archive urls from a list
Brian-Lee Apr 23, 2020
5f7112f
make url from date and loop through archiving urls for archiving routes
Brian-Lee Apr 23, 2020
652a459
use transitfeeds api to get old routes to version by date and cache -…
Brian-Lee Apr 24, 2020
36565da
eliminate unecessary param archiving_old and other cleanup
Brian-Lee Apr 24, 2020
7ac1b10
some cleanup
Brian-Lee Apr 24, 2020
6ebe655
passed archiving_date instead of current date per reviewer suggestion
Brian-Lee Apr 27, 2020
ff9500b
remove unecessary archive_date
Brian-Lee Apr 27, 2020
e618a1f
framework to take archiving_date argument
Brian-Lee Apr 27, 2020
aa0b423
added some comments
Brian-Lee Apr 27, 2020
d4f13f8
changed archived_date to gtfs_date
Brian-Lee Apr 27, 2020
6e0c3b3
combined GtfsScraper calls for both cases
Brian-Lee Apr 27, 2020
24f56e9
eliminated variable d
Brian-Lee Apr 27, 2020
bb7dbf4
added backwards date search if gtfs_date doesnt match exact zipfile date
Brian-Lee Apr 27, 2020
4d568e4
date suffix now matches actual date found and used
Brian-Lee Apr 27, 2020
d5a7cbc
removed duplicative checking for dated gtfs zipfile
Brian-Lee Apr 27, 2020
0bf3cf2
pass gtfs_path to scraper instead of gtfs_date
Brian-Lee Apr 27, 2020
e496466
some cleanup
Brian-Lee Apr 27, 2020
93b90b7
fixed bug where save_routes.py broken without gtsf_date argument
Brian-Lee Apr 27, 2020
af19055
added a comment
Brian-Lee Apr 27, 2020
483a0b2
combined duplicative lines
Brian-Lee Apr 30, 2020
2084742
changed command line argument gtfs_date to date
Brian-Lee Apr 30, 2020
385e209
changed the method of finding most recent gtfs zip
Brian-Lee Apr 30, 2020
8711770
Merge branch 'master' of https://github.com/trynmaps/metrics-mvp into…
Brian-Lee May 7, 2020
f35f14a
combined two identical lines into one
Brian-Lee May 7, 2020
e219b42
reduced if-else to just if
Brian-Lee May 7, 2020
198a3ae
eliminated unecessary else keyword
Brian-Lee May 7, 2020
8fcbb5b
removed outdated comments
Brian-Lee May 7, 2020
46c9927
removed unecessary assignment of save_to_s3
Brian-Lee May 7, 2020
d4f8b72
changed
Brian-Lee May 7, 2020
2a4c807
setting starting date more appropriately to date argument
Brian-Lee May 7, 2020
a5f5a62
chained two lines into one
Brian-Lee May 21, 2020
4d24dfe
eliminated unecessary import shutil
Brian-Lee May 21, 2020
fdba9b7
changed all occurrences of version_date to gtfs_date
Brian-Lee May 21, 2020
d2be8d4
changed one missed version_date to gtfs_date and removed unecessary i…
Brian-Lee May 21, 2020
78e5386
removed unecessary imports
Brian-Lee May 21, 2020
ed4c4d7
improved the comment
Brian-Lee May 21, 2020
3b30c68
removed unecessary parameter from save_routes method
Brian-Lee May 21, 2020
7b81c1a
simplified vars - removed date_to_use
Brian-Lee May 21, 2020
8a7443c
added error msg for dated gtfs file not found and moved code into new…
Brian-Lee May 22, 2020
de0b556
corrected inconsistency -YYYY-MM-DD vs _YYYY-MM-DD in routes path
Brian-Lee May 22, 2020
5639610
fixed introduced bug resetting gtfs_path and gtfs_date outside of ELSE
Brian-Lee May 22, 2020
db1bacc
conditionally load gtfs data from the cache-dir or gtfs_path
Brian-Lee May 22, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 27 additions & 12 deletions backend/models/gtfs.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@
import gzip
import hashlib
import zipfile
import shutil
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved
import os

from . import config, util, nextbus, routeconfig, timetables

Expand Down Expand Up @@ -50,27 +52,40 @@ def get_stop_geometry(stop_xy, shape_lines_xy, shape_cumulative_dist, start_inde
'offset': int(best_offset) # distance in meters between this stop and the closest line segment of shape
}

def download_gtfs_data(agency: config.Agency, gtfs_cache_dir):
gtfs_url = agency.gtfs_url
def download_gtfs_data(agency: config.Agency, gtfs_cache_dir, archiving_url=None):
cache_dir = Path(gtfs_cache_dir)
zip_path = f'{util.get_data_dir()}/gtfs-{agency.id}.zip'
if archiving_url == None:
gtfs_url = agency.gtfs_url
else:
gtfs_url = archiving_url

# need to delete existing zip file and directory in order
# to reuse for the archiving passes
if cache_dir.exists():
shutil.rmtree(cache_dir)
print('removed',cache_dir)
os.remove(zip_path)
print('removed',zip_path)

if gtfs_url is None:
raise Exception(f'agency {agency.id} does not have gtfs_url in config')

cache_dir = Path(gtfs_cache_dir)

if not cache_dir.exists():
print(f'downloading gtfs data from {gtfs_url}')
r = requests.get(gtfs_url)

if r.status_code != 200:
raise Exception(f"Error fetching {gtfs_url}: HTTP {r.status_code}: {r.text}")

zip_path = f'{util.get_data_dir()}/gtfs-{agency.id}.zip'

with open(zip_path, 'wb') as f:
f.write(r.content)

with zipfile.ZipFile(zip_path, 'r') as zip_ref:
zip_ref.extractall(gtfs_cache_dir)



def is_subsequence(smaller, bigger):
smaller_len = len(smaller)
bigger_len = len(bigger)
Expand Down Expand Up @@ -108,15 +123,14 @@ def contains_excluded_stop(shape_stop_ids, excluded_stop_ids):
return False

class GtfsScraper:
def __init__(self, agency: config.Agency):
def __init__(self, agency: config.Agency, archiving_url=None):
self.agency = agency
self.agency_id = agency_id = agency.id
gtfs_cache_dir = f'{util.get_data_dir()}/gtfs-{agency_id}'

download_gtfs_data(agency, gtfs_cache_dir)
download_gtfs_data(agency, gtfs_cache_dir, archiving_url=archiving_url)

self.feed = ptg.load_geo_feed(gtfs_cache_dir, {})

self.errors = []
self.stop_times_by_trip = None
self.stops_df = None
Expand Down Expand Up @@ -261,7 +275,8 @@ def save_timetables(self, save_to_s3=False, skip_existing=False):
agency_id = self.agency_id

dates_map = self.get_services_by_date()

##bri## print('here\n\n\n',dates_map)
##bri## exit()
#
# Typically, many dates have identical scheduled timetables (with times relative to midnight on that date).
# Instead of storing redundant timetables for each date, store one timetable per route for each unique set of service_ids.
Expand Down Expand Up @@ -1058,7 +1073,7 @@ def get_sort_key(route_data):
return route_data['title']
return sorted(routes_data, key=get_sort_key)

def save_routes(self, save_to_s3, d):
def save_routes(self, save_to_s3, d, version_date=None):
agency = self.agency
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved
agency_id = agency.id
routes_df = self.get_gtfs_routes()
Expand All @@ -1078,4 +1093,4 @@ def save_routes(self, save_to_s3, d):

routes = [routeconfig.RouteConfig(agency_id, route_data) for route_data in routes_data]

routeconfig.save_routes(agency_id, routes, save_to_s3=save_to_s3)
routeconfig.save_routes(agency_id, routes, save_to_s3=save_to_s3, version_date=version_date)
23 changes: 18 additions & 5 deletions backend/models/routeconfig.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import re, os, time, requests, json, boto3, gzip
from pathlib import Path
from . import util, config

DefaultVersion = 'v3a'
Expand Down Expand Up @@ -121,8 +122,17 @@ def get_directions_for_stop(self, stop_id):
for s in direction['stops'] if s == stop_id
]

def get_cache_path(agency_id, version=DefaultVersion):
return f'{util.get_data_dir()}/routes_{version}_{agency_id}.json'
def get_cache_path(agency_id, version=DefaultVersion, version_date=None):
# version_date is for saving old versions of routes
# It has nothing to do with version=DefaultVersion
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved
if version_date == None:
return f'{util.get_data_dir()}/routes_{version}_{agency_id}.json'
else:
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved
return f'{util.get_data_dir()}/routes_{version}_{agency_id}_{version_date}/routes_{version}_{agency_id}_{version_date}.json'


##bri##return f"{util.get_data_dir()}/datekeys_{version}_{agency_id}/datekeys_{version}_{agency_id}.json"


def get_s3_path(agency_id, version=DefaultVersion):
return f'routes/{version}/routes_{version}_{agency_id}.json.gz'
Expand Down Expand Up @@ -179,14 +189,17 @@ def get_route_config(agency_id, route_id, version=DefaultVersion):
return route
return None

def save_routes(agency_id, routes, save_to_s3=False):
def save_routes(agency_id, routes, save_to_s3=False, version_date=None):
data_str = json.dumps({
'version': DefaultVersion,
'routes': [route.data for route in routes]
}, separators=(',', ':'))

cache_path = get_cache_path(agency_id)

cache_path = get_cache_path(agency_id, version_date=version_date)
cache_dir = Path(cache_path).parent
if not cache_dir.exists():
cache_dir.mkdir(parents = True, exist_ok = True)

with open(cache_path, "w") as f:
f.write(data_str)

Expand Down
35 changes: 31 additions & 4 deletions backend/save_routes.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@
from compute_stats import compute_stats_for_dates
import argparse
from datetime import date
import requests
from secrets import transitfeeds_api_key # you may have to create this
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved

# Downloads and parses the GTFS specification
# and saves the configuration for all routes to S3.
Expand Down Expand Up @@ -34,7 +36,7 @@
#
# Currently the script just overwrites the one S3 path, but this process could be extended in the future to
# store different paths for different dates, to allow fetching historical data for route configurations.
#
# UPDATE: We are now saving some older routes in versioned directories in metrics-mvp/backend/data
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved

if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Save route configuration from GTFS and possibly Nextbus API')
Expand All @@ -52,12 +54,38 @@

save_to_s3 = args.s3
d = date.today()

archive_date = date.today()
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved
errors = []

limit = '10'
urls_feed = 'https://api.transitfeeds.com/v1/getFeedVersions?key=' + transitfeeds_api_key + '&feed=sfmta%2F60&page=1&limit=' + limit + '&err=1&warn=1'
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved

response = requests.get(urls_feed)
data = response.json()
archiving_urls = []
archiving_dates = []
for i in range(len(data['results']['versions'])):
archiving_urls.append(data['results']['versions'][i]['url'])
archiving_dates.append(archiving_urls[i].split('/')[6])
archiving_dates[i] = archiving_dates[i][:4]+'-'+archiving_dates[i][4:6]+'-'+archiving_dates[i][6:]

for agency in agencies:
scraper = gtfs.GtfsScraper(agency)
scraper = gtfs.GtfsScraper(agency, archiving_url=None)
scraper.save_routes(save_to_s3, d)
errors += scraper.errors
'''
use https://transitfeeds.com/api/swagger/
to get old routes
and cache them in date versioned folders

'''

while(len(archiving_dates) > 0):
archiving_date = archiving_dates.pop()
archiving_url = archiving_urls.pop()
scraper_archiving = gtfs.GtfsScraper(agency, archiving_url=archiving_url)
scraper_archiving.save_routes(False, d, version_date=archiving_date)
Brian-Lee marked this conversation as resolved.
Show resolved Hide resolved
errors += scraper_archiving.errors

if args.timetables:
timetables_updated = scraper.save_timetables(save_to_s3=save_to_s3, skip_existing=True)
Expand All @@ -66,7 +94,6 @@
dates = sorted(scraper.get_services_by_date().keys())
compute_stats_for_dates(dates, agency, scheduled=True, save_to_s3=save_to_s3)

errors += scraper.errors

if errors:
raise Exception("\n".join(errors))