diff --git a/pytides_examples/Tide_Module_Examples_pytides.ipynb b/pytides_examples/Tidetools_Module_Examples.ipynb
similarity index 62%
rename from pytides_examples/Tide_Module_Examples_pytides.ipynb
rename to pytides_examples/Tidetools_Module_Examples.ipynb
index 2e908c5..c7a5230 100644
--- a/pytides_examples/Tide_Module_Examples_pytides.ipynb
+++ b/pytides_examples/Tidetools_Module_Examples.ipynb
@@ -2,19 +2,21 @@
"cells": [
{
"cell_type": "markdown",
+ "id": "incident-incident",
"metadata": {},
"source": [
- "# Example of Tide Prediction For One Date Instance\n",
+ "# Tidetools Module Functionalities\n",
"\n",
- "- In this example, method used to predict tide is adapated from Pytides\n",
+ "- In these examples, methods used to predict tide are adapated from Pytides\n",
"- This implementation will only work for known NOAA gauge stations\n",
- "- Harmonic Constituents data is scraped from NOAA. \n",
+ "- Harmonic Constituents data is fetched from NOAA. \n",
"\n",
"Adapted by @rjleveque from notebook of @socoyjonathan to test local pytides and new tidetools.py."
]
},
{
"cell_type": "markdown",
+ "id": "renewable-penetration",
"metadata": {},
"source": [
"## Tide Prediction Module Functions"
@@ -22,44 +24,50 @@
},
{
"cell_type": "markdown",
+ "id": "domestic-escape",
"metadata": {},
"source": [
- " - retrieve_constituents - retrieves harmonic constituents from NOAA gauge station\n",
+ " - fetch_harcon - Fetches harmonic constituents for CO-OPS station.\n",
+ " - make_pytides_model - Fetches harmonic constituents for station\n",
" - fetch_noaa_tide_data()
- retrieves datetimes, water levels and tide predictions at given NOAA tide station from NOAA's API\n",
- " \n",
- " - datetimes - prepares a collection of datetimes from beginning to end dates if needed\n",
- " \n",
+ " - fetch_datums - Fetch datums for CO-OPS station.\n",
" - predict_tide()
- predicts tide for desired NOAA station given station ID, start date and end date for prediction \n",
- " - datum_value - retrieves datum value for desired datum reference, utilized by predict_tide to obtain MTL value\n",
+ " - datetimes - prepares a collection of datetimes from beginning to end dates as needed\n",
" - detide()
- detides observed water levels with predicted tide\n",
- " - surge()
- predicts surge at NOAA gauge station provided station ID, start date, end date, and landfall date, best for a Clawpack Simulation!"
+ " - surge()
- predicts surge at NOAA gauge station provided station ID, start date, end date, and landfall date! "
]
},
{
"cell_type": "code",
"execution_count": null,
+ "id": "oriental-psychology",
"metadata": {},
"outputs": [],
"source": [
+ "from clawpack.geoclaw import tidetools\n",
"import matplotlib.pyplot as plt\n",
- "import datetime\n",
- "\n",
- "#from clawpack.geoclaw import tidetools\n",
- "\n",
- "# Eventually move tidetools.py to geoclaw\n",
- "# For development purposes, this is temporarily in this repo:\n",
- "\n",
- "import os, sys\n",
- "pathstr = os.path.abspath('..')\n",
- "if pathstr not in sys.path:\n",
- " sys.path.insert(0,pathstr)\n",
- "import tidetools\n",
- "\n",
- "print('Using tidetools from: %s' % tidetools.__file__)"
+ "import datetime"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "60cdf85c",
+ "metadata": {},
+ "source": [
+ "*******************************************************************************************************************"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "a103cb9a",
+ "metadata": {},
+ "source": [
+ "# Example of Tide Prediction For One Date Instance"
]
},
{
"cell_type": "markdown",
+ "id": "julian-thesaurus",
"metadata": {},
"source": [
"### **** Station Information ****"
@@ -67,6 +75,7 @@
},
{
"cell_type": "markdown",
+ "id": "after-minister",
"metadata": {},
"source": [
"Locate NOAA station ID. NOAA gauge stations home: https://tidesandcurrents.noaa.gov/
\n",
@@ -76,6 +85,7 @@
{
"cell_type": "code",
"execution_count": null,
+ "id": "sought-buyer",
"metadata": {},
"outputs": [],
"source": [
@@ -89,6 +99,7 @@
},
{
"cell_type": "markdown",
+ "id": "reported-mustang",
"metadata": {},
"source": [
"### Tide Prediction"
@@ -96,9 +107,10 @@
},
{
"cell_type": "markdown",
+ "id": "acceptable-chest",
"metadata": {},
"source": [
- "Prediction of tide at specified location (station ID) and specified time (GMT) implemented below by calling predict_tide()
method with the following arguments: station_id, beg_prediction_date, end_prediction_date. Note: datum argument is optional\n",
+ "Prediction of tide at specified location (station ID) and specified time (GMT) implemented below by calling predict_tide()
method with the following arguments: station_id, beg_prediction_date, end_prediction_date. Note: datum, time zone and units arguments are optional\n",
"\n",
"
\n",
"\n",
@@ -108,17 +120,19 @@
{
"cell_type": "code",
"execution_count": null,
+ "id": "complex-essay",
"metadata": {},
"outputs": [],
"source": [
"#NOAA Data Scraping Implementation \n",
- "height = tidetools.predict_tide(station_id, prediction_date, prediction_date, datum='MTL')\n",
+ "height = tidetools.predict_tide(station_id, prediction_date, prediction_date, datum)\n",
"times = tidetools.datetimes(prediction_date, prediction_date) # in meters\n",
"print(height[0], \"meters\")"
]
},
{
"cell_type": "markdown",
+ "id": "allied-thompson",
"metadata": {},
"source": [
"*******************************************************************************************************************"
@@ -126,6 +140,7 @@
},
{
"cell_type": "markdown",
+ "id": "infectious-defensive",
"metadata": {},
"source": [
"# Example of Tide Prediction In A Date Interval "
@@ -133,6 +148,7 @@
},
{
"cell_type": "markdown",
+ "id": "bacterial-chase",
"metadata": {},
"source": [
"### Station Information "
@@ -140,6 +156,7 @@
},
{
"cell_type": "markdown",
+ "id": "efficient-marathon",
"metadata": {},
"source": [
"Fill in station ID, a beginning date and an end date for tide prediction below"
@@ -148,6 +165,7 @@
{
"cell_type": "code",
"execution_count": null,
+ "id": "exciting-andrews",
"metadata": {},
"outputs": [],
"source": [
@@ -159,12 +177,13 @@
"beg_date = datetime.datetime(2005, 8, 26, hour=0)\n",
"end_date = datetime.datetime(2005, 8, 31, hour=0)\n",
"\n",
- "#Predict tide with arguments set as: (station_id, beg_prediction_date, end_prediction_date)\n",
- "predicted_tide = tidetools.predict_tide(station_id, beg_date, end_date, datum='MTL')"
+ "#Predict tide with arguments set as: (station_id, beg_pred_date, end_pred_date)\n",
+ "predicted_tide = tidetools.predict_tide(station_id, beg_date, end_date, datum)"
]
},
{
"cell_type": "markdown",
+ "id": "rapid-russell",
"metadata": {},
"source": [
"### Tide Predictions\n",
@@ -174,22 +193,24 @@
{
"cell_type": "code",
"execution_count": null,
+ "id": "unauthorized-marsh",
"metadata": {},
"outputs": [],
"source": [
- "#Method datetimes() makes a range of datetimes given arguments: (beg_prediction_date, end_prediction_date)\n",
+ "#Method datetimes() makes a range of datetimes given args: (beg_pred_date, end_pred_date)\n",
"times = tidetools.datetimes(beg_date, end_date)\n",
"\n",
"plt.figure(figsize=(20,10))\n",
"plt.plot(times, predicted_tide, \"-\", label=\"Tide Prediction\")\n",
"plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
"plt.ylabel('Meters'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- "plt.title('My Prediction for Station {}'.format(station_id))\n",
+ "plt.title('Pytide Tide Prediction for Station {}'.format(station_id))\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
+ "id": "macro-fossil",
"metadata": {},
"source": [
"*******************************************************************************************************************"
@@ -197,14 +218,24 @@
},
{
"cell_type": "markdown",
+ "id": "understanding-occurrence",
+ "metadata": {},
+ "source": [
+ "# Example Comparing Pytides vs NOAA Tide Prediction In A Date Interval "
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "id": "22f2867f",
"metadata": {},
"source": [
- "# Example Comparing NOAA vs Our Tide Prediction In A Date Interval "
+ "In this example, we implement the Pytides submodule to predict the tide from Hurricane Katrina in 2005 on Grand Isle, LA (NOAA gauge station: 8761724) between start date, 08-26-2005, and end date, 08-31-2005. NOAA's predicted tide and observed water levels are also displayed for comparison."
]
},
{
"cell_type": "code",
"execution_count": null,
+ "id": "heated-glossary",
"metadata": {},
"outputs": [],
"source": [
@@ -217,54 +248,51 @@
"end_date = datetime.datetime(2005, 8, 31)\n",
"\n",
"#Predict Tide \n",
- "predicted_tide = tidetools.predict_tide(station_id, beg_date, end_date, datum='MTL')"
+ "predicted_tide = tidetools.predict_tide(station_id, beg_date, end_date, datum)"
]
},
{
"cell_type": "markdown",
+ "id": "brief-madonna",
"metadata": {},
"source": [
"- Calling function fetch_noaa_tide_data()
with arguments set as (station_id, beg_prediction_date, end_prediction_date) retrieves datetimes, water levels and tide predictions for the specified NOAA station in the date interval provided from NOAA's API\n",
- "- Data is scraped in Metric units, GMT timezone, MTL datum and 6 min intervals. These arguments are optional in fetch_noaa_tide_data()
."
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "RJL: I'd suggest breaking the line in the next cell so it fits better when viewed in a narrow browser window. There are also a couple other places below where long lines could be split."
+ "- Data is fetched in Metric units, GMT timezone, MTL datum and 6 min intervals. These arguments are optional in fetch_noaa_tide_data()
."
]
},
{
"cell_type": "code",
"execution_count": null,
+ "id": "minimal-dancing",
"metadata": {},
"outputs": [],
"source": [
"#Retrieve NOAA Tide Data\n",
"times, NOAA_observed_water_lvl, NOAA_predicted_tide = \\\n",
- " tidetools.fetch_noaa_tide_data(station_id, beg_date, end_date, datum='MTL')"
+ " tidetools.fetch_noaa_tide_data(station_id, beg_date, end_date)"
]
},
{
"cell_type": "code",
"execution_count": null,
+ "id": "verified-minimum",
"metadata": {},
"outputs": [],
"source": [
"#Plot Comparisons\n",
"plt.figure(figsize=(20,10))\n",
- "plt.plot(times, predicted_tide, \"-\", label=\"Our Tide Prediction\")\n",
+ "plt.plot(times, predicted_tide, \"-\", label=\"Pytides Prediction\")\n",
"plt.plot(times, NOAA_predicted_tide, \"-\", label=\"NOAA Tide Prediction\")\n",
"plt.plot(times, NOAA_observed_water_lvl, \"-\", label=\"NOAA Water Level Observation\")\n",
"plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
"plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- "plt.title('Comparison of Our Prediction vs NOAA prediction for Station {}'.format(station_id))\n",
+ "plt.title('Comparison of Pytides Tide Prediction vs NOAA Prediction for Station {}'.format(station_id))\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
+ "id": "acquired-attitude",
"metadata": {},
"source": [
"*******************************************************************************************************************"
@@ -272,20 +300,23 @@
},
{
"cell_type": "markdown",
+ "id": "loaded-porter",
"metadata": {},
"source": [
- "RJL: I think you need some explanation either here or at the start of this example of what you are showing, i.e. that you have chose and date and location where there was a bit storm surge (from what event?) so the reader knows what they are looking at. And below you might comment that the reason for detiding the observations might be to compare the surge alone with GeoClaw simulation results, for example."
+ "# Example Detiding and Capturing A Surge for a Gauge Station "
]
},
{
"cell_type": "markdown",
+ "id": "61648635",
"metadata": {},
"source": [
- "# Example Detiding and Capturing A Surge for a Gauge Station "
+ "In this example, we detide the observed water levels for Hurricane Katrina in 2005 on Grand Isle, LA (NOAA gauge station: 8761724) between start date, 08-26-2005, and end date, 08-31-2005, using Pytides predicted tide. With this method, one may compare the storm surge from Pytides prediction with GeoClaw simulation results as shown in the next example."
]
},
{
"cell_type": "markdown",
+ "id": "golden-equivalent",
"metadata": {},
"source": [
"- Calling predict_tide()
method with arguments set as: (station_id, beg_prediction_date, end_prediction_date) outputs predicted tide\n",
@@ -296,6 +327,7 @@
{
"cell_type": "code",
"execution_count": null,
+ "id": "perceived-variety",
"metadata": {},
"outputs": [],
"source": [
@@ -309,13 +341,13 @@
"\n",
"predicted_tide = tidetools.predict_tide(station_id, beg_date, end_date)\n",
"times, NOAA_observed_water_lvl, NOAA_predicted_tide = \\\n",
- " tidetools.fetch_noaa_tide_data(station_id, beg_date, end_date, datum='MTL')\n",
+ " tidetools.fetch_noaa_tide_data(station_id, beg_date, end_date, datum)\n",
"\n",
"surge = tidetools.detide(NOAA_observed_water_lvl, predicted_tide)\n",
"\n",
"#Plot Comparisons\n",
"plt.figure(figsize=(20,10))\n",
- "plt.plot(times, surge, \"-\", label=\"Our Surge Prediction\")\n",
+ "plt.plot(times, surge, \"-\", label=\"Pytides Storm Surge Prediction\")\n",
"plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
"plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
"plt.title('Detided Water Level for Station {}'.format(station_id))\n",
@@ -324,6 +356,7 @@
},
{
"cell_type": "markdown",
+ "id": "continued-fruit",
"metadata": {},
"source": [
"*******************************************************************************************************************"
@@ -331,29 +364,26 @@
},
{
"cell_type": "markdown",
+ "id": "useful-invasion",
"metadata": {},
"source": [
- "# Example for Clawpack Storm Surge Implementation"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "RJL: It's not clear what you mean here by \"storm surge implementation\". It seems like you are talking about comparing the NOAA gauges results with the surge simulated by GeoClaw, but I don't think you are showing any GeoClaw results in this notebook. This might not be clear to a reader, and when you label some curves with e.g. \"our detided prediction\" they might think that's the GeoClaw simulated surge, whereas I think you mean the NOAA observations minus the predicted tide that you computed from the constituents using the new `tide.py` (as opposed to the NOAA predictions downloaded from the webpage)."
+ "# Example Implementation of Pytides Tide Prediction on Clawpack"
]
},
{
"cell_type": "markdown",
+ "id": "completed-awareness",
"metadata": {},
"source": [
- "- Code below works best if placed in gauge_afteraxes( ) in setplot.py for a storm simulation.\n",
- "- Calling surge()
method with arguments set as: (station_id, beginning_date, end_date, landfall_date) will output storm surge from NOAA observed water levels and predicted tide."
+ "- Code below may be utilized to compare NOAA guage results with Clawpack storm surge simulations. \n",
+ "- Code below works best if placed in gauge_afteraxes( ) in setplot.py for comparison.\n",
+ "- Calling surge()
method with arguments set as: (station_id, beginning_date, end_date, landfall_date) will output predicted storm surge from NOAA observed water levels minus Pytides submodule predicted tide."
]
},
{
"cell_type": "code",
"execution_count": null,
+ "id": "classified-speaker",
"metadata": {},
"outputs": [],
"source": [
@@ -367,11 +397,12 @@
"\n",
"# Surge Prediction\n",
"times, surge = tidetools.surge(station_id, beg_date, end_date, landfall_date)\n",
- "plt.plot(times, surge, color=\"b\", label=\"Our Prediction\")"
+ "plt.plot(times, surge, color=\"b\", label=\"Pytides Prediction\")"
]
},
{
"cell_type": "markdown",
+ "id": "growing-labor",
"metadata": {},
"source": [
"*******************************************************************************************************************"
@@ -379,6 +410,7 @@
},
{
"cell_type": "markdown",
+ "id": "polished-sector",
"metadata": {},
"source": [
"# Example Iterating Through A Library Of Stations And Date Intervals"
@@ -386,24 +418,36 @@
},
{
"cell_type": "markdown",
+ "id": "552f02ab",
"metadata": {},
"source": [
- "RJL: Again you might add some more discussion here, saying that these particular stations and dates correspond to particular surge events (which is sort of clear from the comments such as `#katrina` in the cell below, but could be explained more, perhaps with links to NOAA pages or elsewhere describing these events?)\n",
+ "In this example, we provide tide and storm surge predictions for five different storm events utilizing Pytides submodule and data from NOAA. \n",
+ "\n",
+ "1) First storm event considered is [Hurricane Katrina](https://www.nhc.noaa.gov/data/tcr/AL122005_Katrina.pdf) on NOAA station [Grand Isle, LA](https://tidesandcurrents.noaa.gov/stationhome.html?id=8761724) with a storm surge within the given dates. \n",
+ "\n",
+ "2) Second storm event considered is [Hurricane Michael](https://www.nhc.noaa.gov/data/tcr/AL142018_Michael.pdf) on NOAA station [Pilots Station East, SW Pass, LA](https://tidesandcurrents.noaa.gov/stationhome.html?id=8760922) with a storm surge within the given dates. \n",
"\n",
- "RJL: Also, in a lot of figures below it looks like the observations were pretty far from the predicted tides even before the big surge arrived, is that because the storm was already having a significant effect (and hence this deviation from predictions should be preserved in the de-tided version that you will compare to GeoClaw results, since it should also be seen in the simulations presumably) or is this because the predictions are poor for some other reason? It would be valuable to explain this a bit more. In the case of tsunamis, often nothing unusual happens until the first big wave arrives, but surge is probably different."
+ "3) Third storm event considered is [Hurricane Matthew](https://www.nhc.noaa.gov/data/tcr/AL142016_Matthew.pdf) on NOAA station [Wilmington, NC](https://tidesandcurrents.noaa.gov/stationhome.html?id=8658120) with a storm surge within the given dates. \n",
+ "\n",
+ "5) Fourth storm event considered is [Hurricane Irma](https://www.nhc.noaa.gov/data/tcr/AL112017_Irma.pdf) on NOAA station [Vaca Key, Florida Bay, FL](https://tidesandcurrents.noaa.gov/stationhome.html?id=8723970) with a storm surge within the given dates. \n",
+ "\n",
+ "4) Fifth storm event considered is [Hurricane Dorian](https://www.nhc.noaa.gov/data/tcr/AL052019_Dorian.pdf) on NOAA station [Trident Pier, Port Canaveral, FL](https://tidesandcurrents.noaa.gov/stationhome.html?id=8721604) with a storm surge within the given dates. "
]
},
{
"cell_type": "code",
"execution_count": null,
- "metadata": {},
+ "id": "correct-twist",
+ "metadata": {
+ "scrolled": false
+ },
"outputs": [],
"source": [
"station_dict = {'8761724': ('Grand Isle, LA', (2005, 8, 26), (2005, 8, 31), (2005, 8, 29, 11, 10)), #katrina\n",
" '8760922': ('Pilots Station East, SW Pass, LA', (2005, 8, 26), (2005, 8, 31), (2005, 8, 29, 11)), #michael\n",
" '8658120': ('Wilmington, NC', (2016, 10, 6, 12), (2016, 10, 9, 12), (2016, 10, 8, 12)), #matthew\n",
- " '8721604': ('Trident Pier, Port Canaveral, FL', (2019, 8, 24), (2019, 9, 9), (2019, 9, 4, 12)), #dorian\n",
- " '8723970': ('Vaca Key, Florida Bay, FL', (2017, 9, 6, 13), (2017, 9, 12, 13), (2017, 9, 10, 13)) #irma\n",
+ " '8723970': ('Vaca Key, Florida Bay, FL', (2017, 9, 6, 13), (2017, 9, 12, 13), (2017, 9, 10, 13)), #irma\n",
+ " '8721604': ('Trident Pier, Port Canaveral, FL', (2019, 8, 24), (2019, 9, 9), (2019, 9, 4, 12)) #dorian\n",
" }\n",
"\n",
"for (key, value) in station_dict.items():\n",
@@ -417,7 +461,7 @@
" predicted_tide = tidetools.predict_tide(station_id, beg_date, end_date) \n",
" \n",
" times, NOAA_observed_water_lvl, NOAA_predicted_tide = \\\n",
- " tidetools.fetch_noaa_tide_data(station_id, beg_date, end_date, datum='MTL')\n",
+ " tidetools.fetch_noaa_tide_data(station_id, beg_date, end_date)\n",
"\n",
" #Detide Water Level\n",
" surge = tidetools.detide(NOAA_observed_water_lvl, predicted_tide)\n",
@@ -425,37 +469,37 @@
" \n",
" #Plot Comparisons\n",
" plt.figure(figsize=(20,10))\n",
- " plt.plot(times, predicted_tide, \"-\", label=\"Our Tide Prediction\")\n",
+ " plt.plot(times, predicted_tide, \"-\", label=\"Pytides Tide Prediction\")\n",
" plt.plot(times, NOAA_predicted_tide, \"-\", label=\"NOAA Tide Prediction\")\n",
" plt.plot(times, NOAA_observed_water_lvl, \"-\", label=\"NOAA Water Level Observation\")\n",
" plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
" plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- " plt.title('Comparison of Our Prediction vs NOAA prediction for Station {}, {}'.format(station_id, station_name))\n",
+ " plt.title('Comparison of Pytides Prediction vs NOAA Prediction for Station {}, {}'\\\n",
+ " .format(station_id, station_name))\n",
" plt.show()\n",
" \n",
" #Detided Water Level Comparison\n",
" plt.figure(figsize=(20,10))\n",
- " plt.plot(times, surge, \"-\", label=\"Our Detided Prediction\")\n",
- " plt.plot(times, NOAA_surge, \"-\", label=\"NOAA's Detided Prediction\")\n",
+ " plt.plot(times, surge, \"-\", label=\"Pytides Detided Prediction\")\n",
+ " plt.plot(times, NOAA_surge, \"-\", label=\"NOAA Detided Prediction\")\n",
" plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
" plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- " plt.title('Detided Water Level Comparison of Our Prediction vs NOAA prediction for Station {}, {}'.format(station_id, station_name))\n",
- " plt.show()\n",
- " \n",
- " \n",
- " #### Clawpack Implementation (in setplot.py) ####\n",
- " times, surge = tidetools.surge(station_id, beg_date, end_date, landfall_date)\n",
- " plt.plot(times, surge, color=\"b\", label=\"Our Surge Prediction\")\n",
- " \n",
- " "
+ " plt.title('Detided Water Level Comparison of Pytides Prediction vs NOAA Prediction for Station {}, {}'\\\n",
+ " .format(station_id, station_name))\n",
+ " plt.show()"
]
},
{
- "cell_type": "code",
- "execution_count": null,
+ "cell_type": "markdown",
+ "id": "91da5eb6",
"metadata": {},
- "outputs": [],
- "source": []
+ "source": [
+ "From the example above, one may observe that the NOAA water level observations before the storm surge deviate significantly from both Pytides and NOAA's predicted tide levels. Many factors may contribute to this significant discrepency in observed water levels and predicted tide levels way before the storm surge event. \n",
+ "\n",
+ "This discrepency may be most likely due to the effects of the storm event already affecting the water levels days prior to the event in the vicinity of the specified location. These effects may include changes in waves and wind setup, ocean and river currents, temperature of the ocean water, barometric pressure, and relative sea level changes during the time interval leading up to the event as presented by [NOAA] (https://www.nhc.noaa.gov/surge/surge_intro.pdf).\n",
+ "\n",
+ "However, other factor may contribute to this discrepency such as the difference in the actual tide level taking place and predicted tide level. Since a tide is the alternating rise and fall of the oceans with respect to the land, mainly produced by the gravitational attraction of the moon and sun but also due to non-astronomical factors such as the configuration of the coastline, local depth of the water and ocean-floor topography, the range of the tides, the times of arrival of the tides, and the time interval between high and low water will deviate from the harmonic analysis utilized by Pytides and NOAA as presented by [Paul Schureman](https://tidesandcurrents.noaa.gov/publications/SpecialPubNo98.pdf)."
+ ]
}
],
"metadata": {
@@ -474,7 +518,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.6.10"
+ "version": "3.9.12"
}
},
"nbformat": 4,
diff --git a/pytides_examples/pytides_tools.py b/pytides_examples/pytides_tools.py
deleted file mode 100644
index 79bb89b..0000000
--- a/pytides_examples/pytides_tools.py
+++ /dev/null
@@ -1,51 +0,0 @@
-"""
-Functions to facilitate using PyTides.
-"""
-
-import sys, os
-
-# Temporary fix to import local submodule version of pytides:
-CLAW = os.environ['CLAW'] # path to Clawpack files
-pathstr = os.path.join(CLAW, 'tidal-examples/pytides')
-assert os.path.isdir(pathstr), '*** clawpack/tidal-examples/pytides ***'
-print('pytides_tools is using Pytides from: %s' % pathstr)
-if pathstr not in sys.path:
- sys.path.insert(0,pathstr)
-
-from pytides.tide import Tide
-import numpy as np
-
-def new_tide_instance_from_existing(constit_list,existing_tide_instance):
- """
- constit_list is the list of constituents to be used in the
- new_tide_instance.
- The values of the amplitudes and phases for each of them is to be
- pulled from an existing_tide_instance. If no such constituent is in
- the existing_tide_instance, an error message is printed.
- """
- existing_constits = existing_tide_instance.model['constituent']
- existing_amps = existing_tide_instance.model['amplitude']
- existing_phases = existing_tide_instance.model['phase']
- len_existing = len(existing_constits)
- new_model = np.zeros(len(constit_list), dtype = Tide.dtype)
- new_constits=[]; new_amps=[]; new_phases=[];
- for ic in constit_list:
- success = False
- for j in range(len_existing):
- ie = existing_constits[j]
- if (ie.name == ic.name): #grab it
- success = True
- new_constits.append(ie)
- new_amps.append(existing_amps[j])
- new_phases.append(existing_phases[j])
- if (success == False):
- print ('Did not find consituent name: ',ic.name,\
- 'in existing tide instance')
- new_model['constituent'] = new_constits
- new_model['amplitude'] = new_amps
- new_model['phase'] = new_phases
-
- # The new_model is now complete, so make a tide instance called
- #called new_tide from it.
- new_tide = Tide(model = new_model, radians = False)
- return new_tide
diff --git a/tide_predictions/Boundary_Conditions.ipynb b/tide_predictions/Boundary_Conditions.ipynb
deleted file mode 100644
index b34cef6..0000000
--- a/tide_predictions/Boundary_Conditions.ipynb
+++ /dev/null
@@ -1,468 +0,0 @@
-{
- "cells": [
- {
- "cell_type": "markdown",
- "id": "070f5263",
- "metadata": {
- "scrolled": false
- },
- "source": [
- "# Example of Tide Prediction For One Date Instance\n",
- "\n",
- "- In this example, method used to predict tide is adapated from Pytides\n",
- "- This implementation will only work for known NOAA gauge stations\n",
- "- Harmonic Constituents data is scraped from NOAA. "
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "f1656e3a",
- "metadata": {},
- "outputs": [],
- "source": [
- "import matplotlib.pyplot as plt\n",
- "import datetime\n",
- "import noaa_scraper as noaa "
- ]
- },
- {
- "cell_type": "markdown",
- "id": "1cd0bf32",
- "metadata": {},
- "source": [
- "### **** Station Information ****"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "33307057",
- "metadata": {},
- "source": [
- "Locate NOAA station ID. NOAA gauge stations home: https://tidesandcurrents.noaa.gov/
\n",
- "Fill in station ID and date for tide prediction!"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "3c76d8a9",
- "metadata": {},
- "outputs": [],
- "source": [
- "#Station Information\n",
- "stationID = '8518750'\n",
- "\n",
- "#Date of prediction (YEAR, MTH, DAY, HR)\n",
- "prediction_date = datetime.datetime(2017,10,5,0)"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "444bf7f2",
- "metadata": {},
- "source": [
- "### Tide Prediction"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "67132515",
- "metadata": {},
- "source": [
- "Prediction of tide at specified location (station ID) and specified time (GMT) implemented below by calling predict_tide( ) method with the following arguments: stationID, beg_prediction_date, end_prediction_date.
\n",
- "\n",
- "To predict tide at an instant, set beg_prediction_date and end_prediction_date in predict_tide( ) method to the same date!"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "fa3eebea",
- "metadata": {},
- "outputs": [],
- "source": [
- "#NOAA Data Scraping Implementation \n",
- "height = noaa.predict_tide(stationID, prediction_date, prediction_date)\n",
- "print(height[0])"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "e775a8ff",
- "metadata": {},
- "source": [
- "*******************************************************************************************************************"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "d4be333b",
- "metadata": {},
- "source": [
- "# Example of Tide Prediction In A Date Interval "
- ]
- },
- {
- "cell_type": "markdown",
- "id": "f64a6167",
- "metadata": {},
- "source": [
- "### Station Information "
- ]
- },
- {
- "cell_type": "markdown",
- "id": "b2b2c5c8",
- "metadata": {},
- "source": [
- "Fill in station ID, a beginning date and an end date for tide prediction below"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "6af4f3b1",
- "metadata": {},
- "outputs": [],
- "source": [
- "#Station Information\n",
- "stationID = '8518750'\n",
- "\n",
- "#Beginning and End Dates \n",
- "beg_date = datetime.datetime(2017,10,1,0,0)\n",
- "end_date = datetime.datetime(2017,10,5,0,0)\n",
- "\n",
- "#Predict tide with arguments set as: (stationID, beg_prediction_date, end_prediction_date)\n",
- "predicted_tide = noaa.predict_tide(stationID, beg_date, end_date)"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "5fdc6fa2",
- "metadata": {},
- "source": [
- "### Tide Predictions\n",
- "Plot results in a time series plot"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "b07d1fa6",
- "metadata": {},
- "outputs": [],
- "source": [
- "#Method datetimes() makes a range of datetimes given arguments: (beg_prediction_date, end_prediction_date)\n",
- "times = noaa.datetimes(beg_date, end_date)\n",
- "\n",
- "plt.figure(figsize=(20,10))\n",
- "plt.plot(times, predicted_tide, \"-\", label=\"My Prediction\")\n",
- "plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
- "plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- "plt.title('My Prediction for Station {}'.format(stationID))\n",
- "plt.show()"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "6e68cdb9",
- "metadata": {},
- "source": [
- "*******************************************************************************************************************"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "1d65a5a1",
- "metadata": {},
- "source": [
- "# Example Comparing NOAA vs Our Tide Prediction In A Date Interval "
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "bb42cecd",
- "metadata": {
- "scrolled": false
- },
- "outputs": [],
- "source": [
- "#Station Information\n",
- "stationID = '8518750'\n",
- "\n",
- "#Beginning and End Dates \n",
- "beg_date = datetime.datetime(2017,10,1,0,0)\n",
- "end_date = datetime.datetime(2017,10,2,0,0)\n",
- "\n",
- "#Predict Tide \n",
- "predicted_tide = noaa.predict_tide(stationID, beg_date, end_date)"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "0bcba7b5",
- "metadata": {},
- "source": [
- "- Calling function retrieve_water_levels( ) with arguments set as: (stationID, beg_prediction_date, end_prediction_date) retrieves and downloads NOAA's datetimes and observed water level data for the specified NOAA station in the date interval provided\n",
- "- The function retrieve_predicted_tide( ) arguments set as: (stationID, beg_prediction_date, end_prediction_date) retrieves and downloads NOAA's predicted tide data for the specified NOAA station. \n",
- "- Data is scraped in Metric units, GMT timezone, MSL datum and 6 min interval. "
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "558239c0",
- "metadata": {
- "scrolled": false
- },
- "outputs": [],
- "source": [
- "times, NOAA_observed_water_lvl = noaa.retrieve_water_levels(stationID, beg_date, end_date)\n",
- "NOAA_predicted_tide = noaa.retrieve_predicted_tide(stationID, beg_date, end_date)"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "c91f7d0d",
- "metadata": {},
- "outputs": [],
- "source": [
- "#Plot Comparisons\n",
- "plt.figure(figsize=(20,10))\n",
- "plt.plot(times, predicted_tide, \"-\", label=\"My Prediction\")\n",
- "plt.plot(times, NOAA_predicted_tide, \"-\", label=\"NOAA Prediction\")\n",
- "plt.plot(times, NOAA_observed_water_lvl, \"-\", label=\"NOAA Observation\")\n",
- "plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
- "plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- "plt.title('Comparison of Our Prediction vs NOAA prediction for Station {}'.format(stationID))\n",
- "plt.show()"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "edabf72c",
- "metadata": {},
- "source": [
- "*******************************************************************************************************************"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "25f9fca1",
- "metadata": {},
- "source": [
- "# Example Detiding and Capturing A Surge for a Gauge Station "
- ]
- },
- {
- "cell_type": "markdown",
- "id": "ff60fcac",
- "metadata": {},
- "source": [
- "- Calling predict_tide( ) method with arguments set as: (stationID, beg_prediction_date, end_prediction_date) will output predicted tide at the specified location and time interval\n",
- "- Calling retrieve_water_levels( ) method with arguments set as: (stationID, beg_prediction_date, end_prediction_date) with output datetimes and observed water levels at the specified location and time interval\n",
- "- Calling detide( ) method with arguments set as: (NOAA observed water level, predicted tide) will output detided water level. "
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "2e12ce1c",
- "metadata": {},
- "outputs": [],
- "source": [
- "#Station Information\n",
- "stationID = '8518750'\n",
- "\n",
- "#Beginning and End Dates \n",
- "beg_date = datetime.datetime(2017,10,1,0,0)\n",
- "end_date = datetime.datetime(2017,10,5,0,0)\n",
- "\n",
- "predicted_tide = noaa.predict_tide(stationID, beg_date, end_date)\n",
- "times, NOAA_observed_water_lvl = noaa.retrieve_water_levels(stationID, beg_date, end_date)\n",
- "\n",
- "surge = noaa.detide(NOAA_observed_water_lvl, predicted_tide)\n",
- "\n",
- "#Plot Comparisons\n",
- "plt.figure(figsize=(20,10))\n",
- "plt.plot(times, surge, \"-\", label=\"My Prediction\")\n",
- "plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
- "plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- "plt.title('Detided Water Level for Station {}'.format(stationID))\n",
- "plt.show()"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "5361dc29",
- "metadata": {},
- "source": [
- "*******************************************************************************************************************"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "48fd6775",
- "metadata": {},
- "source": [
- "# Example for Clawpack Storm Surge Implementation"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "44b5db31",
- "metadata": {},
- "source": [
- "- Uncomment line below to utilize Clawpack's pytides module located under geoclaw\n",
- "- Code below works best if placed in gauge_afteraxes( ) in setplot.py\n",
- "- Calling surge( ) method with arguments set as: (stationID, beginning_date, end_date, landfall_date) will output observed surge from NOAA."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "15e31fba",
- "metadata": {},
- "outputs": [],
- "source": [
- "#import clawpack.geoclaw.pytides.noaa_scraper as noaa"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "1483072e",
- "metadata": {},
- "outputs": [],
- "source": [
- "#Station Information\n",
- "stationID = '8518750'\n",
- "\n",
- "#Beginning and End Dates \n",
- "beg_date = datetime.datetime(2017,10,1,0,0)\n",
- "end_date = datetime.datetime(2017,10,5,0,0)\n",
- "landfall_date = datetime.datetime(2017,10,3,12,0)\n",
- "\n",
- "times, surge = noaa.surge(stationID, beg_date, end_date, landfall_date)\n",
- "plt.plot(times, surge, color=\"b\", label=\"My Prediction\")"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "d364a6c2",
- "metadata": {},
- "source": [
- "*******************************************************************************************************************"
- ]
- },
- {
- "cell_type": "markdown",
- "id": "b5b0d89f",
- "metadata": {},
- "source": [
- "# Example Iterating Through A Library Of Stations And Date Intervals"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "b3cdcd65",
- "metadata": {},
- "outputs": [],
- "source": [
- "#import clawpack.geoclaw.pytides.noaa_scraper as noaa\n",
- "from datetime import datetime"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "d26bd732",
- "metadata": {
- "scrolled": false
- },
- "outputs": [],
- "source": [
- "\n",
- "station_dict = {'8518750':(datetime(2017,10,1,0), datetime(2017,10,5,0), datetime(2017,10,3,0)),\n",
- " '8540433':(datetime(2017,9,1,0), datetime(2017,9,10,12), datetime(2017,9,5,6)),\n",
- " '8531680':(datetime(2020,10,1,0), datetime(2020,10,10,0), datetime(2020,10,6,0)),\n",
- " '8722956':(datetime(2019,8,1,0), datetime(2019,8,12,0), datetime(2019,8,6,12)),\n",
- " '8658120':(datetime(2018,11,1,0), datetime(2018,11,8,0), datetime(2018,11,3,18)),\n",
- " '8516945':(datetime(2013,1,1,0), datetime(2013,1,6,0), datetime(2013,1,3,12))}\n",
- "\n",
- "for (key, value) in station_dict.items():\n",
- " stationID = key\n",
- " beg_date = value[0]\n",
- " end_date = value[1]\n",
- " landfall_date = value[2]\n",
- " \n",
- " #NOAA Data Scraping Implementation\n",
- " predicted_tide = noaa.predict_tide(stationID, beg_date, end_date) \n",
- " times, NOAA_observed_water_lvl = noaa.retrieve_water_levels(stationID, beg_date, end_date)\n",
- " NOAA_predicted_tide = noaa.retrieve_predicted_tide(stationID, beg_date, end_date)\n",
- " \n",
- " #Detide Water Level\n",
- " surge = noaa.detide(NOAA_observed_water_lvl, predicted_tide)\n",
- " NOAA_surge = noaa.detide(NOAA_observed_water_lvl, NOAA_predicted_tide)\n",
- " \n",
- " #Plot Comparisons\n",
- " plt.figure(figsize=(20,10))\n",
- " plt.plot(times, predicted_tide, \"-\", label=\"My Prediction\")\n",
- " plt.plot(times, NOAA_predicted_tide, \"-\", label=\"NOAA Prediction\")\n",
- " plt.plot(times, NOAA_observed_water_lvl, \"-\", label=\"NOAA Observation\")\n",
- " plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
- " plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- " plt.title('Comparison of Our Prediction vs NOAA prediction for Station {}'.format(stationID))\n",
- " plt.show()\n",
- " \n",
- " #Detided Water Level Comparison\n",
- " plt.figure(figsize=(20,10))\n",
- " plt.plot(times, surge, \"-\", label=\"Our Detided Prediction\")\n",
- " plt.plot(times, NOAA_surge, \"-\", label=\"NOAA's Detided Prediction\")\n",
- " plt.xlabel('Hours since ' + str(beg_date) + ' (GMT)')\n",
- " plt.ylabel('Metres'), plt.margins(x=0), plt.legend(loc = 'best')\n",
- " plt.title('Detided Water Level Comparison of Our Prediction vs NOAA prediction for Station {}'.format(stationID))\n",
- " plt.show()\n",
- " \n",
- " #Clawpack Implementation\n",
- " times, surge = noaa.surge(stationID, beg_date, end_date, landfall_date)\n",
- " plt.plot(times, surge, color=\"b\", label=\"My Prediction\")\n",
- " \n",
- " "
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "dd4972d2",
- "metadata": {},
- "outputs": [],
- "source": []
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 3",
- "language": "python",
- "name": "python3"
- },
- "language_info": {
- "codemirror_mode": {
- "name": "ipython",
- "version": 3
- },
- "file_extension": ".py",
- "mimetype": "text/x-python",
- "name": "python",
- "nbconvert_exporter": "python",
- "pygments_lexer": "ipython3",
- "version": "3.9.6"
- }
- },
- "nbformat": 4,
- "nbformat_minor": 5
-}
diff --git a/tide_predictions/README.md b/tide_predictions/README.md
deleted file mode 100644
index 799f3c5..0000000
--- a/tide_predictions/README.md
+++ /dev/null
@@ -1,24 +0,0 @@
-# Tide Prediction
-
-This directory contains code to predict tide for NOAA gauge stations when given a station ID and date(s) of prediction.
-
-Method used to predict tides in this example is adapated from Pytides, a python package developed by Sam Cox. The method used is that of harmonic constituents, in particular as presented by P. Schureman in Special Publication 98.
-
-Harmonic Constituents data is scraped from NOAA's gauge station harmonic constituents page.
-
-# Requirements:
-
-* Scipy
-* Pandas
-* Request
-* lxml
-
-To install dependencies, run pip install -r requirements.txt .
-
-# Usage
-Use [Boundary_Conditions.ipynb](Boundary_Conditions.ipynb) to modify tide prediction examples, develop your own tide predictions, and obtain code for Clawpack implementation if placed in gauge_afteraxes( ) in setplot.py and calling surge( ) method with arguments set as: (stationID, beginning_date, end_date, landfall_date) to obtain NOAA's observed storm surge.
-
-Modify NOAA station ID and date(s) of prediction (UTC) to make your own example.
-
-# Acknowledgments
-This code was modified from [Pytides](https://github.com/sam-cox/pytides) to work with Python 3 and to more easily predict NOAA gauge station tides and storm surges with only calling surge( ) method with a stationID, a beginning date, an end date, and a landfall date in setplot.py to compare Clawpack storm surge ouput to NOAA's observed storm surge.
diff --git a/tide_predictions/astro.py b/tide_predictions/astro.py
deleted file mode 100644
index a15764d..0000000
--- a/tide_predictions/astro.py
+++ /dev/null
@@ -1,209 +0,0 @@
-from collections import namedtuple
-import numpy as np
-d2r, r2d = np.pi/180.0, 180.0/np.pi
-
-# Most of this is based around Meeus's Astronomical Algorithms, since it
-# presents reasonably good approximations of all the quantities we require in a
-# clear fashion. Reluctant to go all out and use VSOP87 unless it can be shown
-# to make a significant difference to the resulting accuracy of harmonic
-# analysis.
-
-#Convert a sexagesimal angle into decimal degrees
-def s2d(degrees, arcmins = 0, arcsecs = 0, mas = 0, muas = 0):
- return (
- degrees
- + (arcmins / 60.0)
- + (arcsecs / (60.0*60.0))
- + (mas / (60.0*60.0*1e3))
- + (muas / (60.0*60.0*1e6))
- )
-
-#Evaluate a polynomial at argument
-def polynomial(coefficients, argument):
- return sum([c * (argument ** i) for i,c in enumerate(coefficients)])
-
-#Evaluate the first derivative of a polynomial at argument
-def d_polynomial(coefficients, argument):
- return sum([c * i * (argument ** (i-1)) for i,c in enumerate(coefficients)])
-
-#Meeus formula 11.1
-def T(t):
- return (JD(t) - 2451545.0)/36525
-
-#Meeus formula 7.1
-def JD(t):
- Y, M = t.year, t.month
- D = (
- t.day
- + t.hour / (24.0)
- + t.minute / (24.0*60.0)
- + t.second / (24.0*60.0*60.0)
- + t.microsecond / (24.0 * 60.0 * 60.0 * 1e6)
- )
- if M <= 2:
- Y = Y - 1
- M = M + 12
- A = np.floor(Y / 100.0)
- B = 2 - A + np.floor(A / 4.0)
- return np.floor(365.25*(Y+4716)) + np.floor(30.6001*(M+1)) + D + B - 1524.5
-
-#Meeus formula 21.3
-terrestrial_obliquity_coefficients = (
- s2d(23,26,21.448),
- -s2d(0,0,4680.93),
- -s2d(0,0,1.55),
- s2d(0,0,1999.25),
- -s2d(0,0,51.38),
- -s2d(0,0,249.67),
- -s2d(0,0,39.05),
- s2d(0,0,7.12),
- s2d(0,0,27.87),
- s2d(0,0,5.79),
- s2d(0,0,2.45)
-)
-
-#Adjust these coefficients for parameter T rather than U
-terrestrial_obliquity_coefficients = [
- c * (1e-2) ** i for i,c in enumerate(terrestrial_obliquity_coefficients)
-]
-
-#Not entirely sure about this interpretation, but this is the difference
-#between Meeus formulae 24.2 and 24.3 and seems to work
-solar_perigee_coefficients = (
- 280.46645 - 357.52910,
- 36000.76932 - 35999.05030,
- 0.0003032 + 0.0001559,
- 0.00000048
-)
-
-#Meeus formula 24.2
-solar_longitude_coefficients = (
- 280.46645,
- 36000.76983,
- 0.0003032
-)
-
-#This value is taken from JPL Horizon and is essentially constant
-lunar_inclination_coefficients = (
- 5.145,
-)
-
-#Meeus formula 45.1
-lunar_longitude_coefficients = (
- 218.3164591,
- 481267.88134236,
- -0.0013268,
- 1/538841.0
- -1/65194000.0
-)
-
-#Meeus formula 45.7
-lunar_node_coefficients = (
- 125.0445550,
- -1934.1361849,
- 0.0020762,
- 1/467410.0,
- -1/60616000.0
-)
-
-#Meeus, unnumbered formula directly preceded by 45.7
-lunar_perigee_coefficients = (
- 83.3532430,
- 4069.0137111,
- -0.0103238,
- -1/80053.0,
- 1/18999000.0
-)
-
-#Now follow some useful auxiliary values, we won't need their speed.
-#See notes on Table 6 in Schureman for I, nu, xi, nu', 2nu''
-def _I(N, i, omega):
- N, i, omega = d2r * N, d2r*i, d2r*omega
- cosI = np.cos(i)*np.cos(omega)-np.sin(i)*np.sin(omega)*np.cos(N)
- return r2d*np.arccos(cosI)
-
-def _xi(N, i, omega):
- N, i, omega = d2r * N, d2r*i, d2r*omega
- e1 = np.cos(0.5*(omega-i))/np.cos(0.5*(omega+i)) * np.tan(0.5*N)
- e2 = np.sin(0.5*(omega-i))/np.sin(0.5*(omega+i)) * np.tan(0.5*N)
- e1, e2 = np.arctan(e1), np.arctan(e2)
- e1, e2 = e1 - 0.5*N, e2 - 0.5*N
- return -(e1 + e2)*r2d
-
-def _nu(N, i, omega):
- N, i, omega = d2r * N, d2r*i, d2r*omega
- e1 = np.cos(0.5*(omega-i))/np.cos(0.5*(omega+i)) * np.tan(0.5*N)
- e2 = np.sin(0.5*(omega-i))/np.sin(0.5*(omega+i)) * np.tan(0.5*N)
- e1, e2 = np.arctan(e1), np.arctan(e2)
- e1, e2 = e1 - 0.5*N, e2 - 0.5*N
- return (e1 - e2)*r2d
-
-#Schureman equation 224
-#Can we be more precise than B "the solar coefficient" = 0.1681?
-def _nup(N, i, omega):
- I = d2r * _I(N, i, omega)
- nu = d2r * _nu(N, i, omega)
- return r2d * np.arctan(np.sin(2*I)*np.sin(nu)/(np.sin(2*I)*np.cos(nu)+0.3347))
-
-#Schureman equation 232
-def _nupp(N, i, omega):
- I = d2r * _I(N, i, omega)
- nu = d2r * _nu(N, i, omega)
- tan2nupp = (np.sin(I)**2*np.sin(2*nu))/(np.sin(I)**2*np.cos(2*nu)+0.0727)
- return r2d * 0.5 * np.arctan(tan2nupp)
-
-AstronomicalParameter = namedtuple('AstronomicalParameter', ['value', 'speed'])
-
-def astro(t):
- a = {}
- #We can use polynomial fits from Meeus to obtain good approximations to
- #some astronomical values (and therefore speeds).
- polynomials = {
- 's': lunar_longitude_coefficients,
- 'h': solar_longitude_coefficients,
- 'p': lunar_perigee_coefficients,
- 'N': lunar_node_coefficients,
- 'pp': solar_perigee_coefficients,
- '90': (90.0,),
- 'omega': terrestrial_obliquity_coefficients,
- 'i': lunar_inclination_coefficients
- }
- #Polynomials are in T, that is Julian Centuries; we want our speeds to be
- #in the more convenient unit of degrees per hour.
- dT_dHour = 1 / (24 * 365.25 * 100)
- for name, coefficients in polynomials.items():
- a[name] = AstronomicalParameter(
- np.mod(polynomial(coefficients, T(t)), 360.0),
- d_polynomial(coefficients, T(t)) * dT_dHour
- )
-
- #Some other parameters defined by Schureman which are dependent on the
- #parameters N, i, omega for use in node factor calculations. We don't need
- #their speeds.
- args = list(each.value for each in [a['N'], a['i'], a['omega']])
- for name, function in {
- 'I': _I,
- 'xi': _xi,
- 'nu': _nu,
- 'nup': _nup,
- 'nupp': _nupp
- }.items():
- a[name] = AstronomicalParameter(np.mod(function(*args), 360.0), None)
-
- #We don't work directly with the T (hours) parameter, instead our spanning
- #set for equilibrium arguments #is given by T+h-s, s, h, p, N, pp, 90.
- #This is in line with convention.
- hour = AstronomicalParameter((JD(t) - np.floor(JD(t))) * 360.0, 15.0)
- a['T+h-s'] = AstronomicalParameter(
- hour.value + a['h'].value - a['s'].value,
- hour.speed + a['h'].speed - a['s'].speed
- )
- #It is convenient to calculate Schureman's P here since several node
- #factors need it, although it could be argued that these
- #(along with I, xi, nu etc) belong somewhere else.
- a['P'] = AstronomicalParameter(
- np.mod(a['p'].value -a['xi'].value,360.0),
- None
- )
- return a
-
diff --git a/tide_predictions/constituent.py b/tide_predictions/constituent.py
deleted file mode 100644
index 9fe1fa2..0000000
--- a/tide_predictions/constituent.py
+++ /dev/null
@@ -1,160 +0,0 @@
-import string
-import operator as op
-from functools import reduce
-import numpy as np
-import nodal_corrections as nc
-
-class BaseConstituent(object):
- xdo_int = {
- 'A': 1, 'B': 2, 'C': 3, 'D': 4, 'E': 5, 'F': 6, 'G': 7, 'H': 8, 'I': 9,
- 'J': 10, 'K': 11, 'L': 12, 'M': 13, 'N': 14, 'O': 15, 'P': 16, 'Q': 17,
- 'R': -8, 'S': -7, 'T': -6, 'U': -5, 'V': -4, 'W': -3, 'X': -2, 'Y': -1,
- 'Z': 0
- }
-
- int_xdo = {v:k for k, v in xdo_int.items()}
-
- def __init__(self, name, xdo='', coefficients=[], u=nc.u_zero, f=nc.f_unity):
- if xdo == '':
- self.coefficients = np.array(coefficients)
- else:
- self.coefficients = np.array(self.xdo_to_coefficients(xdo))
- self.name = name
- self.u = u
- self.f = f
-
- def xdo_to_coefficients(self, xdo):
- return [self.xdo_int[l.upper()] for l in xdo if l in string.ascii_letters]
-
- def coefficients_to_xdo(self, coefficients):
- return ''.join([self.int_xdo[c] for c in coefficients])
-
- def V(self, astro):
- return np.dot(self.coefficients, self.astro_values(astro))
-
- def xdo(self):
- return self.coefficients_to_xdo(self.coefficients)
-
- def speed(self, a):
- return np.dot(self.coefficients, self.astro_speeds(a))
-
- def astro_xdo(self, a):
- return [a['T+h-s'], a['s'], a['h'], a['p'], a['N'], a['pp'], a['90']]
-
- def astro_speeds(self, a):
- return np.array([each.speed for each in self.astro_xdo(a)])
-
- def astro_values(self, a):
- return np.array([each.value for each in self.astro_xdo(a)])
-
- #Consider two out of phase constituents which travel at the same speed to
- #be identical
- def __eq__(self, c):
- return np.all(self.coefficients[:-1] == c.coefficients[:-1])
-
- def __hash__(self):
- return hash(tuple(self.coefficients[:-1]))
-
-class CompoundConstituent(BaseConstituent):
-
- def __init__(self, members = [], **kwargs):
- self.members = members
-
- if 'u' not in kwargs:
- kwargs['u'] = self.u
- if 'f' not in kwargs:
- kwargs['f'] = self.f
-
- super(CompoundConstituent,self).__init__(**kwargs)
-
- self.coefficients = reduce(op.add,[c.coefficients * n for (c,n) in members])
-
- def speed(self, a):
- return reduce(op.add, [n * c.speed(a) for (c,n) in self.members])
-
- def V(self, a):
- return reduce(op.add, [n * c.V(a) for (c,n) in self.members])
-
- def u(self, a):
- return reduce(op.add, [n * c.u(a) for (c,n) in self.members])
-
- def f(self, a):
- return reduce(op.mul, [c.f(a) ** abs(n) for (c,n) in self.members])
-
-###### Base Constituents
-#Long Term
-_Z0 = BaseConstituent(name = 'Z0', xdo = 'Z ZZZ ZZZ', u = nc.u_zero, f = nc.f_unity)
-_Sa = BaseConstituent(name = 'Sa', xdo = 'Z ZAZ ZZZ', u = nc.u_zero, f = nc.f_unity)
-_Ssa = BaseConstituent(name = 'Ssa', xdo = 'Z ZBZ ZZZ', u = nc.u_zero, f = nc.f_unity)
-_Mm = BaseConstituent(name = 'Mm', xdo = 'Z AZY ZZZ', u = nc.u_zero, f = nc.f_Mm)
-_Mf = BaseConstituent(name = 'Mf', xdo = 'Z BZZ ZZZ', u = nc.u_Mf, f = nc.f_Mf)
-
-#Diurnals
-_Q1 = BaseConstituent(name = 'Q1', xdo = 'A XZA ZZA', u = nc.u_O1, f = nc.f_O1)
-_O1 = BaseConstituent(name = 'O1', xdo = 'A YZZ ZZA', u = nc.u_O1, f = nc.f_O1)
-_K1 = BaseConstituent(name = 'K1', xdo = 'A AZZ ZZY', u = nc.u_K1, f = nc.f_K1)
-_J1 = BaseConstituent(name = 'J1', xdo = 'A BZY ZZY', u = nc.u_J1, f = nc.f_J1)
-
-#M1 is a tricky business for reasons of convention, rather than theory. The
-#reasons for this are best summarised by Schureman paragraphs 126, 127 and in
-#the comments found in congen_input.txt of xtides, so I won't go over all this
-#again here.
-
-_M1 = BaseConstituent(name = 'M1', xdo = 'A ZZZ ZZA', u = nc.u_M1, f = nc.f_M1)
-_P1 = BaseConstituent(name = 'P1', xdo = 'A AXZ ZZA', u = nc.u_zero, f = nc.f_unity)
-_S1 = BaseConstituent(name = 'S1', xdo = 'A AYZ ZZZ', u = nc.u_zero, f = nc.f_unity)
-_OO1 = BaseConstituent(name = 'OO1', xdo = 'A CZZ ZZY', u = nc.u_OO1, f = nc.f_OO1)
-
-#Semi-Diurnals
-_2N2 = BaseConstituent(name = '2N2', xdo = 'B XZB ZZZ', u = nc.u_M2, f = nc.f_M2)
-_N2 = BaseConstituent(name = 'N2', xdo = 'B YZA ZZZ', u = nc.u_M2, f = nc.f_M2)
-_nu2 = BaseConstituent(name = 'nu2', xdo = 'B YBY ZZZ', u = nc.u_M2, f = nc.f_M2)
-_M2 = BaseConstituent(name = 'M2', xdo = 'B ZZZ ZZZ', u = nc.u_M2, f = nc.f_M2)
-_lambda2 = BaseConstituent(name = 'lambda2', xdo = 'B AXA ZZB', u = nc.u_M2, f = nc.f_M2)
-_L2 = BaseConstituent(name = 'L2', xdo = 'B AZY ZZB', u = nc.u_L2, f = nc.f_L2)
-_T2 = BaseConstituent(name = 'T2', xdo = 'B BWZ ZAZ', u = nc.u_zero, f = nc.f_unity)
-_S2 = BaseConstituent(name = 'S2', xdo = 'B BXZ ZZZ', u = nc.u_zero, f = nc.f_unity)
-_R2 = BaseConstituent(name = 'R2', xdo = 'B BYZ ZYB', u = nc.u_zero, f = nc.f_unity)
-_K2 = BaseConstituent(name = 'K2', xdo = 'B BZZ ZZZ', u = nc.u_K2, f = nc.f_K2)
-
-#Third-Diurnals
-_M3 = BaseConstituent(name = 'M3', xdo = 'C ZZZ ZZZ', u = lambda a: nc.u_Modd(a,3), f = lambda a: nc.f_Modd(a,3))
-
-###### Compound Constituents
-#Long Term
-_MSF = CompoundConstituent(name = 'MSF', members = [(_S2, 1), (_M2, -1)])
-
-#Diurnal
-_2Q1 = CompoundConstituent(name = '2Q1', members = [(_N2, 1), (_J1, -1)])
-_rho1 = CompoundConstituent(name = 'rho1', members = [(_nu2, 1), (_K1, -1)])
-
-#Semi-Diurnal
-
-_mu2 = CompoundConstituent(name = 'mu2', members = [(_M2, 2), (_S2, -1)]) #2MS2
-_2SM2 = CompoundConstituent(name = '2SM2', members = [(_S2, 2), (_M2, -1)])
-
-#Third-Diurnal
-_2MK3 = CompoundConstituent(name = '2MK3', members = [(_M2, 1), (_O1, 1)])
-_MK3 = CompoundConstituent(name = 'MK3', members = [(_M2, 1), (_K1, 1)])
-
-#Quarter-Diurnal
-_MN4 = CompoundConstituent(name = 'MN4', members = [(_M2, 1), (_N2, 1)])
-_M4 = CompoundConstituent(name = 'M4', members = [(_M2, 2)])
-_MS4 = CompoundConstituent(name = 'MS4', members = [(_M2, 1), (_S2, 1)])
-_S4 = CompoundConstituent(name = 'S4', members = [(_S2, 2)])
-
-#Sixth-Diurnal
-_M6 = CompoundConstituent(name = 'M6', members = [(_M2, 3)])
-_S6 = CompoundConstituent(name = 'S6', members = [(_S2, 3)])
-
-#Eighth-Diurnals
-_M8 = CompoundConstituent(name = 'M8', members = [(_M2, 4)])
-
-
-noaa = [
- _M2, _S2, _N2, _K1, _M4, _O1, _M6, _MK3, _S4, _MN4, _nu2, _S6, _mu2,
- _2N2, _OO1, _lambda2, _S1, _M1, _J1, _Mm, _Ssa, _Sa, _MSF, _Mf,
- _rho1, _Q1, _T2, _R2, _2Q1, _P1, _2SM2, _M3, _L2, _2MK3, _K2,
- _M8, _MS4
-]
-
diff --git a/tide_predictions/noaa_scraper.py b/tide_predictions/noaa_scraper.py
deleted file mode 100644
index 8f5fea0..0000000
--- a/tide_predictions/noaa_scraper.py
+++ /dev/null
@@ -1,138 +0,0 @@
-import datetime
-import requests
-import lxml.html as lh
-import pandas as pd
-import json
-
-from tide import Tide
-import constituent as cons
-import numpy as np
-
-def retrieve_constituents(stationID):
- #Forms URL
- url = 'https://tidesandcurrents.noaa.gov/harcon.html?unit=0&timezone=0&id={}'.format(stationID)
-
- #Requests URL
- page = requests.get(url)
- doc = lh.fromstring(page.content)
- tr_elements = doc.xpath('//tr')
- col = [((t.text_content(),[])) for t in tr_elements[0]]
- for j in range(1, len(tr_elements)):
- T, i = tr_elements[j], 0
- for t in T.iterchildren():
- col[i][1].append(t.text_content())
- i+=1
-
- #Appends data to csv file
- component_dict = {title:column for (title,column) in col}
- component_array = pd.DataFrame(component_dict)
- component_array.to_csv('{}_constituents.csv'.format(stationID), index=False)
-
- return component_dict
-
-
-def retrieve_water_levels(*args):
- #NOAA api
- api = 'https://api.tidesandcurrents.noaa.gov/api/prod/datagetter?begin_date={}'\
- '&end_date={}&'.format(args[1].strftime("%Y%m%d %H:%M"), args[2].strftime("%Y%m%d %H:%M"))
-
- #NOAA observed data
- obs_url = 'station={}&product=water_level&datum=MTL&units=metric&time_zone=gmt'\
- '&application=ports_screen&format=json'.format(args[0])
-
- obs_data_page = requests.get(api + obs_url)
- obs_data = obs_data_page.json()['data']
- obs_heights = [float(d['v']) for d in obs_data]
-
- NOAA_times = [datetime.datetime.strptime(d['t'], '%Y-%m-%d %H:%M') for d in obs_data]
-
- component_dict = {'Datetimes': NOAA_times, 'Observed Heights': obs_heights}
- component_array = pd.DataFrame(component_dict)
- component_array.to_csv('{}_NOAA_water_levels.csv'.format(args[0]), index=False)
-
- return NOAA_times, obs_heights
-
-
-def retrieve_predicted_tide(*args):
- #NOAA api
- api = 'https://api.tidesandcurrents.noaa.gov/api/prod/datagetter?begin_date={}'\
- '&end_date={}&'.format(args[1].strftime("%Y%m%d %H:%M"), args[2].strftime("%Y%m%d %H:%M"))
-
- #NOAA predicted data
- pred_url = 'station={}&product=predictions&datum=MTL&units=metric&time_zone=gmt'\
- '&application=ports_screen&format=json'.format(args[0])
-
- pred_data_page = requests.get(api + pred_url)
- pred_data = pred_data_page.json()['predictions']
- pred_heights = [float(d['v']) for d in pred_data]
-
- return pred_heights
-
-
-def datum_value(stationID, datum):
- #Scrapes MTL/MSL Datum Value
- datum_url = 'https://api.tidesandcurrents.noaa.gov/api/prod/datagetter?&station={}&product=datums'\
- '&units=metric&time_zone=gmt&application=ports_screen&format=json'.format(stationID)
- page_data = requests.get(datum_url)
- data = page_data.json()['datums']
- datum_value = [d['v'] for d in data if d['n'] == datum]
-
- return float(datum_value[0])
-
-
-def predict_tide(*args):
- #These are the NOAA constituents, in the order presented on their website.
- constituents = [c for c in cons.noaa if c != cons._Z0]
- noaa_values = retrieve_constituents(args[0])
- noaa_amplitudes = [float(amplitude) for amplitude in noaa_values['Amplitude']]
- noaa_phases = [float(phases) for phases in noaa_values['Phase']]
-
- #We can add a constant offset (e.g. for a different datum, we will use relative to MLLW):
- MTL = datum_value(args[0], 'MTL')
- MSL = datum_value(args[0], 'MSL')
- offset = MSL - MTL
- constituents.append(cons._Z0)
- noaa_phases.append(0)
- noaa_amplitudes.append(offset)
-
- #Build the model
- assert(len(constituents) == len(noaa_phases) == len(noaa_amplitudes))
- model = np.zeros(len(constituents), dtype = Tide.dtype)
- model['constituent'] = constituents
- model['amplitude'] = noaa_amplitudes
- model['phase'] = noaa_phases
- tide = Tide(model = model, radians = False)
-
- #Time Calculations
- delta = (args[2]-args[1])/datetime.timedelta(hours=1) + .1
- times = Tide._times(args[1], np.arange(0, delta, .1))
-
- #Height Calculations
- heights_arrays = [tide.at([times[i]]) for i in range(len(times))]
- pytide_heights = [val for sublist in heights_arrays for val in sublist]
-
- return pytide_heights
-
-
-def datetimes(*args):
- #Time Calculations
- delta = (args[1]-args[0])/datetime.timedelta(hours=1) + .1
- times = Tide._times(args[1], np.arange(0, delta, .1))
-
- return times
-
-
-def detide(*args):
- # NOAA observed water level - predicted tide
- return [(args[0][i] - args[1][i]) for i in range(len(args[0]))]
-
-
-def surge(*args):
- #Surge Implementation
- predicted_tide = predict_tide(args[0], args[1], args[2])
- NOAA_times, NOAA_observed_water_level = retrieve_water_levels(args[0], args[1], args[2])
- surge = detide(NOAA_observed_water_level, predicted_tide)
- times = [(time-args[3])/datetime.timedelta(days=1) for time in NOAA_times]
-
- return times, surge
-
diff --git a/tide_predictions/nodal_corrections.py b/tide_predictions/nodal_corrections.py
deleted file mode 100644
index 170f0d6..0000000
--- a/tide_predictions/nodal_corrections.py
+++ /dev/null
@@ -1,141 +0,0 @@
-
-import numpy as np
-d2r, r2d = np.pi/180.0, 180.0/np.pi
-
-#The following functions take a dictionary of astronomical values (in degrees)
-#and return dimensionless scale factors for constituent amplitudes.
-
-def f_unity(a):
- return 1.0
-
-#Schureman equations 73, 65
-def f_Mm(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- mean = (2/3.0 - np.sin(omega)**2)*(1 - 3/2.0 * np.sin(i)**2)
- return (2/3.0 - np.sin(I)**2) / mean
-
-#Schureman equations 74, 66
-def f_Mf(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- mean = np.sin(omega)**2 * np.cos(0.5*i)**4
- return np.sin(I)**2 / mean
-
-#Schureman equations 75, 67
-def f_O1(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- mean = np.sin(omega) * np.cos(0.5*omega)**2 * np.cos(0.5*i)**4
- return (np.sin(I) * np.cos(0.5*I)**2) / mean
-
-#Schureman equations 76, 68
-def f_J1(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- mean = np.sin(2*omega) * (1-3/2.0 * np.sin(i)**2)
- return np.sin(2*I) / mean
-
-#Schureman equations 77, 69
-def f_OO1(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- mean = np.sin(omega) * np.sin(0.5*omega)**2 * np.cos(0.5*i)**4
- return np.sin(I) * np.sin(0.5*I)**2 / mean
-
-#Schureman equations 78, 70
-def f_M2(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- mean = np.cos(0.5*omega)**4 * np.cos(0.5*i)**4
- return np.cos(0.5*I)**4 / mean
-
-#Schureman equations 227, 226, 68
-#Should probably eventually include the derivations of the magic numbers (0.5023 etc).
-def f_K1(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- nu = d2r*a['nu'].value
- sin2Icosnu_mean = np.sin(2*omega) * (1-3/2.0 * np.sin(i)**2)
- mean = 0.5023*sin2Icosnu_mean + 0.1681
- return (0.2523*np.sin(2*I)**2 + 0.1689*np.sin(2*I)*np.cos(nu)+0.0283)**(0.5) / mean
-
-#Schureman equations 215, 213, 204
-#It can be (and has been) confirmed that the exponent for R_a reads 1/2 via Schureman Table 7
-def f_L2(a):
- P = d2r*a['P'].value
- I = d2r*a['I'].value
- R_a_inv = (1 - 12*np.tan(0.5*I)**2 * np.cos(2*P)+36*np.tan(0.5*I)**4)**(0.5)
- return f_M2(a) * R_a_inv
-
-#Schureman equations 235, 234, 71
-#Again, magic numbers
-def f_K2(a):
- omega = d2r*a['omega'].value
- i = d2r*a['i'].value
- I = d2r*a['I'].value
- nu = d2r*a['nu'].value
- sinsqIcos2nu_mean = np.sin(omega)**2 * (1-3/2.0 * np.sin(i)**2)
- mean = 0.5023*sinsqIcos2nu_mean + 0.0365
- return (0.2533*np.sin(I)**4 + 0.0367*np.sin(I)**2 *np.cos(2*nu)+0.0013)**(0.5) / mean
-
-#Schureman equations 206, 207, 195
-def f_M1(a):
- P = d2r*a['P'].value
- I = d2r*a['I'].value
- Q_a_inv = (0.25 + 1.5*np.cos(I)*np.cos(2*P)*np.cos(0.5*I)**(-0.5) + 2.25*np.cos(I)**2 * np.cos(0.5*I)**(-4))**(0.5)
- return f_O1(a) * Q_a_inv
-
-#See e.g. Schureman equation 149
-def f_Modd(a, n):
- return f_M2(a) ** (n / 2.0)
-
-#Node factors u, see Table 2 of Schureman.
-
-def u_zero(a):
- return 0.0
-
-def u_Mf(a):
- return -2.0 * a['xi'].value
-
-def u_O1(a):
- return 2.0 * a['xi'].value - a['nu'].value
-
-def u_J1(a):
- return -a['nu'].value
-
-def u_OO1(a):
- return -2.0 * a['xi'].value - a['nu'].value
-
-def u_M2(a):
- return 2.0 * a['xi'].value - 2.0 * a['nu'].value
-
-def u_K1(a):
- return -a['nup'].value
-
-#Schureman 214
-def u_L2(a):
- I = d2r*a['I'].value
- P = d2r*a['P'].value
- R = r2d*np.arctan(np.sin(2*P)/(1/6.0 * np.tan(0.5*I) **(-2) -np.cos(2*P)))
- return 2.0 * a['xi'].value - 2.0 * a['nu'].value - R
-
-def u_K2(a):
- return -2.0 * a['nupp'].value
-
-#Schureman 202
-def u_M1(a):
- I = d2r*a['I'].value
- P = d2r*a['P'].value
- Q = r2d*np.arctan((5*np.cos(I)-1)/(7*np.cos(I)+1)*np.tan(P))
- return a['xi'].value - a['nu'].value + Q
-
-def u_Modd(a, n):
- return n/2.0 * u_M2(a)
diff --git a/tide_predictions/requirements.txt b/tide_predictions/requirements.txt
deleted file mode 100644
index e5e854b..0000000
--- a/tide_predictions/requirements.txt
+++ /dev/null
@@ -1,7 +0,0 @@
-###### Requirements ######
-numpy
-scipy
-matplotlib
-pandas
-requests
-lxml
\ No newline at end of file
diff --git a/tide_predictions/tide.py b/tide_predictions/tide.py
deleted file mode 100644
index 7e4e110..0000000
--- a/tide_predictions/tide.py
+++ /dev/null
@@ -1,406 +0,0 @@
-from collections.abc import Iterable
-from collections import OrderedDict
-from itertools import takewhile, count
-try:
- from itertools import izip, ifilter
-except ImportError: #Python3
- izip = zip
- ifilter = filter
-from datetime import datetime, timedelta
-import numpy as np
-from scipy.optimize import leastsq, fsolve
-from astro import astro
-import constituent
-
-d2r, r2d = np.pi/180.0, 180.0/np.pi
-
-class Tide(object):
- dtype = np.dtype([
- ('constituent', object),
- ('amplitude', float),
- ('phase', float)])
-
- def __init__(
- self,
- constituents = None,
- amplitudes = None,
- phases = None,
- model = None,
- radians = False
- ):
- """
- Initialise a tidal model. Provide constituents, amplitudes and phases OR a model.
- Arguments:
- constituents -- list of constituents used in the model.
- amplitudes -- list of amplitudes corresponding to constituents
- phases -- list of phases corresponding to constituents
- model -- an ndarray of type Tide.dtype representing the constituents, amplitudes and phases.
- radians -- boolean representing whether phases are in radians (default False)
- """
- if None not in [constituents, amplitudes, phases]:
- if len(constituents) == len(amplitudes) == len(phases):
- model = np.zeros(len(phases), dtype=Tide.dtype)
- model['constituent'] = np.array(constituents)
- model['amplitude'] = np.array(amplitudes)
- model['phase'] = np.array(phases)
- else:
- raise ValueError("Constituents, amplitudes and phases should all be arrays of equal length.")
- elif model is not None:
- if not model.dtype == Tide.dtype:
- raise ValueError("Model must be a numpy array with dtype == Tide.dtype")
- else:
- raise ValueError("Must be initialised with constituents, amplitudes and phases; or a model.")
- if radians:
- model['phase'] = r2d*model['phase']
- self.model = model[:]
- self.normalize()
-
- def prepare(self, *args, **kwargs):
- return Tide._prepare(self.model['constituent'], *args, **kwargs)
-
- @staticmethod
- def _prepare(constituents, t0, t = None, radians = True):
- """
- Return constituent speed and equilibrium argument at a given time, and constituent node factors at given times.
- Arguments:
- constituents -- list of constituents to prepare
- t0 -- time at which to evaluate speed and equilibrium argument for each constituent
- t -- list of times at which to evaluate node factors for each constituent (default: t0)
- radians -- whether to return the angular arguments in radians or degrees (default: True)
- """
- #The equilibrium argument is constant and taken at the beginning of the
- #time series (t0). The speed of the equilibrium argument changes very
- #slowly, so again we take it to be constant over any length of data. The
- #node factors change more rapidly.
- if isinstance(t0, Iterable):
- t0 = t0[0]
- if t is None:
- t = [t0]
- if not isinstance(t, Iterable):
- t = [t]
- a0 = astro(t0)
- a = [astro(t_i) for t_i in t]
-
- #For convenience give u, V0 (but not speed!) in [0, 360)
- V0 = np.array([c.V(a0) for c in constituents])[:, np.newaxis]
- speed = np.array([c.speed(a0) for c in constituents])[:, np.newaxis]
- u = [np.mod(np.array([c.u(a_i) for c in constituents])[:, np.newaxis], 360.0)
- for a_i in a]
- f = [np.mod(np.array([c.f(a_i) for c in constituents])[:, np.newaxis], 360.0)
- for a_i in a]
-
- if radians:
- speed = d2r*speed
- V0 = d2r*V0
- u = [d2r*each for each in u]
- return speed, u, f, V0
-
- def at(self, t):
- """
- Return the modelled tidal height at given times.
- Arguments:
- t -- array of times at which to evaluate the tidal height
- """
- t0 = t[0]
- hours = self._hours(t0, t)
- partition = 240.0
- t = self._partition(hours, partition)
- times = self._times(t0, [(i + 0.5)*partition for i in range(len(t))])
- speed, u, f, V0 = self.prepare(t0, times, radians = True)
- H = self.model['amplitude'][:, np.newaxis]
- p = d2r*self.model['phase'][:, np.newaxis]
-
- return np.concatenate([
- Tide._tidal_series(t_i, H, p, speed, u_i, f_i, V0)
- for t_i, u_i, f_i in izip(t, u, f)
- ])
-
- def highs(self, *args):
- """
- Generator yielding only the high tides.
- Arguments:
- see Tide.extrema()
- """
- for t in ifilter(lambda e: e[2] == 'H', self.extrema(*args)):
- yield t
-
- def lows(self, *args):
- """
- Generator yielding only the low tides.
- Arguments:
- see Tide.extrema()
- """
- for t in ifilter(lambda e: e[2] == 'L', self.extrema(*args)):
- yield t
-
- def form_number(self):
- """
- Returns the model's form number, a helpful heuristic for classifying tides.
- """
- k1, o1, m2, s2 = (
- np.extract(self.model['constituent'] == c, self.model['amplitude'])
- for c in [constituent._K1, constituent._O1, constituent._M2, constituent._S2]
- )
- return (k1+o1)/(m2+s2)
-
- def classify(self):
- """
- Classify the tide according to its form number
- """
- form = self.form_number()
- if 0 <= form <= 0.25:
- return 'semidiurnal'
- elif 0.25 < form <= 1.5:
- return 'mixed (semidiurnal)'
- elif 1.5 < form <= 3.0:
- return 'mixed (diurnal)'
- else:
- return 'diurnal'
-
- def extrema(self, t0, t1 = None, partition = 2400.0):
- """
- A generator for high and low tides.
- Arguments:
- t0 -- time after which extrema are sought
- t1 -- optional time before which extrema are sought (if not given, the generator is infinite)
- partition -- number of hours for which we consider the node factors to be constant (default: 2400.0)
- """
- if t1:
- #yield from in python 3.4
- for e in takewhile(lambda t: t[0] < t1, self.extrema(t0)):
- yield e
- else:
- #We assume that extrema are separated by at least delta hours
- delta = np.amin([
- 90.0 / c.speed(astro(t0)) for c in self.model['constituent']
- if not c.speed(astro(t0)) == 0
- ])
- #We search for stationary points from offset hours before t0 to
- #ensure we find any which might occur very soon after t0.
- offset = 24.0
- partitions = (
- Tide._times(t0, i*partition) for i in count()), (Tide._times(t0, i*partition) for i in count(1)
- )
-
- #We'll overestimate to be on the safe side;
- #values outside (start,end) won't get yielded.
- interval_count = int(np.ceil((partition + offset) / delta)) + 1
- amplitude = self.model['amplitude'][:, np.newaxis]
- phase = d2r*self.model['phase'][:, np.newaxis]
-
- for start, end in izip(*partitions):
- speed, [u], [f], V0 = self.prepare(start, Tide._times(start, 0.5*partition))
- #These derivatives don't include the time dependence of u or f,
- #but these change slowly.
- def d(t):
- return np.sum(-speed*amplitude*f*np.sin(speed*t + (V0 + u) - phase), axis=0)
- def d2(t):
- return np.sum(-speed**2.0 * amplitude*f*np.cos(speed*t + (V0 + u) - phase), axis=0)
- #We'll overestimate to be on the safe side;
- #values outside (start,end) won't get yielded.
- intervals = (
- delta * i -offset for i in range(interval_count)), (delta*(i+1) - offset for i in range(interval_count)
- )
- for a, b in izip(*intervals):
- if d(a)*d(b) < 0:
- extrema = fsolve(d, (a + b) / 2.0, fprime = d2)[0]
- time = Tide._times(start, extrema)
- [height] = self.at([time])
- hilo = 'H' if d2(extrema) < 0 else 'L'
- if start < time < end:
- yield (time, height, hilo)
-
- @staticmethod
- def _hours(t0, t):
- """
- Return the hourly offset(s) of a (list of) time from a given time.
- Arguments:
- t0 -- time from which offsets are sought
- t -- times to find hourly offsets from t0.
- """
- if not isinstance(t, Iterable):
- return Tide._hours(t0, [t])[0]
- elif isinstance(t[0], datetime):
- return np.array([(ti-t0).total_seconds() / 3600.0 for ti in t])
- else:
- return t
-
- @staticmethod
- def _partition(hours, partition = 3600.0):
- """
- Partition a sorted list of numbers (or in this case hours).
- Arguments:
- hours -- sorted ndarray of hours.
- partition -- maximum partition length (default: 3600.0)
- """
- partition = float(partition)
- relative = hours - hours[0]
- total_partitions = np.ceil(relative[-1] / partition + 10*np.finfo(np.float).eps).astype('int')
- return [hours[np.floor(np.divide(relative, partition)) == i] for i in range(total_partitions)]
-
- @staticmethod
- def _times(t0, hours):
- """
- Return a (list of) datetime(s) given an initial time and an (list of) hourly offset(s).
- Arguments:
- t0 -- initial time
- hours -- hourly offsets from t0
- """
- if not isinstance(hours, Iterable):
- return Tide._times(t0, [hours])[0]
- elif not isinstance(hours[0], datetime):
- return np.array([t0 + timedelta(hours=h) for h in hours])
- else:
- return np.array(hours)
-
- @staticmethod
- def _tidal_series(t, amplitude, phase, speed, u, f, V0):
- return np.sum(amplitude*f*np.cos(speed*t + (V0 + u) - phase), axis=0)
-
- def normalize(self):
- """
- Adapt self.model so that amplitudes are positive and phases are in [0,360) as per convention
- """
- for i, (_, amplitude, phase) in enumerate(self.model):
- if amplitude < 0:
- self.model['amplitude'][i] = -amplitude
- self.model['phase'][i] = phase + 180.0
- self.model['phase'][i] = np.mod(self.model['phase'][i], 360.0)
-
- @classmethod
- def decompose(
- cls,
- heights,
- t = None,
- t0 = None,
- interval = None,
- constituents = constituent.noaa,
- initial = None,
- n_period = 2,
- callback = None,
- full_output = False
- ):
- """
- Return an instance of Tide which has been fitted to a series of tidal observations.
- Arguments:
- It is not necessary to provide t0 or interval if t is provided.
- heights -- ndarray of tidal observation heights
- t -- ndarray of tidal observation times
- t0 -- datetime representing the time at which heights[0] was recorded
- interval -- hourly interval between readings
- constituents -- list of constituents to use in the fit (default: constituent.noaa)
- initial -- optional Tide instance to use as first guess for least squares solver
- n_period -- only include constituents which complete at least this many periods (default: 2)
- callback -- optional function to be called at each iteration of the solver
- full_output -- whether to return the output of scipy's leastsq solver (default: False)
- """
- if t is not None:
- if isinstance(t[0], datetime):
- hours = Tide._hours(t[0], t)
- t0 = t[0]
- elif t0 is not None:
- hours = t
- else:
- raise ValueError("t can be an array of datetimes, or an array "
- "of hours since t0 in which case t0 must be "
- "specified.")
- elif None not in [t0, interval]:
- hours = np.arange(len(heights)) * interval
- else:
- raise ValueError("Must provide t(datetimes), or t(hours) and "
- "t0(datetime), or interval(hours) and t0(datetime) "
- "so that each height can be identified with an "
- "instant in time.")
-
- #Remove duplicate constituents (those which travel at exactly the same
- #speed, irrespective of phase)
- constituents = list(OrderedDict.fromkeys(constituents))
-
- #No need for least squares to find the mean water level constituent z0,
- #work relative to mean
- constituents = [c for c in constituents if not c == constituent._Z0]
- z0 = np.mean(heights)
- heights = heights - z0
-
- #Only analyse frequencies which complete at least n_period cycles over
- #the data period.
- constituents = [
- c for c in constituents
- if 360.0 * n_period < hours[-1] * c.speed(astro(t0))
- ]
- n = len(constituents)
-
- sort = np.argsort(hours)
- hours = hours[sort]
- heights = heights[sort]
-
- #We partition our time/height data into intervals over which we consider
- #the values of u and f to assume a constant value (that is, their true
- #value at the midpoint of the interval). Constituent
- #speeds change much more slowly than the node factors, so we will
- #consider these constant and equal to their speed at t0, regardless of
- #the length of the time series.
-
- partition = 240.0
-
- t = Tide._partition(hours, partition)
- times = Tide._times(t0, [(i + 0.5)*partition for i in range(len(t))])
-
- speed, u, f, V0 = Tide._prepare(constituents, t0, times, radians = True)
-
- #Residual to be minimised by variation of parameters (amplitudes, phases)
- def residual(hp):
- H, p = hp[:n, np.newaxis], hp[n:, np.newaxis]
- s = np.concatenate([
- Tide._tidal_series(t_i, H, p, speed, u_i, f_i, V0)
- for t_i, u_i, f_i in izip(t, u, f)
- ])
- res = heights - s
- if callback:
- callback(res)
- return res
-
- #Analytic Jacobian of the residual - this makes solving significantly
- #faster than just using gradient approximation, especially with many
- #measurements / constituents.
- def D_residual(hp):
- H, p = hp[:n, np.newaxis], hp[n:, np.newaxis]
- ds_dH = np.concatenate([
- f_i*np.cos(speed*t_i+u_i+V0-p)
- for t_i, u_i, f_i in izip(t, u, f)],
- axis = 1)
-
- ds_dp = np.concatenate([
- H*f_i*np.sin(speed*t_i+u_i+V0-p)
- for t_i, u_i, f_i in izip(t, u, f)],
- axis = 1)
-
- return np.append(-ds_dH, -ds_dp, axis=0)
-
- #Initial guess for solver, haven't done any analysis on this since the
- #solver seems to converge well regardless of the initial guess We do
- #however scale the initial amplitude guess with some measure of the
- #variation
- amplitudes = np.ones(n) * (np.sqrt(np.dot(heights, heights)) / len(heights))
- phases = np.ones(n)
-
- if initial:
- for (c0, amplitude, phase) in initial.model:
- for i, c in enumerate(constituents):
- if c0 == c:
- amplitudes[i] = amplitude
- phases[i] = d2r*phase
-
- initial = np.append(amplitudes, phases)
-
- lsq = leastsq(residual, initial, Dfun=D_residual, col_deriv=True, ftol=1e-7)
-
- model = np.zeros(1+n, dtype=cls.dtype)
- model[0] = (constituent._Z0, z0, 0)
- model[1:]['constituent'] = constituents[:]
- model[1:]['amplitude'] = lsq[0][:n]
- model[1:]['phase'] = lsq[0][n:]
-
- if full_output:
- return cls(model = model, radians = True), lsq
- return cls(model = model, radians = True)
diff --git a/tidetools.py b/tidetools.py
index 424af66..167e0cc 100644
--- a/tidetools.py
+++ b/tidetools.py
@@ -1,191 +1,180 @@
#!/usr/bin/env python
"""
-EVENTUALLY MOVE THIS TO GEOCLAW!
-
-GeoClaw util Module `$CLAW/geoclaw/src/python/geoclaw/tidetools.py`
+GeoClaw tidetools Module `$CLAW/geoclaw/src/python/geoclaw/tidetools.py`
Module provides provides tide prediction functions.
:Functions:
- - retrieve_constituents - retrieves harmonic constituents from NOAA gauge station
- - retrieve_water_levels - retrieves observed water levels from NOAA's API
- - retrieve_predicted_tide - retrieves predicted tide from NOAA's API
- - datum_value - retrieves datum value for desired datum reference
- - predict_tide - predicts tide with Pytides
- - datetimes - prepares a collection of datetimes from beginning to end dates
- - detide - detides observed water levels
- - surge - predicts surge at NOAA gauge station
+ - fetch_harcon
+ - make_pytides_model
+ - predict_tide
+ - fetch_noaa_tide_data
+ - fetch_datums
+ - datetimes
+ - detide
+ - surge
"""
from __future__ import absolute_import
from __future__ import print_function
-from collections.abc import Iterable
-from collections import OrderedDict, namedtuple
-from scipy.optimize import leastsq, fsolve
-from itertools import takewhile, count
-from datetime import datetime, timedelta
-from functools import reduce
+
from six.moves.urllib.parse import urlencode
from six.moves.urllib.request import urlopen
+import numpy, pandas
+import datetime
+import io, os, os.path, sys
+import json
-try:
- from itertools import izip, ifilter
-except ImportError: #Python3
- izip = zip
- ifilter = filter
-
-try:
- import requests
- import json
- import string
- import lxml.html as lh
- import pandas as pd
- import operator as op
- import numpy as np
- import io
- import os
- import os.path
-except ImportError as e:
- print(e)
-
-# Here's what we'd like to do: (?)
-#from clawpack.pytides.tide import Tide
-#import clawpack.pytides.constituent as cons
-#
+from clawpack.pytides.tide import Tide
+from clawpack.pytides.constituent import noaa, _Z0
# For now, hardwire in the path...
-import sys, os
CLAW = os.environ['CLAW'] # path to Clawpack files
-pathstr = os.path.join(CLAW, 'tidal-examples/pytides')
-assert os.path.isdir(pathstr), '*** Need clawpack/tidal-examples/pytides ***'
+pathstr = os.path.join(CLAW, 'pytides')
+assert os.path.isdir(pathstr), '*** Need clawpack/pytides ***'
print('Using Pytides from: %s' % pathstr)
if pathstr not in sys.path:
sys.path.insert(0,pathstr)
-from pytides.tide import Tide
+noaa_api = 'https://tidesandcurrents.noaa.gov/api/datagetter'
+url_base = 'https://api.tidesandcurrents.noaa.gov/mdapi/prod/webapi/stations'
+url_base_web = 'https://tidesandcurrents.noaa.gov/stationhome.html'
+######################## Tide Prediction Functions ########################
+def fetch_harcon(station, units='meters', verbose=False):
-d2r, r2d = np.pi/180.0, 180.0/np.pi
+ """
+ Fetch harmonic constituents for CO-OPS station.
+ Convert units of the amplitudes if required.
+ If verbose is True, print out the info and constinuents.
+
+ Returns:
+ harcon = list of dictionaries, one for each constituent, with keys:
+ 'number', 'name', 'description', 'amplitude',
+ 'phase_GMT', 'phase_local', 'speed'
+ To print the constituent names:
+ print([h['name'] for h in harcon])
+ amplitudes have been converted to specified units.
+ harcon_info = dictionary with keys:
+ 'units', 'HarmonicConstiuents', 'self'
+ harcon_info['HarmonicConstiuents'] is the list from which
+ harcon was created, after possibly changing units.
+ """
+
+ assert units in ['meters','feet'], '*** unrecognized units: %s' % units
+
+ try:
+ station_id = str(int(station)) # make sure it's a string of an int
+ except:
+ raise ValueError('Station cannot be converted to int')
-NOAA_API_URL = 'https://tidesandcurrents.noaa.gov/api/datagetter'
-NOAA_home = 'https://tidesandcurrents.noaa.gov/harcon.html'
+ url_harcon = '%s/%s/harcon.json' % (url_base, station_id)
-######################## Tide Prediction Functions ########################
+ with urlopen(url_harcon) as response:
+ text = response.read().decode('utf-8')
+
+ # convert to a dictionary:
+ p = json.loads(text)
+ hc = p['HarmonicConstituents'] # list of dictionaries
-def retrieve_constituents(station, time_zone='GMT', units='meters', cache_dir=None,
- verbose=True):
-
- """Fetch constituent data for given NOAA tide station.
- By default, retrieved data is cached in the geoclaw scratch directory
- located at:
- $CLAW/geoclaw/scratch
+ noaa_units = p['units'] # probably 'feet'
+
+ if noaa_units == 'feet' and units == 'meters':
+ for h in hc:
+ h['amplitude'] = h['amplitude'] * 0.3048
+
+ if noaa_units == 'meters' and units == 'feet':
+ for h in hc:
+ h['amplitude'] = h['amplitude'] / 0.3048
+
+ if verbose:
+ print('Harmonic constituent info for station %s' % station_id)
+ print('Should agree with info at \n %s?id=%s' \
+ % (url_base_web,station_id))
+ for k in p.keys():
+ if k != 'HarmonicConstituents':
+ print('%s: %s' % (k.rjust(20), p[k]))
+
+ print('Harmonic Constituents will be returned in units = %s:' % units)
+
+ print(' Name amplitude (%s) phase (GMT) speed' % units)
+ for h in hc:
+ print('%s: %11.3f %20.3f %18.6f' % (h['name'].rjust(5),h['amplitude'],
+ h['phase_GMT'],h['speed']))
+
+ harcon = hc
+ harcon_info = p # also return full dictionary
+
+ return harcon, harcon_info
+
+
+def make_pytides_model(station):
+ """
+ Fetch harmonic constituents for station and return lists of 37
+ amplitudes (meters) and phases (GMT) as required for a pytides model.
+ """
+
+ print('Fetching harmonic constituents...')
+ harcon, harcon_info = fetch_harcon(station, units='meters', verbose=False)
+
+ numbers = list(range(1,38)) # assume standard 37 constituents
+ harcon_numbers = [h['number'] for h in harcon]
+ # make sure there are the expected number and in the right order:
+ assert harcon_numbers == numbers, '*** unexpected harcon_numbers = %s' \
+ % harcon_numbers
+
+ amplitudes = [h['amplitude'] for h in harcon]
+ phases = [h['phase_GMT'] for h in harcon]
+
+ return amplitudes, phases
+
+
+def predict_tide(station, begin_date, end_date, datum='MTL', time_zone='GMT', units='meters'):
+ """Fetch datum value for given NOAA tide station.
:Required Arguments:
- station (string): 7 character station ID
+ - begin_date (datetime): start of date/time range of prediction
+ - end_date (datetime): end of date/time range of prediction
:Optional Arguments:
+ - datum (string): MTL for tide prediction
- time_zone (string): see NOAA API documentation for possible values
- units (string): see NOAA API documentation for possible values
- - cache_dir (string): alternative directory to use for caching data
- - verbose (bool): whether to output informational messages
:Returns:
- - constituent_dict (dictionary): dictionary of tidal constituents for NOAA gauge station
+ - heights (float): tide heights
"""
-
- def units_num(units):
- if (units == 'meters'):
- return 0
- elif (time_zone == 'feet'):
- return 1
+ #These are the NOAA constituents, in the order presented on NOAA's website.
+ constituents = [c for c in noaa if c != _Z0]
+ amplitudes, phases = make_pytides_model(station)
+
+ #We can add a constant offset - set to MTL
+ datums = fetch_datums(station)
+ desired_datum = [float(d['value']) for d in datums[0] if d['name'] == datum][0]
+ MSL = [float(d['value']) for d in datums[0] if d['name'] == 'MSL'][0]
+ offset = MSL - desired_datum
+ constituents.append(_Z0)
+ phases.append(0)
+ amplitudes.append(offset)
+
+ #Build the model
+ assert(len(constituents) == len(phases) == len(amplitudes))
+ model = numpy.zeros(len(constituents), dtype = Tide.dtype)
+ model['constituent'] = constituents
+ model['amplitude'] = amplitudes
+ model['phase'] = phases
+ tide = Tide(model = model, radians = False)
- def time_zone_num(time_zone):
- if (time_zone == 'GMT'):
- return 0
- elif (time_zone == 'Local'):
- return 1
-
- def get_noaa_params(station, time_zone, units):
- noaa_params = {
- 'unit': units_num(units),
- 'timezone': time_zone_num(time_zone),
- 'id': station
- }
- return noaa_params
+ #Time Calculations
+ delta = (end_date-begin_date)/datetime.timedelta(hours=1) + .1
+ times = Tide._times(begin_date, numpy.arange(0, delta, .1))
- # use geoclaw scratch directory for caching by default
- if cache_dir is None:
- if 'CLAW' not in os.environ:
- raise ValueError('CLAW environment variable not set')
- claw_dir = os.environ['CLAW']
- cache_dir = os.path.join(claw_dir, 'geoclaw', 'scratch') #### cache_dir
+ #Height Calculations
+ heights_arrays = [tide.at([times[i]]) for i in range(len(times))]
+ heights = [val for sublist in heights_arrays for val in sublist]
- def get_cache_path(station, time_zone, units):
- filename = '{}_{}_constituents'.format(time_zone, units)
- abs_cache_dir = os.path.abspath(cache_dir)
- return os.path.join(abs_cache_dir, 'constituents', station, filename)
-
- def save_to_cache(cache_path, data):
- # make parent directories if they do not exist
- parent_dir = os.path.dirname(cache_path)
- if not os.path.exists(parent_dir):
- os.makedirs(parent_dir)
-
- component_array = pd.DataFrame(data)
- component_array.to_csv(cache_path, index=False)
-
- def parse(cache_path):
- # read data into structured array, skipping header row if present
- data = pd.read_csv(cache_path)
- component_array = pd.DataFrame(data)
- component_dict = component_array.to_dict(orient='list')
- return component_dict
-
- #Requests URL
- def fetch_data(station, time_zone, units):
- noaa_params = get_noaa_params(station, time_zone, units)
- cache_path = get_cache_path(station, time_zone, units)
-
- # use cached data if available
- if os.path.exists(cache_path):
- if verbose:
- print('Using cached constituent data for station {}'.format(station))
- return parse(cache_path)
-
-
- # otherwise, retrieve data from NOAA and cache it
- if verbose:
- print('Fetching constituent data from NOAA for station {}'.format(station))
-
- #Forms URL
- url = '{}?{}'.format(NOAA_home, urlencode(noaa_params))
-
- page = requests.get(url)
- doc = lh.fromstring(page.content)
- tr_elements = doc.xpath('//tr')
- col = [((t.text_content(),[])) for t in tr_elements[0]]
- for j in range(1, len(tr_elements)):
- T, i = tr_elements[j], 0
- for t in T.iterchildren():
- col[i][1].append(t.text_content())
- i+=1
-
- constituent_dict = {title:column for (title,column) in col}
-
- # if there were no errors, then cache response
- save_to_cache(cache_path, constituent_dict)
-
- return constituent_dict
-
-
- try:
- constituents = fetch_data(station, time_zone, units)
-
- except:
- print('*** Fetching NOAA Constituents failed, returning None')
- constituents = None
-
- return constituents
+ return heights
def fetch_noaa_tide_data(station, begin_date, end_date, datum='MTL', time_zone='GMT', units='metric', cache_dir=None, verbose=True):
@@ -234,7 +223,8 @@ def fetch(product, expected_header, col_idx, col_types):
if verbose:
print('Fetching {} data from NOAA for station {}'.format(
product, station))
- full_url = '{}?{}'.format(NOAA_API_URL, urlencode(noaa_params))
+ full_url = '{}?{}'.format(noaa_api, urlencode(noaa_params))
+
with urlopen(full_url) as response:
text = response.read().decode('utf-8')
with io.StringIO(text) as data:
@@ -284,7 +274,7 @@ def save_to_cache(cache_path, data):
def parse(data, col_idx, col_types, header):
# read data into structured array, skipping header row if present
- a = np.genfromtxt(data, usecols=col_idx, dtype=col_types,
+ a = numpy.genfromtxt(data, usecols=col_idx, dtype=col_types,
skip_header=int(header), delimiter=',',
missing_values='')
@@ -319,150 +309,11 @@ def parse(data, col_idx, col_types, header):
# ensure that date/time ranges are the same
if (date_time is not None) and (date_time2 is not None):
- if not np.array_equal(date_time, date_time2):
+ if not numpy.array_equal(date_time, date_time2):
raise ValueError('Received data for different times')
return date_time, water_level, prediction
-
-def datum_value(station, datum, time_zone='GMT', units='metric'):
- """Fetch datum value for given NOAA tide station.
- :Required Arguments:
- - station (string): 7 character station ID
- - datum (string): MSL, MTL
- :Optional Arguments:
- - time_zone (string): see NOAA API documentation for possible values
- - units (string): see NOAA API documentation for possible values
- :Returns:
- - datum_value (float): value for requested datum reference
- """
- def get_noaa_params(station, time_zone, units):
- noaa_params = {
- 'product': 'datums',
- 'units': units,
- 'time_zone': time_zone,
- 'station': station,
- 'application': 'Clawpack',
- 'format':'json'
- }
- return noaa_params
-
- #Scrapes MTL/MSL Datum Value
- def get_datum(station, time_zone, units):
- noaa_params = get_noaa_params(station, time_zone, units)
- url = '{}?{}'.format(NOAA_API_URL, urlencode(noaa_params))
- page_data = requests.get(url)
- data = page_data.json()['datums']
- datum_value = [d['v'] for d in data if d['n'] == datum]
- return datum_value
-
- try:
- datum_value = float(get_datum(station, time_zone, units)[0])
- except:
- print('*** Fetching datum value failed, returning None')
- datum_value = None
-
- return datum_value
-
-
-def predict_tide(station, begin_date, end_date, datum='MTL', time_zone='GMT', units='meters'):
- """Fetch datum value for given NOAA tide station.
- :Required Arguments:
- - station (string): 7 character station ID
- - begin_date (datetime): start of date/time range of prediction
- - end_date (datetime): end of date/time range of prediction
- :Optional Arguments:
- - datum (string): MTL for tide prediction
- - time_zone (string): see NOAA API documentation for possible values
- - units (string): see NOAA API documentation for possible values
- :Returns:
- - heights (float): tide heights
- """
- #These are the NOAA constituents, in the order presented on NOAA's website.
- from pytides.constituent import noaa, _Z0
-
- constituents = [c for c in noaa if c != _Z0]
- noaa_values = retrieve_constituents(station, time_zone, units)
- noaa_amplitudes = [float(amplitude) for amplitude in noaa_values['Amplitude']]
- noaa_phases = [float(phases) for phases in noaa_values['Phase']]
-
- #We can add a constant offset - set to MTL
- # MTL = datum_value(args[0], 'MTL')
- desired_datum = datum_value(station, datum)
- MSL = datum_value(station, 'MSL')
- offset = MSL - desired_datum
- constituents.append(_Z0)
- noaa_phases.append(0)
- noaa_amplitudes.append(offset)
-
- #Build the model
- assert(len(constituents) == len(noaa_phases) == len(noaa_amplitudes))
- model = np.zeros(len(constituents), dtype = Tide.dtype)
- model['constituent'] = constituents
- model['amplitude'] = noaa_amplitudes
- model['phase'] = noaa_phases
- tide = Tide(model = model, radians = False)
-
- #Time Calculations
- delta = (end_date-begin_date)/timedelta(hours=1) + .1
- times = Tide._times(begin_date, np.arange(0, delta, .1))
-
- #Height Calculations
- heights_arrays = [tide.at([times[i]]) for i in range(len(times))]
- heights = [val for sublist in heights_arrays for val in sublist]
-
- return heights
-
-
-def datetimes(begin_date, end_date):
- #Time Calculations
- delta = (end_date-begin_date)/timedelta(hours=1) + .1
- times = Tide._times(begin_date, np.arange(0, delta, .1))
- return times
-
-
-def detide(NOAA_observed_water_level, predicted_tide):
- # NOAA observed water level - predicted tide
- return [(NOAA_observed_water_level[i] - predicted_tide[i]) for i in range(len(NOAA_observed_water_level))]
-
-#Surge Implementation
-def surge(station, beg_date, end_date, landfall_date):
- """Fetch datum value for given NOAA tide station.
- :Required Arguments:
- - station (string): 7 character station ID
- - begin_date (datetime): start of date/time range of prediction
- - end_date (datetime): end of date/time range of prediction
- - landfall_date (datetime): approximate time of landfall for reference
- :Optional Arguments:
- - datum (string): MTL for tide prediction and retrieval
- - time_zone (string): see NOAA API documentation for possible values
- :Returns:
- - times (float): times with landfall event as reference
- - surge (float): surge heights
- """
- predicted_tide = predict_tide(station, beg_date, end_date)
- NOAA_times, NOAA_observed_water_level, NOAA_predicted_tide = fetch_noaa_tide_data(station, beg_date, end_date)
-
- #detides NOAA observed water levels with predicted tide
- surge = detide(NOAA_observed_water_level, predicted_tide)
- #modifies NOAA times to datetimes
- times = [((pd.to_datetime(time).to_pydatetime())-landfall_date)/timedelta(days=1) for time in NOAA_times]
-
- return times, surge
-
-
-
-# ======================
-# From fetch_noaa_gauge_info.py
-
-
-url_base = 'https://api.tidesandcurrents.noaa.gov/mdapi/prod/webapi/stations'
-
-url_base_web = 'https://tidesandcurrents.noaa.gov/stationhome.html'
-
-station = '9441102' # Westport
-
-# Datums:
def fetch_datums(station, units='meters', verbose=False):
@@ -480,8 +331,8 @@ def fetch_datums(station, units='meters', verbose=False):
datums_info = dictionary with these keys:
'accepted', 'superseded', 'epoch', 'units', 'OrthometricDatum',
'datums', 'LAT', 'LATdate', 'LATtime', 'HAT', 'HATdate',
- 'HATtime', 'min', 'mindate', 'mintime', 'max', 'maxdate',
- 'maxtime', 'disclaimers', 'DatumAnalysisPeriod', 'NGSLink',
+ 'HATtime', 'min', 'mindate', 'mintime', 'max', 'maxdate',
+ 'maxtime', 'disclaimers', 'DatumAnalysisPeriod', 'NGSLink',
'ctrlStation', 'self'
datums_info['datums'] is the list from which
datums was created, after possibly changing units.
@@ -517,9 +368,9 @@ def fetch_datums(station, units='meters', verbose=False):
print('Datums info for station %s' % station_id)
print('Should agree with info at \n %s?id=%s' \
% (url_base_web,station_id))
- for k in p.keys():
+ for k in p.keys():
if k != 'datums':
- print('%s: %s' % (k.rjust(20), p[k]))
+ print('%s: %s' % (k.rjust(20), p[k]))
print('Datums will be returned in units = %s:' % units)
for d in datums:
print('%s: %11.3f %s' % (d['name'].rjust(7),d['value'],units))
@@ -528,88 +379,76 @@ def fetch_datums(station, units='meters', verbose=False):
return datums, datums_info
-def fetch_harcon(station, units='meters', verbose=False):
+def datetimes(begin_date, end_date):
+ #Time Calculations
+ delta = (end_date-begin_date)/datetime.timedelta(hours=1) + .1
+ times = Tide._times(begin_date, numpy.arange(0, delta, .1))
+ return times
- """
- Fetch harmonic constituents for CO-OPS station.
- Convert units of the amplitudes if required.
- If verbose is True, print out the info and constinuents.
-
- Returns:
- harcon = list of dictionaries, one for each constituent, with keys:
- 'number', 'name', 'description', 'amplitude',
- 'phase_GMT', 'phase_local', 'speed'
- To print the constituent names:
- print([h['name'] for h in harcon])
- amplitudes have been converted to specified units.
- harcon_info = dictionary with keys:
- 'units', 'HarmonicConstiuents', 'self'
- harcon_info['HarmonicConstiuents'] is the list from which
- harcon was created, after possibly changing units.
- """
-
- assert units in ['meters','feet'], '*** unrecognized units: %s' % units
-
- try:
- station_id = str(int(station)) # make sure it's a string of an int
- except:
- raise ValueError('Station cannot be converted to int')
- url_harcon = '%s/%s/harcon.json' % (url_base, station_id)
+def detide(NOAA_observed_water_level, predicted_tide):
+ # NOAA observed water level - predicted tide
+ return [(NOAA_observed_water_level[i] - predicted_tide[i]) for i in range(len(NOAA_observed_water_level))]
- with urlopen(url_harcon) as response:
- text = response.read().decode('utf-8')
-
- # convert to a dictionary:
- p = json.loads(text)
- hc = p['HarmonicConstituents'] # list of dictionaries
+#Surge Implementation
+def surge(station, beg_date, end_date, landfall_date):
+ """Fetch datum value for given NOAA tide station.
+ :Required Arguments:
+ - station (string): 7 character station ID
+ - begin_date (datetime): start of date/time range of prediction
+ - end_date (datetime): end of date/time range of prediction
+ - landfall_date (datetime): approximate time of landfall for reference
+ :Optional Arguments:
+ - datum (string): MTL for tide prediction and retrieval
+ - time_zone (string): see NOAA API documentation for possible values
+ :Returns:
+ - times (float): times with landfall event as reference
+ - surge (float): surge heights
+ """
+ predicted_tide = predict_tide(station, beg_date, end_date)
+ NOAA_times, NOAA_observed_water_level, NOAA_predicted_tide = fetch_noaa_tide_data(station, beg_date, end_date)
- noaa_units = p['units'] # probably 'feet'
+ #detides NOAA observed water levels with predicted tide
+ surge = detide(NOAA_observed_water_level, predicted_tide)
+ #modifies NOAA times to datetimes
+ times = [((pandas.to_datetime(time).to_pydatetime())-landfall_date)/datetime.timedelta(days=1) for time in NOAA_times]
- if noaa_units == 'feet' and units == 'meters':
- for h in hc:
- h['amplitude'] = h['amplitude'] * 0.3048
-
- if noaa_units == 'meters' and units == 'feet':
- for h in hc:
- h['amplitude'] = h['amplitude'] / 0.3048
+ return times, surge
- if verbose:
- print('Harmonic constituent info for station %s' % station_id)
- print('Should agree with info at \n %s?id=%s' \
- % (url_base_web,station_id))
- for k in p.keys():
- if k != 'HarmonicConstituents':
- print('%s: %s' % (k.rjust(20), p[k]))
-
- print('Harmonic Constituents will be returned in units = %s:' % units)
- print(' Name amplitude (%s) phase (GMT) speed' % units)
- for h in hc:
- print('%s: %11.3f %20.3f %18.6f' % (h['name'].rjust(5),h['amplitude'],
- h['phase_GMT'],h['speed']))
-
- harcon = hc
- harcon_info = p # also return full dictionary
-
- return harcon, harcon_info
-
-def make_pytides_model(station):
+def new_tide_instance_from_existing(constit_list,existing_tide_instance):
"""
- Fetch harmonic constituents for station and return lists of 37
- amplitudes (meters) and phases (GMT) as required for a pytides model.
+ constit_list is the list of constituents to be used in the
+ new_tide_instance.
+ The values of the amplitudes and phases for each of them is to be
+ pulled from an existing_tide_instance. If no such constituent is in
+ the existing_tide_instance, an error message is printed.
"""
-
- print('Fetching harmonic constituents...')
- harcon, harcon_info = fetch_harcon(station, units='meters', verbose=False)
-
- numbers = list(range(1,38)) # assume standard 37 constituents
- harcon_numbers = [h['number'] for h in harcon]
- # make sure there are the expected number and in the right order:
- assert harcon_numbers == numbers, '*** unexpected harcon_numbers = %s' \
- % harcon_numbers
-
- amplitudes = [h['amplitude'] for h in harcon]
- phases = [h['phase_GMT'] for h in harcon]
-
- return amplitudes, phases
\ No newline at end of file
+ existing_constits = existing_tide_instance.model['constituent']
+ existing_amps = existing_tide_instance.model['amplitude']
+ existing_phases = existing_tide_instance.model['phase']
+ len_existing = len(existing_constits)
+ new_model = numpy.zeros(len(constit_list), dtype = Tide.dtype)
+ new_constits=[]; new_amps=[]; new_phases=[];
+ for ic in constit_list:
+ success = False
+ for j in range(len_existing):
+ ie = existing_constits[j]
+ if (ie.name == ic.name): #grab it
+ success = True
+ new_constits.append(ie)
+ new_amps.append(existing_amps[j])
+ new_phases.append(existing_phases[j])
+ if (success == False):
+ print ('Did not find consituent name: ',ic.name,\
+ 'in existing tide instance')
+ new_model['constituent'] = new_constits
+ new_model['amplitude'] = new_amps
+ new_model['phase'] = new_phases
+
+ # The new_model is now complete, so make a tide instance called
+ #called new_tide from it.
+ new_tide = Tide(model = new_model, radians = False)
+ return new_tide
+
+