Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run fix_typos.sh on repo to catch spelling errors #1082

Merged
merged 6 commits into from
Aug 30, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ LABEL org.opencontainers.image.source="https://github.com/insarlab/MintPy"
LABEL org.opencontainers.image.documentation="https://mintpy.readthedocs.io/en/latest/"
LABEL org.opencontainers.image.licenses="GPL-3.0-or-later"

# Dynamic lables to define at build time via `docker build --label`
# Dynamic labels to define at build time via `docker build --label`
# LABEL org.opencontainers.image.created=""
# LABEL org.opencontainers.image.version=""
# LABEL org.opencontainers.image.revision=""
Expand Down
4 changes: 2 additions & 2 deletions docs/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ is a great starting point if you are new to version control.
- `origin`, which refers to your personal fork

+ Setting up [`pre-commit`](https://pre-commit.com/) within `MintPy` directory:
- Run `pre-commit install` to set up the git hook scripts, so that `pre-commit` will run automatically on `git commit`. If the `No .pre-commit-config.yaml file was found` error occurrs, update your local MintPy to the latest upstream version to have this config file.
- Run `pre-commit install` to set up the git hook scripts, so that `pre-commit` will run automatically on `git commit`. If the `No .pre-commit-config.yaml file was found` error occurs, update your local MintPy to the latest upstream version to have this config file.


#### 2. Develop your contribution: ####
Expand All @@ -56,7 +56,7 @@ is a great starting point if you are new to version control.
git checkout -b seasonal_fitting
```

+ Work on your idea, run tests and commit locally (`git add` and `git commit`) and/or to your fork on GitHub as you progress (`git push` in command line or [GitHub Desktop](https://desktop.github.com/) with graphical user interface). Use a clear commit message describing the motivation of a change, the nature of a bug for bug fixes or some details on what an enchancement does.
+ Work on your idea, run tests and commit locally (`git add` and `git commit`) and/or to your fork on GitHub as you progress (`git push` in command line or [GitHub Desktop](https://desktop.github.com/) with graphical user interface). Use a clear commit message describing the motivation of a change, the nature of a bug for bug fixes or some details on what an enhancement does.

+ Run the [overall test](./CONTRIBUTING.md#testing) locally.

Expand Down
4 changes: 2 additions & 2 deletions docs/FAQs.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ For line-of-sight (LOS) phase in the unit of radians, i.e. 'unwrapPhase' dataset

For LOS displacement (velocity) in the unit of meters (m/yr), i.e. 'timeseries' dataset in `timeseries.h5` file, positive value represents motion toward the satellite (uplift for pure vertical motion).

### 2. How to prepare the input for MintPy if I am using currently un-supported InSAR softwares?
### 2. How to prepare the input for MintPy if I am using currently un-supported InSAR software?

The input of MintPy routine workflow (`smallbaselineApp.py`) is a stack of unwrapped interferograms. For "stack", we mean all the interferograms (unwrapped phase and spatial coherence) and geometries (DEM, incidence angle, etc.) have the same spatial extent and same spatial resolution, either in geo-coordinates or radar (range-doppler) coordinates. The input has 2 components: data and attributes.

Expand Down Expand Up @@ -39,7 +39,7 @@ For dataset in geo-coordinates [recommended]:

For dataset in radar-coordinates, the extra lookup table file(s) is required (_e.g._ lat/lon.rdr for `ISCE-2`, sim_\*.UTM_TO_RDC for `Gamma`, geo_\*.trans for `ROI_PAC`).

All the files above should be in the same spatial extent and same spatial resolution (except for the lookup table in geo-coordinates from Gamma/ROI_PAC). If they are not (e.g. different row/column number, different spatial extent in terms of SNWE, different spatial resolution, etc.), the easiest way is to geocode them with the same ouput spatial extent and same output spatial resolution.
All the files above should be in the same spatial extent and same spatial resolution (except for the lookup table in geo-coordinates from Gamma/ROI_PAC). If they are not (e.g. different row/column number, different spatial extent in terms of SNWE, different spatial resolution, etc.), the easiest way is to geocode them with the same output spatial extent and same output spatial resolution.

MintPy read data files via `mintpy.utils.readfile.read()`. It supports the following two types of file formats:

Expand Down
2 changes: 1 addition & 1 deletion docs/api/data_structure.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ coordinates : RADAR
Start Date: 20141213
End Date: 20180619
Number of acquisitions : 98
Std. of acquisition times : 0.99 yeras
Std. of acquisition times : 0.99 years
----------------------
List of dates:
['20141213', '20141225', '20150307', '20150319', '20150331', '20150412', '20150424', '20150506', '20150518', '20150530', '20150611', '20150623', '20150717', '20150729', '20150822', '20150903', '20150915', '20150927', '20151009', '20151021', '20151102', '20151114', '20151126', '20151208', '20151220', '20160101', '20160113', '20160125', '20160206', '20160218', '20160301', '20160406', '20160418', '20160430', '20160512', '20160524', '20160605', '20160629', '20160711', '20160723', '20160804', '20160816', '20160828', '20160909', '20160921', '20161003', '20161015', '20161027', '20161108', '20161120', '20161202', '20161214', '20161226', '20170107', '20170119', '20170131', '20170212', '20170224', '20170308', '20170320', '20170401', '20170413', '20170425', '20170507', '20170519', '20170531', '20170612', '20170624', '20170706', '20170718', '20170730', '20170811', '20170823', '20170904', '20170916', '20170928', '20171010', '20171022', '20171103', '20171115', '20171127', '20171209', '20171221', '20180102', '20180114', '20180126', '20180207', '20180219', '20180303', '20180315', '20180327', '20180408', '20180420', '20180502', '20180514', '20180526', '20180607', '20180619']
Expand Down
2 changes: 1 addition & 1 deletion docs/api/doc_generation.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
We use [Doxygen](http://www.doxygen.nl/) to generate the API documentation automatically.

+ Install Doxygen following [link](http://www.doxygen.nl/download.html) if you have not already doen so.
+ Install Doxygen following [link](http://www.doxygen.nl/download.html) if you have not already done so.

+ Run doxygen command with `MintPy/docs/Doxyfile` to generate the API documentation in html and latex format (to `$MINTPY_HOME/docs/api_docs` by default).

Expand Down
2 changes: 1 addition & 1 deletion docs/dask.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ smallbaselineApp.py smallbaselineApp.cfg

#### 2.3 Configuration parameters in `~/.config/dask/mintpy.yaml` ####

We provide a brief description below for the most commonly used configurations of dask-jobqueue for MintPy. Users are recommended to check [Dask-Jobqueue](https://jobqueue.dask.org/en/latest/configuration-setup.html) for more detailed and comprehensive documentaion.
We provide a brief description below for the most commonly used configurations of dask-jobqueue for MintPy. Users are recommended to check [Dask-Jobqueue](https://jobqueue.dask.org/en/latest/configuration-setup.html) for more detailed and comprehensive documentation.

+ **name:** Name of the worker job as it will appear to the job scheduler. Any values are perfectly fine.

Expand Down
2 changes: 1 addition & 1 deletion docs/docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ docker run -it -v </path/to/data/dir>:/home/mambauser/data ghcr.io/insarlab/mint
docker run -it -v </path/to/data/dir>:/home/mambauser/data ghcr.io/insarlab/mintpy:latest smallbaselineApp.py /home/mambauser/data/FernandinaSenDT128/mintpy/FernandinaSenDT128.txt
```

Or run the following to launch the Jupyter Lab server, then copy and paste the printed `http://localhost:8888/lab?token=` url in a brower.
Or run the following to launch the Jupyter Lab server, then copy and paste the printed `http://localhost:8888/lab?token=` url in a browser.

```shell
# to launch a Jupyter Notebook frontend, replace "lab" with "notebook" in the command below
Expand Down
2 changes: 1 addition & 1 deletion docs/google_earth.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ save_kmz_timeseries.py embeds a [dygraphs](http://dygraphs.com) javascript for i

The script also use the [regions KML feature](https://developers.google.com/kml/documentation/regions) to support very large datasets without sacrificing resolution. It divides the data matrix into regionalized boxes, nests them using network links so that Google Earth could load them in a "smart" way.

**Alert: for very large datasets, the default settings are not generic due to the various computer memories, data sizes and different prefered details. The user is highly recommended to read the following to understand how the regions feature works and adjust parameters accordingly.**
**Alert: for very large datasets, the default settings are not generic due to the various computer memories, data sizes and different preferred details. The user is highly recommended to read the following to understand how the regions feature works and adjust parameters accordingly.**

1. Level of Detail (LOD)

Expand Down
2 changes: 1 addition & 1 deletion docs/hdfeos5.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ E.g. S1_IW12_128_0593_0597_20141213_20170928.he5

### 4. Web Viewer ###

HDF-EOS5 file format is used as the input of the University of Miami's web viewer for InSAR time-series products. Below is a screenshot of the web viewer for the dataset on Kuju volcano from ALOS-1 acending track 422.
HDF-EOS5 file format is used as the input of the University of Miami's web viewer for InSAR time-series products. Below is a screenshot of the web viewer for the dataset on Kuju volcano from ALOS-1 ascending track 422.

<p align="center"><b>http://insarmaps.miami.edu</b><br></p>

Expand Down
2 changes: 1 addition & 1 deletion docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ We recommend setting the following environment variables, e.g. in your <code>~/.

```bash
export VRT_SHARED_SOURCE=0 # do not share dataset while using GDAL VRT in a multi-threading environment
export HDF5_DISABLE_VERSION_CHECK=2 # supress the HDF5 version warning message (0 for abort; 1/2 for printout/suppress warning message)
export HDF5_DISABLE_VERSION_CHECK=2 # suppress the HDF5 version warning message (0 for abort; 1/2 for printout/suppress warning message)
export HDF5_USE_FILE_LOCKING=FALSE # request that HDF5 file locks should NOT be used
```

Expand Down
79 changes: 79 additions & 0 deletions scripts/fix_typos.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
#!/bin/sh
# -*- coding: utf-8 -*-
###############################################################################
# $Id$
#
# Project: GDAL
# Purpose: (Interactive) script to identify and fix typos
# Author: Even Rouault <even.rouault at spatialys.com>
#
###############################################################################
# Copyright (c) 2016, Even Rouault <even.rouault at spatialys.com>
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
###############################################################################

set -eu

SCRIPT_DIR=$(dirname "$0")
case $SCRIPT_DIR in
"/"*)
;;
".")
SCRIPT_DIR=$(pwd)
;;
*)
SCRIPT_DIR=$(pwd)"/"$(dirname "$0")
;;
esac
GDAL_ROOT=$SCRIPT_DIR/..
cd "$GDAL_ROOT"

if ! test -d fix_typos; then
# Get our fork of codespell that adds --words-white-list and full filename support for -S option
mkdir fix_typos
(cd fix_typos
git clone https://github.com/rouault/codespell
(cd codespell && git checkout gdal_improvements)
# Aggregate base dictionary + QGIS one + Debian Lintian one
curl https://raw.githubusercontent.com/qgis/QGIS/master/scripts/spell_check/spelling.dat | sed "s/:/->/" | sed "s/:%//" | grep -v "colour->" | grep -v "colours->" > qgis.txt
curl https://salsa.debian.org/lintian/lintian/-/raw/master/data/spelling/corrections | grep "||" | grep -v "#" | sed "s/||/->/" > debian.txt
cat codespell/data/dictionary.txt qgis.txt debian.txt | awk 'NF' > gdal_dict.txt
echo "difered->deferred" >> gdal_dict.txt
echo "differed->deferred" >> gdal_dict.txt
grep -v 404 < gdal_dict.txt > gdal_dict.txt.tmp
mv gdal_dict.txt.tmp gdal_dict.txt
)
fi

EXCLUDED_FILES="*/.svn*,*/.git/*,configure,config.log,config.status,config.guess,config.sub,*/autom4te.cache/*,*.ai,*.svg"
AUTHORIZED_LIST="$AUTHORIZED_LIST,te" # gdalwarp switch
AUTHORIZED_LIST="$AUTHORIZED_LIST,LaTeX,BibTeX"
AUTHORIZED_LIST="$AUTHORIZED_LIST,ALOS,Alos"
AUTHORIZED_LIST="$AUTHORIZED_LIST,lon,Lon,LON"
# New Mintpy ones
AUTHORIZED_LIST="$AUTHORIZED_LIST,alos,ALOS,alosStack"
AUTHORIZED_LIST="$AUTHORIZED_LIST,NED"
AUTHORIZED_LIST="$AUTHORIZED_LIST,waterMask,watermask"
AUTHORIZED_LIST="$AUTHORIZED_LIST,smallbaselineApp"
AUTHORIZED_LIST="$AUTHORIZED_LIST,Nealy" # Author in reference

python fix_typos/codespell/codespell.py -w -i 3 -q 2 -S "$EXCLUDED_FILES,./autotest/*,./build*/*" \
--words-white-list="$AUTHORIZED_LIST" \
-D ./fix_typos/gdal_dict.txt .
4 changes: 2 additions & 2 deletions src/mintpy/asc_desc2horz_vert.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ def get_design_matrix4horz_vert(los_inc_angle, los_az_angle, horz_az_angle=-90):
+ dV * cos(inc_angle)
with dH_perp = 0.0
This could be easily modified to support multiple view geometry
(e.g. two adjcent tracks from asc & desc) to resolve 3D
(e.g. two adjacent tracks from asc & desc) to resolve 3D

Parameters: los_inc_angle - 1D np.ndarray in size of (num_file), LOS incidence angle in degree.
los_az_angle - 1D np.ndarray in size of (num_file), LOS azimuth angle in degree.
Expand Down Expand Up @@ -155,7 +155,7 @@ def run_asc_desc2horz_vert(inps):
Returns: inps.outfile - str(s) output file(s)
"""

## 1. calculate the overlaping area in lat/lon
## 1. calculate the overlapping area in lat/lon
atr_list = [readfile.read_attribute(fname, datasetName=inps.ds_name) for fname in inps.file]
S, N, W, E = get_overlap_lalo(atr_list)
lat_step = float(atr_list[0]['Y_STEP'])
Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/closure_phase_bias.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
# Notebook tutorial:
# https://nbviewer.org/github/insarlab/MintPy-tutorial/blob/main/applications/closure_phase_bias.ipynb

# create mask for areas suseptible to biases
# create mask for areas susceptible to biases
closure_phase_bias.py -i inputs/ifgramStack.h5 --nl 5 -a mask
closure_phase_bias.py -i inputs/ifgramStack.h5 --nl 20 -a mask --num-sigma 2.5

Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/dem_error.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def create_parser(subparsers=None):
parser = arg_utils.create_argument_parser(
name, synopsis=synopsis, description=synopsis, epilog=epilog, subparsers=subparsers)

parser.add_argument('ts_file', help='Time-series HDF5 file to be corrrected.')
parser.add_argument('ts_file', help='Time-series HDF5 file to be corrected.')
parser.add_argument('-g', '--geometry', dest='geom_file',
help='geometry file including datasets:\n'+
'incidence angle\n'+
Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/dem_gsi.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
"""

NOTE = """DEHM: Digital Ellipsoidal Height Model
yyxx.dehm with yy and xx indicating the coordinates of the upper left corner of the firt pixel.
yyxx.dehm with yy and xx indicating the coordinates of the upper left corner of the first pixel.
where latitude = (yy + 1) / 1.5, longitude = xx + 100
"""

Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/diff.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ def cmd_line_parse(iargs=None):
ftype = readfile.read_attribute(inps.file1)['FILE_TYPE']
if ftype in ['timeseries', 'ifgramStack', '.unw']:
if len(inps.file2) > 1:
raise SystemExit(f'ERROR: ONLY ONE file2 is inputed for {ftype} type!')
raise SystemExit(f'ERROR: ONLY ONE file2 is inputted for {ftype} type!')

# check: --output (output file is required for number of files >=2)
if not inps.out_file:
Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/generate_mask.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
# common mask file of pixels in all connected components / with non-zero unwrapped phase
generate_mask.py ifgramStack.h5 --nonzero -o maskConnComp.h5 --update

# interative polygon selection of region of interest
# interactive polygon selection of region of interest
# useful for custom mask generation in unwrap error correction with bridging
generate_mask.py waterMask.h5 -m 0.5 --roipoly
generate_mask.py azOff.h5 --roipoly --view-cmd "-v -0.1 0.1"
Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/geocode.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ def create_parser(subparsers=None):
out = parser.add_argument_group('grid in geo-coordinates')
out.add_argument('-b', '--bbox', dest='SNWE', type=float, nargs=4, metavar=('S', 'N', 'W', 'E'),
help='Bounding box for the area of interest.\n'
'using coordinates of the uppler left corner of the first pixel\n'
'using coordinates of the upper left corner of the first pixel\n'
' and the lower right corner of the last pixel\n'
"for radar2geo, it's the output spatial extent\n"
"for geo2radar, it's the input spatial extent")
Expand Down
6 changes: 3 additions & 3 deletions src/mintpy/cli/ifgram_inversion.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ def create_parser(subparsers=None):
help=('Enable inversion with minimum-norm deformation phase,'
' instead of the default minimum-norm deformation velocity.'))
#solver.add_argument('--norm', dest='residualNorm', default='L2', choices=['L1', 'L2'],
# help='Optimization mehtod, L1 or L2 norm. (default: %(default)s).')
# help='Optimization method, L1 or L2 norm. (default: %(default)s).')

# uncertainty propagation
parser.add_argument('--calc-cov', dest='calcCov', action='store_true',
Expand All @@ -97,9 +97,9 @@ def create_parser(subparsers=None):
help='minimum redundancy of interferograms for every SAR acquisition. (default: %(default)s).')
# for offset ONLY
#mask.add_argument('--mask-min-snr', dest='maskMinSNR', type=float, default=10.0,
# help='minimum SNR to diable/ignore the threshold-based masking [for offset only].')
# help='minimum SNR to disable/ignore the threshold-based masking [for offset only].')
#mask.add_argument('--mask-min-area-size', dest='maskMinAreaSize', type=float, default=16.0,
# help='minimum area size to diable/ignore the threshold-based masking [for offset only]')
# help='minimum area size to disable/ignore the threshold-based masking [for offset only]')

# computing
parser = arg_utils.add_memory_argument(parser)
Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/local_oscilator_drift.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
"""

def create_parser(subparsers=None):
synopsis = 'Local Oscilator Drift (LOD) correction of Envisat'
synopsis = 'Local Oscillator Drift (LOD) correction of Envisat'
epilog = REFERENCE + '\n' + TEMPLATE + '\n' + EXAMPLE
name = __name__.split('.')[-1]
parser = create_argument_parser(
Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/modify_network.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ def cmd_line_parse(iargs=None):
if not os.path.isfile(inps.maskFile):
inps.maskFile = None

# check: --exclude-ifg-index option (convert input index to continous index list)
# check: --exclude-ifg-index option (convert input index to continuous index list)
inps.excludeIfgIndex = read_input_index_list(inps.excludeIfgIndex, stackFile=inps.file)

# check: -t / --template option
Expand Down
2 changes: 1 addition & 1 deletion src/mintpy/cli/plate_motion.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
"""

EXAMPLE = """example:
# Use build-in plate motion model of Table 1 from Altamimi et al. (2017)
# Use built-in plate motion model of Table 1 from Altamimi et al. (2017)
plate_motion.py -g inputs/geometryGeo.h5 --plate Arabia
plate_motion.py -g inputs/geometryRadar.h5 --plate Eurasia

Expand Down
Loading