Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add rolling-statistics to Projects #32

Open
wants to merge 70 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
70 commits
Select commit Hold shift + click to select a range
05b07d2
setting -2 for endIndex default to the length of the unique donors
ed-nykaza Oct 18, 2018
77ef21d
add missing argument to checkIndex
ed-nykaza Oct 18, 2018
67527e4
Add rolling-statistics to Projects
jameno Oct 19, 2018
1002079
add batch processing for local time estimate with multicores
ed-nykaza Oct 23, 2018
0a29e7c
create a place holder when a download in progress
ed-nykaza Oct 23, 2018
cf0679a
run on multiple cores
ed-nykaza Oct 24, 2018
59a0804
add the processing time to the script
ed-nykaza Oct 24, 2018
4ff76a1
check track of run time
ed-nykaza Oct 24, 2018
526dbed
keep track of run time
ed-nykaza Oct 24, 2018
6d647f8
make anonymizing optional and chance input to a string
ed-nykaza Oct 24, 2018
737ffb6
sort columns by alpha order, then <>.<>, and then fields w embedded json
ed-nykaza Oct 24, 2018
fafc768
keep track of processing time and clean up code and output
ed-nykaza Oct 24, 2018
230c93b
Merge pull request #33 from tidepool-org/etn/opt-for-aws
Oct 25, 2018
5e90969
initial commit
ed-nykaza Oct 26, 2018
97dc327
Initial commit of node-data-tools
Oct 29, 2018
5da5d25
Refactor processing into functions
Oct 29, 2018
4c931d7
Refactor xlsxStreamWriter into function
Oct 29, 2018
58806a0
merge on "id" since "id" is unique per each row of data
ed-nykaza Oct 29, 2018
d08970f
Merge pull request #34 from tidepool-org/etn/fix-scheduleName-merge-bug
Oct 29, 2018
c5691a6
Further refactoring to export a library
Oct 30, 2018
de8dafd
Shouldn't commit local settings
Oct 30, 2018
91aa777
Count affected records during processing, not writing
Oct 31, 2018
24af3a0
Code cleanup
Oct 31, 2018
7b2a031
Prepare for npm publishing, add command line utility
Nov 1, 2018
e5309cc
Give node-data-tools its own .gitignore
Nov 1, 2018
8d00240
Don't ship .eslintrc in npm module
Nov 1, 2018
6fcf59f
Add command pathway to cli
Nov 1, 2018
563769f
Update README to match CLI changes
Nov 1, 2018
edd17b7
Fix bug introduced in earlier refactor
Nov 13, 2018
2037bc5
initial test for clean.py
Nov 18, 2018
4b9c6d6
updated duplicates and added test for round time
Nov 19, 2018
3313a20
pytest packages should work from conda install
Nov 19, 2018
d7677da
Merge pull request #35 from tidepool-org/pazaan/add-node-tools
pazaan Nov 20, 2018
0c9b883
pytest packages should work from conda install
Nov 28, 2018
2037368
Locks event-stream to 3.3.4
Nov 28, 2018
33260b0
updated readme with testing details
Nov 28, 2018
2f1646c
Merge pull request #40 from tidepool-org/pazaan/lock-event-stream
pazaan Nov 28, 2018
189d78c
Changes after PR review
Nov 28, 2018
039bf08
Merge pull request #37 from rpwils/master
Nov 28, 2018
2031d21
Merge pull request #41 from tidepool-org/pazaan/lock-event-stream
pazaan Nov 28, 2018
b30d74d
initial loop report parser
Jan 25, 2019
7388693
example (WIP) using parser
ed-nykaza Jan 25, 2019
2eb7162
addtional parsing
Jan 26, 2019
5fa04d1
Merge remote-tracking branch 'origin/rpw/loop-report-parser' into rpw…
Jan 26, 2019
d167462
fixed error in carb store and check for section in dictionary prior t…
Jan 26, 2019
9ef2fab
refactored to return one dictionary and ability to parse multiple fil…
Jan 26, 2019
5e4ceb4
added basal profile
Jan 27, 2019
d1c70b3
updated with premeal, workout and suspendthresholdunit
Jan 27, 2019
289d187
minor updates on naming
Jan 27, 2019
ceb9c4c
updated insulin_sensitivity_factor_schedule to match the output style…
Jan 27, 2019
fd649f1
fixed the insulin_sensitivity_factor_timeZone bug
Jan 28, 2019
957d6f7
converted carb_ratio_schedule to a list of dicts
Jan 28, 2019
7becfbd
removed commented code
Jan 28, 2019
bf66861
refactor example and save json
ed-nykaza Jan 28, 2019
b719e60
updated strings to floats
Jan 28, 2019
eaa9acb
test files
Jan 29, 2019
2445216
fixed error in carb
Jan 29, 2019
f159f50
test loop report
Jan 29, 2019
4c22140
added check for file and directory
Jan 29, 2019
946316f
added invalid directory test
Jan 29, 2019
3f8add6
updated error messages
Jan 29, 2019
f06e26e
update example to export csv file
ed-nykaza Jan 30, 2019
066904b
include index in output and the same file name for input and output
ed-nykaza Jan 30, 2019
cf148d4
minor refactoring
Feb 2, 2019
931dc40
Merge pull request #46 from tidepool-org/rpw/loop-report-parser
Feb 2, 2019
427bb5a
Updated Rolling Statistics Code Framework
jameno Apr 5, 2019
306f9c9
Merge remote-tracking branch 'origin/jm/rolling-statistics' into jm/r…
jameno Apr 5, 2019
4bf7c92
Update README
jameno Apr 5, 2019
bd5b3c7
Update README
jameno Apr 5, 2019
21e4d37
Update README
jameno Apr 5, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,11 +1,15 @@
# Environments
.env

# Jupyter Notebook
.ipynb_checkpoints

# R Environment
.RHistory

# Python
__pycache__

# Distribution / packaging
build
dist
Expand All @@ -17,3 +21,13 @@ work-record-archive
export
internal
data

# Test
htmlcov
.pytest_cache



projects/rolling-statistics/results/

*.html
31 changes: 31 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ You are welcome to install the full Anaconda installer, but will only need
Miniconda to get started.

## Getting started
### Project Setup
1. Install [Miniconda](https://conda.io/miniconda.html) for your platform.
1. In a terminal, navigate to the data-analytics directory where the environment.yml
is located.
Expand All @@ -24,3 +25,33 @@ In Bash run `source activate tidepool-analytics`, or in the Anaconda Prompt
run `conda activate tidepool-analytics` to start the environment.

Run `deactivate` to stop the environment.

## Testing
This project uses the testing framework named pyTest. https://docs.pytest.org/en/latest/

After following the project setup instructions, including creating and activating the
virtual environment, you can simply run your tests within Bash

``` bash
# Run tests via
pytest
```

## Running Tests with Test Coverage
This project uses pytest-cov (https://pytest-cov.readthedocs.io/en/latest/) to run test and produce code
test coverage.

To execute a basic test coverage report, run the following from within the virtual environment created during project setup
. This will give the output directly in the Terminal.
``` bash
# Run tests via
pytest --cov
```

To execute a detailed test coverage report, run the following command from within the virtual environment created during
the project setup.
This will create an htmlcov directory containing an index.html page with coverage details.
``` bash
# Run tests via
pytest --cov --cov-report html
```
3 changes: 3 additions & 0 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,9 @@ dependencies:
- plotly
- r
- r-essentials
- pytest
- pytest-cov

- pip:
- python-dotenv
- -e git+https://github.com/tidepool-org/data-analytics#egg=tidals\&subdirectory=tidepool-analysis-tools
16 changes: 16 additions & 0 deletions node-data-tools/.eslintrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"extends": "airbnb",
"parser": "babel-eslint",
"plugins": ["lodash"],
"parserOptions": {
"ecmaVersion": 6
},
"rules": {
"no-plusplus": ["error", {
"allowForLoopAfterthoughts": true
}]
},
"settings": {
"lodash": 3
}
}
8 changes: 8 additions & 0 deletions node-data-tools/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# General
.DS_Store
# Node
node_modules
example-data
test
.eslintcache
yarn.lock
6 changes: 6 additions & 0 deletions node-data-tools/.npmignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
node_modules
example-data
test
.eslintcache
.eslintrc
yarn.lock
88 changes: 88 additions & 0 deletions node-data-tools/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
# @tidepool/data-tools

Node streams to work with [Tidepool](https://tidepool.org) API data

## Installation

``` bash
$ npm install @tidepool/data-tools
```

## Example

``` js
import TidepoolDataTools from '@tidepool/data-tools';

// Set up the processing stream
const processingStream = readStream
.pipe(TidepoolDataTools.jsonParser())
.pipe(TidepoolDataTools.tidepoolProcessor());

// Now attach multiple parallel output streams
processingStream
.pipe(TidepoolDataTools.xlsxStreamWriter(xlsxStream));

processingStream
.pipe(someOtherOutputStream);
```

## Methods

### TidepoolDataTools.jsonParser()

Convenience function to return a `JSONStream.parse('*')`.
See also the [JSONStream NPM Module](https://www.npmjs.com/package/JSONStream).

### TidepoolDataTools.tidepoolProcessor()

Returns a `through` stream that processes the JSON Object data according the the config.

### TidepoolDataTools.xlsxStreamWriter(outputStream)

Writes an xlsx Workbook to `outputStream` with different Worksheets according to the config.

## Command Line Usage

`@tidepool/data-tools` includes a command-line tool
<!--
[`npx`](https://ghub.io/npx):

```sh
npx flat foo.json
```

Or install globally:
-->

Install globally:

```sh
$ npm i -g @tidepool/data-tools && tidepool-data-tools --help
Usage: tidepool-data-tools [options] [command]

Options:
-V, --version output the version number
-h, --help output usage information

Commands:
convert [options] Convert data between different formats

$ tidepool-data-tools convert --help
Usage: tidepool-data-tools [options]

Options:
-V, --version output the version number
-i, --input-tidepool-data <file> csv, xlsx, or json file that contains Tidepool data
-c, --config <file> a JSON file that contains the field export configuration
--salt <salt> salt used in the hashing algorithm (default: "no salt specified")
-o, --output-data-path <path> the path where the data is exported (default: "./example-data/export")
-f, --output-format <format> the path where the data is exported (default: [])
--start-date [date] filter data by startDate and EndDate
--end-date [date] filter data by startDate and EndDate
--merge-wizard-data option to merge wizard data with bolus data. Default is true
--filterByDatesExceptUploadsAndSettings upload and settings data can occur before and after start and end dates, so include ALL upload and settings data in export
-h, --help output usage information
```

## License
Licensed under the BSD-2-Clause License
8 changes: 8 additions & 0 deletions node-data-tools/cli.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
#!/bin/sh
# -*- coding: utf-8, tab-width: 2 -*-
for PROG in node{js,}; do
</dev/null "$PROG" 2>/dev/null && break
done

exec -a "$0" "$PROG" -r esm -- index.js "$@"; exit $?

138 changes: 138 additions & 0 deletions node-data-tools/config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,138 @@
{
"basal": {
"time": {},
"deviceTime": {},
"type": {},
"deliveryType": {},
"duration": {},
"expectedDuration": {},
"rate": {},
"percent": {},
"scheduleName": {},
"suppressed.scheduleName": {},
"suppressed.deliveryType": {},
"suppressed.type": {},
"suppressed.rate": {},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"uploadId": {}
},
"cbg": {
"time": {},
"deviceTime": {},
"type": {},
"value": {},
"units": {},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"uploadId": {}
},
"pumpSettings": {
"time": {},
"deviceTime": {},
"type": {},
"insulinSensitivity": {"stringify": true},
"activeSchedule": {},
"carbRatio": {"stringify": true},
"bgTarget": {},
"units.carb": {},
"units.bg": {},
"basalSchedules": {"stringify": true},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"uploadId": {}
},
"smbg": {
"time": {},
"deviceTime": {},
"type": {},
"subType": {},
"value": {},
"units": {},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"uploadId": {}
},
"wizard": {
"time": {},
"deviceTime": {},
"type": {},
"bgInput": {},
"bgTarget": {"stringify": true},
"insulinSensitivity": {},
"carbInput": {},
"insulinCarbRatio": {},
"insulinOnBoard": {},
"bolus": {},
"units": {},
"recommended.carb": {},
"recommended.correction": {},
"recommended.net": {},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"uploadId": {}
},
"bolus": {
"time": {},
"deviceTime": {},
"type": {},
"subType": {},
"normal": {},
"expectedNormal": {},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"uploadId": {}
},
"deviceEvent": {
"time": {},
"deviceTime": {},
"type": {},
"subType": {},
"value": {},
"units": {},
"duration": {},
"change.from": {},
"change.to": {},
"change.agent": {},
"status": {},
"primeTarget": {},
"volume": {},
"reason.suspended": {},
"reason.resumed": {},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"uploadId": {}
},
"upload": {
"time": {},
"timezone": {},
"timeProcessing": {},
"type": {},
"computerTime": {},
"version": {},
"deviceSerialNumber": {},
"deviceModel": {},
"deviceManufacturers": {"stringify": true},
"deviceTags": {"stringify": true},
"payload": {"stringify": true},
"annotations": {"stringify": true},
"deviceId": {},
"id": {},
"byUser": {},
"uploadId": {}
}
}
Loading