-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Review the code used to create the analysis data #33
Comments
@edambo I think you will take a first pass at this, correct? |
I'm not sure I understand @mbcann01. Are you asking if I worked on these already? I think I did, unless you noticed something I missed. |
Hi @edambo ! I haven't checked yet. We just talked about it on Monday, so I didn't assume that you'd already done it. If so great! |
@mbcann01 Ah, yes. I think I did this already if you mean deleting the files. |
@edambo What about the other tasks listed above? Did you happen to do either of them? |
@mbcann01 Yes, I think they were done before the holidays. I changed the style as you requested, but it's possible I missed something. Let me know if this is the case. |
@edambo , I started looking through the code. In
However, that produces and error on my computer because that is not the file path to the data. Of course, I can just change the file path in the code to match the file path on my computer, but I want it to run on your computer too. What is the path to this file in your computer? |
That is very odd because this path works on my system. I'm not sure why
it's not working. I just ran it and there were no errors.
Ebie
…On Thu, Feb 29, 2024 at 4:04 PM Brad Cannell ***@***.***> wrote:
@edambo <https://github.com/edambo> , I started looking through the code.
In data_01_aps_investigations_import.qmd line 26, the code to import the
data looks like this:
aps_inv <- read_csv("../data/filemaker_pro_exports/aps_investigations_import.csv")
However, that produces and error on my computer because that is not the
file path to the data. Of course, I can just change the file path in the
code to match the file path on my computer, but I want it to run on your
computer too. What is the path to this file in your computer?
—
Reply to this email directly, view it on GitHub
<#33 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/A6F2HY6GBHQEBOKEGRC26H3YV6SWHAVCNFSM6AAAAABAGWHT6SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNZSGA2DMOJRGA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
I haven't changed that specifically, but I've made other changes to the
data management files. I was waiting to finish working on the codebook
files to pull everything together but I will go ahead and send a pull
request now.
…On Thu, Feb 29, 2024 at 10:50 PM Ebie Dambo ***@***.***> wrote:
That is very odd because this path works on my system. I'm not sure why
it's not working. I just ran it and there were no errors.
Ebie
On Thu, Feb 29, 2024 at 4:04 PM Brad Cannell ***@***.***>
wrote:
> @edambo <https://github.com/edambo> , I started looking through the
> code. In data_01_aps_investigations_import.qmd line 26, the code to
> import the data looks like this:
>
> aps_inv <- read_csv("../data/filemaker_pro_exports/aps_investigations_import.csv")
>
> However, that produces and error on my computer because that is not the
> file path to the data. Of course, I can just change the file path in the
> code to match the file path on my computer, but I want it to run on your
> computer too. What is the path to this file in your computer?
>
> —
> Reply to this email directly, view it on GitHub
> <#33 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/A6F2HY6GBHQEBOKEGRC26H3YV6SWHAVCNFSM6AAAAABAGWHT6SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNZSGA2DMOJRGA>
> .
> You are receiving this because you were mentioned.Message ID:
> ***@***.***
> com>
>
|
2024-03-15 Left off at:
Copy and paste for commits:
|
Part of #33 - Use the `here` package to facilitate file import and export. - Checked for overlap with `qaqc/data_01_aps_recode_factors.Rmd`. There was nothing in the QAQC file that wasn't also in the data import file. Deleted the QAQC file. - Added two carriage returns before level one headings.
Part of #33 - Use the `here` package to facilitate file import and export. - Made headings more consistent. - Checked for overlap with qaqc/check_consenting_participants.qmd. There was nothing in the QAQC file that wasn't also in the data import file. Deleted the QAQC file. - Added two carriage returns before level one headings.
2024-03-21 Left off at:
Copy and paste for commits:
|
2024-03-22 Left off at:
Copy and paste for commits:
|
2024-04-04 Left off at:
Reviewing
Copy and paste for commits:
|
2024-04-05 Reviewing
Note: I got an error saying NA's were introduced by coercion. It had to do with the way "Don't know" was written. Just run When you are done reviewing Then, go back to reviewing |
Part of #33 While reviewing this file, I discovered that there is an NA value for MedStar ID in the self_report_import.rds data frame. I need to remove that row and then come back to this file.
Part of #33 - Use the `here` package to facilitate file import and export. - Figure out if the row with an NA value for MedStar ID needs to be removed or not.
#33 I created this code while reviewing `data_06_self_report_import.qmd`. As I'm writing this, it didn't seem like the payoff of changing all of the code was worth the effort. However, I want to save the code -- at least for now -- in case I change my mind or think I will find it useful in some other context. So, I moved it to a new file: exploratory/recoding_factoring_relocating.qmd
#33 Using recoding_factoring_relocating.R to make the code easier to read.
2024-04-09 Reviewing
When you are done reviewing Then, go back to reviewing
|
2024-04-10, 2024-04-11 Reviewing
Copy and paste for commits:
|
#33 Use double colon method instead.
#33 Delete this file when Brad has finished reviewing the repo.
2024-04-11
|
2024-04-16
|
Part of #33 - Removed the tidyselect code being used to select columns inside of across() for recoding factors. Now, we use explicit column names instead so that the code is easier to reason about. - Started using the functions in recoding_factoring_relocating.R and nums_to_na.R to clean and transform categorical variables. - Changed coding for all "Yes/No" columns from "1/2" to "1/0". - Spot check the factor code. - Finish the recode_factor_relocate function (totally optional). - Use the `here` package to facilitate file import and export. - Made headings more consistent. - Checked for overlap with qaqc/data_01_self_report_recode_factors.Rmd. After a review, I concluded that we are safe to delete the QAQC file.
Part of #33 - Started using the functions in recoding_factoring_relocating.R and nums_to_na.R to clean and transform categorical variables. - Changed coding for all "Yes/No" columns from "1/2" to "1/0". - Spot check the factor code. - Use the `here` package to facilitate file import and export. - Made headings more consistent. - Checked for overlap with qaqc/data_01_aps_recode_factors.Rmd. After a review, I concluded that we are safe to delete the QAQC file.
Part of #33 - Checked MedStar IDs for participants who did not give consent to participate. - Removed records from the follow-up interview survey data sets for MedStar IDs that did not have a consent document on file.
#33 Change the hyphen to an underscore in the file name.
2024-04-26
Copy and paste for commits:
|
Did this while working on #33 Move unit tests for nums_to_na.R and recoding_factoring_relocating.R to the tests folder. This is removes the problem of the test data hanging around in the global environment and is ultimately a more sustainable solution.
Overview
Several GRAs have worked hard to create an analysis data frame from the separate files exported by FileMaker Pro as part of the DETECT follow-up interviews. I need to go review them all for correctness and stylistic consistency. Additionally, I need to create some instructions for using/updating files in the future.
Links
Tasks
here
package to facilitate file import and export.The text was updated successfully, but these errors were encountered: