-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Submission 454, Petitpierre/di Lenardo/Rappo #18
Conversation
Co-Authored-By: Rémi Petitpierre <[email protected]>
Caution Review failedThe pull request is closed. WalkthroughThis update introduces a manuscript project using Quarto, centering on the automation of vectorization for historical cadastral plans in Switzerland. It includes an index file that details the research methodology and findings, a bibliography to support the study, and a requirements file for necessary dependencies. These additions enhance the project’s structure and accessibility, facilitating efficient research and analysis of land ownership dynamics over time. Changes
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Outside diff range, codebase verification and nitpick comments (1)
submissions/454/index.qmd (1)
105-125
: Consider removing unused imports.The import statement for
tqdm
is used, butjson
is not utilized in the provided code. Consider removing it if it's not needed.-import shapefile, tqdm, cv2, json +import shapefile, tqdm, cv2
Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Files ignored due to path filters (10)
submissions/454/images/center_2020.svg
is excluded by!**/*.svg
submissions/454/images/center_berney_v2.svg
is excluded by!**/*.svg
submissions/454/images/center_melotte_v2.svg
is excluded by!**/*.svg
submissions/454/images/center_renove.svg
is excluded by!**/*.svg
submissions/454/images/owners_chart.svg
is excluded by!**/*.svg
submissions/454/images/persistence_1727_1831_light.svg
is excluded by!**/*.svg
submissions/454/images/persistence_1831_1888_light.svg
is excluded by!**/*.svg
submissions/454/images/persistence_1888_2020_light.svg
is excluded by!**/*.svg
submissions/454/images/prelaz_renove.svg
is excluded by!**/*.svg
submissions/454/images/segmentation.svg
is excluded by!**/*.svg
Files selected for processing (4)
- submissions/454/_quarto.yml (1 hunks)
- submissions/454/index.qmd (1 hunks)
- submissions/454/references.bib (1 hunks)
- submissions/454/requirements.txt (1 hunks)
Files skipped from review due to trivial changes (3)
- submissions/454/_quarto.yml
- submissions/454/references.bib
- submissions/454/requirements.txt
Additional comments not posted (17)
submissions/454/index.qmd (17)
1-34
: Metadata is well-structured and complete.The metadata section includes all necessary information such as submission ID, title, authors, keywords, and bibliography. Ensure that the ORCID IDs and email addresses are accurate.
36-100
: Narrative sections are clear and well-organized.The narrative provides a comprehensive overview of the research, data collection, and analysis. It is well-supported by references and figures. Ensure that all references are correctly formatted and accessible.
129-159
: FunctioncreateIndex
handles invalid geometries well.The function correctly skips null or empty geometries and attempts to fix invalid ones. It returns an updated GeoDataFrame and index. Ensure that the handling of invalid geometries aligns with the overall data processing strategy.
161-189
: FunctionmatchParcels
efficiently identifies geometry matches.The function uses bounding box overlap to identify potential matches and calculates overlap ratios. Ensure that the overlap threshold aligns with the research objectives.
191-203
: FunctionreverseMatches
correctly reverses matching relations.The function builds a reverse mapping of matches efficiently. Ensure that this reverse mapping is used appropriately in subsequent analysis.
205-215
: Functioncreate_polygons
handles invalid geometries appropriately.The function attempts to fix invalid geometries and filters out small polygons. Ensure that the area threshold aligns with the research objectives.
217-223
: Functionoverlap_percentage
handles exceptions effectively.The function calculates overlap percentages and handles exceptions gracefully. Ensure that the exception handling aligns with the overall error management strategy.
225-243
: Functioncalculate_angles
accurately computes turning functions.The function calculates angles and normalizes the turning function. Ensure that the calculations align with the intended geometric analysis.
246-278
: Functioncompare_polygons
effectively compares polygon shapes.The function computes the minimum area between turning functions for comparison. Ensure that the comparison method aligns with the research objectives.
280-314
: Functioncreate_image
generates images from polygon data.The function processes pixels and assigns intensity based on turning scores. Ensure that the image generation aligns with the visualization needs of the project.
316-353
: FunctioncomputeDynamics
effectively computes fusion dynamics.The function processes one-to-one and many-to-one matches to compute dynamics. Ensure that the methodology aligns with the research objectives.
355-375
: Functionconvert_multipolygon_to_polygon
handles conversion appropriately.The function fills gaps and shrinks polygons back after conversion. Ensure that the conversion process aligns with the data processing requirements.
377-407
: Functioncompute_legend
generates a color palette and legend.The function calculates colors based on weights and generates a legend image. Ensure that the color scheme aligns with the visualization goals.
411-418
: Data loading and filtering is correctly implemented.The code loads shapefiles and filters data based on specified classes. Ensure that the file paths and filtering criteria align with the data processing needs.
422-430
: Geometry matching is efficiently implemented.The code creates indices and matches geometries between datasets. Ensure that the matching logic aligns with the research objectives.
434-443
: Persistence detection and data preparation are accurate.The code computes dynamics and prepares data for visualization. Ensure that the calculations align with the research objectives.
447-469
: Persistence plot generation is correctly implemented.The code generates persistence plots and saves images in different modes. Ensure that the visualization aligns with the project's goals.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mtwente Thanks for adding submission 454. I am not sure whether the inline code is the best solution for our audience and for the later pdf compilation. We should discuss this tomorrow.
@RPetitpierre @lucasrappo Thank you very much for your submission. We decided to comment out your code section for the moment. Please provide us with the missing shape files in the
We really appreciate that you provide code and we would love to render it on our platform as a superb example of portable code. Best |
Dear Moritz @maehr, Thank you for your message and sorry for the late reply. I was on holidays with no access to my email until Monday. I also found a minor but nasty bug in the code which took me a bit of time to fix. The dynamic figure computation is not very adapted in this case, since the code takes a bit over an hour to run. Alternatively, I suggest to use a smaller data sample, which already greatly helps to understand the methodology and provides an exemplar result, covering just a neighborhood. I updated the code according to your suggestions. I also updated the other data file (requirements, data, images) to match the generated sample result. Don't hesitate to write me if you need anything else from my side, and thank you for all the work on organizing this conference, I am looking forward! Best, Remi |
Thank you very much. It doesn't matter if it takes long to run, we could try the whole data set. Quarto allows for freezing to safe compute. @mtwente Can you please take a look at the update? |
Pull request
Proposed changes
Co-authored-by: Rémi Petitpierre [email protected]
Co-authored-by: Isabella di Lenardo [email protected]
Co-authored-by: Lucas Rappo [email protected]
Types of changes
Checklist
Co-authored-by: Name <[email protected]>
.Summary by CodeRabbit
New Features
Chores
requirements.txt
file listing essential libraries for project dependencies, improving setup and environment management.