Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

release: v0.7.0 #81

Merged
merged 17 commits into from
Dec 18, 2024
Merged

release: v0.7.0 #81

merged 17 commits into from
Dec 18, 2024

Conversation

Autoparallel
Copy link
Contributor

@Autoparallel Autoparallel commented Dec 6, 2024

Here is a layout:
image


TODOs:

  • update README
  • fix all warnings

* wip: better HTTP

- Trying to reduce file size and constraints substantially
- Also need to make this so it doesn't matter if there is padding around headers / etc.

* WIP: improving HTTP digesting

* WIP: http rewrite

* WIP: almost working no-header test

* WIP: working start/body

* working tests!

* cleanup

* Update masker.circom

* Update CHANGELOG.md
@Autoparallel
Copy link
Contributor Author

The tests are going to still pass, but the NIVC chain is not functional with the last commit from #82.

I will outline the changes needed to get a consistent NIVC path after I also get in the new JSON circuit. I do not want to repair the old JSON circuits as they are soon to be debt.

@Autoparallel Autoparallel added performance ⚡️ This will not be worked on feature ✨ New feature or core functionality priority high 🔥 ASAP do not merge 🚫 release 🚀 labels Dec 7, 2024
* feat: hash based JSON verification

* WIP: save

* resetting for clearer approach

* good save state

* feat: working hash version

Though this will be too expensive, the idea works!

* WIP: need to clear after comma

* WIP: good progress

* WIP: getting keys also now

* feat: (mostly?) working tree hasher

* seems to be correct for spotify

* perf: first optimization

* wip: brain hurty

left a note to myself

* fix: tree hasher seems correct now

* TODO: note to self

* feat: hash based JSON verification

* WIP: save

* resetting for clearer approach

* good save state

* feat: working hash version

Though this will be too expensive, the idea works!

* WIP: need to clear after comma

* WIP: good progress

* WIP: getting keys also now

* feat: (mostly?) working tree hasher

* seems to be correct for spotify

* perf: first optimization

* wip: brain hurty

left a note to myself

* fix: tree hasher seems correct now

* TODO: note to self

* cleanup from rebase

* cleanup

* WIP: seems to monomial correctly

* rename

* add in value to eval at

* WIP: start looking for matches

* made some fixes

* it may be working!

* now i can write tests!

* more tests

* more JSON hasher tests

* cleanup
@Autoparallel
Copy link
Contributor Author

For the JSON circuit we have some warnings from build-circuit like:

main header: JSONExtraction_20
[warning] signal #1069 is not set
[warning] signal #1070 is not set
[warning] signal #1071 is not set
[warning] signal #1072 is not set
...

* feat: `PolynomialDigest`

* WIP: working to get through NIVC

* feat: HTTP circuit digesting

* feat: ChaCha circuit digesting

* feat: JSON circuit digesting

* fix: `JSONExtraction`

* IT WORKS

* feat: TS init digest

* feat: separate sequence/value
@Autoparallel Autoparallel marked this pull request as ready for review December 13, 2024 13:08
@Autoparallel
Copy link
Contributor Author

Oops i didn't rebase 64d102b but it was just a 1 loc change in package.json

@Autoparallel
Copy link
Contributor Author

For the JSON circuit we have some warnings from build-circuit like:

main header: JSONExtraction_20
[warning] signal #1069 is not set
[warning] signal #1070 is not set
[warning] signal #1071 is not set
[warning] signal #1072 is not set
...

This is done.

@Autoparallel
Copy link
Contributor Author

This is good to go now, in my opinion!

@lonerapier
Copy link
Collaborator

JSON circuit is currently ~500K for 1024B data. How are we going to fold this for size > 1KB? because i think constraints will increase exponentially for more sizes

@Autoparallel
Copy link
Contributor Author

JSON circuit is currently ~500K for 1024B data. How are we going to fold this for size > 1KB? because i think constraints will increase exponentially for more sizes

We can fold 1KB at a time 😄

I got a plan for it

Autoparallel and others added 2 commits December 17, 2024 06:00
* fix: `zeroed_data` for `data_digest` in `http_verification`

* add test for 1024
Copy link
Contributor

@0xJepsen 0xJepsen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am excited for these three fold circuits! Your diagram is really good too, thanks for adding that. Maybe that can go in a readme somewhere?

@Autoparallel
Copy link
Contributor Author

Yeah we should update the README anyway. Let me make that a task.

@Autoparallel Autoparallel merged commit 7457da0 into main Dec 18, 2024
3 checks passed
@Autoparallel Autoparallel deleted the release/v0-7-0 branch December 18, 2024 21:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature ✨ New feature or core functionality performance ⚡️ This will not be worked on priority high 🔥 ASAP release 🚀
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants