You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I believe that if you remove the strip and foo replacements that the prcessing of that same file will be quite faster. The only thing that occurs to me is to have the foo and strip handlers only turned on with an option like -x which is for extended macro replacement features ?
Oh and also if you don't remove the @define xxx entries from the original file the calls ot the regex.sub(xxx) are quite heavy. I'll come up with a few ideas later in the day on how we can speed this up or have an option like -x for extended macro replacements (ie date, time) and another for stripping the define's such as --deldefines
My pull request #12 adds an exclude functionality for just this reason. I had some large minified files all under a js/vendor directory and they took many seconds to parse, but I didn't need them to be parsed because they didn't have anything jsmacro needed to process in them. Sure it's a workaround but one that works well for me in practice. I wouldn't spit on parsing speed increases of course!
This, of course, needs a test case, and a definition of "slow". A quick look at crunching a yui-min.js field took:
I am surprised to see any one file taking over a second.
(For what it's worth, python2.6 is normally the fastest for me at running the test suite; and python3.2 is the slowest.)
The text was updated successfully, but these errors were encountered: