-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expose fetch_and_sort_papers()
parameters in literature_review()
#15
Comments
feel free to assign to me @eimenhmdt , though I'm sure you can handle that... :P |
Thanks for the suggestions! What do you mean by keyword combinations? Regarding the top_n papers, we need to be cautious to not create a context that's too large due to token limitations of GPT 3.5. What are your thoughts on this? |
I was thinking that if users wanted to, they could pass their own Thats true! have you any idea what that context would look like atm in terms of number of papers? If I get time today Ill try run the algorithm with quite a few and see. We would probably want a limit anyway so a user doesn't accidentally choose 1000 top papers or something and burn through their tokens. |
re keywords: I think that's a good idea. Many researchers probably already know very well which keyword combinations they would want to use. If you want, it would be really cool if you could build this. :) re context: Depends on the model you use. Currently, AutoResearcher uses GPT-3.5 turbo by default. But, it's also possible to use GPT-4. I think if we want to extract useful & sufficient information from each paper, 20–25 papers (GPT-4 2x this number) should be max in the context. But, I could also think of a refined algorithm that circumvents these limitations, i.e. adding additional steps, using a vector store etc. |
Interesting! Sure - got my PhD viva in a few days though but will tackle after that. |
Oh, cool. Best of luck for your viva!! |
I think it would be great to be able to configure the following in the
literature_review()
function:year_range
top_n
papersThe text was updated successfully, but these errors were encountered: