Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PROD-32252] Get hey Working with CSV #1

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Conversation

nsohn
Copy link
Collaborator

@nsohn nsohn commented May 17, 2021

Forked the hey repo and trying to get it working with a CSV so that we can pass csv with urls to hit.

This is my first time playing in GO so there's definitely need for improvement. As of now, I think this will only work with CSVs, which isn't great. But before fixing that, I wanted to get some feedback about if this is what we wanted.

One thing I haven't been able to make work is getting the results in a CSV. Once I get that going, we'll need to fix the way in which this does things to produce one CSV with all of the results rather than a CSV for each url.

To test this I have been running go run hey.go -file_name /Users/noamsohn/Desktop/shorten.csv

@nsohn nsohn changed the title Update hey.go [PROD-32252] Get hey Working with CSV May 17, 2021
@nsohn nsohn requested review from stegegn and marcel808 May 17, 2021 22:03
Copy link

@ravitch ravitch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I have to ask - this could be accomplished with a shell script calling hey for each URL, right? Without maintaining a fork?

record, err := r.Read()
nurl := record[0]
// ignore any strings that aren't formatted correctly
if strings.Contains(nurl, ".com") == false {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're trying to check that it's a valid URL, use url.Parse. But you could always just let it fall when it tries to connect.

}
r := csv.NewReader(csvfile)
// TODO remove this after i'm done testing
break_early := true
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done testing?

// TODO remove this after i'm done testing
break_early := true
for {
record, err := r.Read()
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to check err before accessing record, which could be empty.
And make sure you handle EOF so you exit gracefully.

log.Fatal(err)
}
generateRequestAndGetResponse(nurl, method, username, password, *hostHeader, header, dur, bodyAll, num, conc, rateLimit, proxyURL)
if break_early {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't bother passing in hostHeader (or any other globally declared flags)

break_early := true
for {
record, err := r.Read()
nurl := record[0]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nurl? Why not just url?


func generateRequestAndGetResponse(url string, method string, username string, password string, hostHeader string, header http.Header, dur time.Duration, bodyAll []byte, num int, conc int, rateLimit float64, proxyURL *gourl.URL) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems like it would repeat w.Init/w.Run for each CSV line: the approach should probably keep the initialization and start of workers as the main single execution (e.g. like worker.py in GKEmail).
I've put together an alternative hack in rakyll#245 that spins off a reader goroutine that feeds the workers urls through a shared channel.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants