You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Judge the error rate of the different filters: we know that they are blocking a portion of the URLs that have been submitted, but is this in line with the filter settings? Could do this by taking a random sample of blocked URLs (say 100) and getting people to check whether they should be blocked (or not) and under what conditions.
The text was updated successfully, but these errors were encountered:
Possibly similar to the behaviour of the Herdict system - but instead of humans judging whether the site has been blocked, I think @mattroweshow was suggesting they judge whether the site should have been blocked, and if so under what category.
That's correct: I envisaged that we would manually choose a subset of the urls that have been blocked and then see: i) whether they should have been blocked, and; (ii) if so, under what category and filter setting. That way we could get some idea of how 'well' the filters are performing, and validate the statistics that we produce.
Judge the error rate of the different filters: we know that they are blocking a portion of the URLs that have been submitted, but is this in line with the filter settings? Could do this by taking a random sample of blocked URLs (say 100) and getting people to check whether they should be blocked (or not) and under what conditions.
The text was updated successfully, but these errors were encountered: