-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
EA Elicitation Service #17
Comments
This is one of the ideas I'm most excited about (starting a 2-tiered system, <3 vs. thumbs-up). While you're at it, I'd propose eliciting short explanations from people about the reasoning behind their answers, then writing up a summary of the patterns. |
Yep, I think this would be preferable, if easy/possible to get them to do. Else, we could have a system that polls people, then we reach out to some of the people directly. |
The "survey experts" step kind of bugs me. To point as to why, it seems to me that not that many people really dig in fairly deep into a topic. I'd expect experts' shallow patterns to be better than the counterfactual, but I'd expect it to be more valuable for someone to dig in deep, and I'd prefer the second as a way of building consensus. |
I'd note that "survey experts" could come with augmentations. For example, we could hire a research assistant or two to spend more time doing deeper research. The "survey" could also assume a lot of time spent. Like, the experts are expected to spend 2 hours per question, if we want. |
I think we could get pretty far by:
If any of these parameters seem particularly exciting/promising, we could then turn them into forecasting questions.
For example, I had a list of some "Very different stances on AGI"; some of which could be turned into cruxes, and surveyed.
https://forum.effectivealtruism.org/posts/SZFDtA4pjZzepdacv/13-very-different-stances-on-agi
The text was updated successfully, but these errors were encountered: