Our 2016 end-of-year fundraiser is complete!
Thanks to the 62 donors who supported us during this fundraiser! We raised a total of
We hit within about 2% of our first funding target, which means things like more and better workshops and instructor training in the coming year. We'll announce the programs as they are scheduled, and meanwhile focus on the ongoing instructor training that's leading up to our first ever 6-workshop sprint, beginning Feb 8.
CFAR’s mission is to increase the odds of a long-term positive future for humanity via targeted rationality training. This is a change, and is one of several changes we’re very excited about; we’ll be posting more about them as we go!
A short explanation of our mission:
- We care a lot about AI Safety efforts in particular, and about otherwise increasing the odds that humanity reaches the stars.
- Also, we believe such efforts are bottlenecked more by our collective epistemology, than by the number of people who verbally endorse or act on "AI Safety", or any other "spreadable viewpoint" disconnected from its derivation.
- Our aim is therefore to find ways of improving both individual thinking skill, and the modes of thinking and social fabric that allow people to think together. And to do this among the relatively small sets of people tackling existential risk.
$250,000 - More specialized workshops and instructor training. This will let us, for example:
- Make our instructor training free, allowing higher-quality instructor trainees (who can later go loose into the world and spread rationality -- we hope);
- Run two more specialized workshops that are free to participants, have high-calibre targeting, and are actually backchained from the goal of reducing existential risk; and
- Develop higher caliber alumni workshops with the goal of running them regularly
(This is one example of the sort of “basket of goods” we could buy with $250k; actual goods may vary.)
$650,000 – Year-round targeted workshops; all workshops free. This will let us:
- Make all programs free.
- Have all programs backchained from a specific purpose (either direct training for a narrow population; or higher variance training experiments; or creating a stronger practicing rationality community)
- If we get a permanent venue, run ~30 (free!) multi-day trainings/year (large and small; for targeted populations in the world and for alumni to get rationality going)
$850,000 – Staff expansion and more scholarships. This will let us:
- Hire new staff for targeted outreach and community enrichment
- Fund flight scholarships for selected participants, to increase program selectivity and impact.
- Dec 3: CFAR’s new focus, and AI Safety
- Case studies of CFAR’s impact to date
If you have more questions you want answered, you can email them to firstname.lastname@example.org.