76 Comments

This is amazing! Very cool that it worked to fund real projects that were interesting to read about.

Hopefully the first of many rounds of impact certs.

Congrats Austin, Rachel, and Scott!

Expand full comment
founding

Overall, I share Scott's view that this round went fine but was not a resounding success (I might have rated it a 6/10 fwiw), and have many thoughts about how to make our future rounds better. Chief among them: open up trading to nonaccredited investors via mana/charity dollars; do more frequent retro evaluations; provide more feedback and support for grantees.

Beyond the large ACX Grants round at the end of the year, we plan on a few other experiments with certs including one sponsored by Manifold specifically for Manifold community projects, and perhaps an essay contest with ChinaTalk backed by impact certs. If you'd like to sponsor an impact cert-based contest for your own org (for-profit or charity), reach out to austin@manifund.org!

Shoutout to the other 3 judges (Nathan Young, Marcus Abramovitch, and Drethelin) for taking the time to evaluate the retro grants. My own scores and retrospective notes are here (I'll likely clean them up and publish soon): https://manifoldmarkets.notion.site/Austin-s-Judging-Scores-54ff88af184441f9be684838c56112c5?pvs=4

And of course, a huge thanks to Scott for believing in impact markets, working with us to figure out how they could be set up, promoting this effort and funding the whole thing! None of this would have happened without Scott pushing for it~

Expand full comment

I still struggle to understand how to balance the impact / earnings point here.

It seems very unlikely that these markets will return better (or even comparable with) simple capitalistic markets if I wanted to make money.

If I wanted to donate to charity / have a direct impact, my marginal tax rate is c.50% so I can essentially double my money by donating directly to a charity (this isn't quite doubling, as the value of the services funded by my taxes is a lot higher than zero, albeit a lot lower than e.g. malaria treatment).

This is, I acknowledge, a general critique of impact investing.

This seems exclusively useful for (and maybe this is in fact the entire point, albeit I see the potential scale here as small)

1) getting something off the ground that wouldn't reasonably qualify as a charity but which you think is socially useful; or

2) Harnessing investor time/expertise to assess projects and outsourcing that from grantmakers.

Expand full comment

It seems like the requirement for the founder to set the price is difficult here, and lead directly or indirectly to some of the failures here. Notably, Max ended up selling 100% of his project at what was (in hindsight) too low of a price, and the Base Rate Times didn't raise any money because the founder overestimated what investors/funders would be willing to pay.

In practice, the way VC markets work is closer to a fixed VC/founder equity ratio, with the amount of equity sold typically close to 10% per round across a wide variety of businesses. The _price_ of that equity is set by some combination of market demand for the equity and the expected costs of the startup over the next ~2 years until another fundraising is needed.

This is a result of some apparently-irrational behaviors, such as founders being reluctant to raise valuation even in the face of high investor demand, and investors being reluctant to buy "too much" equity for fear of reducing the incentives on the founder to succeed.

I wonder whether similar effects would be useful in an impact market. For example, if every participant was always required to distribute 75% of the equity for their project and keep 25% as "incentive", and the opening auction solely served to set the price for that 75%, which the founder has the binary choice to take or not.

Expand full comment

> the fact that we overall disagreed with funding decisions is pretty damning.

I feel there is an obvious solution here: have the funders write up a short initial impression on each project. This reduces risk for investors and lets them focus on the value they're actually bringing: predicting outcomes rather than predicting what judges will think about outcomes.

Like, much waste could have been avoided if something like this had been written *beforehand*:

> Polymarket - We have trouble figuring out why this would be better than the many other real money prediction markets that happen on Polymarket all the time, and we feel like we already have good data on how real-money markets compare to play-money ones. This is something we don't really want.

Presto, $8k not wasted.

[ Edit: you might respond "but this would discourage projects where we don't see the benefits" - and the answer is that this is only true if investors think you wouldn't see the benefits after the fact - in which case you, almost definitionally, don't want it funded anyway. ]

Expand full comment

I think the only way to make impact markets work would be to explicitly define what the criteria that will be used to evaluate the results will be in advance (e.g. prize money for inventing X or at minimum best at doing Y), otherwise your just doing normal charitable giving with a delay. Also I doubt a market will have significant impact (better than if you just gave the money on day 1) with less than a 5 year time horizon.

Expand full comment

It seems undesirable for the final retroactive funding decision to have any sort of "future growth" component. The investors are already unicorn hunting, so the final decision should be based on concrete facts as much as possible. Otherwise, there'll be compounding forecasting error, plus the game becomes "predict which of these the judges will predict will grow the most", optimizing for hype over substance.

Expand full comment

I was project creator for #15, the interpretable AI forecasting thing, AMA!

Expand full comment
Oct 10, 2023·edited Oct 10, 2023

> Investors spent $33,220 to produce what we judged as $21,125 - $31,125 in charitable value, making the market inefficient. I don’t know exactly how to think about this: I wouldn’t want us to pay the investors much more than they invested, or I’d feel like we were being ripped off.

This is the mistake that leads people to support communism.

Your point of view is wrong and harmful. What you _should_ want is to end up paying the investors much, much more than they invested. You already decided what the project output is worth to you; you should see that it is a better world if $21,000 of charitable value costs $1,000 of monetary value to produce than if it costs $20,500. But your stated preference is for the second of those worlds.

(In slightly more detail, your position alllmost makes sense in the light of your stated commitment to spend no less than $20,000. In that scenario, an outcome you might want to ward off is that someone signs up, invests no money (good!), produces no results (bad!), and then you pay him $20,000, which is a lot more than he invested and is also a waste.

Except that you don't actually avoid that outcome with your rule. Where one person can sign up for free money for no effort, so can tens of thousands of others. When 40,000 people sign up for this deal, you end up paying each of them at most 50 cents, which is not much more than they invested and is therefore OK with you. The waste is still there, it's just as bad as ever, but your rule didn't see it.)

Expand full comment

I applaud the experiment.

Next time around I'd like to see a less self-licking selection of projects, though. While I can understand the logic behind the idea that the most effectively altruistic thing you can do with your money is to raise awareness of effective altruism so that more people donate more money which we can use to further raise awareness of effective altruism... it still makes me uneasy.

It would be nice to see another experiment done with a more limited scope to exclude these sorts of self-licking projects.

Expand full comment
Oct 10, 2023·edited Oct 10, 2023

I remember that when I was reading the descriptions of the projects, a lot of them seemed to (implicitly) go along the lines of "I was going to do this good thing anyway, but now that these markets are open, I might as well get paid by telling people it requires $5000."

Expand full comment
founding

Ah, I'm flattered by the shout-outs! In case you weren't aware, here's my impact markets villain origin story: https://www.astralcodexten.com/p/impact-markets-the-annoying-details/comment/7755694

Overall I feel this was a win. I'm glad the organizers feel like there was room for improvement, but as an investor, I was really glad I participated. I don't give almost anything to charity normally, but the format of this as a market, where I had to exercise my personal best judgement as to returns, act as an advisor to projects when desired, and potentially even make money, helped pull me in. I was very OK with not making any money---indeed, when I saw the unadjusted table first while scrolling through this article, my thought was "Oh well! At least I have some validation that I know how to pick more-impactful projects".

OPTIC seems like a huge success to me, and it was indeed great connecting with Saul and the team. The fact that they were proactive about reaching out to their investors for advice and feedback was a really good sign; I would encourage more projects to do that. (I think if I were a real VC-style impact investor, I would myself reach out and schedule meetings with my investees. But I'm a busy guy. So big kudos to them for taking up that part of the investor-investee relationship!) As part of that relationship, I was also happy that I was able to connect them with my coworker at Google who maintains our internal prediction market; he was able to take a train up to Boston and give a talk there: https://www.youtube.com/watch?v=pYYuMLQjpWY. This kind of incentives-aligned networking with your investor seems good!

My main criteria for picking were:

* Would Scott like this? In particular, this led me to the Kelly-optimal bets tool, and the superforecaster predictions. The latter, in particular, seemed to play well with a lot of the themes we see on ACX, comparing superforecasters with prediction markets and other prediction technologies. I was also interested in https://manifund.org/projects/predicting-replication-in-academic-economics from that aspect, but I wasn't fully convinced by the answers to my queries in the comments there.

* Would this have a big impact if successful? From this perspective OPTIC and Crystal Ballin' seemed high-leverage. The risk here is whether the work actually gets done; I tried to suss this out with Crystal Ballin' but it seems to have only partially panned out. Still, I remain hopeful for them, albeit not on the original timescale.

If people are interested in what kind of projects I investigated, even if I didn't invest, check out https://manifund.org/Domenic for my comment history.

The biggest areas for improvement, I feel, are in communication, both around the whole structure and for newbie investors/investees in particular.

Because you're dealing with something novel, it helps if the rules are really clear. Who exactly are the judges? It would be good to know their backgrounds and preferences ahead of time. Are they supposed to be judging actual realized impact at the 6-month mark, or future potential impact, or...? What exactly is the deadline? What exactly is the funding pot? ("Between $20,000 and $40,000" is a huge range!)

And for newbie investees/investors, I think there are just some basic market mechanics that could serve to be more signposted. E.g., the fact that we collectively bet $45k on a $20k-$40k pot should have generated big warning signs or something. The fact that a project asking for $20k (between 50% and 100% of the total pot) was unlikely to get funded should have been clearer. (Both of these could be addressed in the UI, ideally! Otherwise, putting it in some sort of pre-reading would work.) You could also try to help investors understand better what sort of things to watch out for, e.g. impact/tractability/neglectedness (seems like this would have caught the Polymarket case), investee track record (what many of my questions were focused on), over-ambitious vague projects, etc.

Finally, I told Austin this a while ago, but... where are my actual impact CERTIFICATES!? I want something pretty to print out and brag about to my friends! I want something where selling it back to Scott feels like parting with a small piece of my charitable soul! I want something where I stare at my 66.12% share for OPTIC, maybe in pie-chart form, and think... how could I get the other 33.88% percent?

Expand full comment

I know that this sort of post is less 'sexy' and seems to draw less appreciation than most others, so I just wanted to say that I think this is a fascinating and potentially very useful experiment, and I hope it keeps going and you keep posting about it here.

Expand full comment

As one of the investors (invested $2k, now worth $1489), and someone who has raised a total of $85M from VCs in the real markets, I think you are way too pessimistic about this outcome. I feel like it was a huge success (9/10).

- you got $31k in value created for $31k without having to take any risk yourselves. Some projects did better than you expected! And you get to pay them more! Some did worse and you didn't have to pay them at all! Incredible! (Aside: this benefit to you is why it's ok for investors to make big returns; taking on risk and investing in a wide field is a highly valuable activity.)

- "we are skeptical of investors choices" is not a failure, it's a success! Think about how much investment goes into stupid startups that you are skeptical of; that's just how it goes. Sometimes though, you are wrong, and it's good that other people have different opinions, and we get Airbnb. As long as all the good stuff gets funded, it doesn't matter how much bad stuff other people waste money on. Relatedly...

- I would argue everything happened correctly with Base Rate Times -- the founder stated that they would not do the project unless they had a certain grant (valuation) that neither we nor you would have funded at that level (and in fact was not worth in the end). The failure here was the founder's pricing of their idea, stated preferences vs revealed preferences (i.e. do it anyways), or misunderstanding of how to correctly keep 100% equity but still participate in the round. In a real open impact market they would have gotten $6k at the end anyways, and it's only because of the constraints of this simulated market that they didn't (the requirement to raise any money). Plenty of startups are bootstrapped and then exit, and this is good.

- the real VC market has the exact same dynamic (actually spookily similar) of not only a power law for startup exits, but also a power law for VC returns -- most lose money, some roughly break even, but the top ones make insane returns. Everyone knows this and wants to participate anyways because _they_ think they can outperform the crowd.

- I don't think you should worry about overall investor returns at all in fact, because a) we were considering this charity in the first place and willingly overspent given that (I aimed to invest at EV rather than aiming to actually make a return, and didn't pay attention to total amount invested by others at all -- reasonable given the bidding structure), and b) the market will adjust naturally based on payouts, especially as the big losers one round forgo the next round. This is all natural and correct. If you keep paying out at the end, investors will find a balance where they do on average make money.

- Even overpaying for securities you and Austin especially like is reasonable! This happens all the time IRL for exits; it only takes one entity thinking it's worth it to buy a company at a valuation no one else believes in. IRL the top five buyers don't pool their money and take the median valuation -- the highest valuer just buys it. You two are not being irrational to shell out more for things you think are worth it.

- You're selling yourself short on the market structure. This was actually quite investor friendly. Usually only the highest-paying investor gets to participate in a startup, they have to wait 5 years for any payout, and they have no idea if in 5 years the startup will IPO into a bear market. It is weird for the total funding not to reflect the actual value creation, but actually you did adjust for this by having a floating total in the 20k-40k range, and I think you did a good job valuing stuff. In a real market the dynamic would be similar I'd think, where charities can buy impact certificates OR make grants, and do more of one vs the other depending on how good the certificates are (thereby adjusting the total end payout). (I agree that the round fails for long term projects like the golf ball one, though; that maybe needs 5 years.)

The one and only reason this wasn't a perfect 10 imo is that BRT/AI treaty didn't get an award at the end, because they were ineligible. In the real market, bootstrapped companies are just as eligible for exits.

Expand full comment

As one of the judges, I'll just say that it's really stressful to have to try and put financial values on all these things. I felt pretty dumb.

But somewhat gratifying that there are patterns across the funding. Phew.

Expand full comment
Oct 10, 2023·edited Oct 10, 2023

The cynic in me can't help but feel conflicted about the only project that the judges valued as a standout success being "educate impressionable young people on our values". (A less charitable verb would be "indoctrinate"). Pretty much every talking point about wokeness in academia being a liberal brainwashing camp could be used here with very little modification.

Like, people already accuse EA of being a cult that spends more money on "paying middle class intellectuals to think about how to do charity" than actual charity, and having "gain new adherents to the philosophy" be your highest ROI in a fundraising contest is... not great optics? Especially when the goal is to sell the "impact" to corporations for pure-optics-reasons.

I don't even disagree with the ROI assessment ($300 is really low, and $10k isn't unreasonable), but if the goal is to sell EA benefits to non-EAs, again from a purely optics-position (which matter more to non-EAs than actual outcomes) that seems to miss the mark.

Expand full comment

> thinking in these terms makes all education/movement-building grants vastly better than all other types of grants.

I have an (experimental! warning! careful in taking this at face value!) paper exploring this, which concludes that, in some cases, you should do an "asymptotic ponzi" and invest asymptotically everything into movement building:

- Paper: https://philiptrammell.com/static/Labor__Capital__and_Patience_in_the_Optimal_Growth_of_Social_Movements.pdf Could be food for thought.

- short explanation: https://forum.effectivealtruism.org/posts/FXPaccMDPaEZNyyre/a-model-of-patient-spending-and-movement-building

Expand full comment

I read barely half of it and it seemed pretty byzantine to me. Any barbarians at the gates yet?

Expand full comment

Personally: I'd be itchy calling these charitable.

Would like to see some that benefit people who aren't well-off. I guess I can understand if people want to invest in gigabrained crypto applications that could replace the dollar!!! and not apps that make third-worlders' lives easier, but you have to admit we're snorting clouds over here.

Expand full comment

"This was a hard grant to value. One way to value it is something like: suppose that he keeps doing this x 3 years, and 5% of his students become long-term committed rationalists/EAs. That’s 9 new committed rationalist/EAs. Suppose half of those would have counterfactually found our community anyway; that’s about 4 new ones. Suppose of those four, one takes the GWWC pledge to donate 10% of lifetime income, and another goes into direct work and has a good career in some useful institution. Each of those people could plausibly generate $100K in charitable value. So maybe we should value this at $100K."

This is the whole problem with the approach - you're just making up numbers, or picking them out of a hat. Not only don't you have a clue if these numbers are right, they're not really testable either (unless you add a decades-long tracking of the participants, and even then their truthfulness might be in doubt). The value might as well be $0 - in fact, this might be the likelier number. The whole evaluation is empirically empty.

Expand full comment

Given that population numbers and the concomitant impact on the sustainability of the earth for numberless future human beings wouldn't it be a valid longtermist goal to bring human numbers in alignment with sustainability? They seem to be pinning their hopes on an assumption that we can just tech our way out of this. Which I suppose would have huge appeal to the "tech bros".

Expand full comment
founding

[I went to UMD for undergrad, as did a different Matt G; I wouldn't be surprised if there are more of us floating around the rat/EA sphere.]

UMD is a party school--but also it's the flagship university for the state of Maryland, and gets a lot of top students thru scholarships and being well-ranked for lots of prograsm, has a bunch of investment in undergraduate research, and so on; I am not surprised that one could put together a solid EA group there.

Expand full comment

I feel like the median here is potentially awkward. Roughly speaking: six months ago, a bunch of people were asking "what will this be worth in six months?" and the project sold certificates to the person who gave the highest answer. And today, five people are asking "what's this worth today?" and the investors are selling certificates according to the median answer of that group. If the five people were buying individually (and could afford to buy everything they wanted), 9/18 of the projects would have made money. (Five had 1/5 valuing it higher than total equity, three had 2/5, one had 5/5, and nine had 0/5.)

I assume "a group of people decide among themselves what to invest in using some way to aggregate their opinions" also happens a lot in the stock market and VC, so I don't think this is a bad way to do things. But it stood out to me.

Expand full comment

Noob questions: I couldn’t find a clear explanation. 1. What does having an “impact certificate” mean? Is it about the impact of the entire project for its entire duration, potentially indefinitely, or just for a specific part of the project? 2. Can the project owner issue new shares after certain milestones are met?

Expand full comment