This bounty is no longer available
Web3 DAO | foresight-org Logo

Storytelling Bounty - Ethical practices around the proceeds of AI

Organization

foresight-org

Deadline

about 2 years ago

Status

ENDED

261.87 USD

INSTRUCTIONS

Bounty concept

In the Existential Hope-podcast, we invite scientists to speak about long-termism. Each month, we drop a podcast episode where we interview a visionary scientist to discuss the science and technology that can accelerate humanity towards desirable outcomes. One of the questions we always ask is for the scientist to provide an example of a potential eucatastrophe. The phrase “eucatastrophe” was originally coined by JRR Tolkien as “the sudden happy turn in a story which pierces you with a joy that brings tears”

In a paper published by the Future of Humanity Institute, written by Owen Cotton-Barratt and Toby Ord, they use Tolkien’s term to suggest that ”an existential eucatastrophe is an event which causes there to be much more expected value after the event than before.” I.e. the opposite of a catastrophe.

Telling stories can help us make what seems abstract become real and clear in our minds. Therefore we have now created a bounty based on this prompt. Use this event as a story-device to show us the picture of a day in a life where this happens. How would this make us feel? How would we react? What hope can people get from this?

An example of a Eucatastrophe by Anna Yelizarova (Future of Life Institute)

“An event where we collectively agree on more ethical practices around the proceeds of AI, so that all people can have their needs met by society.” To me, a eucatastrophe is not just something very good happening out of the blue. It's in storytelling: Everything is at the brink of collapse. Things are about to end very badly for all the characters we care deeply about. And then suddenly, there's a knight in shining armor with an army that comes and saves us. I think that's what Tolkien was referring to as a eucatastrophe.

Thinking about how this storytelling tool would look in relation to AI, it would be a lot of people that are fed up with their needs not being met by society, frictions building up, and perhaps even some agreement is breached. Then the eucatastrophe is an event that addresses the needs of the people who are complaining.

The paper “The Windfall Clause” puts forward this idea of an agreement where all companies that are building AGI agree beforehand that if they build an AI that accrues more than a certain percentage of the global GDP, their money will be reallocated in a trust and it will be communally decided on how this money is spent.

So the eucatastrophe would be an event where we collectively agree that enough is enough. Where we break out of this paradigm and become bigger people in a scenario where nobody would expect this to happen. Where you're pleasantly surprised and we put a stop to the machinery and agree on more ethical practices around the proceeds of AI.

Bounty prompt Describe a day in the life when we have an event where humanity collectively agrees on more ethical practices around the proceeds of AI, so that all people can have their needs met by society.

Submit your bounty-response below for your chance to be rewarded 0.15 ETH.