Will an agentized LLM cause some chaos?
➕
Plus
51
Ṁ3866
Apr 11
32%
chance

People have started agentizing LLMs to do various things, including things like chaosGPT who's goal is to destroy humanity and cause chaos.

This is based on agents using models available at time of market creation (so up to GPT4).

Resolves yes if an agentized LLM :

  • Causes death or injury to a human

  • Destroys a computer or erases valuable data on a machine not running the agent

  • Successfully writes a fake news story that causes people to act in dramatic ways (eg. crashing a stock)

  • Otherwise does something that causes chaos where if a human did it would result it some kind of criminal charge or fine (eg. Fake bomb threats)

Resolves yes regardless of intent behind creating the agent if it does something like above.

Things that don't count:

  • Someone commits suicide after talking to a standard llm chat bot

  • Someone freaks out about AI and does something chaotic

  • Someone screws up their own computer/data trying to make an agent.

Feel free to ask about specific scenarios in the comments.

Resolves in 2 years from market creation.

Get
Ṁ1,000
and
S3.00
Sort by:
sold Ṁ94 YES

"Will an agentized LLM cause some chaos?" --> "Will an LLM available at market creation be agentized and cause some chaos?"

If this is truly conditionally based on models up to GPT4 as stated (those released on or before Apr 11, 2023), I don't see this happening

related:

predictedYES

Outcome of this could hinge on the LLM version qualification if a newer one becomes ubiquitous before the market closes. Conditional markets lower the probability.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules