Economics Asked by Zaratruta on January 16, 2021
Let’s imagine a coin-flip game, which uses an unbiased coin.
Starting with X dollars, your total increases 50% every time you flip heads. But if the coin lands on tails, you lose 40% of your total. You can play this game N times. On each turn, you must bet the total amount you had on the last turn.
Is it worth betting on this case? How can we formalize this in order to have this answer in the general case?
Is it easier to formalize this for a specific value of X and N? Let us assume, for example X = 100 dollars and N = 100 turns.
PS: This scenario appears in an article discussing some ideas of Ole Peters, a theoretical physicist who is claiming that Everything we know about modern economics is wrong.
The game has positive expected value, so you should play (assuming constant utility of each dollar, which tends to break down somewhat for very large sums of money). If you start with $X, after 1 round you have a 50% chance of having $1.5X, and a 50% chance of having $0.6X, for a total expected value of $1.05X. That is, for every dollar you bet, on average, you'll win $1.05. The amount of money and number of trials doesn't matter at all for this analysis - every round is effectively identical, so you should always bet.
In addition, there is absolutely no risk of "gambler's ruin" here, since you can never go bankrupt. In games where you can lose your entire wager, playing even a positive expected value game can lead to bankruptcy over time, since you'll eventually run into a losing streak that outstrips your bankroll. But here, since you will always have some amount of money on hand, you can keep playing forever no matter how bad of an unlucky streak you get.
Answered by Nuclear Hoagie on January 16, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP