I'm reading about the history of risk (not the game)

A hypothetical, apropos of a conversation I had this weekend about why backing up your pass in the game of craps is such a good bet.

Let's say you are offered a bet which is (mechanically) precisely the same as a coin flip, except that one side only has a 1 in 1000 chance of coming up. You're offered a chance to bet one dollar on that side, with a $X payoff if you win. You are also guaranteed that the other person will offer you the bet as many times as you have a dollar to bet. As it happens, you currently have $1,000. If you are risk-neutral and don't value the time spent playing this game at all (your oppurtunity cost is nil), and your only goal is to walk away from the transaction with more money than you started with, what is the lowest value of $X (in whole dollars) for which you should except accept this bet? Or is there no value for which you accept? This is not a "trick question."

If you start thinking about this and think it's really simple, you're right. But the person who brought it up this weekend brought it up to prove the point opposite of what it actually shows. Also, it's possible my reasoning on this is wrong, it's happened before.