You're just stating things without making any argument. I am aware of consumer surplus, thank you very much.
You said that I value something at 60 cents and the seller values it at 40 cents, therefore I will buy it at 50 cents. Not only is your simple linear model absurdly reductionist, it's just ignorant. If a grocer is willing to sell at 40 cents, I would never pay a 50% markup when I wouldn't have to, your assertion that I would notwithstanding. I would just pay the 40 cents, and therefore the only objective measurement of how what I valued it at would come out at 40 cents. You stating that I actually value it at 60 cents is just you trying to fit a scenario to your fictional model.
Are you dumb? I just made up the numbers. I never said anything was linear and even though the grocer is willing to sell at 40 cents he doesn't because then he wouldn't be able to meet supply so he raises the price until he can.
Maybe the real life numbers are 45, 50 & 55.
Maybe they're 34, 37 & 50.
Maybe you really fucking love apples and it's 34, 37 & 5000 in which case go nuts, buy up the whole orchard, and keep all the doctors away for all time.
Maybe those numbers aren't even close because what exactly does an apple cost these days I don't even know exactly.
But the point holds. There is a price at which the transactions happens. Somewhere below that is the producer willingness to sell. The difference between those numbers is the producer surplus. Somewhere above the price is the consumer willingness to pay. The difference between that number and the price is the consumer surplus.
Go find any economics 101 textbook in the known world and you'll have a downward sloping demand curve, and an upward sloping supply curve. They intersect at a market price. Then you get some shaded in areas for the producer and consumer surplus.
If you don't know what one of these graphs looks like then you aren't qualified to participate in any discussion ever about what things should cost. Just shut up and go back to writing shitty software using the worst programming language in general use today.
Supply/demand curves are literally 4th grade curriculum where I come from. So yeah, mastered that shit 20+ years ago. Thanks for assuming though.
You've gone off the rails and are now confusing 'cost' with 'worth' or 'value'. We can only determine a 'value' or 'worth' when an actual transaction occurs. You can't just extrapolate out on a model curve and expect it to hold in the real world, no matter what your theory tells you.
Bottom line is, if a grocer is currently asking 40 cents for an apple, I will transact at that price no matter how much higher I 'value' that apple. I wouldn't pay 60 cents for an apple that a grocer is selling for 40 cents. I wouldn't pay 50 cents for it either.
BTW, ECMAScript is poised to take over the world. Better strap in.
In reality, in most transactions, both parties profit due to the fact that there is no fixed concept of "worth."
The apple that you bought from your grocer for 50 cents was valued at 40 cents by him and 60 cents by you.