Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Employee Turns Gaming Network Into Private Bitcoin Mine (wired.com)
99 points by rheide on May 1, 2013 | hide | past | favorite | 70 comments


Could this be a source of income for indie game developers? While users play the (free/cheap/torrented) game, the users graphics card spends its spare cycles on mining for the game developers. No obnoxious ads, no paying $60 for a game you only play a few times.


This works oddly well in that the amount you 'pay' for the game is directly proportional to how long you play it.


When you mine your GPU/CPU is running at full load, producing lots of noise and heat. It also maxes out power consumption on your rig and ups your electricity bill. I don't see many gamers willing to do this.


You can control how much effort you put into the mining, and nothing says the mining has to impact game play. Every game has a lot of fluff screens that could be used to put a few cycles towards mining.


Oh so the argument was to run mining while the game is running ? Mh, considering a high end card running 24/7 is hardly producing any ROI you would need boatloads of players and a lot of fluff screens :)


My original idea was to run mining while the game doesn't require the full use of the graphics card - i.e. rendering at 60fps requires 40%, bitcoin mining could use another 40%. If the scene is more taxing and requires 70% of the GPU, mining could drop down to 10%.

I've got no idea how effective this would be, but it would be an interesting experiment for an indie game dev.


usually players want the games to run as fast as possible, developers dont manually hold back framerates to do other stuff on the side. I dont think this would work really good.


This might sound nice in theory, but having had some experience optimizing mining rigs myself ,- you proposal does not work well. I would not allow mining on my gear. it puts say more stress on the gear than regular gaming does. In addition to drawr more power, shortening the lifespan of components and most likely crashing the graphics driver


Could be a great step up from freemium applications, without needing ads.


Once ASICs become mainstream, GPU mining will no longer be profitable.


The Avalon ASIC can perform 68,000 Mhash/s which is ~110x more powerful than a 7970 GPU (at ~600 Mhash/s) at around 3x the cost.

It's obviously better value for an individual to buy an ASIC than a bunch of GPUS, but I think you could still earn a few coins if you've got thousands of gamers with spare GPU cycles.


If you can buy one.

The problem with ASICs is that, once they get under the profit curve, they have no other usage. I can very reliably expect a 7970 to be useful for a few years at least. An Avalon, if I manage to get it, may or may not pay for itself before difficulty makes it unprofitable to run. And then it's completely useless.

Lots of hypothesis and conjectures.


Could you repurpose them for password cracking?


Nope.

FPGAs yes, ASICs no.

They are just machines or iteratively calculating sha-256( sha-256( something ) ) and they are hard-wired to that.


It depends on who is paying for the power though, as far as I know there are still botnets doing CPU/GPU mining on a large scale.


GPU miners may migrate to Litecoin then for that reason. Going to be an interesting next few years in the cryptocurrency space.


GPU miners have much less of an advantage over CPU miners when mining litecoin; It was designed to be memory intensive as well as cpu intensive (scrypt).


Right, but whether a CPU miner is better for LTC or not is only relevant when you're choosing what kind of miner to buy.

For those who already have GPU miners they bought for BTC, that are now becoming obsolete to FPGA and ASIC miners, the pertinent comparison is how to get the best ROI on their GPU miner - continue with BTC, switch to LTC, or just shut it off.


This could be a good way to get some minor extra cash out of hobby projects. A ~$5000 might not be much for a investor, but for a student/hobby project its clearly a nice bonus.


coinlab.com does just that: they developed GPU code that does just that: game developers can integrate it in their games to get revenues from their players.


Just wait until WebCL is widely deployed and any webpage could use your GPU to mine bitcoins!


This could also be used as a mechanism for micropayments.

Want to read this article? Help us calculating 100 Gigahashes on your graphics card by simply clicking ok!


I suppose it's better than ads-only. When there are less ads, an ad will become more valuable and fewer need to be shown. Perhaps it'll one day be possible to run without Adblock Plus and not have three flashing 300x300 blocks on your screen at any given time. Also mining keeps a cryptocurrency's network secure. It sounds really good to me ^^


Well, it's not so good if you're trying to use laptop on battery power.

Having said that - I wouldn't mind websites charging me automatically fractions of cents for reading their contents


Interesting idea, "charging fractions of cents". Usually when people talk about covering the costs, ideas like monthly subscriptions and donations are mentioned, but I haven't heard about very small automatic fees.

Then again, not all content is worth reading. It should be proportional to how long you are on the site, or you should be able to revoke it manually or something. Anyhow, an interesting idea to think about and toy with.


And then WebCL will become disabled by default in sane browsers.

Stealth mining is akin to theft (electricity bill), I wonder how it will be classified, legally.


How exactly is this different from sending unwanted java-script with ads and tracking code? The CPU cycle and bandwidth is not exactly free, and both cost electricity.

Not that it matter. Current legal thinking regard consensus is a very simple one. So long the user stay on the site, then an implied consent is given to what ever the EULA says (so long the EULA page is linked in tiny text in the corner somewhere). What the cookie law in EU showed, is that there aren't enough political power at the layers below elected politician level that will enforce a strict consensus requirement on the web.


> How exactly is this different from sending unwanted java-script with ads and tracking code? The CPU cycle and bandwidth is not exactly free, and both cost electricity.

Scale is the difference.

The bit of bandwidth required for a small image and the CPU time to render it and run any associated little bits of script, is not all that significant compared to what is going on to transfer and render the content that has actually been requested.

Running someone's GPU (and/or CPU) at 100% for a time is going to be significant though, not only does it consume more noticeable amounts of electricity directly but it also generates heat which could reduce the life expectancy of the user's device and will certainly reduce the battery-life-until-next-charge of a laptop or other portable device which could be quite an inconvenience.

Of course that sort of thing already happens with flash, which is one of the reasons I have it disabled be default (or simply not installed) on all my devices, with badly written animations consuming as much CPU time as is available - though that is at least just ineptitude rather than someone deliberately abusing my kit for their own gain.

Having said all that: if sites were honest about using processing power for things like that and I could choose between ad-free but power consuming and normal sites, I'd probably pick the former on desktop and the less *PU intensive option on mobile devices.

Perhaps it is something places like Reddit could consider in their continuing effort to keep the site relatively un-ad-encumbered.


> The bit of bandwidth required for a small image and the CPU time to render it and run any associated little bits of script, is not all that significant compared to what is going on to transfer and render the content that has actually been requested.

I strongly remember people doing tests on news sites. What they find was several MB of bandwidth, for each few lines of sent content. Ads was around 2500 times bigger in bandwidth, and I would not even try to imaging how much javascript run-time is spent on ads/tracking vs the graphical interface.

Sure, some sites are better and less intrusive, but is that the norm?


Heat and sound. The difference is between me working on my laptop on a lecture, and me having to shut down my laptop because the fan is going crazy...



yesterday's discussion: https://news.ycombinator.com/item?id=5636233

EDIT: wow has wired always been this bad? 16 trackers/ads according to ghostery, 20 according to disconnect


As are all sites owned by Conde Nast.


For years.


This is ridiculous, it wasn't an employee it was the CEO, who dropped hints they were doing it all through April. Shouldn't expect better from Wired I guess.

This being a entrepreneur tech site, it looks like there's a market for non-terrible companies providing video game-related software and services to niche multiplayer communities.


Is this a viable revenue stream for free app developers? Would it ever make it past the Apple review process?


Something we'd actually (half jokingly) thought about a few weeks back.

Pretty sure you'd need a huge number of users to make it worthwhile, though would be interesting for someone to do the numbers.

It could definitely get past the Apple review process, afaik it doesn't violate any developer guidelines (even though it feels wrong).


Could you imagine the reviews about draining battery life?


Absolutely.

I'd be really interesting in seeing if the energy cost is lower than the average long term output. I'd be perfectly fine with leaving my iPhone plugged in all night if it meant it could generate me $x/night, where $x is greater than the electricity hit I get.


I'd be surprised if it were worth it. Even with fairly efficient GPU mining rigs it's getting close.


Not for long. Once the 50gHash ASIC monsters come online, using anything else to mine will be unprofitable.


Bitcoin miners running on iPhones / iPads won't make any money, you need a decent GPU these days.


Sure, but could the work not be distributed amongst many users? How many concurrent users would it take for it to be useful? Would likely demolish battery life and data caps, but I think you could run only when batt > 50% and wifi is on.


It would have minimal effect on data usage, as you only ever need to report back when you've found something, and you only need send a 32-byte precomputed seed every ~10 minutes or so.

It would, however, absolutely demolish the battery. This kind of computation is pretty much directly proportional to power use, and modern smartphones are power-limited. So, if you mine at full tilt, the smartphone will drain the battery literally as fast as it can, and get really hot doing so. You can limit the hashing speed, but this will only spread the work over time. It won't change the fact that to get any good hashrate out of the system, you basically have to consume all the battery.

However, all of this is pointless because you'd need millions, possibly tens of millions of concurrent users to match a single modern $1000 mining rig. Computing sha hashes is a workload that fits special-purpose hardware really well. GP hardware simply cannot properly compete anyway. And now as ASICs are driving up the difficulty, you'd be making much less than 1$ of profit for 1$ of electricity consumed.


According to bitcoin.it [1] iPhone 3g's cpu (ARM1176JZ(F)-S) can achieve 1.19 kH/s

When I plug that into Alloscomp's bitcoin calculator [2] it says that at todays exchange rate you could expect a return of about $0.02 per month per phone (assuming they're on 24/7)

[1] https://en.bitcoin.it/wiki/Mining_hardware_comparison#ARM

[2] http://www.alloscomp.com/bitcoin/calculator


If you had one billion devices you would make around 30 million per month. One billion devices is totally unfeasible though (Especially one billion on 24/7) and you could definitely make more money some other way with that many devices.


Like pushing ads to the phones.


Would it be viable to rent a (GPU) cluster for mining? $3700 in 14 days seems doable...


If only this option was available more in the gator browser bar heyday.


I smell bullshit. Sounds like the company tried it, got caught, and blamed it on a (fictitious?) rogue employee.


The "rogue employee" is actually one of the cofounders. When people caught him, he lied about it and said it only ran for two days and only got 2BTC. Once people dug deeper, he admitted that it actually went on for 14 days and was for 30BTC. He's making it seem like no big deal and treating it as a joke. He's talking down to anyone who says otherwise.


> In just a couple of weeks, he netted close to BTC30, or about $3,700.

You honestly think the company would risk something like that for such a small amount of money?


I think they acknowledged they were willing to risk it by considering it in the first place.


That's what I found to be strange too. Why was it ever considered?


"the company considered" has a quite wide range of possible interpretations. It could mean anything from "a few developers bullshitted about it in the lunch room", through to "the board oversaw prototype development and approved a pilot deployment to measure public response" - which are _clearly_ very different scenarios. Unfortunately, while it's easy to distinguish those two meanings at a large company, as you try to distinguish the intent at smaller and smaller companies, the differences become much harder to judge - in the degenerate case of a two-person company, "two dev's bullshitting" and "board oversight and approval" are exactly the same thing...


Customers with a low cost of electricity and a hot GPU could have used it to offset the cost of their subscription. They would not have needed to learn anything about Bitcoin or manage their own financial account.


Hmmm, I wonder if you could somehow plot bitcoin mining rates against location and temperature/climate? I wonder if there are people who selectively turn on their mining boxen to use as room heaters, effectively getting the computation (and bitcoins) "for free" since they'd be consuming the power for heating anyway?


Thoughts like this are why I love this site.


Yes, it's not unheard of.


Instead of donating to charity, they should have given it to the gamers who had their computers illegally used. They do not realize the criminal and civil actions that can arise from this.


Are you sure there are grounds for a criminal case? When you install and run a piece of software on your computer, you grant it a fairly wide license to run. Very little actual harm was done (I can't think of anything beyond power consumption and perhaps an infinitesimal amount of wear and tear), and certainly much less than a bug that causes the computer to crash. If crashing the user's computer could be grounds for a criminal case if would have very far-reaching consequences.


If crashing the user's computer could be grounds for a criminal case if would have very far-reaching consequences.

It's not "crashing the user's computer". The guy intentionally added hidden functionality to his company's software that he knew would cause an extremely high load on the client. It's the difference between a mistake and deliberate sabotage.


I'm aware it didn't crash the computer, it's an analogy.

Also, "sabotage is a deliberate action aimed at weakening another entity through subversion, obstruction, disruption, or destruction". Clearly not what's going on here, and it's this kind of hyperbole I'm arguing against.


Supposedly some peoples video cards got overheated and were destroyed.

I would seriously consider participating in a class-action lawsuit if this happened to me.


If a piece of hardware accepts a valid API software command that will destroy it, the hardware is faulty, not the software.


It made $3700 and 14,000 users were affected. I doubt any of the users would care about the ~25-30 cents they might get from this...


$3700 with the matching donation would make it 50-60 cents with your estimate. With a group lawsuit, the payout would be much greater.


The payout to the users would be greater or the payout from the company to the lawyers would be greater? The latter I agree, the former, I doubt given the paltry amounts that most class actions suits return.


Especially considering some people have said their GPUs burned out during that time (even though lpkane said it didn't optimize to use the GPU). His response when somebody asked him what they should do about the RMA "lol, I wouldn't tell them you were mining". He did offer everyone a free month of premium (a $7 value)


They're providing a premium account the the affected users.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: