Why isn’t CPU time more valuable?

Here’s something I find puzzling: why isn’t CPU time more valuable?

I first thought about this when I was working for MD Anderson Cancer Center, maybe around 2002. Our research in adaptive clinical trial methods required bursts of CPU time. We might need hundreds of hours of CPU time for a simulation, then nothing while we figure out what to do next, then another hundreds hours to run a modification.

We were always looking for CPU resources, and we installed Condor to take advantage of idle PCs, something like the SETI at Home or GIMPS projects. Then we had CPU power to spare, sometimes. What could we do between simulations that was worthwhile but not urgent? We didn’t come up with anything.

Fast forward to 2019. You can rent CPU time from Amazon for about 2.5 cents per hour. To put it another way, it’s about 300 times cheaper per hour to rent a CPU than to hire a minimum wage employee in the US. Surely it should be possible to think of something for a computer to do that produces more than 2.5 cents per CPU hour of value. But is it?

Well, there’s cryptocurrency mining. How profitable is that? The answer depends on many factors: which currency you’re mining and its value at the moment, what equipment you’re using, what you’re paying for electricity, etc. I did a quick search, and one person said he sees a 30 to 50% return on investment. I suspect that’s high, but we’ll suppose for the sake of argument there’s a 50% ROI [1]. That means you can make a profit of 30 cents per CPU day.

Can we not thinking of anything for a CPU to do for a day that returns more than 30 cents profit?! That’s mind boggling for someone who can remember when access to CPU power was a bottleneck.

Sometimes computer time is very valuable. But the value of surplus computer time is negligible. I suppose it all has to do with bottlenecks. As soon as CPU time isn’t the bottleneck, its value plummets.

Update: According to the latest episode of the Security Now podcast, it has become unprofitable for hackers to steal CPU cycles in your browser for crypto mining, primarily because of a change in Monero. Even free cycles aren’t worth using for mining! Mining is only profitable on custom hardware.

***

[1] I imagine this person isn’t renting time from Amazon. He probably has his own hardware that he can run less expensively. But that means his profit margins are so thin that it would not be profitable to rent CPUs at 2.5 cents an hour.

9 thoughts on “Why isn’t CPU time more valuable?

  1. “What could we do between simulations that was worthwhile but not urgent? We didn’t come up with anything.”
    In a scientific context, the typical answer is to pool your resources with more people, so that bursts of usage aren’t too correlated between groups of users and at any given time someone will use the available cycles. This is the purpose of the Open Science Grid, and similar grids in other regions. Amazon’s prices sound cheap, but most science projects (and computing cluster administrators) I’ve talked too say they are too expensive for the levels of computing needed compared to the budgets available.

  2. Most of the value in the world comes from turning data into something useful or just physically doing something. So “the network is the computer” isn’t wrong. If your $0.025/hr node was put to work processing about 10 GB / hour of streaming data, it would be about 100x more expensive for bandwidth than the computing power. That ignores a lot of complexity, but it’s a good heuristic. From that point of view it could almost be a loss leader to help sell bandwidth.

  3. Human brain has equivalent of at least 30 TFLOPS of computing power [1]. To get same computing power with CPU at $6.3/TFLOPS in AWS it would cost $189/hr [2]. So renting a human as “general AI computer” is more than 27 times less expensive. Also, don’t forget that humans also come with powerful high precision mobile actuators and unmatched sensor arrays.

    So in conclusion, if you have $1M lying around, you can almost always find more profitable endeavor by renting humans than same amount of compute capacity in cloud. Price of GFLOPs is falling however at about 10X every 13±3 years. So possibly in 20-30 years things might be different.

    [1] https://aiimpacts.org/brain-performance-in-flops/

    [2] https://aiimpacts.org/recent-trend-in-the-cost-of-computing/

  4. That’s similar to the though I have when I pour too much salt out of the container and I throw the extra in the sink instead of trying to get it back into the box.

  5. Lack of investment. The areas that receive decent funding make use of additional hardware – finance, engineering, computer gaming.

    The uses of CPU capacity beyond crypto are science, art; research, investigation, curiosity. Unfortunately we do not place enough value on those things as a society – we spend what we have to, largely, and prioritise physical need and profit.

  6. I believe it’s just supply and demand. The demand for computing resources on-demand is high enough that there is an over-supply of left-over surplus CPUs. Microeconomics suggests that the price of something in a market will tend towards the marginal cost. As there is sufficient spare CPU time for all non-critical tasks (no one is installing more for that market) and CPU time is more or less an undifferentiated commodity, the price will drive to the network, transaction, and power costs. The price will likely jump significantly when/if the non-real-time tasks outweigh the real-time tasks and that first CPU is bought to handle it.

  7. I understand what you’re saying about supply and demand. What’s surprising is that the demand is what it is. Why can’t we think of anything for these machines to do that’s worth more than pennies per hour? Why can’t I think of anything? It seems like a huge breakdown of creativity.

Comments are closed.