Against large blocks

Larger blocks create too much risk to Bitcoin's decentralization
First, it should be understood that decentralization is the most important property of Bitcoin.

Larger blocks reduce decentralization via the following mechanisms:


 * Higher block propagation latency favors large miners
 * Large blocks make it too costly to run a full node

Because of those two effects and because decentralization is so critical, we should in general be conservative about raising the block size.

We should be especially hesitant to raise the block size now, because mining is already too centralized. We don't have any margin of safety there.

Counterarguments

 * Make sure to see the counterarguments on the sub-pages linked to above.
 * It's unlikely we would lose the opportunity for a decentralized currency if Bitcoin becomes corrupted. As long as the technology for decentralized currencies exist, and as long as there is some market demand for a decentralized currency, it's likely that if Bitcoin is corrupted some more decentralized currency will step in to fill the needs no longer met by Bitcoin.
 * If Bitcoin ever got to a dangerous level of centralization, we could always create some hard or soft fork to dial back the block size.
 * Some would claim that we're already at such a dangerous level.
 * It's true that decentralization is the key property of Bitcoin, but we should still be willing to sometimes sacrifice some amount of decentralization for other benefits.

We need a fee market eventually, so it's good for one to develop now
Matt Corallo writes: "Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler)."

Peter Todd summarizes this paper: "In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize."

Jorge Timon writes: "I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future."

Mark Friedenbach writes: "It is a simple fact that the block size cannot increase so much as to cover every single use by every single person in the world, so there is no getting around the reality that we will have to transition into an economy where at least one side has to pay up for a transaction to get confirmation, at all. We are going to have to deal with this issue whether it is now at 1MB or later at 20MB. And frankly, it'll be much easier to do now."

Mark later writes: "We must be careful to use the block size limit now to get infrastructure to support a world with full blocks -- it's not that hard -- while still having a little room to grow fast if things unexpectedly break."

Ensuring Bitcoin's long term security is a tough problem. We have some vague understanding that transaction fees will eventually need to pay for security, and we're also pretty sure that a fee market will work. But our fee market right now looks nothing like it'll need to in the future. What if there are problems with how we're imagining things will work? Maybe the more experience we get with fee markets between now and when we need them, the better we'll be able to fine tune the fee market in the future to actually pay for security.

Counterarguments

 * We'll probably need a fee market with lots of users eventually, perhaps in over 20 years if the exchange rate grows faster than the block halving schedule. The best chance of achieving that is to do things to increase the number of users we'll have in 20 years. Making Bitcoin harder for users to use now mainly hurts our long term growth trajectory and reduces our options in the future. Bitcoin's security depends on its usage winning a race against its reward halving schedule.
 * The lower the block size, the more likely it is that a sudden increase in demand would result in high fees. High fees have many negative effects
 * Fee markets aren't that complicated. It's simple supply and demand. There's no need to get such a huge head start when we won't need one for so long and doing so has immediate costs.
 * Gavin writes that we already have a functioning fee market, although only if we consider txns that need to confirm quickly. If a txn has just the minimum fee above the spam limit, it will get confirmed eventually. It's unclear how current wallets and full nodes would handle lots of txns that would never confirm.
 * The far future is uncertain. We might not need a fee market for over 20 years. It's a mistake to try to plan so far ahead when we don't need to, especially when the plan involves incurring costs now to gain an uncertain benefit in 20 years. Gavin makes this argument here.
 * Peter R. has published an analysis showing that a fee market would develop even in the absence of a block size limit.
 * A criticism of this paper that seems pretty severe is that it didn't take into account that a miner's orphan rate depends on its hash power. It'd be good to see the analysis re-done with this change.
 * Long term security can't be funded via transaction fees

Having high fees now will encourage the development of more scalable solutions sooner
Matt Corallo writes: "This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo" ...and: "Yes, I am arguing that by increasing the blocksize the incentives to actually make Bitcoin scale go away. Even if amazing technologies get built, no one will have any reason to use them."

Pieter Wuille writes: "Call me cynical, but without actual pressure to work on [things like Lightning], I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem."

Wladimir writes: "A mounting fee pressure, resulting in a true fee market where transactions compete to get into blocks, results in urgency to develop decentralized off-chain solutions. I'm afraid increasing the block size will kick this can down the road and let people (and the large Bitcoin companies) relax, until it's again time for a block chain increase, and then they'll rally Gavin again, never resulting in a smart, sustainable solution but eternal awkward discussions like this."

Counterarguments
As long as people expect that there will be high enough fees in the future to make layer 2 networks worthwhile, we don't need to create high fees now.
 * Blockstream has hired Rusty Russell to implement Lightning, and there are at least a few other implementations of layer 2 networks being worked on (here, here, here). It seems that the active work on this should reduce the fear somewhat that it will not be done.
 * Gavin writes that even if block space becomes plentiful, there are still a lot of reasons for people to want these layer 2 networks.
 * The high fees that would be needed to encourage faster development of layer 2 infrastructure could have many other very bad effects. They'd also place us at risk of even higher fees if demand increased faster than we expected, magnifying the negative effects of high fees.

The Bitcoin network scales as O(N^2), which is not scalable
This discussion assumes familiarity with Big O notation. Let's look at where the quadratic growth comes from.

Let n = the total # of full nodes. Let u = the total # of Bitcoin users Let t = the average # of transactions per user

So the per-node cost is O(u*t) The network-wide cost is O(n*u*t)

Assume some constant fraction of users operate full nodes. n = 0.001 * u. Then the network-wide cost is O(u*0.001*u*t) = O(u^2), assuming t is capped and doesn't scale up with u forever. The total cost of the network being quadratic in the number of users means it doesn't scale.

Adam Back has a post on this here.

Counterarguments

 * What is important is the per-node cost, not total network cost. The per-node cost is not quadratic. Looking at the cost of the entire network exaggerates the scaling difficulty because the n^2 amount of work is split over n nodes. So if O(u*t) is something that each individual node can handle, there it doesn't matter that the total amount of work that the entire network does scales quadratically.
 * For example, consider a hypothetical network with N nodes, where each node has some unique identifier. Suppose the operation of this network requires each node to compute a hash of every other node's identifier plus some time based data every day. Then this network is O(N^2), with each node being O(N) like Bitcoin. Yet this hypothetical network does scale because even if every person in the world was on the network, 7 billion hashes per day is very cheap. So a network having O(N^2) cost does not imply that it won't scale. Deeper analysis is needed.
 * n is not necessarily a constant fraction of u. As u increases, n might grow logarithmically, or n might max out at around 100,000. As Bitcoin grows, each new user is less and less likely to run a full node because they're more of a mainstream user. As long as any given user can run a full node if they want to, and as long as there are enough full nodes that a new user can find an honest one, no important decentralization is lost if n doesn't scale linearly with u.
 * Big-O notation conveys performance very approximately. In real life, the constant factors matter a lot. We can perform more detailed calculations, given knowledge of the practical limits of u, t, and n. If we do a detailed analysis and find the Bitcoin can grow much bigger without hitting scaling issues, what does it matter if things would break at values of u, t, and n that we'll never encounter?

A larger block size will make it easier for governments to regulate miners and full node operators
If mining operations become so large that only huge professionally run organizations can mine profitably, then these organizations can more easily be identified and controlled by governments.

If the block size ever got large enough that only huge professional organizations could run full nodes, they could also be regulated more easily.

Counterarguments

 * None of the current block size proposals require anything more than a pretty good broadband connection and a decent modern hard drive, so the idea of full nodes being regulated with current proposals seems far fetched.
 * Gavin's proposal does result in block size growth that reaches 8 GB in 2036, so if technology improved much slower than Gavin expects we could end up in this situation. Although if that started happening we could always reduce the max block size.
 * It's not clear that any current proposed increase would give a large professional mining operation a significant advantage over a hobbyist miner with a good broadband connection.

Downloading large blocks via TOR will cause nodes to reveal themselves
Gavin quotes the argument as being: "More transactions makes it more difficult to keep Bitcoin activity private from oppressive governments."

Presumably it's easy to hide small amounts of traffic using TOR without revealing yourself, but the risk of revealing yourself grows as your rate of data transfer increases. So running a full node over TOR will be more risky as block size increases.

Counterarguments

 * Gavin argues that you could just pay for a machine outside of your oppressive country to run a full node on, and use TOR to connect to it. The traffic between your node and the rest of the Bitcoin network won't be over TOR, so whoever you're trying to conceal yourself from won't be suspicious of you.
 * I vaguely recall Peter Todd countering that hosted machines are notoriously insecure so you open yourself up to more hacking risk in this situation. If someone can send me an original source where this argument is made I'd appreciate it. It'd also be good to get hard data on how secure hosting providers are.

Almost all technical experts favor waiting
Greg Maxwell writes: "Committers to Bitcoin Core who are pushing for immediate deployment of the 20MB increase: Gavin, Opposed/Concerned: Myself, Pieter, Jeff, Wladimir. Gavin stands more or less alone in the Bitcoin Core technical community on this. That there are are many of people at my company who are concerned about the impact on the survivability of Bitcoin as a decenteralized system is a product of the fact that these concerns are the overwhelming majority in the technical community, and my company was created specifically to fund technical infrastructure work, and so employs a lot of technical people out of this space.

[I'm using commiters in that list as a fixed set and because the OP highlights them. If I instead e.g. take all the people with substantial activity in the last year there are a couple people whos positions are unknown, even if I assume they support Gavin's recent push the result remains; people concerned about that proposal are the overwhelming majority]"

Since this was written, Gavin has changed his proposal to initially increase to 8 MB instead of 20 MB. Jeff Garzik has moved closer to Gavin's position, proposing his own hard fork increase to 2 MB with subsequent increases depending on votes by miners up to a limit of 32 MB. Jeff's proposal is still more conservative than Gavin's. Still, among the people who have contributed the most to Bitcoin Core, a most do not regard it as urgent to increase the block size.

It is true that many people in industry favor a block size increase, including wallet developers and exchange developers, as well as miners. Some of these people may be technical, but their expertise in regards to security and decentralized systems is on average lower than most of the people who contribute a lot to Bitcoin Core or who hang out in the #bitcoin-wizards IRC channel.

Counterarguments

 * Those opposing a block size increase tend to come from a particular culture of cypherpunks. These people often place an extremely high value on security and trustlessness. Their opposition to increasing the block size might not reflect a real disagreement about the trade-off involved (and therefore not be a real technical disagreement), but might simply reflect a different opinion on where on the decentralization / usage spectrum we should aim.
 * Meni Rosenfeld, an extremely technical and well respected person in the Bitcoin community, has recently come out as preferring 8 MB blocks over 1 MB blocks in the near term. So now the list of very technical high profile people supporting larger blocks soon includes at least Gavin, Mike Hearn, Jeff Garzik, and Meni Rosenfeld.

Fewer node operators means fewer people need to consent to future changes
Peter Todd writes: "Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?"

The important point is that miners and those running full nodes are really the only people that need to consent to changes in Bitcoin.

Counterarguments

 * Ultimately all power flows from the users of the system. The mechanism is just indirect. Users provide demand which drives the value of the coin. If users don't want to use Bitcoin because they feel the requirements of running a node make it too centralized, then the price will go to zero and these full nodes will just be helping to maintain a useless network. If users stick with an old fork of the chain with a smaller block size, coins on that fork will have value whereas coins on the new fork won't, so miners will mine on the old fork, exchanges will only accept coins on the old fork, etc.

Larger blocks would still not allow all use cases. You always have to exclude something. So what's the difference between 1 MB and 20 MB?
See Peter Todd here, and Pieter Wuille here and here.

Whether blocks are 1 MB or 20 MB doesn't fundamentally change anything, because demand for completely free transactions will be nearly infinite. We have to set a limit somewhere. We'll lose some use cases with 1 MB, and we'll lose almost as many at 20 MB. Maybe fees will be a bit higher at 1 MB, but the difference is not a big deal.

Counterarguments

 * High fees have many negative effects. We shouldn't underestimate them.
 * An increase in demand at a lower block size is more likely to result in very high fees than at a higher block size.

We could raise the block size quickly in an emergency if we needed to
Greg Maxwell mentions here the possibility of hardforking with a short lead time. See also comments from Luke-jr in that thread.

If there is a reliable mechanism for increasing the block size on short notice, in response to bad effects from having too small of a block size, it would reduce the urgency of hard forking soon.

Counterarguments

 * Those opposed to a near term hard fork have spent very little effort arguing for their ability to do a fast hard fork when needed, so it's not clear that they really believe it'd be that easy or would go smoothly.
 * It seems like there is no drawback to at least having a concrete plan of how such a "fast" hard fork would be rolled out and what conditions would trigger it. No effort seems to have been devoted to this so far, which is concerning if this is really being put forth as an argument for waiting.