The TNG Tech Manual gives a very simple reason for "mainframe" computing: computers in the 24th century simply are massive pieces of technology, and can't be miniaturized further.
You can't build a pocket-sized steam engine and expect it to perform efficiently. Similarly, you supposedly can't build a 24th century computer, complete with a FTL field generator that makes it perform, unless it's past a certain threshold size. Installing fifty smaller units means that every one of them fails to perform, and the sum total is vastly inferior to the one mainframe.
Is this "true" in the "actual" fictional universe? Might well be, is all we need to know.
...The Intrepid does more computing than the Galaxy? We see no signs of that. Sure, the newer (?) ship may process more fictioquads or fantasmabits, but unless that actually translates to observed better computing, it just goes to show that the newer (?) ship is inferior at computing, squandering its resources. Which is a feature of the real world, too, with home and business computing plateauing every now and then, and possibly also taking the occasional downturn, even when the processors get consistently more powerful.
Timo Saloniemi