Is there a theoretical limit to how fast CPUs can go at the moment?
You can hit a limit with specific materials such as silicon (at which point you'd need to transition to new materials).
Arguably, we should have started using different materials in computer chip designs such as synthetic diamonds, carbon nanotubes and graphene a while ago (at least where appropriate in the computer chip) to make proverbial hybrids... but markets don't work like this.
They tend to focus on 'cost efficiency' and existing production/manufacturing (and what's worse, many prefer to wait so they can produce new materials via existing methodology - which is mainly driven by money not scientific or technical breakthroughs - otherwise, nothing is really stopping large businesses to say harvest existing production facilities for their raw materials, decompose them into base elements and then use them to build new/more suitable facilities - but again, they don't think like this as 'cost' is again a limiting factor to them)... so if transitioning to say new materials would require redesign of production facilities, it will only be done if its cost-effective (monetarily affordable) and profitable (otherwise it COULD have been done from a resource and scientific/technical point of view a while ago).
Hybrid computer chips, or complete discarding of silicon as a material and how computer chips work might be next.
For example, graphene doesn't technically have an 'off' switch so you need to make one (which was designed a LONG time ago actually), but to be fair how we LOOK at computer chip operations also might need to change.
If we keep trying to apply old methodologies to new materials as opposed to trying to make chips by making use of new material natural properties as they are... it would take AGES to do stuff (Well, not anymore, because you can actually put a supercomputer with an AI or adaptive algorithm to do the research and it can be done in a fraction of a time that it would take whole teams of specialized people to do that).
In regards to HDD's reaching their limits... I agree, and to be fair, I actually HATE how they keep dragging out the platter HDD's so much.
IBM already developed ideas for holographic storage in the 90ies... but if you want something 'less exotic'... then SSD's are more likely to take over for HDD's when it comes to storage density and reliability... but cost is an issue here because a 4TB SSD for example can cost as much as a 16TB HDD... which is absurd (seeing how prices of SSD's should have actually beaten HDD's by now).