Now correct me if I'm wrong, but 720p is still full HD, isn't it?
When I first started following the development of HDTV, the American variant in analog video was going to be 1125 lines. When the framing pulses are removed, that cuts the vertical number down to 1080, the number now recognized as "full" HD (1920 x 1080).
I don't know where the "720" frame (1280 x 720) came from; I think I read somewhere that it was developed at Microsoft, but I can't find anything on that now. (Microsoft "jumped the gun" on MPEG4 by introducing their own variation before the official specs were finalized. So maybe 720 was something similar.) The advantage of 720 is that two "HD" channels will fit into the same bandwidth as one 1080 channel.
The 720 variation of HDTV is still called "HD" in all the marketing literature, but it's obviously a smaller picture. Once upon a time anamorphic NTSC was also called "high definition." So marketing buzzwords can be slippery.
JVC used to run a marketing pitch that 720p was actually better "HD" than 1080i. If I remember correctly, the argument was that 1080i was really only 540 due to the interlacing. As noted above in this thread, interlacing was thrown into the mix to compensate for slower electronics. So in that regard, I would agree with JVC. Only now we have 1080p cameras and monitors, and even higher resolutions.
Personally, I'd rather see HDRI (high dynamic range image) become common. People who've seen such monitors tell me it's like looking out a window at a sunny afternoon. ("Need sunglasses!") Deep shadows and blazing whites.