• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

The internet? Bah, it's a fad

Data transmission rates right now are actually hampered by the speed at which data can be written too. We could theoretically have maxed out internet on every single modern computer but the hard drives literally wouldn't be able to keep up with the transmission. Of course not many places in the world offer that kind of data transmission speed (IIRC, anything over 3 gbs is pointless for that reason) so I wonder what 15 years from now will be like.
 
Then again, the biggest difference that could have affected his opinion is that everyone back then was on dialup.
The speed of the internet was one of the issues, but all of the technology of the time wasn't up to the task of the modern internet. Just think about storage; back in 95 most of us were using harddrives measured in hundreds of megabytes, maybe gigabytes, while now we've moved into the terrabyte era. The idea that we could store everything digitally was a big enough feat back then, the problem with transmitting it to millions of people was another concern.

Even so, it's naive of him to think that these things wouldn't be improved upon. Just compare computers of the 1990s to computers of the 70s and you can see the insane achievements that had been made. It only makes sense that 10-20 years down the line we'd make even more improvements to the technology.
 
Then again, the biggest difference that could have affected his opinion is that everyone back then was on dialup.
The speed of the internet was one of the issues, but all of the technology of the time wasn't up to the task of the modern internet. Just think about storage; back in 95 most of us were using harddrives measured in hundreds of megabytes, maybe gigabytes, while now we've moved into the terrabyte era. The idea that we could store everything digitally was a big enough feat back then, the problem with transmitting it to millions of people was another concern.

Last year, I read that 20 hours of video was being uploaded to YouTube every minute, so every minute, 1,200 minutes of footage is being added to their servers. And that rate increases with time, they could be over 30 hours per minute by now. That's mind-boggling to think about now, let alone 15 years ago. And that's only one website (albeit, one of the biggest).


Yeah, that's so true too. I think that every 5 years or so, we hit an innovation that really improves upon things. One year it was Google. Then came Youtube, and that really made it easy to find old videos. Kind of like a video archive, really. Another was Wikipedia. That's what I think the real Internet 2.0 is all about. Not front ends like fancy websites or social networks like they'd like you to think, but rather information and media retrieval. That's the biggest difference between then and now. Imagine if you could go back in time and tell yourself that you'd be streaming movies, bring your whole bookshelf to the beach and have a cell phone resembling a communicator, and an ipdad like a padd. You'd most likely get a blank face in return.

I read Bill Gate's book, The Road Ahead in the 90's. It was an interesting read. I found an article awhile back that described many things that he got right. It was written in 2006, but fits with the timeline the first article was written in:

http://www.bit-tech.net/bits/2006/02/08/road_ahead_billgates/
 
As wrong as he was about the future of the Internet, if I had my druthers I'd redesign the whole thing from the ground up. It grew too chaotically, with nobody thinking about how to make the different parts work together. Too many programming languages that do the same thing (some days I have to write code in four or five different languages - how ridiculous is that? Can VB really do anything that C# can't do?). Too many bits and pieces that need to talk to each other but weren't designed to do so.

A few months ago, I was tasked with writing an application that would open an email account, go through the Inbox looking for messages indicating that an outgoing email had bounced, parse the headers to determine the reason, and move the email into an appropriate folder, depending on the reason for the bounce. It took a ridiculous amount of time because there's no standard way for those headers to be formatted. Sometimes they're in attached files, sometimes they're in the body of the bounce message, and sometimes there just doesn't seem to be any way of determining the reason. The codes themselves can be formatted in a couple of different ways. If there's a simpler way of handling the problem, it's not documented anywhere. (Don't get me started on Microsoft's lousy documentation, when it even exists - they love to obfuscate things.) Drove me up the wall.

We won't even talk about the security issues. A ground-up redesign of the whole thing would start by plugging all the holes that viruses are written to exploit. Sure, there would probably be vulnerabilities that would be missed, but a good foundation would make it a lot harder for hackers to get in and wreak havoc.
 
You're welcome, Owain! :)

As wrong as he was about the future of the Internet, if I had my druthers I'd redesign the whole thing from the ground up. It grew too chaotically, with nobody thinking about how to make the different parts work together. Too many programming languages that do the same thing (some days I have to write code in four or five different languages - how ridiculous is that? Can VB really do anything that C# can't do?). Too many bits and pieces that need to talk to each other but weren't designed to do so.

A few months ago, I was tasked with writing an application that would open an email account, go through the Inbox looking for messages indicating that an outgoing email had bounced, parse the headers to determine the reason, and move the email into an appropriate folder, depending on the reason for the bounce. It took a ridiculous amount of time because there's no standard way for those headers to be formatted. Sometimes they're in attached files, sometimes they're in the body of the bounce message, and sometimes there just doesn't seem to be any way of determining the reason. The codes themselves can be formatted in a couple of different ways. If there's a simpler way of handling the problem, it's not documented anywhere. (Don't get me started on Microsoft's lousy documentation, when it even exists - they love to obfuscate things.) Drove me up the wall.

We won't even talk about the security issues. A ground-up redesign of the whole thing would start by plugging all the holes that viruses are written to exploit. Sure, there would probably be vulnerabilities that would be missed, but a good foundation would make it a lot harder for hackers to get in and wreak havoc.


Agreed, there. It's way too easy for someone to crack into web servers and bring a business to it's knees while people scramble to get things back in order.
 
As wrong as he was about the future of the Internet, if I had my druthers I'd redesign the whole thing from the ground up. It grew too chaotically, with nobody thinking about how to make the different parts work together. Too many programming languages that do the same thing (some days I have to write code in four or five different languages - how ridiculous is that? Can VB really do anything that C# can't do?). Too many bits and pieces that need to talk to each other but weren't designed to do so.

A few months ago, I was tasked with writing an application that would open an email account, go through the Inbox looking for messages indicating that an outgoing email had bounced, parse the headers to determine the reason, and move the email into an appropriate folder, depending on the reason for the bounce. It took a ridiculous amount of time because there's no standard way for those headers to be formatted. Sometimes they're in attached files, sometimes they're in the body of the bounce message, and sometimes there just doesn't seem to be any way of determining the reason. The codes themselves can be formatted in a couple of different ways. If there's a simpler way of handling the problem, it's not documented anywhere. (Don't get me started on Microsoft's lousy documentation, when it even exists - they love to obfuscate things.) Drove me up the wall.

We won't even talk about the security issues. A ground-up redesign of the whole thing would start by plugging all the holes that viruses are written to exploit. Sure, there would probably be vulnerabilities that would be missed, but a good foundation would make it a lot harder for hackers to get in and wreak havoc.

I think this is a very wrong-headed way to look at it. The plethora of tools and technologies available isn't the problem, nor is it a lack of standards. Email headers are standardized. If you want to get to the meat of the problem it's that companies such as Microsoft decided not to comply with standards, went their own way, and everyone else had to be compatible because Microsoft's market clout allowed them to gain wide penetration. (Microsoft is hardly alone in this, they are just the most obvious target.)

If anything, I think the argument is for more standardization as well as concerted efforts to punish entities that don't comply with those standards. HTML and especially JavaScript were clusterfucks for years because of non-standard implementation wars between browsers.

What language is running on the backend is really irrelevant so long as it outputs standard HTML/XML/CSS/etc.

Security is a whole other ball of wax. The Internet is insecure because it was designed that way. Individual servers can be secure but the network itself is fundamentally open and trusting. It's why DDoS attacks still work despite being so simple and well-known: they exploit the basic open nature of the Internet.

I am against any rearchitecting of the Internet in the name of "security." That is nothing more than a cover for massive government intrusion and control.
 
That said, I still have my doubts it will replace physical media entirely any time soon. I love streaming video and the ease of access to almost anything I want but nothing beats having a copy you can pop in a DVD/Blu-Ray player or a book you can read with no power or just without internet for some reason.

Who knows? In 15 years I might be as wrong as him and no one will use physical media for anything but legal or back-up purposes.

The day that physical media is relegated to that role, (pardon the vulgarity) we are fucked as a society when something big, man or nature induced, takes down the internet.
 
I agree but there are a lot of people even here who think that's our future. That physical media will be as dated and worthless or at least archaic as any old physical format today (8-track, cassette tape, floppies etc).
 
The author must be pretty dumb. I used the Internet back then extensively. No, it wasn't as useful as it is today. But, it was still pretty useful. You could find lots of good information. It wasn't as hard to find as he implies. Just like today, there were reputable and not so reputable sites. You could easily and safely purchase all sorts of things online. There was email and IM (Powwow was an early one that I used).

The author claims that he'd been online for 20 years. He should've noticed the trends then.

Mr Awe
 
Then again, the biggest difference that could have affected his opinion is that everyone back then was on dialup.
The speed of the internet was one of the issues, but all of the technology of the time wasn't up to the task of the modern internet. Just think about storage; back in 95 most of us were using harddrives measured in hundreds of megabytes, maybe gigabytes, while now we've moved into the terrabyte era. The idea that we could store everything digitally was a big enough feat back then, the problem with transmitting it to millions of people was another concern.

Last year, I read that 20 hours of video was being uploaded to YouTube every minute, so every minute, 1,200 minutes of footage is being added to their servers. And that rate increases with time, they could be over 30 hours per minute by now. That's mind-boggling to think about now, let alone 15 years ago. And that's only one website (albeit, one of the biggest).

Anyone familiar with Moore's law should be aware that all of those issues would soon go by the way side. So, if you assume things double every 2 years, over 14 years computers would be improved by a factor of 128. That'll solve a lot of problems, including storage issues!

Mr Awe
 
To be fair it's an article written 15 years ago when many things that we take for granted today are in their infancy or not even invented yet.

15 years is a lot of development time for something so fast developing as media and information technology.. some of what he said is true and some of it isn't (mainly the stuff that couldn't be done technologically and are normal today like Wifi networks making it indeed possible to take your laptop to the beach and go on the internet from there)

Who knows how technology will look like in another 15 years and what we take then for granted that's just appearing now.

Maybe our laptops are then a disappearing technology and iPad like devices are the norm.
 
To be fair it's an article written 15 years ago when many things that we take for granted today are in their infancy or not even invented yet.

15 years is a lot of development time for something so fast developing as media and information technology.. some of what he said is true and some of it isn't (mainly the stuff that couldn't be done technologically and are normal today like Wifi networks making it indeed possible to take your laptop to the beach and go on the internet from there)

Who knows how technology will look like in another 15 years and what we take then for granted that's just appearing now.

Maybe our laptops are then a disappearing technology and iPad like devices are the norm.

I brought that up a little in the "defunct technology" thread. Desktop computers are moving out of the consumer mainstream, replaced by laptops. Laptops themselves are being crowded out by netbooks, iPads, and the like. I think the computer market will look very different in 10 years.
 
I think this is a very wrong-headed way to look at it. The plethora of tools and technologies available isn't the problem, nor is it a lack of standards. Email headers are standardized. If you want to get to the meat of the problem it's that companies such as Microsoft decided not to comply with standards, went their own way, and everyone else had to be compatible because Microsoft's market clout allowed them to gain wide penetration. (Microsoft is hardly alone in this, they are just the most obvious target.)

Okay, point taken. I shouldn't have said that they aren't standardized. I don't have the site bookmarked on my laptop - it's something I use at work, and right now I'm on a train just west of Kingston, Ontario (yes, a train :p) - but I know there is a standard for how those headers are supposed to be formatted, and what the various error numbers mean. But when one company is using Microsoft Exchange Server to handle its email, and another is using a different system, and a third organization uses yet another, somewhere along the way (probably Exchange Server) decides to write out its own messages, making those standardized ones irrelevant. And finding the documentation to process those non-standard messages is next to impossible. So yes, it would be lovely to be able to take those companies to task who ignore the standards. Unfortunately, as you point out, M$ is the elephant next to the mice that are everything else, and they get to do whatever they want.

If anything, I think the argument is for more standardization as well as concerted efforts to punish entities that don't comply with those standards.

I'd love to be able to do that - but how? 99% of users don't care about that, and will continue to buy software that is incompatible with everything else just because it's from M$. Unless M$ decides to open up their API for the benefit of those of us who are developing software that needs to interface with theirs - and we all know that the Toronto Maple Leafs will win the Stanley Cup before that happens - then we developers are stuck.

HTML and especially JavaScript were clusterfucks for years because of non-standard implementation wars between browsers.

Hell, they still are, though it's better than it used to be. I had a client for whom I was sending out an email blast to their customers a few months ago. I looked at it in Yahoo Mail - perfect. Gmail - perfect. Using Firefox, IE, Chrome, Safari, Outlook 2003 - perfect. Outlook 2007? Looked like crap. It took us two days to redo the client's HTML so that it would look decent (and it still wasn't perfect).

Security is a whole other ball of wax. The Internet is insecure because it was designed that way. Individual servers can be secure but the network itself is fundamentally open and trusting. It's why DDoS attacks still work despite being so simple and well-known: they exploit the basic open nature of the Internet.

I am against any rearchitecting of the Internet in the name of "security." That is nothing more than a cover for massive government intrusion and control.

Well, if you want to get technical, the Internet was "designed" so that a nuclear attack wouldn't bring down all communications between government installations - it grew organically on top of ARPAnet, Usenet, etc. But what I was referring to was holes like those which allow people to hack into networks using nothing more complex than JavaScript - and I've witnessed it with my own eyes. Stuff like that shouldn't be able to be done, and maybe one can argue that the designers of those networks are to blame, but as a developer, I can't think of a single reason why you would want to design your system so that that was possible.
 
To be fair it's an article written 15 years ago when many things that we take for granted today are in their infancy or not even invented yet.

I guess the point is that a lot of us foresaw this stuff coming, easily. I used the internet back in the 90s and you could see things developing. The trends were there.

It would take a noticeable lack of imagination to miss that. In fact, the fact that he could easily point out what was missing back then should have indicated to him that others noticed the same things and would be working on them.

Mr Awe
 
It's been around since before that in some form or another dating back to I think the 1960's.
 
It's been around since before that in some form or another dating back to I think the 1960's.

OH, yeah! Part of one of my classes this semester dealt with the history of the Internet.

ARPANET was the seed of the whole shebang!
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top