I am firefox.You're using Firefox aren't you?![]()
I am firefox.You're using Firefox aren't you?![]()
You're using Firefox aren't you?![]()
How wide is your http pipelining JAllen?
4 pipelining requests sounds okay.
After a bit of reading, it may be caused by looking for search results that don't exist. I've noticed something strange on google myself these past few months where if I begin browsing through the hundreds of pages of search results, the number of results pages mysteriously shortens as I go on. There aren't as many there as it makes out there are. If you try to jump to the later ones that don't exist, then it thinks you're causing trouble.
This is what happens when your computer is filled with viruses from watching those "special" movies.![]()
ITS A FAKE!!!!
![]()
What's a fake?
J.
I always set mine to 8.4 pipelining requests sounds okay.
I wonder why it numbers them like that if they don't exist?
I see. Thanks for the info.In the last week or so I believe there have been a number of Google spiders roaming the Net and latching onto some sites, including the RavesceneBBS and another BBS which I administrate (it happened there first possibly because I was linking to it in my signature a lot and I was posting here a lot, hence multiple instances of that link) causing a sudden jump in the number of users online at any one time (our BBSes are very small in popularity, so this is noticeable).
http://en.wikipedia.org/wiki/Google_search#Error_messages
Actually they weren't google spiders. I did a trace on a couple I caught. They originated from Bandcon, a content delivery system. It appears to generate a swarm of page reading entities whenever a web page is clicked on.
I wonder why it numbers them like that if they don't exist?
My guess is for optimisation. Imperfect hash tables suggest more results than there actually are. Although it is a quick and easy way of filtering the results down to a manageable size.
Each of these pages might be entries in a concordance sql database, but it doesn't need to look through all of those results unless it is necessary. Since most people are served by the first results page, preparing hundreds of results would be a waste of effort. It just prepares the first page and it is only when you visit the later pages that it forces google to sort through the rest. It is only then that it discovers the false positives from the hash function, and there are less results than initially suggested.
That's my guess anyway.
Then why did I just cut your bandwidth down so much?!^ That would be fine. I don't look at porn.
J.
We use essential cookies to make this site work, and optional cookies to enhance your experience.