I was going to post this in Miscellaneous, but since it concerns a technical issue, I think it's appropriate here.
It seems that Photobucket's image-recognition software is extremely sensitive to anything that even remotely resembles an exposed female breast. Now, I understand that with 100 million registered users and nearly 5 million images uploaded every day, obviously the job of detecting pictures that violate the rules has to be done by machines. But how long will it take for image-recognition technology to equal the discerning capabilities of the human eye and the human brain? Will it ever reach that level? I get a bit frustrated when Photobucket keeps deleting my uploaded images due to “terms of service violations,” even though there's no nudity and absolutely nothing offensive in my pictures. (Well, okay, one of my images did have a naughty word in it.)
Anyone else have similar issues with Photobucket and other image-hosting sites? Are some sites better than others in this regard?
It seems that Photobucket's image-recognition software is extremely sensitive to anything that even remotely resembles an exposed female breast. Now, I understand that with 100 million registered users and nearly 5 million images uploaded every day, obviously the job of detecting pictures that violate the rules has to be done by machines. But how long will it take for image-recognition technology to equal the discerning capabilities of the human eye and the human brain? Will it ever reach that level? I get a bit frustrated when Photobucket keeps deleting my uploaded images due to “terms of service violations,” even though there's no nudity and absolutely nothing offensive in my pictures. (Well, okay, one of my images did have a naughty word in it.)
Anyone else have similar issues with Photobucket and other image-hosting sites? Are some sites better than others in this regard?