Saturday, 6 April 2013

Deep Web

Deep Web

The deep Web is the part of the Internet that is inaccessible to conventional search engines, and consequently, to most users.
It should not be confused with the dark Internet, the computers that can no longer be reached via Internet, or with the distributed filesharing network Darknet, which could be classified as a smaller part of the Deep Web.

Estimates based on extrapolations from a study done at University of California, Berkeley in 2001,
speculate that the deep Web consists of about 7,500 terabytes. More accurate estimates are available for the number of resources in the deep Web: He detected around 300,000 deep web sites in the entire Web in 2004, and, according to Shestakov, around 14,000 deep web sites existed in the Russian part of the Web in 2006

"It would be a site that's possibly reasonably designed, but they didn't bother to register it with any of the search engines. So, no one can find them! You're hidden. I call that the invisible Web."
Bergman cited a January 1996 article by Frank Garcia

-Dynamic content
-Unlinked content:
-Private Web
-Contextual Web
-Limited access content
-Scripted content
-Non-HTML/text content

At present, the Internet is functionally divided into two areas:

·     The surface Web contains 1% of the information content of the Web. Search engines crawl along the Web to extract and index text from HTML (HyperText Markup Language) documents on websites, then make this information searchable through keywords and directories.
·     The deep Web contains 99% of the information content of the Web. Most of this information is contained in databases and is not indexed by search engines - technical and business reasons are obstacles. This information is made searchable by keywords only through the query engine located on the specific website of each database.
In their 2001 white paper, 'The Deep Web: Surfacing Hidden Value,' BrightPlanet noted that the deep Web was growing much more quickly than the surface Web and that the quality of the content within it was significantly higher than the vast majority of surface Web content. Although some of the content is not open to the general public, BrightPlanet estimates that 95% of the deep Web can be accessed through specialized search.

LexiBot is a specialized search tool developed by BrightPlanet, as a means of searching thedeep Web
and so Many..

 eBay Ad:
  Let's Take A Look at  Online Mega Store;


Post a Comment