Home Free Trial FAQ
Open/Create Company File Accept an Invite Order/Renew
Acerca de DNS Ordenar/Renovar Preguntas Frecuentes AUP Dynamic DNS Clients
Configurar Dominios Dynamic DNS Update Password
de Redes ▼
Enterprise Avanzado Estándarr Prueba Preguntas Frecuentes Resumen de Precio/Funciones Ordenar Muestras
Configure/Status Alert Profiles
|Survey Data Mining: Home | FAQ | Archive | Glossary|
ssl srvrs by domain
web authoring tools
web share changes
isp market share
dns load balancing
p3p compact policy
telephone area code
DNS site operator
Mail Server Survey
CA Market Share
RTSP Server Survey
Web Survey by IP
|Monthly Webe Site Failure Rate|
|Monthly Failure Rates by Server Type|
By modifying the above graph's percentage to reflect a delta between the percentage market share a web server currently enjoys, and the percentage of failed sites using that server, we highlight whether a disproportionate number of servers with the specified server type are failing.
For example, if a particular server currently enjoys a market share of 50%, but only 40% of failed sites are of that server type, then that server will have a value of -20% on the graph, since the server type is losing customers at a rate 20% lower than expected. Conversely, if the market share of a server is 10%, but 12% of all failed sites are of this server type, then that server will have a value of 20%, since it is losing sites 20% faster than expected. By calculating values in this way, we have the ability to directly compare values between different server types, regardless of their current market share.
If we stretch our imagination and assume that a site
more likely to survive is operated by a smart web site administrator, then
the following chart illustrates the technologies the smart money is betting
on. (Hint: The smart money is betting on servers with a negative deviation.)
|Monthly Web Site Growth|
By measuring the rate at which we find sites that we've never known about in any one month, and knowing the sample size of our data sets, we can do a first order approximation of how large the web is, by estimating how many sites we would find if we crawled all of the web in one month, rather than only the approximlate 10% that we currently crawl.
As a result of how we crawl the web, our surveys only report on what we call "Active" Web. That is to say, we only include sites that were important enough to be referenced by another site. This means that parked domains, personal web sites not referenced anywhere, etc. are not included in our survey. Our argument is that if we can't find a site, then it really isn't part of the "Active" Web.
The following graph depicts what we feel is a reasonably accurate estimate
of the size of the active web over time:
|New Web Sites by Server Type|
By modifying the above graph's percentage to reflect a delta between the percentage market share a web server currently enjoys, and the percentage of new web sites using that server, we highlight whether a site is doing better or worse than it has in the past in terms of acquiring new sites.
For example, if a particular server currently enjoys a market share of
50%, but only 40% of new sites found are of that server type,
then that server will have a value of -20% on the graph, since the server
type is underperforming its expected percentage by 20%. Conversely, if
a market share of a server is 10%, but 12% of all new sites are of this
server type, then that server will have a value of 20%, since it is over
performing by 20% of its expected value. This mechanism has the effect
of providing a common basis of comparison among all servers, regardless of
their current market share.