Website
Website Website is a basic unit of information in the Internet.

Website is a single resource of information in the Internet. Usually, it is realized through a set of web pages, devoted to some specific topic. The pages are written in special markup lenguages and may contain various types of data - applications, audio and video files, images etc. The site is hosted on a hardware server and can be found in the Internet by unique name due to DNS. The access to a website usually provided by HTTP (hyper text transfer protocol), or its secure version - HTTPS. This protocol provides possibility to deliver information from the server to a client, who is usually viewing the pages with the help of special application, called browser.

  • CM.Glossary.Uptime
  • CM.Glossary.Website Hosting
  • CM.Glossary.DNS
  • CM.Glossary.HTTP
  • CM.Glossary.WebsiteMonitoring
more glossary
"

Service is very  reliable and performs as described in sales information.  User interface is simple and easy to use.

"
- St
Have you checked your website speed recently? If not, you should!

We’re happy to announce that we've finally released “Response time check” tool to diagnose poor website performance. Speed is one of the most important things in website workflow as it affects not only Google rankings but also your visitor conversions. Here in this article:

▶​ Which components make up page load time?

▶​ Website speed optimization.

▶​​ How to measure your website response time with HostTracker?

 

We’re happy to announce that we've finally released “Response time check” tool to diagnose poor website performance. Speed is one of the most important things in website workflow as it affects not only Google rankings but also your visitor conversions. A recent study shows that visitors aren’t willing to wait more than 3 seconds for a page to load. And truly, we all know how waiting for a slow loading website can feel like an eternity when you’re trying to get some vital information. Actually, a lazy website may cost your business. What really matters is that simply taking a few steps in optimizing website performance can make a very big difference. So, let’s look at some common causes of a slow website and how you can speed these things up.

Which components make up page load time?

  1. DNS lookup time - the amount of time it takes a domain name being mapped to an equivalent IP-address to be found. 
  2. Connect Time (TCP) - how long it takes to establish a connection to the web server. 
  3. First Byte Time (TTFB) - the time it takes to get the server response as well as the interval since the server receives HTTP request till the server sends the first byte of the response back. 
  4. Download Time (Content Time) – the time span between start and end of content load.

Note: If you’re using SSL Certificate for a secure connection, you’ll need an extra time to spend on authentication, that means additional time for establishing a link between your web server and a browser.

All these components together stand for the page response time. See, it's quite possible that your website is slow because of the problem with one of the five things mentioned above.

Website speed optimization

Website loading speed depends on several factors and each of them could hold things up. Let's find out the most common reasons why your website speed may be slow.

When something wrong with the website itself:

  1. Third-Party Objects. Different third-party plugins hosted on the page can slow down your website. Even though the most powerful services such as Google Analytics integrate these scripts async and seamlessly for website performance, still placing third-party plugins will cause a delay in loading as each one of them adds up to the total number of requests that are being sent. Although the website speed is affected by numerous things, yet it is primarily impacted by the number of HTTP requests your website makes. So the golden rule of optimization is the less weight website to carry, the faster it works.

  2. Media From Other Sources. The more external media files on the page, the longer your website takes to load. Such bulky content has not only a negative impact on web page speed but it’s also one of the surest ways to make visitors leave. How can it be avoided? First, place someone else’s media in moderation; Secondly, use proper graphics file formats; Thirdly, leverage reliable local storage. If you do a couple of these things, you’ll see significant improvements in no time!

  3. Bulky code/ Inefficient SQL. Inefficient code or unoptimized database queries can have a really degrading effect on your website performance. Consider doing some code optimization like editing some scripts, HTML, CSS code etc; or database optimization like adding some indexes, altering the queries, modifying the structure etc. Problems with code are usually the culprit of dragging your website performance down.

When your hosting is killing your speed:

  1. DNS – your website destiny depends on the DNS server choice you make. The faster your DNS server, the quicker content on your page will be delivered.

  2. The Data Center Location. Do not neglect geography. It’s important to ensure that your vis­i­tors are hit­ting the near­est data col­lec­tion cen­ter. Understanding the time taken on transmitting information gives you a better awareness of user experience you’re providing, because you know, it takes time for data to be delivered. If the site is a global resource, it is recommended to use CDN (Content Delivery Network), that is, a network of globally distributed web servers which is used to deliver website content to the local end-users as fast as possible. Essentially, it’s hosting your files across all this server network and delivering them from the closest location. It’s worth noting, that in recent years the popularity of cloud hosting has skyrocketed. No surprise as it costs less, provides more and gives the opportunity to benefit from infinite flexibility.

  3. Choosing the wrong web hosting service. The reality is that sometimes the biggest problem with your website performance is that it requires simply more resources, than your web host can provide. Consider searching for a web hosting company that best suits your needs. It should go without saying: choosing a good hosting company is a key to high website performance.

How to measure your website response time with HostTracker?

In the Response Time Check window please enter your URL, task name, and specify the Timeout value.

Note: Every time your speed value exceeds this threshold, you will receive a notification.

Armed with this tool you’ll be always updated on how well your website loads - website statistics and history of events are always available in a convenient format (see the picture above).

Hope you enjoy this article! Remember the hardest thing in optimization is often to simply get started.

 

 

 

more blog
Thank you for feedback!
 
Sign In
Sign Up
Prices & packages
Our monitoring network
Home > Blog > HostTracker_under_Azure

Those, who actively involved with the Web, should know HostTracker, a company from Ukraine, which has been supporting one of the leading global web monitoring services since 2004. Its goal is to monitor site health and accessibility in near-real-time access. Using alert message system, HostTracker allows to reduce downtimes, to improve quality of service for users, to quickly localize troubles...

​Those, who actively involved with the Web, should know HostTracker, a company from Ukraine, which has been supporting one of the leading global web monitoring services since 2004. Its goal is to monitor site health and accessibility in near-real-time access. Using alert message system, HostTracker allows to reduce downtimes, to improve quality of service for users, to quickly localize troubles, and etc.

Architecturally, HostTracker includes a server-based hub, acting both as a data collector and control center, and a series of software agents, launched in various regions – typically using the equipment operated by major providers, hosters and affiliates. The geographically distributed architecture provides common system reliability and also allows collecting data in terms of access speed, bandwidth and other key performance characteristics on regional level – a critically important feature for the international business.

The first version of HostTracker, which is still functioning and providing services for tens of thousands of customers, was Linux based. Today, it is supported by nine control servers, located and organized in two DPCs on collocation principle, and few dozens of agents. Considering that the final objective of web monitoring is focused on increasing the uptime of client-based web resources – whereas 95% of HostTracker customers were able to increase it up to 99% – then, performance and accessibility of the service itself are not just critical, but rather fundamental parameters that influence the whole business. Theoretically, HostTracker should demonstrate accessibility close to 100%. However, an extensive growth of the service made this task hard to solve.

HostTracker was facing constantly increasing network traffic – a problem for seamless operation of the service. Inability to add new control servers on-the-fly, difficulties when maintaining not uniform and multiple-aged hardware was another limiting factor. Moreover, the desire to develop the service through wider protocol and network service support was meeting certain obstacles. “Unfortunately, for Linux there was a limited choice of ready-to-use solutions and libraries, while inventing something completely new was difficult”, says Artem Prisyazhnyuk, HostTracker director. “We had an idea of reviewing the stack of technologies we used for a more sophisticated one and after taking a closer look at the .NET platform, its potential in terms of scalability and network support, I realized that was exactly the thing we had been looking for.”

It was sure that migrating to a completely different platform should be a complex task – the project extended over three years. However, it was like blessing in disguise: during this period, the world has seen the cloud computing that seemed an ideal tool for solving both the scalability problem and putting aside one’s own whole infrastructure. Besides, the PaaS model allowed to remove most of the effort in terms of administering the solution and to control the application as a self-contained entity, to the extent of complete automation, and thus, Windows Azure had in fact no alternatives.

As a result, the second version of HostTracker, commercial operation of which started in May 2012, is already functioning under Windows Azure. Its central ingredient is realized as Web Role and associated with SQL Azure Database – it provides external portal, analytics and report generation, control of monitoring applications. The latter are ensured with instances of Worker Role, which also use SQL Azure Database to store their data and to provide the service scalability depending on the network loading. Agents are functioning as they did before, with the viability of their transfer to Windows Azure being considered.
Now, HostTracker uses HTTP/HTTPS and ICMP protocols to monitor specific ports, including various methods (HEAD/POST/GET), and etc.
 

HostTracker instant check



Alarm reporting is available via email, SMS and instant messages. The customer can receive reports with statistics about resources being controlled and their performances. You can spend only 6 minutes to make monitoring settings for five sites, while the average response time in case of failure is limited by a couple of minutes, and it takes 1-3 minutes more to inform the customer about the problem. Using this service, anyone can check any site, including access from various regions.

 As a result, if on the one side the transfer to the .NET platform itself gave us the potential to modernize HostTracker, to optimize the application architecture and realize new internal functions, then, on the other side, the migration to the cloud allowed to refuse from less important, though time consuming activities such as administering the solution, and, first of all, to reach necessary performance indicators. Microsoft, for all basic Windows Azure services, declares 99,9% accessibility and guarantees monthly refunds, should this indicator be lower. This creates a firm ground for operating such services like HostTracker, as the accessibility is the most critical parameter for these applications. Using the cloud infrastructure also provides a better protection for the service: unauthorized access to the application and many types of attacks are effectively excluded, while the data safety is ensured by triple reservation.

HostTracker received another advantage from abandoning its own infrastructure. The service’s performance characteristics are also rather critical, for they directly affect the failure reporting system operation. In this respect, Windows Azure is virtually a drainless source of computing power. This means that by timely starting additional monitoring instances you can support HostTracker functioning parameters on the necessary level. Moreover, the cloud environment is exactly what you need in order to make this process almost fully automatic, excluding further need for direct control.

Share:
Send to Twitter Send to Facebook Send to LinkedIn Share on Google+
Blogs:
HostTracker blog HostTracker page on Facebook