Database Monitoring
Database monitoring Database monitoring - DB check for access and regular execution of the specified query.

Database monitoring feature allows to run a query during every check and react in the appropriate way on the result. Also, it is possible to check just possibility to connect to DB - by ignoring the query field. To set the monitoring, fill the connection data: DB server address, port, database name, login and password of a user for connection. We strongly recommend to create the new user with limited rights. However, do not forget to provide him enough rights for performing the supposed actions. Also, it is necessary to add HostTracker servers addresses, to the whitelists on firewall or other blocking software, to allow the access. The addresses are permanent, and are listed on the same form.

Create ContentCheck Task

There could be arbitrary query - SELECT, UPDATE, DELETE, INSERT, execution of stored procedures (like scheduler), results comparison, logical operations. The only restriction is execution time - it should not take longer than 30 seconds. Otherwise, the timeout error will be reported.

It is recommended to create the queries which display the necessary value in the first row of the first column. This result can be analysed. For UPDATE, INSERT, DELETE queries the number of affected rows is analysed. There are different ways to analyse the resulted value by comparison with specific preset - equal/not equal/higher/lower/in range. In case the condition is not satisfied, no connection to DB, query timeout - the error is reported.

Create ContentCheck Task
  • CM.Glossary.WebsiteMonitoring
  • CM.Glossary.ContentCheck
more glossary
"Your sites is very nice and is the best service for web designers. thanks a lot."
- Del.
Shellshock vulnerability online check

Considering the recently discovered Shellshock vulnerability, HostTracker has created a tool for testing it.

Check your server for vulnerability

How does it work?

It is developed for a Linux server with a web server installed on it. The algorithm is very simple. We consequently generate 4 http requests:

  • 1. Ordinary request
  • 2. The request tries, using vulneratility, post a "harmful" cookie which causes 2-seconds delay in respond to our special http request.
  • 3. The request tries, using vulneratility, post a "harmful" cookie which causes 4-seconds delay in respond to our special http request.
  • 4. Same as #3

How to understand the result?

We compare response time for all 4 requests. Three situation are possible:

  • 1. Vulnerability found. We may affirm that if the difference in responses is about 2 seconds for requests without cookie and with 2-second-delay cookie, as well as for requests with 2 and 4-second delay cookie. It means that our request was able to use the vulnerability and set these cookies.
  • 2. Vulnerability not found. All the requests have about the same response time. The cookies, likely, were not installed because there is no vulnerability.
  • 3. Uncertain situation. If the response time differs widely, without coincidence with preset by cookies delay, we can not say for sure. It could be if the server is under high load. To check this, we use two requests with same cookies (#3 and #4). If the response time for two same checks varies, we make a conclusion that the response time is not affected by cookies. At least, not only by them. So in this case our method can not detect vulnerability

Safety of checks

Our test can not damage your server. The risk consists of appearance of one extra-cookie, which is used only for our requests and can not affect normal work-flow of your site.

more blog
Thank you for feedback!
 
Sign In
Sign Up
Prices & packages
Our monitoring network
Home > Blog > Improve_site_uptime

It is rather strange, but instead of going down web services usually grow so overloaded with the user requests that after slowing down they become unresponsive. But nowadays it is extremely important to provide secure and high-quality web hosting services, available anytime, as e-businesses is acquiring popularity. The owners of the websites demand perfect service, 100% uptime and quality assurance...

It is rather strange, but instead of going down web services usually grow so overloaded with the user requests that after slowing down they become unresponsive. But nowadays it is extremely important to provide secure and high-quality web hosting services, available anytime, as e-businesses is acquiring popularity. The owners of the websites demand perfect service, 100% uptime and quality assurance. Numerous techniques may be used to make access to the website smooth, thus increasing uptime. Site uptime informer

Here we offer you twelve methods recommended to achieve performance gain and uptime increase. Software optimization is required for that, as well as hardware optimization. A lot of software features may be upgraded with the use of general and improved coding standards, operated by the website manager. At the same time the company providing the webhosting services needs to improve the hardware constantly.
You can measure the accessibility of the website using monitoring services Host-Tracker.

Software Optimization
The following are the first six ways to improve uptime:

  • Split the databases
  • Separate the read an write databases
  • Use popular content caching more often and improve its quality
  • Optimize static content
  • Ensure compressed delivery of content
  • Optimize the content management system


To begin with, splitting  the databases – horizontally or vertically, or combining these two directions - is essential, making the connection more reliable. It is also good to differentiate the read and write databases, this giving an opportunity for master/slave setup. These actions are useful to extend the database infrastructure for future use.

Second, when you set the system to use better popular content cache, and to use it more often, your site will scale up more easily with many users operating it. Internet caching has no difference from computer caching: it supposes storing popular content at a separate container, allowing much quicker access to the information for the users.

To optimize static content is one more way to make the access to internet pages and files quicker. One of the means for it is to compress the images to the maximum extent possible (but, of course, preserving their high quality). Moreover, it is necessary to check if the web server may be used to deliver compressed content; this characteristic is not connected with the images, as they are already compressed files. Take care to have all the appropriate settings from the beginning.
.
One more appropriate thing to do is to improve the system of content management reducing the database calls number for each page request. It is like any type of connection: at the decrease of the information sending times the connection is maintained. In this circumstances, the number of calls to the database should be as low as possible – it will ensure that the users are able to access the content with the greatest speed.

Software and Hardware Optimization
The next six methods for augmenting uptime are:
Six more ways to augment the uptime are the following:

  • Use content delivery networks
  • Use emerging standards, such as HTML5
  • Improve the programming techniques
  • Add the content “expires” headers
  • Lessen the number of HTTP requests
  • Use Ethernet connections, allowing for more speed


Contend delivery networks allow for operation over larger amounts of media, at the same time improving the performance of the site. They are developed to direct the traffic to private networks. Such services allow operation over large media files, rooting the traffic along the Internet age, not straightly, averting extreme overloading. As the content delivery networks operate with large files, this unloads the servers, providing quick and qualitative connection.

Emerging standards, such as HTML5, include systems, improving the websites. It is achieved through advanced programming techniques, aimed at website and internet communications. Such standards do not 100 % assure that the website that uses them will not go down, but the in-built mechanisms of the code will act as the auxiliary mechanisms if it happens. Moreover, you should use improved programming methods while working with large loads and traffic spike.

“Expiers” content headers make all the automatically downloaded files cacheable for the visitors. So adding these headers to the content you will prevent the constant downloading of the browser. It blocks page review pointless HTTP requests. As well as the reduction of the database calls number, the reduction of the HTTP requests number will make the connection speed stable and not overloaded.
The last method is to develop Ethernet connection speeds. This will allow to cope with larger files and unexpected traffic spikes. A lot of hosting providers consider it as an excellent investment.

Speeds improvement and downtime reduction make the customers happier. Many of these methods to improve the web server or web site uptime may be achieved in a few minor steps. Such tactics, including Database rearrangement software optimization, caching improvements, content compressing, use of content delivery networks and content management systems, programming practices improvements and hardware upgrading will improve the web host as well as web site uptime, thus also improving your business.

 

 

 

 

 

 

Share:
Send to Twitter Send to Facebook Send to LinkedIn Share on Google+
Blogs:
HostTracker blog HostTracker page on Facebook