Welcome to www.facingbipolar.com

Web Hosting - Managing Disk Space Few things are less exciting than managing the disk space that always seems to be in too short a supply. But few things are more important to the health and well being of your site. The most obvious aspect of managing disk space is the need to have enough. If you have only a few dozen web pages, that's not an issue. But as the amount of information (web pages, database content and more) grows, the quantity of free space goes down. That's important for two reasons. All permanent information on a computer is stored on hard drives. Temporary information is often stored in memory only. The two components are completely separate, though they are sometimes confused with one another. As the amount of free space on the hard drive decreases several effects occur. Here's one way to picture them... Imagine you had a table with a certain area and you lay out playing cards on the table. At first, you lay them out in order, the 2 at the side of the 3, then 4, and so on. But then you pick up one or two cards from the middle and discard them. Then you add some more cards. Pretty soon things look pretty random. Now cover the cards with a big opaque sheet of paper. You want the cards to appear in order when displayed to someone. A special robot could be designed to always pick up the cards from underneath the sheet in order. Or, it could slide a hole in the sheet over the cards to display them in the correct order (2, 3, 4, ...), no matter what order they are really in. That's similar to how the operating system always shows you information in a sensible way, even though it's actually stored randomly. Why should you care? Real files are stored in pieces scattered around the drive wherever there is space for them. The more free space there is, the quicker the operating system can find a place to store a new piece. That means, if you delete the junk you no longer need (and free up more space) the system actually runs quicker. It helps create space you might need, and allows the operating system to store files for you faster. But there's a second effect. As you delete old files or change them, the pieces get more and more scattered. It takes the 'robot' longer and longer to fetch or display the 'cards' in order. Existing files are fetched and put together 'on the fly' (say, when you request a graphical page or a list of names). But, it takes longer to put together the web page when there are more scattered pieces. So, the other aspect of managing disk space is to keep the pieces of the files more or less in order. A utility that does that is called a 'de-fragger' or de-fragmentation program. You can request that a system administrator run it, or if you have the authority, you can run it yourself. That keeps the 'cards' in order and allows for quicker access to them. So, managing disk space involves chiefly three things: (1) keeping enough space to store what you need to store, and also (2) keeping enough free space to make new file storage quick and (3) making old file retrieval fast by keeping things orderly. When only a few files are involved the benefit isn't worth the effort. But as the number and size of the files grow, to thousands of files or several gigabytes of data, the effect becomes more noticeable. Keeping things organized then makes a significant difference in performance. Much of this can be automated using utilities. Some will delete files in a certain folder older than a certain date. A de-fragger can be set to run automatically during times of light usage, or quietly in the background at all times. Discuss the options with your system administrator and help him or her do the job better by keeping your house in order. You'll benefit by having a better performing web site.

Web Hosting - Bandwidth and Server Load, What's That? Two key performance metrics will impact every web site owner sooner or later: bandwidth and server load. Bandwidth is the amount of network capacity available, and the term actually covers two different aspects. 'Bandwidth' can meanthe measure of network capacity for web traffic back and forth at a given time. Or, it sometimes is used to mean the amount that is allowed for some interval, such as one month. Both are important. As files are transferred, emails sent and received, and web pages accessed, network bandwidth is being used. If you want to send water through a pipe, you have to have a pipe. Those pipes can vary in size and the amount of water going through them at any time can also vary. Total monthly bandwidth is a cap that hosting companies place on sites in order to share fairly a limited resource. Companies monitor sites in order to keep one site from accidentally or deliberately consuming all the network capacity. Similar considerations apply to instantaneous bandwidth, though companies usually have such large network 'pipes' that it's much less common for heavy use by one user to be a problem. Server load is a more generic concept. It often refers, in more technical discussions, solely to CPU utilization. The CPU (central processing unit) is the component in a computer that processes instructions from programs, ordering memory to be used a certain way, moving files from one place to the next and more. Every function you perform consumes some CPU and its role is so central (hence the name) that it has come to be used as a synonym for the computer itself. People point to their case and say 'That is the CPU'. But, the computer actually has memory, disk drive(s) and several other features required in order to do its job. Server load refers, in more general circumstances, to the amount of use of each of those other components in total. Disk drives can be busy fetching files which they do in pieces, which are then assembled in memory and presented on the monitor, all controlled by instructions managed by the CPU. Memory capacity is limited. It's often the case that not all programs can use as much as they need at the same time. Special operating system routines control who gets how much, when and for how long, sharing the total 'pool' among competing processes. So, how 'loaded' the server is at any given time or over time is a matter of how heavily used any one, or all, of these components are. Why should you care? Because every web site owner will want to understand why a server becomes slow or unresponsive, and be able to optimize their use of it. When you share a server with other sites, which is extremely common, the traffic other sites receive creates load on the server that can affect your site. There's a limited amount you can do to influence that situation. But if you're aware of it, you can request the company move you to a less heavily loaded server. Or, if the other site (which you generally have no visibility to) is misbehaving, it's possible to get them moved or banned. But when you have a dedicated server, you have much more control over load issues. You can optimize your own site's HTML pages and programs, tune a database and carry out other activities that maximize throughput. Your users will see that as quicker page accesses and a more enjoyable user experience.

Web Hosting - Sharing A Server Things To Think About You can often get a substantial discount off web hosting fees by sharing a server with other sites. Or, you may have multiple sites of your own on the same system. But, just as sharing a house can have benefits and drawbacks, so too with a server. The first consideration is availability. Shared servers get re-booted more often than stand alone systems. That can happen for multiple reasons. Another site's software may produce a problem or make a change that requires a re-boot. While that's less common on Unix-based systems than on Windows, it still happens. Be prepared for more scheduled and unplanned outages when you share a server. Load is the next, and more obvious, issue. A single pickup truck can only haul so much weight. If the truck is already half-loaded with someone else's rocks, it will not haul yours as easily. Most websites are fairly static. A reader hits a page, then spends some time skimming it before loading another. During that time, the server has capacity to satisfy other requests without affecting you. All the shared resources - CPU, memory, disks, network and other components - can easily handle multiple users (up to a point). But all servers have inherent capacity limitations. The component that processes software instructions (the CPU) can only do so much. Most large servers will have more than one (some as many as 16), but there are still limits to what they can do. The more requests they receive, the busier they are. At a certain point, your software request (such as accessing a website page) has to wait a bit. Memory on a server functions in a similar way. It's a shared resource on the server and there is only so much of it. As it gets used up, the system lets one process use some, then another, in turn. But sharing that resource causes delays. The more requests there are, the longer the delays. You may experience that as waiting for a page to appear in the browser or a file to download. Bottlenecks can appear in other places outside, but connected to, the server itself. Network components get shared among multiple users along with everything else. And, as with those others, the more requests there are (and the longer they tie them up) the longer the delays you notice. The only way to get an objective look at whether a server and the connected network have enough capacity is to measure and test. All systems are capable of reporting how much of what is being used. Most can compile that information into some form of statistical report. Reviewing that data allows for a rational assessment of how much capacity is being used and how much is still available. It also allows a knowledgeable person to make projections of how much more sharing is possible with what level of impact. Request that information and, if necessary, get help in interpreting it. Then you can make a cost-benefit decision based on fact.