Choosing the right Data Centre for your needs is important, for many people it's hard to tell the difference between one facility and the next or to establish the true caliber of a facility.
Hopefully these 10 points will offer you some foresight and help you make an informed decision.
- Location, Location, Location
- Connectivity - Independence
- Connectivity - quality
- Knowledge and experience
- Tier I, II, III and IV
- The Basics
1. Location, Location, Location
The importance of location can't be stressed enough. The whole purpose of colocating (putting your servers into a data centre) is because you'd like them operational 24x7. As such ensure your choice of Data Centre is mindfully located. Many are at flood risk or in high crime areas, don't assume it's safe or protected. Check for yourself
- Transport links
- Power links
- Communication links 300m from main road and rail networks is best
- Environmental risks (i.e flood, falling tree's etc)
- Industrial risks (i.e nearby hazardous industry)
- Flight paths
- Nearby Road and Rail (i.e road/rail crash)
- Local crime (Would you leave a laptop in your car there?)
All small risks but why compromise.
Data Centre's need lots of reliable power. A UPS can provide limited protection for short outages and you'd expect a generator for longer ones.
However other important factors are often overlooked. Do they have enough fuel onsite? How often are these systems tested and checked?
Many generators can only do their full load for 4 hour periods or rely on refuelling contracts which offer not help when there's 30cm of snow on the ground.
3. Connectivity - Independence
By choosing a Carrier Neutral facility if your IP needs exceed that of your IP provider (in either capacity, quality or reliability) you can choose another. You can also opt to install your own connectivity. This means you can migrate ISP's without physically moving your servers or connect your office to the Data Centre.
4. Connectivity - Quality
If you're not in a carrier neutral facility you won't have a choice however its still important to ensure your provider has truly diverse connections. Many facilities use multiple BT tails to connect to the regional Tier 1 POP - the BT tails become the single point of failure. If they do have diverse connections ensure they're diversely routed
5. Knowledge and Experience
Many companies have set-up data centres in the recent years looking to profit, even with an unlimited budget only those with both knowledge and experience are able to offer a truly reliable service. Having 2(N+1) is great but only if it's understood, maintained and operated correctly. Ensure your choice have knowledgeable on site staff
6. Tier I, II, III, IV
The industry recognised Tier's are used to describe the weakest element in a data centre.
- Tier I - no redundancy, maybe not even a generator!
- Tier II - some redundancy, a but a single failure could affect the IT service, faults and maintenance would mean an impact
- Tier III - almost full redundancy, any component can be safely removed, a single failure could impact the IT load
- Tier IV - Full redundancy, continuous cooling (while the generator starts) no single point of failure at all
The uptime institute create the Tier's and charge to assess Data Centres, many Data Centre's opt not to be assessed but quote their Tier based on the published standards. We encourage you to ask questions yourself and understand how a significant component failure could impact on the IT load
How secure is the site? Look at both physical security, location, CCTV, staffing, external lighting, passing traffic, access control and local crime
8. The Basics
Do they have the basics,
- Dry Fire Suppression
- 24x7 On site Staff
- 24x7 Access (no charge)
- 24x7 reboots (no charge)
- 24x7 IT staff
- ISO9001 and ISO27001
- Access control
Are they stable? We're talking finances. How long have they been trading, how busy are they, data centre's cost millions to build and aren't profitable when they're empty. Be careful considering a new supplier in a new facility
Hopefully you should only have to visit your chosen Data Centre every 2-3 years, however if you do have to pull an all nighter or spend 4-5 days there you'll want to consider
- Food, vending machine or cafe
- Build room
- Showers and Toilets
- Local Hotel
- Secure parking
- Free tea and coffee
- Rest area / break room
Colocation and Data Centre Glossary
If you're new to the world of 24x7 IT or just don't know your FM200 from your R407C we've created a guide to help you turn confusion into knowledge, additions and amendements welcomed
- Server Size
- Run Time and Autonomy
- Close Control
- Floor Loading
- Access Control
- Single point of Failure
- Hot Aisle
- Cold Aisle
- Hot or Cold Aisle Containment
- Carrier Neutral
- Shared Racks
- Service Contention
- Dedicated servers
Most data centres have racks designed to take high density 19" rack mounted computer systems. We sell space by the cell offering 47, 22, 15 and 11 "U" cells All server manufactures specifically design servers for racks; these are very deep, 19" wide and slim, typically under 44mm for a typical system with up to 3 hard disks. We can accommodate any size of hardware, be it floor mounted a laptop or desktop case, but be wary of the air flow, often a 2U case will offer more flexibility and use less power than a 1U equivalent. 1U = 1 Rack Unit - 44.55mm high, 900mm deep and 482.6mm wide
You must ensure your hardware has sufficient air flow to work in a data centre environment, rack mount servers are designed to push the air not only out of their own case, but also out of the rack itself. A desktop PC with normal fans is designed to be quiet, in a DC sound levels are of no concern. We recommend high speed temperature controlled fans and rack mountable hardware.
Uninterruptible Power Supply. Provides power to a mission critical load in the event of a mains power failure. The UPS itself is either powered by a fuel cell, battery or rotary energy store. Fuel Cell's are expensive, batteries heavy and large and rotary energy stores typically only allow for a 3 second outage. DateCentre UPS's also clean the power supply to the load. ServerHouse use batteries as they're scalable, reliable and offer a long protection periods (20 minutes)
Run Time and Autonomy
Used to describe how long a UPS or Generator can run without interruption
Engine with a large alternator attached to generate electrical current, in our case used to power the entire building (lift, lights, computers, cooling, kettle and heating) in the event of a power outage >20 seconds)
Power Usage efficiency, a ratio indicating a data centre.s efficiency by calculating the amount of power required to service the load. A PUE of 2 would be 1kW of computing load and 1kW of cooling/lighting. The lower the PUE the more efficient the facility. Design PUE and operational PUE can often vary, ServerHouse has an average PUE of 1.32
Air conditioning system which controls both the temperature and humidity. Domestic and commercial air conditioning systems only control the temperature and often reduce the humidity. Low humidity can create static. Static is a risk to servers. ServerHouse has close control air conditioning which ensures the humidity is well within safe levels.
The maximum mass the floor can support, normally measured in kN per SqM (kilo newtons per square meter)
An inert gas which once released is able to extinguish a fire by both cooling and destabilizing the chemical reaction of the fire. It's safe to breath and means data centre operations can run continuously, even in the event of a fire
Very Early Smoke Detection Apparatus (VESDA). Most smoke detectors rely on smoke to rise to them in order to detect a fire. This means a fire could be taking hold while smoke is rising. VESDA sucks air across a laser detector to reduce the time it takes to detect smoke.
Access control systems both control access (much like a key would) but also record access so if you try a door you're not allowed through it will be recorded. If you go through a door it will be recorded. If you steal a key from site it can be disabled.
Security is managed by a biometric system, i.e your finger print, palm scan, iris etc are recorded and used to prove your identity
Where N is the requirement (what the data centre needs) +1 is the spare capacity. This may be in the form of an oversized unit or a hot or cold spare. But be careful, an oversized unit doesn't always provide redundancy, only extra capacity. Ensure you ask any data centre to explain their N+1.
Where N is the requirement, +N is a seperate and equally sized backup system. For example Two generators each able to support the full load.
Two totally independent systems. Far superior to N+N. N+N can exist with multiple single points of failure (i.e two UPS's but both feed from the same supply), S+S indicates there are none
Single point of Failure
A component in any system which if it should fail will break the entire system. No single point of failure means any component can break without affecting the overall systems performance.
The aisle between two rows of racks that is intentionally hot
The aisle between two rows of racks that is intentionally cold
Hot of Cold Aisle Containment
Hot or Cold Aisle Containment Building a sealed environment around the Hot or Cold Aisle to improve cooling efficiency by forcing as much air as possible through servers before returning to the air conditioning unit.
Many colocation providers also own the data centre and place limitations on connectivity and telcoms services. ServerHouse is carrier neutral and we don't restrict or limit with who or how you get your connectivity. This means you can have diversity, better pricing and change providers without moving your servers.
A Shared Rack is a 47U rack which is used by multiple companies, typically this is done to save money, by sharing space it saves your provider money. However there are problems with this.
The first is the power supply, should another server in that rack malfunction and overload the circuit breaker, all the customers in that rack will lose power. The second is idle hands, when other occupants slide their servers in and out or reach into the rack there is a chance they may unintentionally disconnect your server. For these reasons ServerHouse does not provide shared rack space, we do however provide 11U rack cells so you can manage costs without compromising on security or reliability
When your service is shared, this means you may be sharing your bandwidth with between 10 and 50 other clients. We don't believe in service contention so for every Mb of IP we sell we ensure there is capacity in our network to support it. In addition to this we monitor our bandwidth usage in 1 minute intervals to ensure every link have at no more than 20% utilization.
A dedicated server is a item of hardware supplied by server hosting companies to simplify the colocation process, typically the server hosting company would purchase the server on the clients behalf and maintain it's hardware for the duration of the contract. ServerHouse don't offer this service because it's expensive for the client, while set-up costs are lower, after 6-8 months it's more expensive, after 2-3 years the client ends up paying a huge premium for out dated hardware. We find customers get find better value and service by providing the hardware themselves