DataHive – a World Class Data Centre

Delivering the utmost in data centre value……

DataHive provides the most efficient and cost-effective way to manage your IT solutions. Reduce costs and increase performance in our highly secure, world-class data centre providing significant savings to your infrastructure budget. We provide the option of a single 1U colocation space up to a complete private suite in our fully managed data centre.

Data centre space is AVAILABLE NOW! DataHive offers secure, affordable data centre space featuring colocation, managed services, virtualization, and private cloud computing. Our data centre is located in downtown Calgary, Alberta, Canada — one of the most geographically safe locales in the world.

Whatever your data centre, colocation, virtualization or managed services requirements, DataHive can provide you with a secure cost effective data centre space solution that enables you to concentrate on your core business, but without the responsibility and daily issues attached with running a data centre.

Our space can be easily tailored to suit your individual needs. Private self secure contained data suites, shared data suites, caged data suites, and colocation starting from a single 1U.

We all want more for less so don’t delay. Contact us today. We have quality space and facilities waiting for your equipment and systems right now.

DataHive can also provide managed and technical services, private cloud computing and virtualization and offer assistance in setting up network security, backup, business continuity and disaster recovery. We have solutions to fit all budgets.

Whatever your industry, size or location let us know your requirements and we will find you a solution to suit your needs and your budget.

Historic Floods of 2013

When I left DataHive on June 20 to go home, the weather was rainy but I did not anticipate the severity of what was to come in the next 12 to 24 hours. I suppose nobody did.

Less than three hours after I’d left the downtown core, I got a call from friends who live in Bowness asking if they could stay with us for a day or so as they were being evacuated. That was my first clue that things were going to get hairy.

By the time I turned on the morning news on Friday at 6AM, it was clear we were going to be in for big trouble. All transit into the core was suspended, and Mayor Nenshi was requesting that everybody stay at home. A mere 6 hours later, downtown was being evacuated. Ultimately, approximately 100,000 residents and the entire core were being moved out of the way of record high waters. It was alarming to watch the devastating effects of the flood as they unfolded live on TV.

I was constantly monitoring our systems, knowing that if the water reached our building and/or the City of Calgary elected to cut the power to our part of the city (as they were doing throughout the flood zone), we’d have to think and act very quickly. The network was holding steady. Our providers were keeping the traffic flowing, which didn’t really surprise me. As long as we were able to maintain power to our routers, I was fairly certain that the internet would work as it was designed to. Traffic would be able to find at least one workable route.

Power was the real concern. We have the UPS in place, and we have 72 hours of fuel for the generator, meaning the immediate impact of the grid being cut would be zero. My concern was with what would happen after 72 hours or if the flood waters reached the room in which the generator is kept. The authorities certainly were not going to let a loaded fuel truck (or any traffic for that matter) into the red zone. If they cut the power on Friday afternoon, we’d be down by Monday afternoon. Best case scenario was to orchestrate an organized and planned shutdown of the data centre and then wait it out as the disaster was dealt with.

The logical thing to do was to contact our clients and let them know that – as of that moment – DataHive was not being effected. We said we’d keep them advised as the situation changed, and that we had a plan in place to deal with any eventuality. And then…


The water kept coming. The Bow River rose to 3 or 4 times its usual flow, but 840 7th Avenue SW was a little oasis in the middle of an unprecedented natural disaster. We didn’t lose grid power for even a second. The generator did not fire up.

On Saturday, I was able to ride my bike across the Louise Bridge and get into our building to prepare a ¼ rack for a new client who had been flooded out of their existing location. Needless to say, they were pretty impressed that DataHive was able to help them out, given the state of emergency that the rest of downtown was in.

The river didn’t crest until sometime late Sunday, so we weren’t out of the woods yet, but things were looking good for our data centre. Monday morning came, and all staff were able to make their way downtown and put in a full days work. Much of the downtown core would not be so fortunate. Most businesses were closed until at least Wednesday, many had no choice but to stay closed for the entire week, and places like the library are still closed as far as I know.

I can’t really explain how we got so lucky. One short block to the north of us (closer to the river), utility power was cut. A few blocks to the south, power was cut. East into downtown…cut.

In a way, it doesn’t really matter why. What matters is that we now know that DataHive is situated in a part of downtown Calgary that is proven to be extremely reliable, even in the face of a multi-billion dollar natural disaster that resulted in many data centres going off-line.

Here’s hoping that our clients and friends all made it through relatively unscathed, and that we never see something like this again. I can confidently speak for everybody at DataHive when I say “we’re very grateful”.

- Iain

DataHive’s Data Centre Untouched by Calgary Flood

“Not one single piece of equipment in our data centre was affected by this ‘flood of the century’. There was no down time, no data loss, no impact upon our clients,” stated Marjorie Zingle, CEO of DataHive.

Many businesses and buildings in Calgary’s downtown core have had to be shut down — some for a few days, some for a much longer period of time.

Ms Zingle stated that this phenomenal flood has not impacted DataHive’s data centre in any way. Power and connectivity have been maintained throughout the crisis, with its generator on standby and staff members closely monitoring the situation.

“In light of the extent of the disaster, DataHive is contributing to the Red Cross,” said Ms Zingle.

As well as remaining open for business, DataHive has been able to deliver its well-known, prompt, and individualized customer service. They responded immediately to urgent requests for temporary server storage. They were able, without any delays or complications, to install and set up new client equipment and temporary storage the day after the flooding began.

DataHive is a fully redundant, Class N+1 data centre located in downtown Calgary. This carrier-neutral facility and its network are optimized to provide security, reliability, redundancy, performance and scalability.

The Bots are Coming! The Bots are Coming!

The Check Point 2013 Security Report researched almost 900 businesses in 62 countries to determine the major security risks companies are exposed to daily. The research shockingly showed that 63% of enterprises are infected with bots. As well, more than half are infected with new malware at least once per day.

Cybercrime is ever evolving, and has reached unprecedented levels. Cybercriminals are constantly changing their techniques, and are threatening organizations of all sizes.

The top threats include botnets and risky Web applications. The surge in Web 2.0 applications has provided cybercriminals with brand new ways of infiltrating corporate networks. The survey showed 91% of enterprises use applications with potential security risks. Over half of the enterprises surveyed had experienced a data loss incident.

Being aware of all activity taking place on their networks is a major component for enterprises to develop a strong security blueprint. Security threats are constantly evolving, and enterprises must do what they can to minimize their risks.

Data Breaches Hurt Business Investments

Data breaches and hacker attacks are a major concern for consumers whose personal information has been put at risk. They are also a major concern for investors.

A recent study by Zogby Analytics shows that companies that have been a target of at least one cyberattack are viewed by possible investors with skepticism. A whopping 78.1% of investors surveyed indicated they would be very unlikely to invest in such a business.

As well, 68.7% indicated they would avoid investing in a company that has had one or more data breaches.

The study also found that more than half of investors were more concerned with the theft of customer data than corporate data.

Companies that are ill-equipped to battle cyberattacks will likely find they will suffer a significant detrimental effect on their brand when they are hit. If they don’t react to a breach with a well-thought-out plan, the detrimental effect will be even more devastating.

Hackers attack Twitter, Pinterest and Tumblr

If in doubt, change your password. Customer support tool Zendesk had been hacked earlier in February, and that affected Zendesk’s clients Twitter, Pinterest and Tumblr.

“We believe that the hacker downloaded email addresses of users who contacted those three customers for support, as well as support email subject lines,” explained Zendesk on their blog.

Pinterest responded by urging users to take measures to protect their accounts by using strong passwords and not sharing their passwords.

Tumblr warned its customers to review any e-mails they had shared with Zendesk to make sure no account information could be used by the hackers.

TLS protocol open to attack

UK scientists warn that the TLS protocol that provides security for online banking, credit card data and Facebook has “major weaknesses” which may lead to the interception of sensitive personal data.

The Transport Layer Security (TLS) protocol is used by millions of people daily. TLS provides security for online banking and for credit card purchases for online shopping.

Many corporate email systems use it, as do several huge entities including Facebook and Google.

The Information Security Group at Royal Holloway University found that a so-called ‘Man-in-the Middle’ attack can be launched against TLS that intercepts sensitive personal data.

“While these attacks do not pose a significant threat to ordinary users in its current form, attacks only get better with time. Given TLS’s extremely widespread use, it is crucial to tackle this issue now,” said the Information Security Group’s Professor Kenny Paterson. “Luckily we have discovered a number of countermeasures that can be used. We have been working with a number of companies and organizations, including Google, Oracle and OpenSSL, to test their systems against attack and put the appropriate defences in place.”

Internal Threats to your Data

A portable hard drive that contained personal information of more than half a million Canadians “disappeared” from the Gatineau office of Human Resources and Skills Development Canada.

The 583,000 Canadians whose information was “lost” were Canada Student Loans Program borrowers. The information included their names, social insurance numbers, birthdays, contact information and loan balances. It also contained information about the borrowers’ parents, siblings and spouses, which could conceivably increase the number of people impacted to two million plus.

Whether the “loss” was malicious or inadvertent, the Canadian government now joins many private businesses in learning the hard way that a major threat to data comes from within.

There are a number of factors that make internal data compromise difficult to address.

• Unstructured data lacks adequate controls: unstructured data is stored outside business applications and can be viewed outside the core business systems. This causes security issues because audit trail controls no longer apply.

• Most enterprises have huge volumes of unstructured data: often the excessive amount of unstructured data is due to poor organization of unnecessary and outdated files.

• Sensitive data is not readily identifiable: due to large volumes of data it is difficult to identify the small subset of sensitive data that needs to be safeguarded.

• It’s easy for data to travel: with staff email and access to the Internet, with file-sharing sites, with tiny data storage devices, the transfer of data out of the organization is extremely simple.

To address the unstructured data leakage risk, enterprises must have strong data governance and management controls in place. These controls are necessary to reduce the volume of unstructured data and to identify and control the most sensitive information.

Common Cloud Computing Missteps

Moving to the cloud can, for many SMBs, seem like a great bargain. They save on expenses such as updating hardware, power, maintenance, and floor space. But moving to the cloud can have its pitfalls as well as benefits, and these pitfalls may bring their own costs.

The pitfalls of moving to the cloud can include:

1. Overlooking the company’s specialized needs. Most cloud offerings are designed to be generic, to fit the general needs of the masses. If a company has unique requirements, such as compliance with regulations, it may face additional charges.

2. Overlooking variable demands. If high-demand periods aren’t addressed up front, a company can face expensive up-charges when those periods occur.

3. Overlooking software license provisions. Many companies assume they can move their apps to the cloud for free. That assumption is likely wrong. Many software licenses prohibit the transfer of apps to the cloud environment, and some providers allow such a transfer for a fee.

4. Getting locked in to a specific cloud solution. Amazon, as well as some other cloud-services providers, has a proprietary application program interface. This type of interface forces its clients to customize their applications. After the expense and effort put into creating customizations it may be too expensive and difficult to switch to another cloud-services provider.

Data Security Tips

Time for New Year’s resolutions, and resolving to protect your data is a great start to the year. Here are some tips to help you in your mission to keep your data safe.

  • Have a breach response plan prepared and tested.
  • Understand cloud service-level agreements so you can push for meaningful information on failover and disaster recovery practices.
  • Educate management on phishing, spear phishing and social engineering.
  • Educate staff to recognize applications and mobile devices that collect or transmit data and to communicate the risks to information security management personnel.
  • Check periodically whether you and your business associates are in compliance with all privacy and security requirements

No doubt you can come up with your own resolutions for the New Year, but please consider these tips if you are serious about preventing data breaches.

The Private Cloud Approach

The private cloud approach is designed for a single enterprise. That is not to say the business can’t be a large and complicated enterprise, with multiple sites and data centres. It definitely can be.

Cloud attributes can be applied in a private setting to improve the coordination of corporate data centres. Private clouds will not have the same reach and scale as public clouds. However private clouds are a more efficient option over traditional deployment approaches. They essentially enable the enterprise’s IT department to provide standardized services to company users.

A combination of public cloud and private cloud is more common than solely private cloud solutions. This combination permits differing levels of control over company data.

The combination of public and private cloud works well with phased deployment. This approach is becoming more and more accessible to SMBs. It offers flexibility and customization as well as a level of standardization that can minimize costs and speed deployment time.