Recently we moved zoomshare into a new home. Alas, the move didn't go quite as planned. What follows is an "insider's" story of the move, what went wrong and how, ultimately, zoomshare came back to life.
On any given day the traffic zoomshare generates is significantly less during the early morning hours compared to peak hours during the afternoon in the United States. Early morning zoomshare's over 750,000 sites receive approximately 125,000 visits in total between Midnight and 5am Central Time. Peak hours, between Noon and 5pm, zoomshare sees about 4 to 5 times as many visitors, some 550,000.
Obviously, we scheduled an overnight maintenance window to move the necessary zoomshare equipment. Given that, I left the office early Tuesday to rest up. My plan was to skip dinner, roll into bed and get a quick nap in before meeting up with kree10 and peenworm.
As time marched toward Midnight with nary a sheep in sight to count, I rolled out of bed and started to get dressed. I should have known then something was up. After watching semis whoosh by at a highway oasis while eating a burger, soda and America's favorite french fries I meet up with kree10 and peenworm to start.
What exactly is involved in moving servers from one colocation facility to another? Well since a computer, server or otherwise, requires a connection to the Internet, the first step before moving is to setup service at the new location. In the case of servers this also means getting and assigned new static IP Address so the computers making up the system can be found. More importantly, the new IP Addresses need to be associated with the existing domain name, zoomshare.com. Therefore, from a technical point of view this means configuring servers to route traffic on their new home network by assigned them their new IP Addresses as well as configuring DNS to associate the new addresses with the existing zoomshare domain(s).
In the physical world, this means mounting and unmounting computers and network equipment in a server cabinet, running network and power cables and of course lifting and shuttling equipment between locations.
It sounds simple, but one has to realize that the "application" that is zoomshare is a complex system of hardware and software. In hardware zoomshare currently accounts of at least half a dozen pieces of equipment; servers, switches, routers and firewall, on two different networks, one public and one private. In software, the zoomshare application code is dependent on several instances of web server software, a database engine, email transfer agent, operating system and a heck of a lot of custom code that's distributed between the various servers and of course must be able to communicate between each other at different network layers.
Minor Troubles and Tribulations
Alas, colocation facilities tend to be very utilitarian since the business model for providers of these facilities is to maximize space, power and bandwidth for housing computers. This means they tend to be cold, noisy and cramped. Moreover since my main job on zoomshare is as a software engineer and not as a system administrator, it didn't take long after meeting up with kree10 and peenworm to feel like the proverbial third wheel, not much to do and nowhere to be but in the way. Soon however peenworm had the servers we needed to take with us offline and with kree10's van loaded up we moved on up and started reconfiguring servers for their new home.
Each new home has its own little quirks and idiosyncrasies to them. The same goes with colocation facilities. Each facility has its own take on how things should work and run, procedures and rules to follow and work by. For example, the new home of zoomshare has a loading dock that can be used to deliver equipment, be it delivered by hand or freight.
As time goes by one learns how to navigate the little quirks in a new home. They can become reassuring where originally they were unsettling. For a colocation facility, this might mean the different between how a procedure is written and how it is actually followed. Alas, it doesn't help, it's not reassuring, when you are just beginning to learn how to navigate the facility's procedures and something goes amiss. In the case of the loading dock, it was a new security guard on staff that threw me off as I tried to get access to the loading dock from the inside while kree10, van and equipment waited in the freezing subzero cold outside.
After a bit of wangling for access to the dock and a handcart to load the equipment on kree10 and I met up with peenworm who had already started preparing the new server cabinet for zoomshare.
To me it seemed down hill from here on out. Sure, nothing as complex as this goes quite as planned, but we seemed to have been able to navigate the bumps in the road without going bust. Moreover, most of the physical work and for the most part my helpfulness started to come to an end as each server was added to the cabinet, secured as much as possible, wired and powered up.
We had already prepared the new facility a week before hand as much as possible and now that we had the remaining servers mounted and wired all that was left was for peenworm to reconfiguring and verify everything was online. With that zoomshare would be back to life.
In all it took us a bit longer than planned to get all the equipment up and running. All should have been running by 5am, just as traffic to zoomshare for the day would start to increase. We had figured it would take us until 3 for the actual move, 5 in case anything went amiss. By 6 we had everything wrapped up and the "little problems" solved. Judging by the wireless network public network access to zoomshare was snappy and running faster then ever. All was done with only a handful of issues too minor to mention. What's a little misunderstanding about a handcart between friends? After all, all's well that ends well.
Or was it? Find out in "On Two Hours Sleep"