The Federal CIO’s guide to partnering with Quest Software for Data Center Consolidation, Part III
Note 1: the bulk of this blog post was done on an Apple iPad - I point this out not because of a fascination with the iPad, but because of the fact that such long documents were not readily possible from a mobile platform only a few years ago. That still amazes me.
Note 2: this is a very rough, stream of consciousness blog entry. Grammar, spelling and other writing errors should be ignored. If you want a nice, clean "white paper" type of document, please contact me offline, and I'll get you something cold and clinical.
------------------
Preparation and migration
--
Preparing for the move, beyond the simple assessment seems a no-brainer. But a migration? Really? Right before a move? And I say "yes." First, let's be clear and qualify that "migration" (to me) means a cut over to another platform. This could be email, user directory, database, etc, and it may be to a new version of some current software, but it's definitely getting off the current version. The idea is to make the software and version of that software as current as possible before you actually move to a new location/environment. The only thing this excludes is the move from physical to virtual. That comes later, and I'll explain why then.
There are several reasons for you to consider doing a migration before consolidating environments. First, and foremost, the migration is going to happen sooner or later, and aren't you better off doing the migration in a comfortable and stable environment instead of your new one? Plus, the migration may actually shake some things out and make the consolidation easier. For example, if you use QMM to migrate mailboxes from an old version of Exchange to 2010, or use QMM for AD to restructure your Active Directory environments, you may actually find that there are many users, mailboxes, groups and other objects that could be deleted/abandoned.
The same goes for databases. If you're running an old version of Oracle for your databases, it's time to cut over, and see what features and benefits you get. And we even have a tool that let's you mix versions to replicate data while you make this sort of move, so it's not a jarring, "all at once" process. That tool, BTW, is Shareplex. And it also let's you replicate mixed versions of databases, which is pretty cool.
But why else should you do the migration before starting the actual move? Well, frankly, because support is easier. Sure, you can migrate a Windows 2003 server, or an Oracle 9i database into your new environment, but if there's a problem, what will the vendor tell your team? Most likely, they'll tell you to upgrade to the latest version.
It's not widely discussed, but the reality is that most software companies want you on the latest and greatest version of their software when you call support. It's usually the version their developers are currently using as the basis for the next release. It's the version that support is working with most, and one that they have set up locally to recreate your problems. One or two versions is often ok, but if you're running something more than 2-3 years old, I think you're asking for trouble. Get to the latest versions however you can, because you don't want to consolidate and move old software around.
Another reason is your staff's personal development. I've run IT groups in the past, and the most common question was always, "when are we going to get to the latest version of X?" where X was some database, operating system or programming language. If you are the CIO at a federal agency, your staff knows that data center consolidation is coming, in some form or fashion. You want them ready and energized for the task. Letting them get to the latest version of whatever software they work with will excite them, and you'll have a happy crew moving into the consolidation.
Now, I did mention that cutting over to a virtual environment should be last. The reason is that this is really the same as a hardware move. No matter what hardware your environment is using, something is sure to change. And your software may react adversely. Plus, if you couple that with a version upgrade, you are changing a lot out from under your user base as well as IT staff. The idea is to minimize risk, and that just doesn't cut it. So do the "migration" first, get it settled, and then cut over hardware (which can be P2P or P2V).
And if you're contemplating a P2V move, you should definitely check out vConverter. Not only does it work with P2V, but V2V (you may want to switch hypervisors, or try out multiple hypervisors with the same ), and even V2P in case you absolutely have to back out. Or even want to switch hardware, using the move to virtual as a stepping stone.
Finally, if you upgrade, migrate and virtualize in a single move, how do you know where you got performance gains or losses? If you read my last post on this topic, you'll know I propose baselining before starting. The only way to do that is to start with a known point, but then make incremental moves so you can collect more information on what impact each part of the upgrade, migration and consolidation has on your environment.
The Federal CIO’s guide to partnering with Quest Software for Data Center Consolidation, Part II
Note 1: the bulk of this blog post was done on an Apple iPad - I point this out not because of a fascination with the iPad, but because of the fact that such long documents were not readily possible from a mobile platform only a few years ago. That still amazes me.
Note 2: this is a very rough, stream of consciousness blog entry. Grammar, spelling and other writing errors should be ignored. If you want a nice, clean "white paper" type of document, please contact me offline, and I'll get you something cold and clinical.
------------------
Initial assessment and baselining
--
Let's dive right in and get started.
To begin, an assessment needs to be performed to determine all the things that will be part of the consolidation. Presumably, this has already started as an initial plan is due to be submitted to the OMB for review and budgeting. However, everyone knows adjustments can and will be made. So I'd suggest you do an assessment assuming every item will be questioned.
There are lots of ways to survey what you have, but looking to Quest to help with that may not be something you thought to do. Well, you should. The reason is that we have a lot of tools, and lots of tools to help with your entire environment. From the operating system, to the databases and file servers, all the way to app and web servers as well as desktops. And while we're not in the inventory management business, we can certainly hold our own if you need a list.
"What kind of lists can you provide," you ask? For starters, we can get you users and desktops. Nowadays, most users are in Active Directory. And most of their desktops and laptops are joined to AD. So you could use something as simple as Quest Reporter to pull a list of those objects out of AD. Following the 80/20 rule, that should give you a good ballpark of your end-user environment. Need something s little more accurate? Then you'll need to do more work but get more results. You can either go with something like Desktop Authority to get you a good idea of what is actually out at the desktop level. Or, you can fall back to your AD event logs, and monitor login events over some time period with Quest Change Auditor for AD. In both cases, the products are sure to give you a lot more benefits beyond the initial assessment. And both Change Auditor and Reporter give you a good feel for your Windows server environment as well.
But the assessment is more than just a 'survey.' You cannot just make a nice clean inventory of everything you are responsible for, and leave it at that. It is critical to know -how- those systems are performing. In other words, you need to set a baseline, and you probably need to do it in 2 ways. The first way is through some measurements and metrics. Quest's Foglight platform is fantastic for end to end monitoring, and it can serve double duty by providing those initial statistics up and down your entire app stack.
Foglight can also provide those initial lists I mention above. You need RAM, CPU and disk numbers off your servers? We can get those to you, and help with some capacity planning as well. And if you run Foglight long enough, you'll have some very good trending and usage data to use beyond the consolidation effort.
The second baseline to check is subjective, and it's the user's perception of the current systems. This wouldn't involve any Quest product, but to simply put together a quick, 5 minute survey of what the users think of the apps they use. There are many free and paid sites out there that can run such a survey for you but I'd really encourage you to get this initial feedback. And if it starts to look grim, and you're surprised by the results, check out Quest End User Monitor to walk through the apps, and see what the users are complaining about.
That's really it on the baseline side. We can help with that initial assessment as well as providing initial metrics for how your environment is functioning. Can we provide complete coverage of your environment? Probably not, but the tools you'd use from us would continue to provide value beyond the assessment rather than being a throw away once the assessment is complete. And wouldn't it be nice to be in a new environment but with a familiar toolset? I think your IT staff would say, "yes."