Sunday, January 11, 2015

Create and upload a Windows Server VHD to Azure

This article shows you how to upload a virtual hard disk (VHD) with an operating system so you can use it as an image to create virtual machines based on that image. For more information about disks and images in Microsoft Azure, see About Disks and Images in Azure.

Note: When you create a virtual machine, you can customize the operating system settings to facilitate running your application. The configuration that you set is stored on disk for that virtual machine. For instructions, see How to Create a Custom Virtual Machine.

Prerequisites

This article assumes that you have the following items:
An Azure subscription - If you don't have one, you can create a free trial account in just a couple of minutes. For details, see Create an Azure account.
Microsoft Azure PowerShell - You have the Microsoft Azure PowerShell module installed. To download the module, see Microsoft Azure Downloads. A tutorial to install and configure PowerShell with your Azure Subscription can be found here.
  • The Add-AzureVHD cmdlet, which is part of the Microsoft Azure PowerShell module. You'll use this cmdlet to upload the VHD.
A supported Windows operating system stored in a .vhd file - You have installed a supported Windows Server operating system to a virtual hard disk. Multiple tools exist to create .vhd files. You can use a virtualization solutions such as Hyper-V to create the .vhd file and install the operating system. For instructions, see Install the Hyper-V Role and Configure a Virtual Machine.
Important: The VHDX format is not supported in Microsoft Azure. You can convert the disk to VHD format using Hyper-V Manager or the Convert-VHD cmdlet. A tutorial on this can be found here.
Window Server operating system media. This task requires an .iso file that contains the Windows Server operating system. The following Windows Server versions are supported:
OSSKUSERVICE PACKARCHITECTURE
Windows Server 2012 R2All editionsN/Ax64
Windows Server 2012All editionsN/Ax64
Windows Server 2008 R2All editionsSP1x64
This task includes the following steps:

Step 1: Prepare the image to be uploaded

Before the image can be uploaded to Azure, it must be generalized by using the Sysprep command. For more information about using Sysprep, see How to Use Sysprep: An Introduction.
In the virtual machine that you just created, complete the following procedure:
  1. Log in to the operating system.
  2. Open a Command Prompt window as an administrator. Change the directory to %windir%\system32\sysprep, and then run sysprep.exe.
    Open Command Prompt window
  3. The System Preparation Tool dialog box appears.
    Start Sysprep
  4. In the System Preparation Tool, select Enter System Out of Box Experience (OOBE) and make sure that Generalize is checked.
  5. In Shutdown Options, select Shutdown.
  6. Click OK.

Step 2: Create a storage account in Azure

A storage account represents the highest level of the namespace for accessing the storage services and is associated with your Azure subscription. You need a storage account in Azure to upload a .vhd file to Azure that can be used for creating a virtual machine. You can use the Azure Management Portal to create a storage account.
  1. Sign in to the Azure Management Portal.
  2. On the command bar, click New.
  3. Click Storage Account, and then click Quick Create.
    Quick create a storage account
  4. Fill out the fields as follows:
  5. Under URL, type a subdomain name to use in the URL for the storage account. The entry can contain from 3-24 lowercase letters and numbers. This name becomes the host name within the URL that is used to address Blob, Queue, or Table resources for the subscription.
  6. Choose the location or affinity group for the storage account. By specifying an affinity group, you can co-locate your cloud services in the same data center with your storage.
  7. Decide whether to use geo-replication for the storage account. Geo-replication is turned on by default. This option replicates your data to a secondary location, at no cost to you, so that your storage fails over to a secondary location if a major failure occurs that can't be handled in the primary location. The secondary location is assigned automatically, and can't be changed. If legal requirements or organizational policy requires tighter control over the location of your cloud-based storage, you can turn off geo-replication. However, be aware that if you later turn on geo-replication, you will be charged a one-time data transfer fee to replicate your existing data to the secondary location. Storage services without geo-replication are offered at a discount. More details on managing geo-replication of Storage accounts can be found here: Create, manage, or delete a storage account.
    Enter storage account details
  8. Click Create Storage Account.
    The account now appears under Storage Accounts.
    Storage account successfully created
  9. Next, create a container for your uploaded VHDs. Click on the Storage account name and then click on Containers.
    Storage account detail
  10. Click Create a Container.
    Storage account detail
  11. Type a Name for your container and select the Access Policy.
    Container name

Step 3: Prepare the connection to Microsoft Azure

Before you can upload a .vhd file, you need to establish a secure connection between your computer and your subscription in Microsoft Azure. You can use the Microsoft Azure Active Directory method or the certificate method to do this.

Use the Microsoft Azure AD method

  1. Open the Microsoft Azure PowerShell console, as instructed in How to: Install Microsoft Azure PowerShell.
  2. Type the following command:
    Add-AzureAccount
    This command opens a sign-in window so you can sign with your work or school account.
    PowerShell Window
  3. Microsoft Azure authenticates and saves the credential information, and then closes the window.

Use the certificate method

  1. Open a Microsoft Azure PowerShell window.
  2. Type: Get-AzurePublishSettingsFile.
  3. A browser window opens and prompts you to download a .publishsettings file. It contains information and a certificate for your Microsoft Azure subscription.
    Browser download page
  4. Save the .publishsettings file.
  5. Type: Import-AzurePublishSettingsFile <PathToFile>
    Where <PathToFile> is the full path to the .publishsettings file.
    For more information, see Get Started with Microsoft Azure Cmdlets
    For more information on installing and configuring PowerShell, see How to install and configure Microsoft Azure PowerShell

Step 4: Upload the .vhd file

When you upload the .vhd file, you can place the .vhd file anywhere within your blob storage. In the following command examples, BlobStorageURL is the URL for the storage account that you created in Step 2, YourImagesFolder is the container within blob storage where you want to store your images.VHDName is the label that appears in the Management Portal to identify the virtual hard disk. PathToVHDFile is the full path and name of the .vhd file.
  1. From the Microsoft Azure PowerShell window you used in the previous step, type:
    Add-AzureVhd -Destination "<BlobStorageURL>/<YourImagesFolder>/<VHDName>.vhd" -LocalFilePath <PathToVHDFile>
    PowerShell Add-AzureVHD
    For more information about the Add-AzureVhd cmdlet, see Add-AzureVhd.

Add the Image to Your List of Custom Images

After you upload the .vhd, you add it as an image to the list of custom images associated with your subscription.
  1. From the Management Portal, under All Items, click Virtual Machines.
  2. Under Virtual Machines, click Images.
  3. And then click Create an Image.
    PowerShell Add-AzureVHD
  4. In Create an image from a VHD, do the following:
    • Specify name
    • Specify description
    • To specify the URL of your VHD click the folder button to launch the below dialog box
    Select VHD- Select the storage account your VHD is in and click Open. This returns you to the Create an image from a VHD window. - After you return to theCreate an image from a VHD window, select the Operating System Family. - Check I have run Sysprep on the virtual machine associated with this VHD to acknowledge that you generalized the operating system in Step 1, and then click OK.
    Add Image
  5. OPTIONAL : You can also use Azure PowerShell's Add-AzureVMImage cmdlet to add your VHD as an image.
    From the Microsoft Azure PowerShell window, type:
    Add-AzureVMImage -ImageName <Your Image's Name> -MediaLocation <location of the VHD> -OS <Type of the OS on the VHD>
    PowerShell Add-AzureVMImage
  6. After you complete the previous steps, the new image is listed when you choose the Images tab.
    custom image
    When you create a new virtual machine, you can now use this new image. Choose My Images to show the new image. For instructions, see Create a Virtual Machine Running Windows Server.
    create VM from custom image

How to Capture a Windows Virtual Machine to Use as a Template

This article shows you how to capture an Azure virtual machine running Windows so you can use it like a template to create other virtual machines. This template includes the OS disk and any data disks attached the virtual machine. It doesn't include networking configuration, so you'll need to configure that when you create the other virtual machines that use the template.

Azure treats this template as an image and stores it under My Images. This is also where any images you've uploaded are stored. For more information about images, see About Virtual Machine Images in Azure.

Before You Begin

These steps assume that you've already created an Azure virtual machine and configured the operating system, including attaching any data disks. If you haven't done this yet, see these instructions:

Capture the Virtual Machine

  1. Connect to the virtual machine by clicking Connect on the command bar. For details, see How to Log on to a Virtual Machine Running Windows Server.
  2. Open a Command Prompt window as an administrator.
  3. Change the directory to %windir%\system32\sysprep, and then run sysprep.exe.
  4. The System Preparation Tool dialog box appears. Do the following:
    • In System Cleanup Action, select Enter System Out-of-Box Experience (OOBE) and make sure that Generalize is checked. For more information about using Sysprep, see How to Use Sysprep: An Introduction.
    • In Shutdown Options, select Shutdown.
    • Click OK.
    Run Sysprep
  5. Sysprep shuts down the virtual machine, which changes the status of the virtual machine in the Management Portal to Stopped.
  6. Click Virtual Machines, and then select the virtual machine you want to capture.
  7. On the command bar, click Capture.
    Capture virtual machine
    The Capture the Virtual Machine dialog box appears.
  8. In Image Name, type a name for the new image.
  9. Before you add a Windows Server image to your set of custom images, it must be generalized by running Sysprep as instructed in the previous steps. Click I have run Sysprep on the virtual machine to indicate that you have done this.
  10. Click the check mark to capture the image. When you capture an image of a generalized virtual machine, the virtual machine is deleted.
    The new image is now available under Images.
    Image capture successful

Next Steps

The image is ready to be used as a template to create virtual machines. To do this, you'll create a custom virtual machine by using the From Gallery method and select the image you just created. For instructions, see How to Create a Custom Virtual Machine.

Saturday, January 10, 2015

SysAdmin Tools

Below is a list of tools I found to be extremely helpful over the years. I will go in to more detail for a few of them with specific examples on how I used them in the real world.

– Microsoft Planning and Assessment Toolkit: If you are planning a major project around server consolidation, cloud migration, VDI, or SAN implementation , be sure to give MAPs a look. This will give you all the information you need to ensure your new environment has been designed to handle the current workload. 
– Dropbox: If you haven’t heard about Dropbox yet, then take a few min to create a Dropbox accountfree. It’s a great cloud based storage app that will give you access to your data across multiple computers or mobile devices.
– TreeSize: Have you ever had a drive fill up but have no idea what folder is taking up all the space? TreeSize will give you an explorer view of all the folders, sub folders and files for a volume and sort them by the amount of space being used. This is a free program. The paid version allows you to scan network folders as well.
– Ninite: We all know how much time goes into configuring a new PC. Installing Windows Updates, Office, third party apps like Adobe Reader, Flash, Shockwave etc. and having to install each app manually. Ninite let’s you build a custom installer to include a BUNCH of third party add-ons that goes out to download the installers and installs them all with a click of a button.
– Angry IP Scan: Its really not that angry, but is extremely helpful when you want to see what is actually on your network. This is a simple IP scanner and will return the host name if available along with an alive or dead status.
– WinMTR: WinMTR is a combination of ping and traceroute and is extremely helpful in isolating connectivity issues specifically around packet lost all the way to the destination. Download it free here
– iPerf: Ever wonder what your real throughput was like from your PC to a server on your LAN or remote location? Check out iPerf. It’s like a locally hosted speed test that has both a client and a server component.
– CatTools: Managing the configs a bunch of routers and switches can be a daunting task. How do you back these up in the event of a failure. CatTools automated the process for most firewalls and switches.
– ViseVersa Pro: This is a great little program that can be used for data backup , data migration and more. It will allow you to keep files in sync between multiple locations as well as maintain NTFS permissions.
– LogMeIn: If you are looking for a solid remote access method, look no further. This is by far the best one available today available on all mobile devices as well as pC and MAC.
– Kaseya: If you are responsible for managing more that 10 systems, Kaseya is a great way to manage everything from inventory, licensing to automating tasks across the organization.
– Pingdom: This is a great SaaS based app that lets you monitor specific ports, services etc from multiple geographical locations. Unlike other simple port test services, Pingdom has the ability to check for certain content on a site, for example “Error 500″ Check it out
– NetWrix Password Notifier: Do your users always forget to change their password before it expires? Check out this sweet app that will send your users direct emails reminding them to change their password until it happens! You can customize the email to include instructions, or even personalize it to the user. http://www.netwrix.com/password_change_reminder.html

The role of the managed IT service provider in a cloud world

A-Glance-At-How-The-Cloud-Is-Changing-The-IT-World

By now, you’ve all probably heard the term “cloud computing”, (SaaS, PaaS, IaaS etc). Depending on who you ask, cloud computing can mean many different things. It can mean “moving your server infrastructure from your on-premise server closet (or the corner of your office ) to a highly secure, geo-redundant datacenter where you pay someone to host the underlying hardware for you”, or it can also mean a “pay as you go service where you only pay for what computing resources you consume, giving you the ability to scale up or scale down as needed minimizing your capital expense.” Whatever your definition of cloud is, it is here to stay and is changing the way we think about IT. So what does this mean for today’s managed IT service provider?
Traditional managed IT service providers built their business model around supporting clients on a per workstation, per server pricing model. This worked great for years (and still does in some cases) however with a lot of clients moving applications and servers to the cloud, as well as more and more MSP’s being formed each and every day, our way of thinking needs to change a bit. What happens when you go into a client that has 0 servers on-premise , and the employees all use personal devices connecting to multiple SaaS based applications? Or the client that gives each employee a tablet, smart phone, laptop and virtual desktop? Whatever the case may be, the point is traditional per server , per workstation pricing models may need to change. Granted all these devices and services still need to be managed in one way or another, the managed IT provider needs to find a way differentiate themselves from every other help desk out there offering monitoring services with Kaseya or N-able as a backend. So how do you do that?
1.) Embrace the cloud: All too often, MSP’s that are threatened by the cloud tell their customers to avoid moving services to the cloud in order for them to keep that “per server revenue” still coming in. This is unfortunate because they are shielding some of the most innovative technologies available to them at a fraction of the cost of maintaining it on-premise. At my firm, we encourage our clients to move as many services as they can to the cloud that make the most sense for their business. This includes but not limited to hosted Exchange, Microsoft Lync, Sharepoint, CRM, disaster recovery and more.
2.) Customer Service: Part of the fear of moving to any cloud provider is the lack of service one may receive. Instead of calling the IT department for an issue with E-Mail, now the employee has to call into a giant helpdesk to report the issue, and maybe they will get a call back within 48hrs. We truly focus on being a customer service organization first and foremost and an IT provider second. With the variety of cloud services a single company might use, IT now has to become a central resource that employees can contact when they are having issues, and know how to support each application used in the environment.
3.) Cloud Integration: Integration of cloud applications is extremely important. Our roll as an IT Help Desk has changed into a cloud integrator. We need to be able to take a clients on-premise Active Directory and federate with a service like Office 365 to offer seamless authentication, as well as extend the clients private cloud infrastructure to the public cloud for elasticity or even disaster recovery. At Project Leadership Associates, we specialize in such cloud integrations connecting many of the cloud offerings clients use today to work seamlessly together. Our clients enjoy the benefits of using Office 365 subscription based licensing model, with the flexibility of integrating a hybrid Lync based PBX system both on premise and in the cloud.
4.) Become a Strategic Partner: We want our clients to look at us as a strategic partner instead of just as a vendor. Anyone can provide basic helpdesk support, however not every company can provide true CIO , strategic advice on how how to best use technology for the business needs. For all our managed services clients, we host quarterly business reviews where together we establish a 3 year technology plan, and prioritize items according to the clients growth plans and business requirements. In addition, we assist with IT budgeting, and make recommendations on ways to leverage technology to make employees more efficient.
Bottom line, even though IT is shifting to the cloud, I believe there will always be a need for managed IT service providers to help connect these providers together. The need for supporting servers and workstations will be there, however the true focus should be on customer service, building long-term partnerships and integrating with the cloud services.

What’s your disaster recovery plan?

disaster-recovery-plan

Disaster recovery is often an area that doesn’t get as much planning and attention as it should. Most IT departments and smaller outsourcing firms are so busy constantly putting out fires, that the process of actually testing if the data can be recovered almost never happens. Another problem I see often is that the business owner usually has a completely different set of RPO’s (recovery point objectives) and RTO’s (Recovery Time Objectives) in mind than IT has. For most companies, the days of just doing 1 backup at the end of the business day just doesn’t cut it anymore. Staff expect their data to always be accessible, and when something happens to it, they expect to get a recent copy back fast. So what does all this mean, and what should you do as a business owner to make sure your companies most valuable asset (your data) is protected and recoverable when you need it most?
1.)
    What are your recovery point objectives:
Ask yourself this question. If I am working on a budget plan for the year, and I accidentally deleted the data, what is an acceptable point in time to have that file recovered ? Am I OK with last nights data? Do I need a copy from 8hrs ago? Or do I need the file from 1hr ago because I was actively working on the file all day? This is defined as your RPO.
2.)
    What are your recovery time objectives?
Another question to ask yourself is, “If you had a complete system failure, how long can your systems be down before you start losing money?” If there was a natural disaster that caused power to be lost for an extended period of time, do you need to be up and running in a remote location, and if so, how fast? This is defined as your RTO.
These are 2 very important questions that your IT partner should be asking you. The answers you provide will help IT design the appropriate DR plan to meet your RPO’s and RTO’s. Now that you have these two objectives defined, what can you do to make sure these are actually being met, and most importantly, tested and documented? Below are some great technologies you could implement to help you achieve both low RTO’s and well as frequent RPO’s.
Demand for RPO’s up to 15 minutes:
    – Volume Shadow Copy:
This is a free built in feature to Windows Server, and can be enabled on each volume on a schedule that meets your needs.
    – Dell AppAssure Replay:
Replay has the ability to take snapshots of your mission critical applications as often as every 15 minutes, which can then be replicated to an off-site server in a remote location. Replay has a unique feature for applications like SQL and Exchange where Replay will test to see if the databases can be mounted, and notify you if there are any issues.
    Demand for RTO’s of 5 minutes or less
    – Hyper-V Replicas:
With the release of Server 2012, Hyper-V now has virtual machine replication (Hyper-V replicas) built in! These VMs can be replicated to a warm standby server on-site as often as every 5 minutes, or sent off-site to the cloud. This is a great low cost option to be up and running quickly in the event of a failure.
    – NeverFail:
If you need true high availability with near 0 downtime, using HA software like NeverFail will be your best option. NeverFail will make sure your data is always available, and can automatically be failed over to a remote server.
Great! I have my RPO’s and RTO’s defined ,I have my software vendor selected that I am going to use, how often should I actually test this and how do I know its being done?
Each business should have a documented disaster recovery plan that outlines all of the backup solutions in place, how often they run, and a procedure to recover the data for each solution. In my managed services practice, we receive daily notifications that let us know if the backups have run successfully or not. This is a great starting point, however we want to make sure the data is actually recoverable. Each week we make sure we run through a manual check where we validate the ability to recover a file. We also like to do a complete DR test at least quarterly where we bring the DR site online, and have a few employees verify they can access their data. Each check is documented in a managed services ticket and sent to the client contact with the results of the test for their records. If the recovery process is not up-to-date in the DR plan, the engineer is responsible to updating the plan with the most recent recovery steps.
I hope this helps you think more about your disaster recovery requirements, as well as give you some ideas of ways to accomplish just about any RPO or RTO. If you need assistance creating a DR plan, or would like assistance validating an existing DR plan, please contact me to schedule a consultation!

Is your data safe in the cloud?

Image

We all have heard that the cloud is supposed to save us time and money, but are we sacrificing data security, and protection by migrating our servers to the cloud?  The short answer is, it depends.  There seems to be a new hosting provider on the market every day promising low prices, highest levels of uptime, and secure data protection all in a public cloud, hybrid cloud, or private cloud configuration.  Before you get all caught up in the marketing departments attempt to promise you the world, you really need to do some homework before you trust your company’s data to some guy in a basement.  So what do you need to look for?
How long has the hosting company been in business?  This is important.  Often times you will see startup hosting company’s that have only been in business for 6 months to a year looking to build a name for themselves offer really low prices.  While this is great, it’s EXTREMELY difficult for small company’s to remain competitive with company’s like Navisite, Rackspace, Ubistor, and Amazon etc. just to name a few.  These big players purchase servers by the pallet and truck load , and can offer you the best possible pricing on hardware.  The last thing you want is to trust your data to a small organization that may go out of business in the next 12 months.
What type of infrastructure is your data hosted on?   Just because someone says your data is hosted “in the cloud”, it doesn’t mean that they have built out this extravagant datacenter with redundant servers, multiple backup and disaster recovery solutions, subject matter experts etc.  It’s a scary thought, however I have seen some IT providers call a single, under powered server running VMWare or Hyper-V , with no backup or disaster recovery solution, market their product as a cloud offering.  While their prices are EXTREMELY low, you certainly get what you pay for.
Instead of just looking at the bottom line, look at what the company has to offer.  For example, choosing a company that can provide me with the results of an SSAE 16 audit, and has had all their internal controls  (backup solutions, infrastructure redundancy, security, policies and procedures etc.) validated and published by a third party firm will certainly make me feel more comfortable than a piece of marketing material.  It’s also important to note that anyone can put a server in a datacenter that is SSAE 16 certified.  Do not confuse a datacenter certification with a service organization being certified.
How is my data protected from a datacenter outage?   Another assumption that organizations have is that if they migrate their servers to the cloud, they will never experience an outage.  While most cloud providers have geo-redundant datacenters with redundant power, internet, servers etc. there are rare instances where the datacenter may experience an outage.  Most cloud providers will offer you the ability to replicate your servers to another datacenter of theirs, however at a certain cost.  If you need datacenter availability, be sure to ask your hosting provider what your options are.
If you are in the market to migrate your servers to the cloud, I would be leverage a cloud hosting provider that has been in business for a while that has multiple datacenters, and preferably SSAE 16 certifications along with clearly defined service level agreements.  At Project Leadership Associates, we have partnered with some of the best names in the cloud hosting industry, and have validated the infrastructure and service offerings of each before we recommend them to our clients.  We position ourselves as a vendor agnostic cloud advisor and can recommend the best solution for your specific needs.

Modernizing Legacy Clients: Why Now Is the Time to Secure and Transform

Hey folks! 👋 Had a random hour of “should I scroll or build?” — so here I am, blogging instead of doomscrolling. 😅 If you're still l...