Our virtual systems have changed the way that we think about security. In this video, you’ll learn about virtualization technologies and some of the more important security considerations in a virtual environment.
<< Previous Video: Host Software BaseliningNext: Cloud and SAN Storage Data Security >>
Virtualization is the concept where you can have one physical device. And on that one physical computer you can run many different operating systems or many instances of many different operating systems. These would all be separate independent operating systems that have their own CPUs, their own memory, their own network configurations. But they’re all really running on one single, physical device. On our desktops, we can have host based virtualization where you can be running a Windows desktop, a Linux desktop, a Mac OS 10 desktop.
But then you might also have virtual systems running on your desktop so that you might be running a Windows or a Linux operating system in separate windows at the same time on your computer. There’s also more of an enterprise level virtualization where you have a very large server that has a lot of memory, a lot of hard drive space, and a lot of CPU’s associated with it. And you can spin up many, many different servers, sometimes hundreds of different servers all running on this one physical platform. This idea of virtualization’s been around for a very long time.
The mainframes that IBM created in 1967 were using virtual systems on their mainframes. We’ve simply taken that idea through the years and honed it to run in RPC architectures. Here’s a screen that gives you a good idea of what host based virtualization would look like. This is a Mac OS 10 desktop. This is my desktop. And on my Mac OS 10 desktop I have running a window for my browser.
So this is running in the native Mac OS. I also have a Windows system running. And this Windows is a self-contained unit. It happens to be running in a window here. But I could also make it run full size. So my screen looks just like I’m running on a Windows device. And in fact, you really are running Windows inside of this virtual system. And I can run any application I’d like. I’ve also got Linux running at the same time on my system. So this is how virtualization can really take the idea of using multiple physical devices and combine them all together so that you’re running all of these operating systems on one single, physical device.
To bridge this gap between the physical world and the virtual world you need some specialized software called the hypervisor. This might also be referred to as the virtual machine manager. And it’s in charge of keeping track of all of these CPUs that are in use and the memory usage, and making sure that the virtual platforms are able to use the proper resources that they’re gathering from the physical world. Many of these host based systems will require a particular kind of CPU in your computer that supports virtualization.
There’s specialized hardware in these CPUs that will allow a much more effective way of virtualizing your hardware to use across all of these different operating systems. If your CPU does not have this virtualization capability in the hardware it may still allow you to run the virtualization software. But the performance is not going to be as good as if you have that specialized virtualization capability in the CPU of your hardware. This hypervisor is going to be responsible for sharing the resources between the physical and the virtual systems.
So it will manage the sharing of CPU and memory. It’ll make sure the networking pieces are all separated out. And it will make sure that there is security between all of these separate operating systems. Having the ability to have all of these different virtual systems and different operating systems running all at the same time certainly provides you with additional functionality of your computer. But it also provides you with some very nice security features. Each one of those virtual worlds is called a guest.
And each one of those guests has its own virtual file where everything is self-contained. If you wanted to grab that single file, pick it up and move it to another system, you could run that virtual system. now on the new computer. It makes it very portable, and it also makes it very secure because everything is self-contained in that single file. Since it is a single file, you could do versioning of this system. So you might take snapshots occasionally of your operating system.
And if you happen to install a file or you get infected with malware, you can simply roll back to a previous snapshot. And your system now is in the same form that it was when you took that snapshot originally. You can store multiple snapshots. So every time you want to make a major change to your system, you simply take a snapshot. And it’s very easy to roll back to a previous version. If you did also want to roll back to a particular date and time, especially if you’re making a very big change to the operating system– you might be upgrading to a new service pack, or you might be installing new hardware into that virtual world, but that patch broke something associated with the update– this is very easy now to roll back to a previous version using the snapshot function.
Another nice security feature of these snapshots is that you have a way to go back in time to see when something may have changed. If the bad guy took advantage of a vulnerability and exploited your system, you could then go back to previous snapshots to see exactly when that occurred. And it may give you a little bit more of a determination of from a calendaring or time perspective of the date and time when you first saw this particular vulnerability become exploited. In organizations that are using this virtualization capability in their data center, they have a number of additional functions for performance.
One of these is called elasticity, which allows an organization to quickly make more systems and roll out more capacity when they need it, and then pull that capacity back, and run fewer systems when they don’t need those capabilities. If there’s a certain time of the day, or a certain period of the quarter, or maybe during the holiday season where you would need more computing resources, it’s very easy to simply click on a particular image and deploy more and more systems out to cover that particular excess load that’s coming in.
This also can be orchestrated. This means that you’re able to set up some automated processes so that every time you deploy a new system, you click that button it not only deploys the new server, it loads the proper software for that server and then changes firewall rules to allow access to that new server. That’s just one example of how this orchestration all works behind the scenes so that you can easily deploy a server along with all of the support systems that go around it.
You’ve also got the ability to do this across data centers. If one data center very busy but you have some excess capacity in another data center, the orchestration software allows you to simply drag a server off to another data center. You’re now using those available resources, and your orchestration ensures that all of those support services will work properly once that’s now running in the new data center. You can take advantage of some of these virtualization features in your security work.
If you need a new machine you don’t have to purchase new hardware. You simply spin up a new virtual system running on your desktop or running in your virtual environment. This is a very fast way to spin up a system to use for port scanning or vulnerability testing. You can perform all the testing that you need. And when you’re done, you simply remove that machine and allocate those resources to other virtual systems in your environment. You might also use virtual systems for testing software.
You can create a sandbox. And before you run software on a machine that’s out in your production network, you might want to spin up a virtual machine, run that software, and see if there’s anything detrimental that happens when you run the software in the virtual environment. And once you become comfortable with the software running in the virtual world, you could then deploy to other systems in your environment ensuring that you’ve already tested and confirmed that that software is not something malicious that you should be concerned about. You might even find specialized software that runs on every single person’s computer that acts as a virtual sandbox.
And as they are running executables for the first time, it runs it in the virtual sandbox on that particular computer. That’s obviously a more advanced function of virtualization. But it’s one that has given us more capabilities in the security world to keep all of our systems safe.
Category: CompTIA Security+ SY0-401