Shortly after completing a deployment of virtualized servers, St. Vincent’s Hospital’s IT staff concluded that...
desktop virtualization technology was pretty much a no-brainer, said Kane Edupuganti, the hospital’s director of IT operations and communications.
The medical facility has about 5,000 desktops, many of them five or six years old and containing outdated operating systems, in 42 sites in the five New York City boroughs. “Every time a desktop died, our engineers had to open it up, figure out what was wrong, replace the part, or load the software onto a new machine,” Edupuganti said.
His team considered and rejected a thin client system because it would still have to maintain an OS and moving parts on those 5,000 distributed endpoints, Edupuganti said. The team ultimately chose Pano Logic Inc.’s Pano, a virtual zero-client device that is essentially “a dumb terminal with a mouse, with no moving parts.”
Now, when a device dies or needs repairs, the desktop engineer takes it offline and installs a new box, which is registered on the server and booted up in minutes. Power use per desktop went from 150 W to just 5 W, Edupuganti reported. In addition, physicians and nurses can access their desktop resources anytime, anywhere, at work or at home.
First, however, Edupuganti’s team had to deal with virtual-desktop infrastructure issues. Because Pano zero client devices depend on a server host for all their processing, they require a minimum dedicated bandwidth of 10 Mbps to avoid network latency problems. Fortunately, St. Vincent’s Hospital had upgraded its metropolitan area network to Electronic Virtual Private Links, which would provide that, Edupuganti said. Storage availability was also an issue, because virtual client devices have no local hard drives.
During the next three to five years, most health care providers will exploit some form of virtualization technology, according to Barry Runyon, a research vice president at Gartner Inc. As the St. Vincent’s Hospital case illustrates, the rewards can be great. Centralized desktop and software provisioning means it’s easier to deploy and configure new desktops and software updates, he said.
Virtualization technology also makes security administration both easier and more effective -- when the user logs off, all local data disappears, leaving unauthorized users with nothing to access, Runyon said. All software updates and patches are managed centrally, and employees have no way of downloading dubious programs or malware from the Web.
A successful deployment of desktop virtualization technology nevertheless requires as much up-front planning and IT head-sweat as deploying server virtualization, industry experts warn. A companion tip examines what it takes to set up a virtual server environment. What follows are some strategies and tips from industry experts and IT managers who have been involved with desktop virtualization.
Choosing desktop virtualization hardware: Thin, thinner, zero?
Desktop virtualization technology typically refers to a hosted virtual desktop setup in which desktop “images,” including the OS environment and data files, are centrally maintained on a host server and served up on demand when a user device logs on.
Desktop virtualization technology vendors include Citrix Systems Inc., Dell Inc., Hewlett-Packard Co., IBM, Microsoft, MokaFive Inc., NEC Corp., Pano Logic, Sentillion Inc. (recently purchased by Microsoft), Symantec Corp. and VMware Inc. Products differ widely in terms of the OS and hardware platforms they support, and, at least as important, the extent of their dependency on a host server.
At one end of the scale are zero client devices, such as Pano Logic’s Pano. These are essentially dumb terminals, with no resident software and no local processing power or memory.
One step up are thin clients; these offer sufficient local resources for users to do some work offline but lack persistent data storage. When a connection is lost, the software automatically caches the data and reestablishes the session when the connection is restored.
Oklahoma Arthritis Center, in Edmond, Okla., plans to use VMware View 4.0’s offline connection option because its care providers need constant desktop availability to fill out or access patient information, said Chris Nelson, the center’s director of IT.
Some thin clients have both local processing and persistent storage capabilities. Mosaic, a nonprofit care provider in Omaha, Neb., for intellectually challenged adults and children, chose this option partly to minimize network traffic.
“We had 40 agencies operating in 14 states, each with its own localized Windows network and data,” said Thomas Keown, Mosaic’s data storage and security administrator. Now, all agency employees use MediaLogic’s NoMachine NX thin clients to run most applications on Mosaic’s server farm, also in Omaha. They can upload and work offline on data that is routinely cleaned out within 24 hours for security reasons. NoMachine NX uses a compression algorithm to minimize network traffic. In addition, users can access the Internet and check their email locally. Bandwidth use is only about 30 Kbps per user, Keown said.
Shoppers also need to consider the type of desktop virtualization hardware a virtual client supports. Mosaic, for example, could use NoMachine NX on 80% of its legacy machines, Keown said. “And we can buy refurbished Linux desktops with more functionality than we need for about $120 each.”
In contrast, many of St. Vincent’s Hospital’s PCs were obsolete power hogs that took 10 minutes to boot up and “were ready for the scrap heap,” Edupuganti said. Replacing these clunkers with Pano Logic’s small, cheap, energy-conserving Pano devices was thus a win-win, he said.
Deployment challenges of desktop virtualization technology
Virtualized desktops are like infants that cannot cut loose from the umbilical cord that attaches them to the data center. This dependency can put a heavy demand on corporate computing, storage and network resources. “A hosted virtual desktop often means a forklift upgrade in the infrastructure,” said Gartner’s Runyon.
Given the inevitable growth of application and user capacity requirements, performance monitoring has to be ongoing to maintain acceptable service levels, industry sources warn.
“We hadn’t anticipated the memory usage we would need on servers, with 80 virtual desktop users concurrent on [the] system,” said Mosaic’s Keown. Response time became so slow that end users, including Keown himself, couldn’t do work. He brought in a specialist who upgraded the IBM blades, then informed his IT director that the project couldn’t go forward unless there was at least one spare server available to plug in as needed. Mosaic now has 500 end users and the capacity for 600.
Every time a desktop died, our engineers had to open it up, figure out what was wrong, replace the part, or load the software onto a new machine.
Kane Edupuganti, director of IT operations and communications, St. Vincent’s Hospital
Desktop virtualization technology also needs constant access to the corporate storage area network, because the client has no local memory store. St. Vincent’s Hospital’s Edupugani found this out the hard way, when budget constraints forced him to put off installing a backup SAN: “Our single SAN did take a hit, and we were sitting ducks -- all the desktops went out.” The hospital now has a secondary SAN, he said.
On the bright side, as the industry matures, vendors are introducing more resource-conserving features. One such feature is provisioning a virtual client image on demand, rather than maintaining a persistent image for each user on the server. This saves memory space, which is important for organizations that have thousands of desktop clients. It also allows client images to be tailored to individual user needs.
On the network side, meanwhile, “you need enterprise class switches that support VLANs [virtual local area networks] so you can configure your SAN onto two different switches,” Oklahoma Arthritis Center’s Nelson advised. This is both for redundancy, in case one SAN fails, and also “to segment iSCSI traffic so it doesn’t flood the main network,” he said.
Health care providers should choose desktop virtualization technology that supports PC-over-IP, Nelson said, because this standard network protocol greatly increases performance when high-resolution graphical images or medical records are being uploaded.
Getting end users to embrace the virtualized client
Whatever virtualized client you choose, count on end users to resist the idea of giving up their PC autonomy, industry sources warn. “Our biggest hurdle was convincing employees to give up their hard drives, which for some, meant giving up their computers,” Mosaic’s Keown said.
That’s why involving business users up front in a deployment of desktop virtualization technology is so crucial. It can not only lower their resistance but also provide worthwhile feedback.
For instance, Mosaic’s senior leadership proposed the idea of desktop virtualization as a collaborative tool, Keown reported. “They used to have to drag flash drives around to get or leave documents. They said, ‘Hey, I work with five, six people in the state, can we set up folders to share stuff?’”
Mosaic’s IT department is now in “responsive mode,” inviting all business users to come up with ideas, Keown said. “I can’t anticipate how they’ll want to work, but I can see flashbulbs going off in people’s heads.”
Elisabeth Horwitt is a contributing writer based in Waban, Mass. Let us know what you think about the story; email email@example.com.