Peer-to-peer potential rediscovered
By Susan Breidenbach
(IDG) -- Peer-to-peer has been generating lots of attention. Like PCs in the 1980s and the Web in the 1990s, industry watchers say it is one of those disruptive technologies that will turn much of computing upside-down.
This doesn't resonate well with IT professionals who have unpleasant memories of certain early LAN technologies. They call peer-to-peer unsecure, unscalable and unmanageable.
But peer-to-peer technologies are slipping in the back door, and they are a lot less visible than PCs were. Cost isn't as much of a barrier, either. With industry giants such as Intel and Sun backing major peer-to-peer initiatives, it's time to reevaluate peer-to-peer.
"If you are supporting a group of employees who are collaborating with partners on a project, and they go from four people talking once a week and sharing five files to 12 people talking three times a day and sharing 500 files scattered across four companies, [peer-to-peer] technologies can make your job -- and theirs -- a lot easier," advises Andy Oram, editor of the book Peer-to-Peer.
This isn't Windows for workgroups
Of course, peer-to-peer isn't a new concept. IP routing is peer-to-peer, as is the rest of the Internet's original foundation. But the sudden commercialization of the Internet in the mid-1990s imposed a client/server superstructure.
If peer-to-peer wouldn't scale in its previous incarnations, why is it ready now? Directory technologies, such as Lightweight Directory Access Protocol (LDAP), weren't around in the 1980s, and computing power, network bandwidth and storage capacity are thousands of times what they were. It took too much effort to set up and manage peer-to-peer connections, and the resources just didn't exist.
Now Sun is pushing peer-to-peer with Project JXTA, an open source initiative to develop standards. The goal is to let existing and future computing platforms of all types and sizes interact as peers.
"JXTA is the turning point that will make enterprise [peer-to-peer] possible," says Leon Guzenda, CTO for Objectivity, a Mountain View, Calif., developer of distributed database technology that uses peer-to-peer. "Sun has brought together what you need to build a kernel for [peer-to-peer], and they have partners to build the layers above it. Right now it's quite primitive, like an early version of the Unix kernel, but it's very powerful."
Peer-to-peer is a set of concepts rather than a specific technology. Industry analyst Mike Neuenschwander of The Burton Group defines it this way: A movement to network distributed content and resources that are valuable in aggregate, but must remain in the custody of their various owners.
"Business-to-business interactions come to mind here, because one IT department doesn't own all the resources," Neuenschwander says. "But if you can bring content to a centralized place, it is better to do so."
Pure peer-to-peer applications are relatively rare, and their utility in a corporate environment is still suspect. Peer-to-peer products targeting the corporate market tend to use a hybrid approach with some sort of central authority.
Endeavors Technology in Irvine, Calif., uses "brokered peer-to-peer." A registration server functions as a gatekeeper for people entering and leaving a particular peer-to-peer community. But once individuals have joined that community, all communication takes place directly between peers.
Creating virtual supercomputers
The most dramatic peer-to-peer technology involves reclaiming unused computing cycles on desktop computers and harnessing them into a virtual supercomputer. This platform can run huge applications that are "pleasingly parallelizable" because they can be broken into small pieces and run on separate machines.
According to the Omni Consulting Group, the average number of unused computing cycles in a company is about 47 percent, which includes heavily used servers and hosts. When desktops alone are considered, the percentage is much higher.
Some early grid computing initiatives have been philanthropic, getting people to donate computing cycles for medical research and the SETI@Home project. The latter claims the title of world's largest distributed supercomputer, with more than 25 teraflops of processing power.
However, Charlie Catlett, a senior fellow at Argonne National Laboratories and chair of the Global Grid Forum (GGF), says businesses are taking notice. "Commercial enterprises now account for about 20 percent of the attendees at the GGF meetings," he says.
Commercial application of grid computing has been going on for quite some time. Manufacturers such as Pratt & Whitney and Boeing have used grids of computers instead of wind tunnels to simulate and analyze the flow of wind over a structure. Similarly, distributed workstations are being used to conduct seismic analysis, crash simulations and risk analysis.
"If a computational task can be done in a short time, [peer-to-peer] doesn't make sense," says Paul Kirschner, a senior project analyst with United Technologies Research Center in Hartford, Conn. "If it will take hours, [peer-to-peer] might be a fit."
Stephen Elbert, director of applications for Entropia, says the human genome project depends on grid computing. Genome sequencing that used to take years can now be completed in weeks or days.
The San Diego company's Entropia 3000 software takes a large application, splits it into small tasks and distributes them to participating PCs. The server monitors their progress and can reassign tasks. The task runs entirely in the background, and a sandboxing technology isolates this activity and prevents the server from seeing the PC user's files. "Bioinformatics is one industry that really needs grid computing," Elbert says. "They are crunching much larger data sets, and doing in-silico rather than in vitro or in vivo research."
Three years ago, J.P. Morgan started evaluating grid computing for compute-intense risk-management systems, but could not find a ready-made product that fit the bill. The investment bank -- now J.P. Morgan Chase -- built a homegrown technology instead.
"When you can add a second 1-GHz CPU to a desktop for $600 instead of getting an additional CPU on an enterprise server for $15,000, that looks pretty good," says Steven Neiman, who heads up high-performance computing for J.P. Morgan Chase.
The tradeoff is that peer-to-peer raises maintenance costs. To use peer-to-peer, the financial firm had to modify some of its desktop management policies and educate users. There was an up-front development cost of about $2,000 per desktop, along with an annual maintenance cost of about $600 per desktop.
However, with the recent emergence of products from such vendors as DataSynapse, Entropia and United Devices, J.P. Morgan Chase would like to migrate from its proprietary technology to a vendor-supported turnkey platform that could be used to expand peer-to-peer operations.
Neiman says cost alone isn't what makes peer-to-peer compelling. Longer term, he says he hopes Internet-based peer-to-peer will enable naturally distributed applications that can gather competitive market intelligence. Such programs would go out to the Web and mine hundreds of sites. The information could be crunched and distilled on distributed peer-to-peer nodes before it is brought back through the firewall, effectively creating an edge server network in reverse.
Content distribution and file sharing
Most of the files in today's companies are on PCs, not servers, and peer-to-peer can let you see all these storage assets as one big distributed file space. A workgroup member might even be able to find the sketch of an idea you've just begun on your PDA.
The FedStats Interagency Task Force uses NextPage's NXT3 to syndicate content across some 60 portals maintained individually by agencies of the federal government. NXT3 uses XML for messaging among the servers and doing search requests across them. The content continues to be distributed but users see it as a single source.
"XML tagging, combined with HTML and the [peer-to-peer] architecture of NXT3, enables you to pull fragments of distributed content together on the fly and create new documents," says task force member Brand Niemann, a computer scientist with the Environmental Protection Agency.
The proof-of-concept pilot, which used only public information that resided outside agency firewalls, helped overcome initial resistance. "Once people saw what it could do, they couldn't get it fast enough," Niemann reports.
Intel uses peer-to-peer to streamline the distribution of computer-based training materials to employees. The IT department didn't want people to download huge multimedia files from a central server, so programmers built an application called Share and Learn and deployed it on every desktop. When a user clicks on one of the listed courses, the application searches for the courseware locally and then gradually widens the search. Once a user downloads the material, the application knows the closest place to find it.
"By effectively caching the files on the machines of the people who first download them, we have reduced the burden on our network substantially," says Bob Knighten, a peer-to-peer evangelist at Intel's Microprocessor Research Laboratory in Beaverton, Ore.
Collaboration is the peer-to-peer category that really brings power to the people. It is transforming the Web into a much more personalized environment, in which you can share information on your own terms.
"Two brains are better than one, and [peer-to-peer] enables real-time knowledge sharing," says Vijay Srinivasan, president of Global eTech, a San Jose system integrator building business applications with Endeavors' peer-to-peer technology. "[Peer-to-peer] applications will make it easier for people to do their work. In contrast, workflow can be more complicated to support with a server-based architecture, and consequently it gets restricted."
For the past four years, government contractor Syntek Technologies in Arlington, Va., has worked on a Defense Advanced Research Projects Agency program aimed at improving government decision-making.
"We found that decisions are better when people make them collectively in small groups," says Greg Mack, vice president of IT and internetworking at Syntek. "Also, it's important to get the right people involved, and all the right people aren't always present. So we needed a way to do collaboration among distributed people on distributed systems."
Last fall, Mack and his team discovered Groove, the brainchild of Lotus Notes inventor Ray Ozzie. They became beta version users of the software and have been pleased with it. The whole environment is on each local system, so people can work offline. Changes are automatically cached, and updates are made in both directions when an individual reconnects.
"With products like Groove, you can collaborate with people outside your company, and you don't have to figure out who is going to set up a resource group," The Burton Group's Neuenschwander says. "The IT people don't have to set up servers and punch holes in firewalls."
Peer-to-peer facilitates ad hoc collaboration around a context and can enable secondary e-commerce activities to develop around e-business communities. For example, an automobile manufacturer can reduce the inventory of parts at its distribution centers by encouraging parts arbitrage among dealers. This reduces inventory and shipping costs and alleviates distribution bottlenecks.
Security has been a major barrier to peer-to-peer adoption. The different platforms being used within a company and across an extranet all have different security systems, and it is hard to get them to interoperate. People end up using the lowest-common-denominator features. The peer-to-peer community is trying to adapt existing security standards such as Kerberos and X.509 certificates.
"In grid computing, what you are really trying to do is take separate resources and build loose federations, often on the fly," says Marty Humphrey, co-chair of the GGF's Security Working Group. "Kerberos is more of a centralized technology, and doesn't scale well across a distributed environment."
Similarly, Secure Sockets Layer (SSL) and X.509 don't allow for single sign-on or delegation. The GGF's Security Working Group has proposed a standard for X.509 proxy credentials that would let one peer "impersonate" another by delegating identity remotely as part of the SSL protocol.
Collaborative peer-to-peer products, such as Groove and Endeavors' Magi 2.0, cater to corporate environments by incorporating strong encryption and authentication technologies. The products basically implement a public-key infrastructure that is used automatically in ordinary exchanges between peers.
"When we have conversations that are sensitive, we now use Groove instead of e-mail," Syntek's Mack says. "It is much more secure than instant messaging."
Ironically, peer-to-peer environments have some inherent resilience to attack that client/server architectures do not. When information is distributed, there is no convenient point of attack for intruders. Similarly, peer-to-peer platforms are inherently fault-tolerant because a single system going down has little or no impact.
Peer-to-peer technologies can also be used to improve security in e-business environments by providing fine-grained access controls. "We need a more lateral approach to security," says Andrew Grimshaw, founder and CTO of Avaki, a peer-to-peer developer in Cambridge, Mass. "It opens up the network, but in a very constrained way. You are controlling things at the software layer rather than at the network layer."
Nevertheless, the prospect of peer-to-peer connections across organizations tends to give network professionals nightmares about corporate espionage and new ways to spread viruses.
"When you go through a firewall to a public network or extended private network, management issues go up an order of magnitude and security and privacy issues go up several orders of magnitude," says Terry Retter, director of strategic technology services at the PricewaterhouseCoopers Global Technology Centre in Menlo Park, Calif.
"People need to stop thinking about protecting computers and start thinking about protecting information," Retter says. "That shift hasn't happened yet. But think ahead to when we have several billion handheld devices accessing all kinds of things. You have to firewall almost at the object level. We're not there yet, behaviorally or technically."
Managing the environment
Questions about peer-to-peer manageability have to be evaluated even though people, processes and computing power are moving to the edge and beyond the edge.
"IT professionals are in denial about what is going on among the desktops," Neuenschwander says. "They are not really in control of content and of what employees do with it. Peer-to-peer technology could give you better control or at least better knowledge of what is happening on the desktop."
JXTA includes peer-monitoring hooks that will enable the management of a peer node. People can build visualization tools that show how much traffic a particular node is getting. With such information, a network manager can decide to increase or throttle bandwidth on various nodes, or implement a different level of security.
Meanwhile, companies are already finding it costs less to administer laptops in the field on a peer-to-peer basis.
"Say there are mobile users in the field who need new virus signatures or other software updates," says Frank Bernhard, managing principal in charge of Omni Consulting Group's supply chain and telecommunications practice. "I need a way to go out and scan these computers and see what's on them. I can set up [peer-to-peer] relationships and distribute fixes and patches more efficiently than I could over the general network. It enables customization to each desktop rather than a general download to everyone."
Administrative overhead is one of the gating factors for peer-to-peer applications; they must be easy to deploy and use. If you opt for password-based security, make sure the product integrates directly with your company's LDAP implementation. Also look for tools that support legacy data with minimal fuss and monitoring capabilities that can collect statistics about usage.
When peer-to-peer happens, it will happen because of a killer application, and your users will figure out a way to implement it themselves. It's time to seize the initiative so you can stay on top of the situation.
Today's peer-to-peer efforts are being compared with the advent of Mosaic eight years ago. It is very early, and many things companies are developing will never fly. But some of them will, and they may be very disruptive.
According to Neuenschwander, technological objections to peer-to-peer are often just a smoke screen for philosophical differences. Peer-to-peer enables "me-centric" computing as the view is lateral or bottom-up instead of hierarchical and top-down. It is a cultural movement that attempts to align computing with the needs of the individual who is using the software, and it is meeting with a lot of resistance from the powers that be.
"Maybe [peer-to-peer] has to come in through the back door, like PCs did," concludes Garry Allen, a consultant in Kingston, Ontario, who has 20 years of experience with custom programming and system design. "Any IT professional that is not looking seriously at [peer-to-peer] is going to become a dinosaur."
|Back to the top|