Teaching Old IT Admins New Tricks

These new systems would be perfect, if it weren’t for those damn users.”

The statement made me wince. But Jerry is a frustrated glass-house IT guy. As the VP of IT, he must incorporate current technology offerings to maximize competitiveness. It’s not only the users giving Jerry fits, though, it’s the flexibility of the client/server approach.

In the past, corporate approval came with a cost/benefit analysis. Jerry often backed into a series of cost-saving numbers to win support. Once signed off, he gathered information from the planned users and then, without their meddling, built them a solution. No user reviews. No requirements changed. No problems.

With client/server, though, his approach isn’t working. Things change too fast. Today’s solution can’t be quantified in the traditional cost/benefit analysis. The big wins aren’t just cost savings but better ways to get the results. Jerry was looking for insight because his credibility was fading fast.

I often see this “legacy garbage,” I told him. Many IT veterans don’t easily change, and until they do, their new solutions have a significant fail rate. Until they embrace the new methods, everybody loses: Users have a system that doesn’t meet their current needs, the corporation doesn’t know if it will get a positive return, and IT faces yet another underwhelming project.

But today, many IT professionals leverage the technology to their benefit. They can be flexible to user demands and still be in full control of the situation. With that explanation I got Jerry’s attention, but his expression was skeptical.

This approach allows the project team to determine the benefits of the new system before, during, and after the development. Commitment starts at the inception of the business solution. Rather than using the traditional snapshot cost/benefit analysis, we need to move to a more dynamic model.

Start by reviewing the corporate strategies and goals. Using an iterative refinement approach, frame and qualify a model that identifies the crucial characteristics of our solution. Once the solution’s critical constructs are known, we can at any point determine how well our system is helping the business.

However, building a successful BIM (Business Impact Model) requires solid knowledge as well as careful thought and work during each aspect of solution development. While this model lends itself well to iterative design, it can be used in a traditional waterfall approach. He was more interested.

Planning is critical

The most crucial point is during the planning phase. In this initial step, we identify the solution’s impacts, risks, and current use cases. Leveraging this information, the team applies insight, experience, and user hopes to define desirable use cases. Once these use cases are grouped into subsystems, stating the requirements, risks, and business impacts of our proposed solution is direct and quantifiable. By tying them to corporate initiatives, we can determine and assess the business impact of any chang es in our project’s plans or focus.

During development, we can now easily address those untimely “damn user demands.” We can now at least acknowledge their requests. Jerry smiled. We were bonding.

Rather than trying to enforce rigid requirements that were developed before the project truly began, we can now embrace users’ suggestions as we prototype. We just inject their suggestions into our BIM and quantify the value of the impact. Armed with this information, we determine if the proposed enhancement makes our cut list.

Additionally during the design process, we determine how to best collect the items underlying our growing presumptions of the identified impacts and risks. Easy access to these base measures will allow us to accurately recompute our BIM after deployment.

After incorporating these information hooks into our solution, we are able to determine the production system’s vital stats. Since our BIM will now be using timely data instead of old estimates, we can get a precise reading as to how well our solution is helping the business. With direct ties to the corporate goals and quantifiable financial impact, we can provide some very solid information.

A welcome solution

Jerry was beaming. Like many IT people, he hates when financial and business execs ask him to quantify the benefits of a deployed system. Anything that can help him overcome those nagging inquiries is welcomed. And a flexible approach with this information as a result is a godsend.

Liking what he heard, Jerry was ready to tackle his next trouble spot. Grinning, he leaned forward and said, “Now about client/server benchmarks …”

csvrClient/server deployment requires one piece of hardware you probably won’t find in the computer store: a new hat rack.

The challenges of setting strategic directions and policies, establishing standard platforms, and rolling out applications that share data have caused many companies to redeploy their information technology staffs.

What has emerged in the largest companies is a collaborative environment with some centralized control. Think of it not as a glass house, but a glass condominium. Corporate IT can’t afford to be an isolated tower of power.

At Chase Manhattan Bank in New York, for example, CIO Craig Goldman leads a “gang of 60″ technologists in establishing global computing standards. Each technologist represents a business unit that budgets for, implements, and supports initiatives based on the standards.

The team approach eliminates an ivory tower. “They get in the boat and row with me,” said Goldman of his technologists. “We don’t have people who don’t have a sense of what it’s like in real life.”

However, collaboration does not mean eternal flexibility. “Once we’ve agreed on direction and standards, that becomes the law,” he said.

Like Chase, many other large organizations are adopting a split structure, where corporate IT establishes standards and a global infrastructure that departmental IT groups maintain, according to Richard Buchanan and John McCarthy, analysts at Forrester Research Inc., in Cambridge, Mass. (see chart, Page 23).

“Large-scale client/server requires a high order of planning, design, and construction specialization,” Buchanan and McCarthy wrote in a recent report. “Similarly, constructing a skyscraper requires greater sophistication than building a woodshed.”

If corporate IT wears a fedora, then departmental IT wears a hard hat.

Pendulum swing

The pendulum is swinging back from decentralized freedom to more central IT control as computing environments become more complex.

“As information becomes more global, and more distributed, it stands to reason that we can’t continue to support islands of automation insulated from the rest of the organization,” said John Daly, senior industry analyst for Summit Strategies Inc., a research firm in Boston.

“Somebody has got to be in control,” echoed Charles Vetters, CEO of Micro Technology, an Aurora, Colo., consultancy with about 8,000 client/server installations under its belt. “Companies are saying, `We need to be smaller, smarter, closer together. We don’t want five systems out there that don’t talk to each other.'”

Creating a smaller organization is often the result of implementing client/server — another reason to redeploy staff. “The five-year plan may not guarantee a reduction of resources, but it’s usually part of the cost analysis,” Vetters said. That corporate IT calls the shots is purely practical. “They’re the ones with access to the five-year plan, and maybe they don’t want to talk about it.”

Yet there are variations on the IT split. Harris Methodist Health System, a $750 million health-care services firm based in Fort Worth, Texas, has a single IT group that sets standards and deploys applications for its various businesses: a health-maintenance organization, 10 hospitals, an insurance company, a home-care company, and an international health-care subsidiary.

Somewhat like Chase, Harris hires user-group coordinators from each of the businesses to help IT establish standards and assigns customer-support representatives to the groups to deploy applications. User-group members meet for 18 hours each month to identify business initiatives and technology solutions that meet company standards.

What happens when somebody has an idea for an application that can only be deployed with non-standard technology?

“We do grant exceptions, but they must be approved by IT,” said Larry Blevins, Harris senior vice president and CIO. They must also be well-documented and represent a temporary cost of doing business, rather than a permanent addition to the firm’s technology overhead, which could mean looming support costs down the road, he said.

In general, according to Vetters, departmental efforts that stray from the corporate plan have to be paid for and supported by the departmental organization. With that caveat, “You get few renegades spending bucks on experiments.”

Just how much departmental IT organizations can do depends on how entrepreneurial or anarchic the corporate organization is, according to Summit Strategies’ Daly.

A large insurance or retail organization like Wal-Mart, with a need to share data across the entire organization, would have more centralized control. The other extreme might be found at a company like 3M Corp., which fosters a spirit of entrepreneurism, Daly said.

Whether corporate IT is separate from or grouped with departmental IT (see related story, Page 21), a distinction between business and technology initiatives no longer exists. Across the board, companies are deciding that business initiatives drive applications, and not vice versa.

Small and midsized firms have come up with some creative ways to structure IT — and still meet the toughest challenges of client/server deployment.

Most organizations with revenues under $1 billion, in fact, can effectively handle trouble spots such as systems management, multivendor complexity, and security with a single IT staff, according to Forrester Research Inc., of Cambridge, Mass. (see chart, Page 24).

If you don’t get IT, you don’t get it

RealCom Office Communications, a 10-year-old telecommunications provider with $78 million in sales, has a distributed approach to systems management.

The company supports about 60 remote users in 15 branch offices, according to Eric Nelson, vice president of IT at RealCom, in Chantilly, Va. General managers at each remote site report to Nelson, who coordinates development efforts.

The company also maintains a distributed approach to processing. Each branch has enough bandwidth to do local transaction processing fairly easily, Nelson said. At headquarters in Atlanta and in Chantilly, RealCom is moving to a data warehouse to support back-office and decision-support systems that don’t require real-time processing.

For security, RealCom requires branches to dial in to hubs in Atlanta or Chantilly to get anywhere else on the LAN. And it has tackled multivendor complexity by limiting itself to two suppliers: Novell Inc. and Microsoft Corp.

Spanning the network

Like other small firms, WorldSpan, a privately held developer of travel-related software and services, is struggling with systems management.

“That is an area of lively discussion here,” said Paul Halstead, vice president of distributed systems for the Atlanta company, which is trying to decide whether to keep it in one group or to split it into areas of database administration, capacity planning, and application development.

WorldSpan’s IT, which Halstead described as a “single, integrated organization,” already distributes responsibility for security among departments. “The network services group will provide access [to data] based on security approval by the appropriate department heads,” he said.

supbWhen Judy Estrin decides it’s time for startup No. 3, she needn’t worry about funding. All she has to do is look to Philip Greer. He’s a honcho at Weiss, Peck & Greer Venture Partners, which made some $57 million on Estrin’s first two startups — Bridge Communications and Network Computing Devices. If Estrin comes calling again, he says, only half-joking, “My response would be: ‘”How much do you want?”‘

Estrin, 39, who resigned as president and CEO of NCD just three weeks ago, doesn’t know what’s next. Not yet anyway. She wants to “decompress” from her 70-hour workweeks. Plus, she and husband Bill Carrico, NCD’s former chairman, feel better suited to running something smaller than $144 million NCD. But don’t expect the duo to stay out of the startup mode for long. Estrin figures she and Carrico will be back within a year.

Keep an eye on them. They’ve shown amazing prescience. When they founded Bridge in 1981, overall sales of internetworking devices totaled less than $50 million annually. Today, internetworking is a $3 billion-a-year business, says Lee Doyle of International Data Corp. And the X hardware and software market, in which NCD was again an early arrival, grew from almost nothing in 1988 to $1.5 billion today, according to Greg Blatnik of the X Business Group, in Fremont, Calif.

How does Estrin come by such a good eye? It may be in her genes. Her parents are both professors of computer science at UCLA. But she also credits her 44-year-old husband, someone she also calls “mentor.” Says Estrin: “Both of us are very keyed in with technology and where it’s going.” Each has an electrical engineering degree. Between them, they represent 40 years of high-tech experience.

Estrin, who’s been married to Carrico for seven years, says it’s unlikely she’d ever do a startup without him. “We complement each other, and we enjoy working together,” she says. They’re so inseparable a friend has taken to calling them “The Bill & Judy Show.”

They met at Zilog in 1979. Estrin was engineering manager, Carrico a business unit manager. Two years later, Carrico got the bug to do a startup and convinced Estrin, by then at Ungermann-Bass, to join him. It took them about six months to write a business plan and get venture funding.

Estrin says she’s drawn to “building markets,” which requires a lot of long hours. “Sometimes we wonder why we pick things that are so hard,” she says with a laugh. At Bridge, she not only had to create a company, she had to educate the market about how to tie networks together. “She spent more time on the road with customers than Bill or myself, and that enabled us to never miss a beat or a trend,” says Eric Benhamou, a Bridge co-founder and now CEO of 3Com Corp. 3Com acquired Bridge in 1987, but it’s Bri dge’s basic strategy that prevails. Benhamou, who became the company’s chief in 1990, shifted it out of network operating systems and servers to concentrate on interworking products. “The new strategy that has worked astonishingly well is similar to that of Bridge, just broader,” he says.

Estrin and Carrico left 3Com after nine months in a disagreement about the company’s direction. They planned to take six months off, but a friend asked them to hear a pitch from six people at a new company called Network Computing Devices. It didn’t take them long to recognize the virtue in the company’s plan to make something brainier than a dumb terminal but much cheaper than a workstation for use on LANs. They spent just one day calling industry people — including Estrin’s sister, Deborah, a computer s cience professor at USC — to confirm the X market’s promise.

Promising as the market may have been, Estrin, once again, had to work hard to nurture it. “An awful lot of education had to be done for X,” says IDC director Eileen O’Brien. “Judy appeared on practically every single panel, did hundreds of press interviews, and spent an awful lot of time with folks explaining X.” O’Brien dubbed her the “mother of X Windows.” Estrin was more than a market evangelist, however. She was instrumental in capturing paying customers for NCD. “She sold some of the biggest accounts we have,” says NCD director Greer.

Carrico passed Estrin the mantle of president and CEO in October 1993. Together at the top they broadened NCD’s software line to include E-mail and gateways. Now, as she passes the torch, she leaves behind some big challenges for her successor, Ed Marinaro, an NCD diector and 25-year industry veteran. While NCD remains the largest X terminal vendor, Hewlett-Packard is gaining fast. And Sun Microsystems jumped into the market last year. Competition plus the PC price war are eroding NCD’s gross margins, whic h fell from a hefty 40 percent in 1992 to 32 percent today. It lost $9.2 million in the first quarter — due to the purchase of E-mail developer Z-Code Software Corp. And it reported a meager $182,000 second-quarter profit on sales of $41 million. Meanwhile, its stock price is sitting at $3.75 — down from its 1992 IPO price of $12.

But Estrin doesn’t see all this as departing on a down note. “Being at the top of one’s game doesn’t mean that your stock price is at its highest or your earnings are at their highest,” she says. NCD, she insists, is actually in better shape than it was a year ago — when its stock price was higher — because it’s well-positioned to make a stronger push in software.

Whatever happens at NCD, the fact remains that Estrin and Carrico worked their startup magic again. Estrin says she honestly doesn’t know what she’ll pull out of her sleeve next. “I’m purposefully not looking [for startup ideas],” she says. “I don’t want to get excited about something.” Whatever she decides on, you can be sure investors will be close by. Says Benhamou: “If I have an opportunity to invest in whatever they do next, I will.”

Back to top