Opinion
Computing Applications

Forum

Posted
  1. The ICANN Framework Better Than Going to Court
  2. Drop the Adjective
  3. Keep E-Journals Affordable
  4. Needs and Adoption of Process
  5. Give Security Accreditation a Chance
  6. Keeping Out Cookies
  7. The Joy of Coding
  8. Author

As a practicing ICANN domain name dispute arbitrator, I found Michael Froomkin’s article "The Collision of Trademarks, Domain Names, and Due Process in Cyberspace" (Feb. 2001, p. 91) a bit too critical.

First of all, it should be noted that about 50% of the respondents choose not to reply, which certainly helps explain the relatively high percentage of findings in favor of the complainant. In all the cases I have handled where there was a default by the respondent, it was quite clear the respondent was in fact guilty of cybersquatting, that is, of obvious and gross misuse of someone else’s trademark.

As I understand it, the ICANN framework was meant to provide a fast and inexpensive method to resolve disputes that would take much longer to hash out in court. If a party is not happy with the admittedly "quick and dirty" ICANN process, it can always appeal through the regular court system. Thus the ICANN framework results in faster, less expensive resolution of disputes, at no loss in final quality, since the courts always have the last word.

It is true, as Froomkin states, that the ICANN framework has shifted the average cost of disputes from the trademark owners (who, before ICANN, had to go to court) to the respondents (who, after ICANN, have to go to court if they are not happy with ICANN decisions). But this shift can be considered undesirable only if it is thought the only recourse against obvious and flagrant cybersquatting and domain name piracy should be the conventional court procedures protecting trademarks.

A majority of the legal and IT community appears to believe that alternatives to court procedures are appropriate, and the ICANN process, despite its imperfections, is not, on the whole, a bad alternative.

Richard Hill
Geneva, Switzerland

Back to Top

Drop the Adjective

Digital Library.

Doesn’t it strike you, too? That sort of "gee whiz" quality of the phrase (see special section on "Digital Libraries," May 2001). So new, so modern, so—digital. And so much more, so much better, than a plain old library (sneer), which even the school geek wouldn’t be seen in now.

But a distinction without a difference, perhaps? I mean, the old brick-and-mortar library in my town may not be electronically integrated, but it has CDs in it; they’re digital, aren’t they? It even contains some Internet terminals. Ah, but I suppose a digital library isn’t allowed to have those clunky old books.

It reminds me of "transistor radio" or maybe even "motor car." And, like them, after a few years, no one will be caught dead including the adjective "digital" (except ironically). Heard it in science fiction lately?

Can we declare this irritating phase over now, please? Lose the "digital." It’s a library.

George Ellingham
London, England

Back to Top

Keep E-Journals Affordable

I’d like to add my support to the "Viewpoint" expressed by Krzysztof Apt ("One More Revolution to Make: Free Scientific Publishing," May 2001, p. 25). The academic community is alarmed by recent, substantial price increases by commercial publishers.

I don’t think access has to be free, as long as there is a commitment to the cost being affordable. This should be an easy commitment for a learned society such as ACM, which already has an excellent record of producing print journals its members can afford.

I have some knowledge of the economics of e-journals through my editorship of an e-journal published by the London Mathematical Society. It is possible to produce an affordable refereed journal of high typographic standard.

ACM can’t and shouldn’t shoulder the entire burden, but with its prominent position, it could play an exemplary role by launching some new e-journals.

Lawrence C. Paulson
Cambridge, England

Back to Top

Needs and Adoption of Process

Thank you for the column "Software Process Improvement in the Small" ("Thinking Objectively," Apr. 2001, p. 105). This is one of the best introductions to why a process is needed, as well as to the risks of process adoption. I like very much, and subscribe to, the authors’ conclusions, including: "The discussion to avoid is the one in which formal processes are compared out of the context of the particular group’s particular circumstances. We believe that abstract side-by-side comparisons of process models are essentially meaningless."

I intend to use it (quoting the original, of course).

Adriano Comai
Tornino, Italy

Back to Top

Give Security Accreditation a Chance

Bruce Schneier argues against a "Cyber Underwriters Laboratory," primarily in response to the business plans of the recently formed Center for Internet Security ("Inside Risks," Apr. 2001, p. 128). While I generally agree with Schneier’s opinions about the difficulty of rating security products, I worry that readers will think any kind of security accreditation capability is a bad idea.

As Schneier often writes, "Security is a process, not a product." There are many processes that constitute an organization’s information security posture. They range from physical security (guns, guards, gates), to personnel security (required background checks for sensitive positions), to operations security (regular updating of antiviral signature files). Surely a collection of generally accepted processes could form the basis for a security process accreditation model. A successful model would be used by industry as the basis for measurement and improvement, as well as third-party accreditation (for instance, a cyber UL).

There are many pitfalls in implementing an effective security process accreditation model. But that doesn’t mean we shouldn’t try to create them.

Bill Brykczynski
Fairfax, VA

Back to Top

Keeping Out Cookies

I do not let cookies into my system, and I tell others to do the same. Up to now, I have had difficulty explaining why this should be the case in terms of invasion of privacy and basic system protection. Now I can cite Hal Berghel’s informative column "Caustic Cookies" ("Digital Village," May 2001, p. 19) to set them straight.

I am amazed at the chutzpah of browser designers assuming they have the right to enter my computer and deposit "Web guano" (Berghel’s term) on my hard drive. I liken it to putting my calling card into your wallet or handbag every time we meet. You would call the cops if I tried to do that.

Saul I. Gass
College Park, MD

Back to Top

The Joy of Coding

Henry Liebermann and Christopher Fry ("Will Software Ever Work," Mar. 2001, p. 122) suggest users ought to be able to fix software bugs. Although they only hint at a possible mechanism, implying artificial intelligence, they do not give details. One suggestion they make is to provide a better coding language, as C and C++ are not suitable for debugging, at least by users.

We believe no new language for coding can ever provide the desired results. On the contrary, users should rarely write or interact with code. Most of the problems in large-scale software arise from the impenetrability of the code, particularly, the control flow of that code. Real-world problems have to take account of all the errors that may be encountered, so software for seemingly simple tasks becomes complex and eventually unmanageable.

For more than 10 years, an alternative software design approach (F. Wagner, "VFSM Executable Specification" In Proceedings of the IEEE International Conference on Computer Systems and Software Engineering," 1992) has been used, in a wide variety of projects. It formalizes the control flow into a hierarchy of finite state machines and isolates all numerical calculations (including input-output routines) into separate, small, easily tested code modules. The control flow is prepared in a formal manner, by means of state-transition tables that can call on numerical or other procedures on entry to or on exit from, each state. This control flow specification is then transformed into a special format, but "code" is not generated; the specification is interpreted by a runtime "executor" program, which in a sense understands the specification and fulfills the intentions of the system designer.

The latest versions of this technology include a real-time database that facilitates the communications between many state machines, supports distributed systems, and formalizes access to input-output services, even for the most complex control-system applications.

When these techniques are fully supported, they turn software development into something resembling engineering design. Such software can be developed by engineers in close touch with the needs of the user, perhaps even by the users themselves. Although some conventional coding is still required, it does not require the high level of skill needed for conventional methods. When, perhaps in the testing phase, users need to change the operation of a part of the system, they are obliged to alter the formal specification and can make changes with a high degree of confidence. Typically, when a large project is implemented this way, up to three months of bug hunting is replaced by about a week of tuning-up.

These techniques have been applied to semiconductor production lines, complex measuring instruments, and in telecommunications (A.R. Flora-Holmquist et al., "The Virtual State-Machine Design and Implementation Paradigm" Bell Labs Technical Journal, Winter 1997). We assert they ought to be used in all cases where human safety is an issue. Their foundation—the executor program module—has never had to be altered in its 12 years of use.

The resulting software is very reliable in use and needs little maintenance. A number of available products can generate code from a formal specification in state-diagram or transition table format, but we believe such products do not provide a really good solution. Their users know they will be able to work with the resulting code and therefore do not feel obliged to create a 100% accurate specification at the design stage.

We find it difficult to spread our message, as the academic world, having tried and failed in the past, is convinced what we do is impossible. Moreover, and this may be the real problem, programmers enjoy coding.

Ferdinand Wagner and Peter Wolstenholme
Ilbhofen, Germany

Back to Top

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More