Advertisement

Practice

Economic analysis of microcomputer hardware

Economic analysis of the hardware characteristics of the personal computer segment of the microcomputer market indicates that the variables that most affect cost are the bundle of attributes offered by the micros. Interestingly, certain technological characteristics found to be important explanatory variables for cost in larger computers were not found to be significant for micros.
Research and Advances

The 1988–89 Taulbee survey report

This report describes the results of a survey of the Forsythe list of computing Departments1, completed in December, 1989. The survey concerns the production and employment of Ph.D.s that graduated in 1988-892 and the faculty of Ph.D.-granting computing departments during the academic year 1989-90. All 129 Computer Science (CS) departments (117 U.S. and 12 Canadian) participated. In addition, 29 of 32 departments offering the Ph.D. in Computer Engineering (CE) were included3. Throughout this report, CE statistics are reported separately so that comparisons with previous years can be made for CS, but the intention is to merge all statistics for CS and CE in a few more years. Some highlights from the survey are: The 129 CS departments produced 625 Ph.D.s, an increase of 8 percent over the previous year; 336 were Americans, 35 Canadians, and 248 (40 percent) foreign (6 were unknown). Of the 625, 309 (49 percent) stayed in academia, 181 (29 percent) went to industry, 24 (4 percent) to government, and 56 (9 percent) overseas; 7 were self-employed; and 9 were unemployed (39 were unknown). A total of 1,215 students passed their Ph.D. qualifying exam in CS departments, an increase of 9 percent over 1987-88. No Afro-Americans, 6 Hispanics, and 87 women (14 percent) received Ph.D.s this year. The 129 CS departments have 2,550 faculty members, an increase of 123, or almost 1 per department. There are 938 assistant, 718 associate, and 894 full professors. The increase came entirely in the associate professor range. The 129 CS departments reported hiring 204 faculty and losing 161 (to retirement, death, other universities, graduate school, and non-academic positions). Only 9 assistant professors in the 158 CS and CE departments are Afro-American, 24 Hispanic, and 103 (9 percent) female. Only 2 associate professors are Afro-American, 8 Hispanic, and 74 (8 percent) are female. Only 5 full professors are Afro-American, 8 Hispanic, and 33 (3 percent) female. The growth in Ph.D. production to 625 is less than what was expected (650-700). Still, a growth of almost 50 Ph.D.s is substantial, and it will mean an easier time for departments that are trying to hire and a harder time for the new Ph.D.s. There is still a large market. The new Ph.D.s. however, cannot all expect to be placed in the older, established departments, and more will take positions in the newer departments and in the non-Ph.D.-granting departments.Growth of Ph.D. production seems to have slowed enough so that over production does not seem to be a problem in the near future. There will not be enough retirements, however, to offset new Ph.D. production for ten years. (In the 158 departments, 22 faculty members died or retired last year.) We believe that many of the new Ph.D.s would benefit from a year or two as a postdoc, and perhaps it is time for the NSF to institute such a program in computer science and engineering.The percentage of CS Ph.D.s given to foreign students remained about the same at 40 percent. In CE, the percentage was much higher, at 65 percent.The field continues to be far too young, a problem that only time is solving. CS continues to have more assistant professors than full professors, which puts an added burden on the older people, but there was substantial growth this year in the number of associate professors (as assistant professors were promoted). But the ratio of assistant to full professors in CS has not changed appreciably in four years. As we have mentioned in previous Taulbee Reports, no other field, as far as we know, has this problem. In fact, most scientific fields are 80 to 90 percent tenured in many universities. In CS, this problem is more severe in the newer and lower-ranked departments. In fact, the top 24 departments now have 223 assistant, 176 associate, and 290 full professors. The CE departments have far more full professors than assistant professors, mainly because many are older EE departments offering CE degrees.As we have indicated, Afro-Americans and Hispanics simply are not entering computer science and engineering. It is high time that we did something about it, and we hope the CRB will take the lead in introducing programs to encourage more participation from these minorities.There was a slight growth in the percentage of female Ph.D.s in CS, from 10 to 14 percent. Still, there are far too few women in our field, and our record of retention of women in the faculty is abysmal. There are only 33 female full professors in the 158 CS and CE Ph.D.-granting departments! Again, we hope the CRB will help introduce programs to encourage more women to enter computing and to remain in academia over the years. The signs are that the NSF is interested in this problem as well.
Opinion

From Washington: EC directives aim for market harmony

The 1992 unification plans for the 12-nation European Community (EC) have surely been among the most dissected blueprints of the year. Politicians ponder trade agreements, economists refigure potential revenues, and media attention is unrelenting. Absorbing it all are those U.S. high tech industries hoping to make a lasting impression on what the economists, politicians and media all predict will be a $4 trillion market of over 300 million consumers.Harmony has become the slogan adopted for the project. The EC Commission has spent the past five years identifying and implementing a program of close to 300 directives and regulations that would allow for the free movement of consumer products within the community. The goal of these directives, in addition to promoting European commerce and fair business competition, is to eliminate possible physical, technical or fiscal trade barriers. The result, they hope, is one harmonious marketplace.Watching every note of this orchestration has been the U.S. Department of Commerce (DoC). Working with an internal program that includes the participation of senior officials from the International Trade Administration, the DoC has examined over 185 of the adopted or proposed directives issued by the EC Commission. Moreover, it consulted with trade associations and industry representatives to explore how these directives relate to current U.S. business practices and determine how they might affect future EC business dealings.The DoC has published the results of its analyses in a three-volume series that examines directives for a rainbow of products and businesses. The recently released EC 1992: A Commerce Department Analysis of European Directives, Volume 3 features the Department's final roundup of EC Directives—primarily those stipulating technical requirements, nine of which pertain to computers and telecommunications.Much attention has focused on how the high tech industries, particularly computers, software, telecommunications, and information technology, will be affected by EC standards, certification and testing issues. (See Communications, April 1990 and July 1990). According to Myles Denny-Brown, an international economist and coordinator for EC 92 activities for information technology industries at the DoC, the high tech industry regards EC market potential with cautious optimism. “I believe (industry) feels there is the possibility for some real liberalization there,” he says. “But there is also the possibility of some restrictiveness.”Denny-Brown points out that standards and procurement issues are particularly important to build a competitive environment that would allow market growth to really take off the way it should. (see sidebar)
Opinion

Legally speaking: should program algorithms be patented

In the Legally Speaking column last May [6], we reported on a survey conducted at last year's ACM-sponsored Conference on Computer-Human Interaction in Austin, Tex. Among the issues about which the survey inquired was whether the respondents thought patent protection should be available for various aspects of computer programs. The 667 respondents overwhelmingly supported copyright protection for source and object code although they strongly opposed copyright or patent protection for “look and feel” and most other aspects of programs. Algorithms were the only aspect of programs for which there was more than a small minority of support for patent protection. Nevertheless, more than half of the respondents I opposed either copyright or patent protection for algorithms. However, nearly 40 percent of the respondents regarded algorithms as appropriately protected by patents. (Another eight percent would have copyright law protect them.)We should not be surprised that these survey findings reflect division within the technical community about patents as a form of protection for this important kind of computer program innovation. A number of prominent computer professionals who have written or spoken about patent protection for algorithms or other innovative aspects of programs have either opposed or expressed reservations about this form of protection for software [2, 4, 5].This division of opinion, of course, has not stopped many firms and some individuals from seeking patent protection for algorithms or other software innovations [8]. Although the Refac Technology patent infringement lawsuit against Lotus and other spreadsheet producers may be in some jeopardy, it and other software patent lawsuits have increased awareness of the new availability of software patents. This situation, in turn, has generated some heated discussion over whether this form of legal protection will be in the industry's (and society's) long-term best interests.The aim of this column is to acquaint readers with the legal debate on patent protection for algorithms and other computer program innovations, an issue which seems to be as divisive among lawyers as those in the computer field. [3, 9].
Practice

Agenda: a personal information manager

The free-form, evolving, personal information that people deal with in the course of their daily activities requires more flexible data structures and data management systems than tabular data structures provide. A tool for managing personal information must conveniently handle freetextual data; allow for structure to evolve gracefully as the database grows; represent unnormalized data; and support data entry through database views. We have designed a new type of database that serves these needs—“item/category” database—and realized this design in a commercial personal computer software product named “Agenda.”
Opinion

Personal computing: simple complexity and COMDEX

One of today's emerging paradigms is the view that complex behavior or form can emerge from the interaction of relatively simple components, if you have enough of them and they have enough time to do whatever they do. The emergent behavior or form might seem systematic or chaotic. Some examples are neural nets, cellular automata, fractals, electronic mail networks, market economies, whirlpools, and snowflakes. Years ago, similar systems were often called self-organizing, and they were found in models of memory, pattern recognition, multilevel stores, and libraries. The area languished, however, awaiting the development of theory and powerful hardware. Personal workstations played an important role in facilitating experimentation and mass market personal computers are now up to the task.
Research and Advances

The politics of standards and the EC

European legislation and power struggles in the standards arena are sparking fear of technical barriers to trade and prompting the American standards community to reevaluate its infrastructure. The National Institute of Standards and Technology may step up its role in order to negotiate at a governmental level with the EC.
Opinion

From Washington: budget FY 1991: the numbers tell the R&D story

This is the time of year when talk turns to fiscal budgets. In Washington, however, such banter typically involves astronomical sums of money.When President Bush released his proposed budget for FY 1991 last January, the reaction from the scientific community was mixed. Many observed that seldom have research and development (R&D) projects been given as prominent a place in a federal budget. Other industry watchers, while less enthusiastic, had to agree that in many respects R&D fared better under this year's budget than last year's.However, understanding the details of the budget is far more important than reviewing its broad outlines. For that reason, the American Association for the Advancement of Science (AAAS) calls members of the scientific community to Washington each spring to dissect, denounce and defend the government's R&D funding plans for the next fiscal year.The Colloquium on Science and Technology Policy conference centered around the analytical findings of the AAAS's Research and Development FY 1991 report. The three-day conference was peppered with high-ranking White House officials who either defined the specific branches of the government's R&D interests or discussed the possible implications the budget poses for future projects.There is an overall 7 percent budget increase for R&D, with a 12 percent increase in nondefense R&D programs and an 8 percent increase in basic research. In the area of computer science and engineering, DARPA, NSF, and the ONR remain the largest sources of government funds for R&D.Federal support in computer science is divided into two basic categories: defense and civilian. More than 60 percent of total federal R&D expenditures in computer science and engineering are supported by the defense sector. Moreover, federal R&D activities are conducted in government and nonuniversity labs as well as in universities. The majority of the funding for computer science research supports activities outside of universities.D. Allan Bromley, director of the Office of Science and Technology Policy (OSTP), explains the thinking behind the President's budget: To prioritize funding requests, the Office of Management and Budget follows three basic guidelines. They include 1. Programs that address national needs and national security concerns, 2. Basic research projects, particularly university-based, individual and small group research, and 3. Adequate funding for the nation's scientific infrastructure and facilities. Bromley points out that one of the primary avenues OSTP will emphasize this year is high-performance computing—a dynamic technology for industrial, research and national security applications. Of first concern will be the development of hardware to enhance mainframes and address the parallelism needed to make TERAOP computers perform trillions of operations per second. The next phase will be software development, followed by the construction of a fiber optic network.Bromley, who also serves as assistant to the President for Science and Technology, calls the FY 1991 budget an excellent one for R&D. However, he is quick to add there are problems with those numbers. (One of the most serious involves the funding rate for grants at the NSF and National Institute of Health (NIH). Despite a decade of funding increases, the money available for new, young investigators is very tight. Indeed, the scientist community is partly to blame, he says.“We argued for multiyear grants and contracts to cut down on the amount of paperwork required to do research,” recalls the OSTP director. “Both NSF and NIH have responded to those requests, and in the process they built substantial ‘outyear mortgages&rsquo’ for themselves.”
Practice

Typographic style is more than cosmetic

There is disagreement about the role and importance of typographic style (source code formatting and commenting) in program comprehension. Results from experiments and opinions in programming style books are mixed. This article presents principles of typographic style consistent and compatible with the results of program comprehension studies. Four experiments demonstrate that the typographic style principles embodied in the book format significantly aid program comprehension and reduce maintenance effort. —Authors' Abstract
Practice

Building bilingual microcomputer systems

In the Arab world the need for bilingual microcomputer systems is ever increasing. In addition to the ability to process the Arabic and English scripts, an ideal system should support the use of existing applications with Arabic data and the access to the system facilities through Arabic interfaces. The Integrated Arabic System (IAS) was developed to study the feasibility of building such systems using existing microcomputers and software solutions.
Practice

Six-digit coding method

The Six-Digit Coding Method (SDCM) is a new coding method for Chinese characters. It is based on the structural analysis of Chinese characters. We recently developed this method and have successfully used it to code 11,100 characters, including the simplified, traditional, and variant forms found in Xin Hua Dictionary [7]. This article illustrates the basic principes, features, and some viewpoints concerning the method.
Research and Advances

The European community and information technology

The world has watched Eastern Europe erupt into such political turmoil that historians are expected to call this period the Revolutions of 1989. Economic evolution was also underway as the Continent progressed toward a single European market. The goal—a market without national borders or barriers to the movement of goods, services, capital and people—was first outlined over 30 years ago by the 12 countries which became members of the Common Market. In the mid 1980s, the effort was renewed when these same countries approved an ambitious plan outlining hundreds of legislative directives and policies that would harmonize and re-regulate those of the member states. The measures are drafted by the European Commission, voted on by the Council of Ministers, amended if necessary, and then assigned budgets by the Parliament. They include competition law, labor law, product regulation and standardization, taxation and subsidies, and quota and tariff guidelines. In 1987, the Single European Act created a timetable for the passage of legislation with a formal deadline for the removal of barriers by December 31, 1992, hence the term Europe '92 (EC '92). But many have described EC '92 as a process that will continue throughout the 1990s. The ouster of communist leaderships throughout Eastern Europe, however, has raised unexpected questions about the participation of the Eastern countries, and this could alter or delay the process. Nevertheless, the changes have begun and are taking place during the Information Revolution. It is therefore natural to ask what impact EC '92 will have on the computer industry. Inevitably, several of the directives and policies relate primarily, and many secondarily, to information technology. Table 2 lists the policies in effect and those being proposed. In the following pages, Communications presents several points of view regarding the impact of EC '92 on the information technology market in Europe. As of July 1988, the European information systems market was estimated at $90 billion by Datamation magazine and is expected by many to be the fastest growing market this decade. But during the last ten years, European-based computer companies have had difficulty keeping pace with American and Japanese firms. In 1988, European companies managed only a 20 percent market share on their own turf, according to market researcher International Data Corporation. Not much had changed since 1982 when their market share was 21 percent. As reported in the Wall Street Journal last January, European computer companies have been hindered by lack of economies of scale, narrow focus on national markets, and difficulty in keeping pace with Japanese and IJ.S. product innovations. But the occasion for the Journal article was the news that Germany's Siemens AG was merging with the ailing Nixdorf Computer AG. The result would possibly be the largest computer company based in Europe, and the sixth or seventh largest in the world. And in October of 1989, France's Groupe Bull announced the purchase of Zenith Electronics Corporation's personal computer unit. Bull claimed that it would become the sixth largest information service company in the world. Such restructurings have been predicted with the approach of EC '92, as corporate strategies would begin to take into account directives and trade rules regarding the computer and telecommunications industries. Smaller European and American computer companies are anticipating battle with giants like IBM and DEC, which have long-established European divisions or subsidiaries. IBM has been the leader in mainframes, minicomputers, and personal computers, but it is expected that all computer companies, European-based or not, will face greater competition in Europe. The Netherlands' NV Philips, the largest European semiconductor and consumer electronics company, says it has been preparing for EC '92 since the 1970s. And North American Philips Chairman Gerrit Jeelof has claimed company credit for initiating the 1987 European Act. In a speech delivered at a Business Week and Foreign Policy Association Seminar last May, Jeelof said that while American companies had forsaken consumer electronics, Philips and France's Thompson have held their own against the Japanese. But he indicated that American dominance of the European semiconductor market was a major impetus for EC '92. Jeelof said: . . . because of the lack of European strength in the field of computers, the integrated circuits business in Europe is dominated by Americans. Europe consumes about 34 percent of all ICs in the world and only 18 percent are made in Europe by European companies. The rest are made by American companies or are imported. It is not a surprise then that in 1984 we at Philips took the initiative to stimulate a more unified European market. At the time, we called it Europe 1990. Brussels thought that 1990 was a bit too early and made it 1992. But it has been the electronics industry in Europe together with other major companies, that have been pushing for Europe 1992. Why did we want it? We wanted a more homogeneous total market in Europe and, based on that, we wanted to become more competitive. The process is on its way and obviously we see some reactions. If you take action, you get reaction. One reaction has been concern on the part of non-European companies and their governments that the EC is creating a protectionist environment, a “Fortress Europe.” As walls between nations are coming down, some fear that other more impenetrable ones are going up on the Continent's edges. Jeelof argues against this perception in another speech, “Europe 1992—Fraternity or Fortress,” reprinted in this issue in its entirety. Communications also presents an analysis of several trade rules relating to semi-conductors in “The Semiconductor Market in the European Community: Implications of Recent Rules and Regulations,” by Roger Chiarodo and Judee Mussehl, both analysts in the Department of Commerce Office of Microelectronics and Instruments. The authors outline the consequences of Europe's Rules of Origin, anti-dumping measures that are supposed to prevent companies from using assembly operations in an importing country to circumvent duty on imported products. In the United States, if the difference between the value of parts or components from the dumping country and the value of the final product is small, then duty will be placed on those parts or components used in U.S. assembly operations. By contrast, the EC rule says that if the value of parts or components exceeds 60 percent of the value of all parts and materials, then duty will be placed on those parts and materials upon assembly in Europe. Since 1968, origin was also determined according to “the last substantial process or operation” resulting in the manufacture of a new product. In the case of printed circuit boards, some countries interpreted this as assembly and testing, while others thought it meant diffusion. In 1982, the EC began harmonizing these interpretations, and as of 1989, the last substantial operation was considered diffusion: the selective introduction of chemical dopants on a semiconductor substrate. As a result, American and Japanese semi-conductor manufacturers have spent millions building foundries on European soil. To reveal the Japanese interpretation of such changes, Japanese Commerce Minister Eiichi Ono, with the Japanese Embassy in Washington, DC, expresses his country's impressions of EC '92 in this issue. In his speech, “Japan's View of EC '92,” delivered at an Armed Forces Communications and Electronics Association (AFCEA) conference on Europe '92, Ono states that while the EC's intentions might not be protectionist, they could become so upon implementation. His discussion focuses on semi-conductors and technology transfer issues. Although not a formal directive, in July 1988, the European Council decided to promote an internal information services market (the last “L” document in Table 2). To present the reasoning and objectives behind this initiative, we reprint the Communication from the Commission to the Council of Ministers, “The Establishment at Community Level of a Policy and a Plan of Priority Actions for the Development of an Information Services Market,” and the resulting July 1988 “Council Decision” itself. Funds allocated for 1989 and 1990 are approximately $36 million, $23 million of which was slated for a pilot/demonstration program called IMPACT, for Information Market Policy Actions. This may seem a pittance in comparison to the programs of other governments, but this Decision and other EC legislation are the first steps toward an EC industrial policy. Recognizing that Europe's non-profit organizations and the public sector play a very important role in providing database services, in contrast to the U.S. where the private sector is now seeding the production of such database services, IMPACT has prepared guidelines to help the public sector cooperate with the private sector in marketing information. These guidelines would also allow private information providers to use public data and add value to it to create commercial products. IMPACT is providing incentives to accelerate innovative services for users by paying 25 percent of a project's cost. After the first call for proposals, 16 of 167 projects proposed by teams composed of 300 organizations were funded. American-based companies can apply for funds if they are registered in Europe. Unlike the U.S., the EC allows registration regardless of who owns a company's capital. Projects funded are to develop databases that would be accessible to all members of the Community either on CD-ROM or eventually on a digital network, an ISDN for all Europe, as planned by the fifth recommendation listed in Table 2. One project in the works is a library of pharmaceutical patents on CD-ROM that will enable users to locate digitized documents. Users will also have direct access to on-line hosts for all kinds of patents. A tourism information database and a multi-media image bank of atlases are other pilot projects chosen, and another project will provide information on standards. Eventually, audiotext might be used to retrieve data by telephone instead of a computer terminal. When the initial projects have been completed, the Commission will inform the market place about the results of the implementation. Plans for a five-year follow-up program, IMPACT-2 are also under discussion. These projects depend to some extent on the implementation and passage of directives or the success of larger and better funded projects. On-line access to databases depends on the recommendation for an ISDN as well as on the standardization directive for information technology and telecommunications. The certification, quality assurance, and conformity assessment issues involved in that directive are too numerous and important to just touch on here and will be covered in a later issue of Communications. To make these databases accessible not only technically, but also linguistically, the EC has funded two automatic language translation projects called Systran and Eurotra. Systran is also the name of the American company in La Jolla, CA, known for its pioneering work in translation. In conjunction with the EC, Systran Translation Systems, Inc., has completed a translation system for 24 language pairs (English—French, French—English, for example, are two language pairs) for the translation of IMPACT- funded databases. The system resides on an EC mainframe; there will be on-line access by subscription; and it will also be available on IBM PS/2s modified to run VMS DOS. It is already on France's widespread Minitel videotext network. As this practical, market-oriented approach to technology implementation is beginning, Europe's cooperative research effort, ESPRIT, is also starting to transfer its results. Last year, the second phase, ESPRIT II, set up a special office for technology transfer. Its mission is to ensure the exploitation, for the benefit of European industry, of the fruits of the $1.5 billion ESPRIT I program that began in 1984, as well as the current $3.2 billion program (funding through 1992). The EC contributes half of the total cost, which is matched by consortia comprised of university and industry researchers from more than one country. About 40 percent of ESPRIT II's funds will be devoted to computer related-technologies. Every November, ESPRIT holds a week-long conference. Last year for the first time it devoted a day to technology transfer. Several successful technology transfers have occurred either from one member of the program to another or out of the program to a member of industry that had not participated in the research. An electronic scanner that detects and eradicates faults on chips, for example, was developed by a consortium and the patents licensed by a small company. This automatic design validation scanner was co-developed by CSELT, Italy, British Telecom, CNET, another telecom company in France, IMAG, France, and Trinity College, Dublin. The company that will bring it to market is ICT, Gmbh, a relatively small German company. It seems that in Europe, as in the United States, small companies and spin-offs like those found in the Silicon Valley here, are better at running quickly with innovative ideas, says an EC administrator. Another technology transfer success is the Supernode computer. This hardware and software parallel processing project resulted in an unexpected product from transputer research. The Royal Signal Radar Establishment, Inmos, Telmat, and Thorn EMI, all of the UK, APTOR of France, and South Hampton University and the University of Grenoble, all participated in the research and now Inmos has put the product on the market. Three companies and two universities participated in developing the Dragon Project (for Distribution and Reusability of ADA Real-time Applications through Graceful On-line Operations). This was an effort to provide effective support for software reuse in real-time for distributed and dynamically reconfigurable systems. The researchers say they have resolved the problems of distribution in real-time performance and are developing a library and classification scheme now. One of the companies, TXT, in Milan, will bring it to market. Several other software projects are also ready for market. One is Meteor, which is aimed at integrating a formal approach to industrial software development, particularly in telecommunications. The participants have defined several languages, called ASF, COLD, ERAE, PLUSS, and PSF for requirements engineering and algebraic methods. Another project is QUICK, the design and experimentation of a knowledge-based system development tool kit for real-time process control applications. The tool kit consists of a general system architecture, a set of building modules, support tools for construction, and knowledge-based system analysis of design methodology. The tool kit will also contain a rule-based component based on fuzzy logic. During the next two years, more attention and funds will be indirectly devoted to technology transfer, and the intention to transfer is also likely to be one of the guides in evaluating project proposals. Some industry experts maintain that high technology and the flow of information made the upheaval in Eastern Europe inevitable. Leonard R. Sussman, author of Power, the Press, and the Technology of Freedom: The Coming Age of ISDN (Freedom House, 1990), predicted that technology and globally linked networks would result in the breakdown of censorious and suppressive political systems. He says the massive underground information flow due to books, copiers, software, hardware, and fax machines, in Poland for example, indicates that technology can mobilize society. Knowing that computers are essential to an industrial society, he says, Gorbachev faced a dilemma as decentralized computers loosened the government's control over the people running them. Glasnost evolved out of that dilemma, says Sussman. Last fall, a general draft trade and economic cooperation accord was signed by the European Commission and the Soviet Union. And both American and Western European business interests are calling for the Coordinating Committee on Multilateral Export Controls (COCOM) to relax high technology export rules to the Eastern Bloc and the Soviet Union. The passage of that proposal could allow huge computer and telecommunications markets to open up. And perhaps the Revolutions of 1989 will reveal themselves to have been revolutions in communication and the flow of information due in part to high technology and the hunger for it.
Research and Advances

Japan’s view of EC ’92

Delivered at the Armed Forces Communication and Electronics Association's EC '92 Symposium in December 1989, this speech focuses on the European semiconductor industry-where Japan ranks as the second largest exporter-and technology transfer of dual-use products from Japan.

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More