Educational Requirements Must Keep Pace to Enable Technology Expansion

Parent Category: 2020 HFE

By William Cave and Vladimir Gelnovatch

Organizations that depend upon technology in a competitive environment, e.g., those building chips, satellites, rockets, etc., must improve their technology to remain competitive. In a fair market, those who are the winners are generally the most competitive. To win, they must expand their technology to provide improvements for their buyers and beat the competition.

Occasionally, to meet expectations, the leap must be big and the expansion becomes stalled - until someone sees a new approach that makes the leap. Invariably, this requires substantial knowledge of the underlying factors affecting the existing technology, as well as a more general understanding of approaches to dealing with the problem. This requires being able to handle the truth about the shortcomings of the current technology.

But this is what real science is all about - seeking the truth about the underpinnings of the physics and mathematics underlying the technology. As stated by Lord Kelvin and quoted by Anselmo and Ledgard, [1], “When you can measure what you are speaking about, … you know something about it; but when you cannot measure it, … your knowledge is of a meager and unsatisfactory kind …” This implies taking careful measurements of the current approach and exposing the shortcomings.

In some technologies, it is difficult to get people to define - let alone take - measures. In others, measures are taken on a regular schedule. In the latter case, the technology is likely to expand rapidly. Rapid technology expansion leads to the next problem, keeping up with the expansion.

In complex engineering areas, e.g., aeronautical, mechanical and electrical, technology expansion depends upon special people who understand the principles required to expand the current technology. This implies familiarity with measures and tests that demonstrate potential changes to the underlying technology. This goes beyond understanding principles that underlie the current technology.

Those who have been directly involved in expanding a technology to where it currently resides are typically scarce. When they retire, they are hard to replace. Without people behind them - ready to take over, it is likely that further expansion will not occur until others are brought up to speed. And, it is not unusual for such technology to remain unchanged for very long periods, or even deteriorate – see the microwave radio example below. Most important, this depends heavily on continuous measurement just to ensure the desired properties do not deteriorate.

Expanding Education Requirements

As indicated above, being able to expand current technology requires knowledge beyond the current technology. In many high technology areas, this knowledge is scarce - even among those involved with the original technology expansion. When the expansion is due to a major leap, the knowledge requirement expands accordingly. This leads to the dilemma: How does one keep up with the educational requirements necessary for further expansion.

Computer technology is one of the most rapidly expanding technologies today, and has been for many decades. As witnessed by the need for speed and parallel processing technology, it is clearly leveling off. When observing the changes, one sees the need for improved measures and corresponding testing. This lack of understanding has been true in the software field for many decades, [2] and [3]. And because software requirements drive those of hardware, it is becoming true in hardware. A recent article, [4], quoted statements by Hennessy and Patterson, 2017 Turing Award Winners, that the equivalent of Von Neumann’s Instruction Set Architecture (ISA) is needed for parallel processors, see also [5] and [6]. Like many other experts, they stated the need for a totally new approach to software, see ‘Resistance’ papers, [2]. And here, the need for expanded education is obvious and critical in both technologies.

The field of chip design provides an example of meeting educational requirements. Upon receiving a PhD, either in Electrical Engineering or Computer Science, graduates are not prepared to contribute directly to complex computer chip design. This is because the level of complexity that one must deal with takes a considerable amount of time to learn - just to understand what a knowledgeable designer might consider basic. Chip designers are brought up to speed through on-the-job training. Even professors take sabbaticals with major corporations to learn some of the basics that may be taught in a lab, or in a classroom or over the internet. The most significant underlying principle is understanding the testing that must be performed to achieve the ability to meet design constraints, especially under conditions that cause variations in component parameters. This has been referred to as “Worst Case Design.”

Application to Microwave Technology

Recent articles describe military radios being jammed by simple jammers from lesser countries, see [7] and [8]. These jammers do not match the real threats to our communication systems, threats that were beat by radios designed in the 1970s-1980s. How could this be?

The software radio was introduced at the turn of the century and presented as a totally new and revolutionary technology. It was supported by major investments in large radio companies doing business with the US DoD. During the next decade, radios were built to minimize the use of non-software approaches. Unfortunately, their designers did not understand key underlying principles, such as the Receiver Operating Characteristic (ROC), a measure of a radio’s ability to overcome jamming. These principles are based on microwave theories that provide the ability to overcome complex jamming threats from advanced nations. Lack of this basic knowledge has opened the door to simple threats. Software was interfaced to simple microwave chips and shown to work. But the ROC was not used to test against simple - let alone complex - threats.

The Bottom Line

Of all the principles that must be followed to support expanding technology, the most important is defining the measures to be met and the testing that must be performed to produce the desired measured outcomes. To successfully achieve such technology expansion, one must have people who understand the underlying principles sufficiently to define both the expanded measures and corresponding testing of those measures. This implies having people with a sufficiently expanded knowledge of the underlying principles with which they must deal in order to perform these functions. Unfortunately, as technology expands, it has become difficult for the educational environment to keep up with such high levels of expansion. This issue must be addressed, at least to the extent that the underlying problem is better understood.


[1] Donald Anselmo and Henry Ledgard, “Measuring Productivity in The Software Industry,” Communications of the ACM, vol. 46. no.11, Nov. 2003.

[2] Resistance References,  .

[3] Barr, Adam, “Why Do Good Engineers Write Bad Software,” IEEE Software, Jan 2019.

[4] Martinez, M, “Nobel Prize for Computing” based on Hennessy-Patterson Turing Lecture, International Symposium on Computer Architecture (June 4, 2017), May 30, 2018.

[5] Ledgard, Henry, The Cave ASA Versus the Von Neumann ISA, University of Toledo, March 2019, Toledo, OH,

[6] Cave, W. C.,, The Application Space Architecture (ASA) for Parallel Processors, Visual Software International, March 2019, Spring Lake, NJ.

[7] Seffers, George I., Army Mounts IW Comeback, The CYBEREDGE, May 1, 2018.

[8] Tomlinson, Lucas, ‘Adversaries’ jamming Air Force gunships in Syria, Special Ops general says, Fox News, March 2019.


BS Electrical Engineering (computer option), 1960, Penn State University

MS Electrical Engineering/Computer Science, 1963, New York University

PhD Fellowship - Electrical Engineering, 1965-66, Polytechnic Institute of Brooklyn

PG Courses - EE - Optimal and Stochastic Control Theory, 1967-68, Stevens Inst. of Tech.


Worked on one of the first digital computers - PENNSTAC, Penn State, 1958-1960

Worked on the U.S. Army BASICPAC, First transistor computer, built by Philco, 1960-1962

Project Leader, U.S. Army MINIPAC Computer, One of first transistor computers, 1962-1965

Chairman & CEO: Optimal Systems Research, Inc., 1967-1974

Chairman & CEO: Prediction Systems, Inc., 1974-present;

U.S. Representative, NATO Panel XIII, Data Communications, Brussels, Belgium., 1976-1978

Chairman, U.S. DoD Panel on Software Life-Cycle Management, 1976-1978

Chairman & CEO: Visual Software International, Inc., 2004-present;

Author - Papers and books for professional societies & publishers


U.S. Army 102d Signal Bn, 1957 to 1959, Hohenstadt Radio Station

BS Electrical Engineering (Honors) 1963, Monmouth University, NJ

MS Electrical Engineering (Honors) 1966, New York University, NY

PhD Electrical Engineering Fellowship -, 1967New York University, NY  -   (Dissertation  “DEMON” - An Optimization Algorithm for Microwave Networks)

Various Positions - Electronic Devices / Technology Lab, Army R&D Labs, NJ,

  - Retired as Lab Director, 1963-1997,

Visiting Professor of Electrical Engineering at the University of Virginia

General Technical Services, Wall, NJ, Program Manager/ Sr. Engineer, 1997-2005

Expert Consultant, TriQuint Semiconductor Corp, Dallas, TX, 2005-2008

Expert Consultant, Prediction Systems, Inc., Spring Lake, NJ

Various Army & U.S. Awards for Outstanding Performance, 1972-1997

Institute of Electrical & Electronic Engineers Fellow Award (1981).

President of Microwave Theory & Techniques Society, IEEE (1989).

Associate Editor of “Microwave Journal” Magazine (1975 - 1995).