Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Terry Bollinger
  • McLean, Virginia, United States

Terry Bollinger

MITRE, Emerging Tech, Department Member
  • I retired early from helping the US Federal government define, acquire, and oversee millions of dollars of research f... moreedit
The paper describes a Prolog-based prototype expert system that was implemented by the Network Operations Branch of the NASA Goddard Space Flight Center. The purpose of the prototype was to test whether a small, inexpensive computer... more
The paper describes a Prolog-based prototype expert system that was implemented by the Network Operations Branch of the NASA Goddard Space Flight Center. The purpose of the prototype was to test whether a small, inexpensive computer system could be used to host a load shedding "advisor," a system which would monitor major physical environment parameters in a computer facility, then recommend appropriate operator responses whenever a serious condition was detected. The resulting prototype performed significantly better than was originally anticipated, due primarily to efficiency gains achieved by replacing a purely rule-based design methodology with a hybrid approach that combined procedural, entity-relationship, and rule-based methods.
Progress brings new dangers: Powerful home computers, inexpensive high-speed Internet access, telecommuting, and software flaws have combined to create a soft underbelly in the defenses of government and corporate networks-and there is... more
Progress brings new dangers: Powerful home computers, inexpensive high-speed Internet access, telecommuting, and software flaws have combined to create a soft underbelly in the defenses of government and corporate networks-and there is evidence that this weakness is being exploited. As of early 2005, many home and small business systems contain uninvited programs that capture keystrokes and passwords, hand control over to hostile users, and lull their owners into a false sense of security by undermining their virus and spyware checkers. The goal of such hardcore spyware programs is to help attackers profit at your expense by stealing your data, resources, money, or identity. Home PCs and laptops used in telecommuting are tempting targets for hardcore spyware since they provide easy access to data that otherwise can be found only in well-guarded enterprise networks. This article describes defensive strategies for reducing your level of risk from hardcore spyware.
When someone attempts to sneak in an observation on an entangled set of particles in the here-and-now, the quantum result looks just as if a record of that transgression was captured, sent back in time to the original generation of the... more
When someone attempts to sneak in an observation on an entangled set of particles in the here-and-now, the quantum result looks just as if a record of that transgression was captured, sent back in time to the original generation of the entangled particles, and then rebroadcast for everyone in the future to see.
This paper defines quantum observation as conversion of tiny (attojoule or less) packets of energy and momentum into statistically irreversible Boltzmann heat. Under this definition, quantum waves never “collapse” but instead grow so... more
This paper defines quantum observation as conversion of tiny (attojoule or less) packets of energy and momentum into statistically irreversible Boltzmann heat. Under this definition, quantum waves never “collapse” but instead grow so fractally convoluted in one region of space that their entangled properties in remote regions of space become statistically irreversible and persistent over time. The implication is that far from being a rare phenomenon that depends on such hard-to-define phenomena as human awareness, quantum observation via atto-scale creation of Boltzmann heat is the most common of all quantum phenomena and the underlying basis for the very existence of “classical” matter and objects.
This informal but well-referenced description of an afterimage experiment called Ghost Tap provides persuasive and easily reproducible evidence that the visual cortex plays a significant role in certain classes of long-duration visual... more
This informal but well-referenced description of an afterimage experiment called Ghost Tap provides persuasive and easily reproducible evidence that the visual cortex plays a significant role in certain classes of long-duration visual afterimages. Subjects of the experiment literally cannot discern the difference between the afterimage and reality, resulting in easy startling of the subjects when physical motion in the room no longer matches the persuasive afterimages they are perceiving. Anecdotal examples of less extreme versions of the same effect suggest that the Ghost Tap effect has, over centuries, intentionally and unintentionally helped persuade people of the existence of nominally “supernatural” effects that are just persuasive long-duration afterimages. While this description is informal, the easy reproducibility of the Ghost Tap makes it a good candidate for more precise and quantitative studies. One theory why Ghost Tap exists is that it is part of load reduction and spe...
In sharp contrast to the fully virtualized, faster-than-human learning speeds of AlphaGo Zero, the learning speed of AlphaFold2 remains firmly attached to and limited by human experimental time.
The author argues Richard Feynman’s path integral approach to Quantum Electrodynamics (QED) can be re-interpreted as a way to network predictive computations, and that this style of networked predictive computations may provide insights... more
The author argues Richard Feynman’s path integral approach to Quantum Electrodynamics (QED) can be re-interpreted as a way to network predictive computations, and that this style of networked predictive computations may provide insights on how to interpret sensor data faster and more intelligently.
Research Interests:
This 2002 study of where and how free and open source software was being used at that time in the U.S. Department of Defense was instrumental both in establishing its value to in research, engineering, and security. It was instrumental in... more
This 2002 study of where and how free and open source software was being used at that time in the U.S. Department of Defense was instrumental both in establishing its value to in research, engineering, and security. It was instrumental in leading to DoD policies that enabled broader use of open source software. These policies in turn helped enable broader federal, private sector, and international use and support of open source software. If you currently have a smartphone, many of its capabilities would never have been possible if this study had not helped put into place U.S. federal policies that in turn encouraged broader use of open source software by the private sector.
3. Principles of Organization This explains the first and most important method chosen to break the subject matter into smaller sections, using four principles of software construction. The subject matter proper appears in section 5. 4.... more
3. Principles of Organization This explains the first and most important method chosen to break the subject matter into smaller sections, using four principles of software construction. The subject matter proper appears in section 5. 4. Styles of Construction This explains a second and less important method chosen to break down the subject matter in each of section 5 into even smaller subsections, using three styles/methods of software construction.
The articles in IEEE Software are the result of hard work by many people. We deeply appreciate the efforts of everyone who reviewed the many articles submitted to Software last year. The peer review process helps maintain the magazine’s... more
The articles in IEEE Software are the result of hard work by many people. We deeply appreciate the efforts of everyone who reviewed the many articles submitted to Software last year. The peer review process helps maintain the magazine’s revered quality. All of us in the software development community owe gratitude to people who participate in this crucial service. If you would like to contribute as a reviewer, visit www.computer.org/web/peer-review/magazines to nd out how to get involved. —Software’s editorial board and staff
For anyone trying to understand both the basics and the full range of options available when making a DOI metadata submission to Crossref, this linked table of XML element and attribute descriptions gives one small publisher’s best... more
For anyone trying to understand both the basics and the full range of options available when making a DOI metadata submission to Crossref, this linked table of XML element and attribute descriptions gives one small publisher’s best understanding of the most recent version of Crossref’s metadata submission elements and attributes. As of April 2021, the most recent version of Crossref XML files is 4.4.2. This table provides definitions for the six Crossref XML Schema Definition (xsd) files that include the most commonly used description elements of a DOI submission: crossref4.4.2.xsd, common4.4.2.xsd, fundref.xsd, AccessIndicators.xsd, clinicaltrials.xsd, and relations.xsd. The table also includes a brief description of the main features of the externally defined jats:abstract (JATS) element. This table focuses not on XML syntax but on the intent and structure of the elements from a small publisher perspective. This table is one small publisher’s interpretation of Crossref XML and is ...
: Interoperability is the ability to use resources from diverse origins as if they had been designed as parts of a single system. Over time, individual interoperability problems tend to disappear as the resources involved literally become... more
: Interoperability is the ability to use resources from diverse origins as if they had been designed as parts of a single system. Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem of interoperability itself never disappears. Instead, it simply moves up to a new level of complexity that accepts earlier integrations as a given. Interoperability is especially critical for military systems, where international politics can lead to abrupt realignments where yesterday's foe becomes today's coalition partner. This report on interoperability has five sections. The first section is an introduction to the interoperability problem, and the second section describes fundamental interoperability concepts and develops a terminology for describing interoperability issues and needs. The second section also addresses the vital issue of interoperability...
In 1980, Russian mathematician Yuri Manin published Computable and Uncomputable. On pages 14 and 15 of his introduction, Manin suggests that “Molecular biology furnishes examples of the behavior of natural (not engineered by humans)... more
In 1980, Russian mathematician Yuri Manin published Computable and Uncomputable. On pages 14 and 15 of his introduction, Manin suggests that “Molecular biology furnishes examples of the behavior of natural (not engineered by humans) systems which we have to describe in terms initially devised for discrete automata.” Manin then describes the remarkable energy efficiency of naturally occurring biomolecular processes such as DNA replication. He proposes modeling such behaviors in terms of unitary rotations in a finite-dimensional Hilbert space. The decomposition of such systems then corresponds to the tensor product decomposition of the state space, that is, to quantum entanglement. Manin’s initial focus on biological molecules as examples of highly energy-efficient quantum automata is unique among quantum computing’s founding figures since both he and other early leaders quickly moved to the then-new and exciting concept of von Neumann automata. The von Neumann formalism reinterpreted...
A physics theory predicts precise experimental results for some set of naturally occurring phenomena. Consequently, every well-formed physics theory is equivalent to a computer program that uses input descriptions of specific experimental... more
A physics theory predicts precise experimental results for some set of naturally occurring phenomena. Consequently, every well-formed physics theory is equivalent to a computer program that uses input descriptions of specific experimental setups to generate the outputs expected from those setups. The Kolmogorov complexity (or Kolmogorov minimum) of such a computer program is the program that uses the smallest number of bits to represent the largest possible of such input-output data pairs accurately. The principle of concise prediction asserts that the theory whose program length is shortest for a given set of experimental inputs and results is the one most likely to lead to deeper insights and new physics.
The physics of photon momentum are straightforward mathematically but can produce surprisingly counterintuitive outcomes. A few simple calculations show how a single photon of green light can, in principle, impart two locomotive engines’... more
The physics of photon momentum are straightforward mathematically but can produce surprisingly counterintuitive outcomes. A few simple calculations show how a single photon of green light can, in principle, impart two locomotive engines’ worth of momentum without violating energy conservation. The calculation is one example of why quantum mechanics needs better accounting of linear momentum.
Mendeleev's pragmatic table approach to capturing patterns in the chemical elements allowed it to endure even after quantum theory uncovered the simplicity that created those patterns. While predictively powerful, the Standard... more
Mendeleev's pragmatic table approach to capturing patterns in the chemical elements allowed it to endure even after quantum theory uncovered the simplicity that created those patterns. While predictively powerful, the Standard Model's early extensive use of mathematical symmetries likely added a level of noise that to this day is hiding, rather than clarifying, the deeper foundations of physics.
Quantum erasure experiments push the boundary between the quantum and classical worlds by letting delayed events influence the state of previously recorded and potentially widely distributed classical information. The only significant... more
Quantum erasure experiments push the boundary between the quantum and classical worlds by letting delayed events influence the state of previously recorded and potentially widely distributed classical information. The only significant restriction to such unsettling violations of forward-only causality is that the distribution of forward-dependent information cannot cross out of the light cone boundaries of the event in the past, a feature that ensures no violations of causality — no rewriting of anyone else's recorded histories — can occur. The erasure interpretation of this conundrum requires rewriting of information recorded and distributed in the past, which would itself be a violation of causality. The quantum predestination interpretation removes the causal rewriting issue. However, quantum predestination requires detailed coordination of inputs from outside of the forward-dependent event's light cone, thus massively violating the same limit that prevents causality viol...
In terms of leveraging the total power of quantum computing, the prevalent current (2020) model of designing quantum computation devices to follow the von Neuman model of abstraction is highly unlikely to be making full use of the full... more
In terms of leveraging the total power of quantum computing, the prevalent current (2020) model of designing quantum computation devices to follow the von Neuman model of abstraction is highly unlikely to be making full use of the full range of computational assistance possible at the atomic and molecular level. This is particularly the case for molecular modeling, in using computational models that more directly leverage the quantum effects of one set of molecules to estimate the behavior of some other set of molecules would remove the bottleneck of insisting that modeling first be converted to the virtual binary or digital format of quantum von Neuman machines. It is argued that even though this possibility of “fighting molecular quantum dynamics with molecular quantum dynamics” was recognized by early quantum computing founders such as Yuri Manin and Richard Feynman, the idea was quickly overlooked in favor of the more computer-compatible model that later developed into qubits an...
The Academic Indifference Ratio (AIR) of a programming language is the ratio of the total lines of code (TLOC) for the language over the total lines of academic text (TLAT) written about the language. A large AIR means the language is... more
The Academic Indifference Ratio (AIR) of a programming language is the ratio of the total lines of code (TLOC) for the language over the total lines of academic text (TLAT) written about the language. A large AIR means the language is heavily used but only lightly studied. COBOL and Visual Basic are near the head of the line for AIR metrics, so one might say that academia treats them as AIR-head languages that are too mundane for serious study. This situation is unfortunate since, for example, heavily used languages such as Visual Basic end up pushing the boundaries of automation in areas such as ease-of-use and increased safety through self-correcting logic. It's time to stop ignoring these woolly mammoths in the room and instead start viewing them as resources and inspirations for papers on how humans and computers work best together.
As indicated by the name "quantum erasure," the most common interpretation of certain classes of delayed choice quantum experiments is that they, in some fashion, erase or undo past decisions. Unfortunately, this interpretation cannot be... more
As indicated by the name "quantum erasure," the most common interpretation of certain classes of delayed choice quantum experiments is that they, in some fashion, erase or undo past decisions. Unfortunately, this interpretation cannot be correct since the past decisions were already classically and irreversibly captured as recorded information or datums. A datum is information that, through temporal entanglement, constrains future events. The correct interpretation of such experiments is stranger than erasure: Recordings made early in such quantum experiments predestine choices made later through arbitrarily complex and often human-scale classical choices. Since this process of quantum predestination occurs only within the future light cone of datum creation, another (possibly) less radical way to interpret such experiments is that time is multiscale, granular, and impossible to define outside of the quantum state of the entities involved. The continuum time abstraction is not compatible with this view.
This paper provides a reference copy of one particular and highly informal comment in a multiweek Academia.edu discussion of the paper Randomness in Relational Quantum Mechanics by Gary Gordon. The other main participants in this... more
This paper provides a reference copy of one particular and highly informal comment in a multiweek Academia.edu discussion of the paper Randomness in Relational Quantum Mechanics by Gary Gordon. The other main participants in this particular thread of the discussion were Doug Marman, Conrad Dale Johnson, Ruth Kastner, and the author. In this comment, the author argues that the only self-consistent approach to reconciling Feynman path integrals with Maxwell’s experimentally well-proven theory of electromagnetic wave pressure is introducing a new spin-0 particle, the vacuum or space phonon (sonon), that conveys linear momentum. The path histories of QED become the always-expanding structure of the sonon field, which, like a bubble, becomes increasingly unstable as it expands. The collection of all sonon fields around well-defined bundles of conserved quantum properties creates xyz space by defining the complete set of relational information for those entities. Spacetime in the sonon mo...
This report documents the results of a study by The MITRE Corporation on the use of free and open-source software (FOSS) in the U.S. Department of Defense (DoD). FOSS gives users the right to run, copy, distribute, study, change, and... more
This report documents the results of a study by The MITRE Corporation on the use of free and open-source software (FOSS) in the U.S. Department of Defense (DoD). FOSS gives users the right to run, copy, distribute, study, change, and improve it as they see fit, without asking permission or making fiscal payments to any external group or person. The study showed that FOSS provides substantial benefits to DoD security, infrastructure support, software development, and research. Given the openness of its source code, the finding that FOSS profoundly benefits security was both counterintuitive and instructive. Banning FOSS in DoD would remove access to exceptionally well-verified infrastructure components such as OpenBSD and robust network and software analysis tools needed to detect and respond to cyber-attacks. Finally, losing the hands-on source code accessibility of FOSS source code would reduce DoD’s ability to respond rapidly to cyberattacks. In short, banning FOSS would have imme...
This is an early version of the Software Engineering Body of Knowledge (SWEBOK) chapter on Software Construction, meaning the creation of artifacts (programs) that that enable computers to implement specific desired behaviors. The article... more
This is an early version of the Software Engineering Body of Knowledge (SWEBOK) chapter on Software Construction, meaning the creation of artifacts (programs) that that enable computers to implement specific desired behaviors. The article describes construction in terms of its dependence on the verbal, mathematical, and visual capabilities of the human brain.
Andre Adelsbach Adnan Agbaria Gail-Joon Ahn Ammar Alkassar Jim Alves-Foss Lorenzo Alvisi Yair Amir Nirwan Ansari Jean Arlat Dmitri Asonov James Aspnes Vijay Atluri Paul C. Attie Oscar Au ... Michael Backes Saurabh Bagchi Iris Bahar... more
Andre Adelsbach Adnan Agbaria Gail-Joon Ahn Ammar Alkassar Jim Alves-Foss Lorenzo Alvisi Yair Amir Nirwan Ansari Jean Arlat Dmitri Asonov James Aspnes Vijay Atluri Paul C. Attie Oscar Au ... Michael Backes Saurabh Bagchi Iris Bahar Roberto Baldoni David Banks Claudio Basile Luigi Benedicenti Vincent Berk Elisa Bertino Konstantin Beznosov Edoardo Biagioni Ernst Biersack Matthew Bishop Rakesh Bobba Terry B. Bollinger William Bolosky Andrea Bondavalli Dan Boneh Jeremy Bryans Peter Buchholz Levente Butty��n ... Christian Cachin Jan Camenisch Roy ...
Abel, Timothy C., Alcon Abumara, Humam, Motorola Adrion, W. Richards, Univnsi~ ofMa.ssacbwtts, Amhe?~ Agarwal, Ramesh K., McDonneU Douglas Reseal-cb Agrawal, Dharma D., North Cam&r State Univrrti~ Akiyama, Yoshihiro, IBMJapan Albrecht,... more
Abel, Timothy C., Alcon Abumara, Humam, Motorola Adrion, W. Richards, Univnsi~ ofMa.ssacbwtts, Amhe?~ Agarwal, Ramesh K., McDonneU Douglas Reseal-cb Agrawal, Dharma D., North Cam&r State Univrrti~ Akiyama, Yoshihiro, IBMJapan Albrecht, Allan J. Allen, Alastar R., Unizwsity ofAbwdeen Allen, Wayne P, izZCcC Almy, Tom A., Tektronix Alvarez, Angel, Ciudad Univektal-ia Alverson, Gail A., Universify of Wzsbington Ambler, Allen L., Univek~ oflhsas Ammann, Paul E., Gewge Mason University Anderson, Peter G., Rorbestn- Institute ojX~~ ...
The articles appearing in IEEE Software are the result of hard work by many people. We deeply appreciate the efforts of everyone who participated in our peer review process last year. Authors often tell us how much they value the... more
The articles appearing in IEEE Software are the result of hard work by many people. We deeply appreciate the efforts of everyone who participated in our peer review process last year. Authors often tell us how much they value the reviewers' comments and suggestions. Their expertise, care, and attention help maintain Software's quality. All of us in the software development community owe them a heartfelt ���thank you.��� Readers who would like to contribute to our community by reviewing papers next year can visit www. computer. org/ ...
Alain Abran, Universit�� du Qu��bec �� Montr��al Carl Adams, University of Portsmouth Michael Ainsworth, Praxis Critical Systems Mehmet Aksit, University of Twente Roger Alexander, Colorado State University Julia Allen, Carnegie Mellon... more
Alain Abran, Universit�� du Qu��bec �� Montr��al Carl Adams, University of Portsmouth Michael Ainsworth, Praxis Critical Systems Mehmet Aksit, University of Twente Roger Alexander, Colorado State University Julia Allen, Carnegie Mellon University Rajeev Alur, University of Pennsylvania Javier Andrade-Garda, University of A Corunna Annie I. Anton, Georgia Institute of Technology Mark Ardis, Rose-Hulman Institute of Technology Felix Bachmann, Software Engineering Institute Donald Bagert, Rose-Hulman Institute of Technology Akhilesh Bajaj, Carnegie Mellon ...
IDA Document D-754 summarizes the Reuse in Practice Workshop which was held at the Software Engineering Institute. The objective of this workshop was to assess the current state of the practice of software reuse and provide... more
IDA Document D-754 summarizes the Reuse in Practice Workshop which was held at the Software Engineering Institute. The objective of this workshop was to assess the current state of the practice of software reuse and provide recommendations to the research and user communities to enhance software reuse. The workshop focused on four areas of software reuse: domain analysis, implementation, environments, and management. Position papers from several of the attendees are included as part of the document.
Since the early days of quantum theory, the concept of wave function collapse has been looked upon as mathematically unquantifiable, observer-dependent, non-local, or simply inelegant. Consequently, modern interpretations of quantum... more
Since the early days of quantum theory, the concept of wave function collapse has been looked upon as mathematically unquantifiable, observer-dependent, non-local, or simply inelegant. Consequently, modern interpretations of quantum theory often try to avoid or make irrelevant the need for wave collapse. This is ironic, since experimental quantum physics requires some variant of wave collapse wherever quantum phenomena interact with the classical universe of the observer. The paper "Quantum-Inspired Simulative Data Interpretation: A Proposed Research Strategy" (MITRE Pubic release 10-3164) proposes a pragmatic view in which wave function collapses are treated as real phenomena that occur in pairs. Paired collapses occur when two wave packets exchange real (vs. virtual) momentum-carrying force particles such as photons. To minimize reversibility, such pairs must be separated by a relativistically time-like interval. The resulting Wave Packet Network (WPN) model resembles a ...
Interoperability is the ability to use resources from diverse origins as if they had been designed as parts of a single system. Over time, individual interoperability problems tend to disappear as the resources involved literally become... more
Interoperability is the ability to use resources from diverse origins as if they had been designed as parts of a single system. Over time, individual interoperability problems tend to disappear as the resources involved literally become part of one system through integration and standardization, but the overall problem of interoperability itself never disappears. Instead, it simply moves up to a new level of complexity that accepts earlier integrations as a given. Interoperability is especially critical for military systems, where international politics can lead to abrupt realignments where yesterday's foe becomes today's coalition partner. This report on interoperability has five sections. The first section is an introduction to the interoperability problem, and the second section describes fundamental interoperability concepts and develops a terminology for describing interoperability issues and needs. The second section also addresses the vital issue of interoperability s...
Research Interests:
Books are like people: They get more interesting when you let them interact a bit. In this pair of book reviews, I look at two very different books that nonetheless share a common theme of wondering what science has in store for us in the... more
Books are like people: They get more interesting when you let them interact a bit. In this pair of book reviews, I look at two very different books that nonetheless share a common theme of wondering what science has in store for us in the near future. The first, A New Kind of Science by Stephen Wolfram, is a nonfiction work intended to provide a new basis for all of science. The other, Prey by Michael Crichton, is a technology-based fantasy. After the reviews, I discuss some of the interesting threads of fact and fantasy that weave their ways through both books.
Research Interests:
In this debate-style pair of articles, Bill Curtis from SEI argues that proper attention to process maturity in an organization is the key to accelerating its ability to produce quality software. Terry Bollinger, a former Associate... more
In this debate-style pair of articles, Bill Curtis from SEI argues that proper attention to process maturity in an organization is the key to accelerating its ability to produce quality software. Terry Bollinger, a former Associate Editor-in-Chief for IEEE Software, takes the perspective that only an organization that has deep and up-to-date skills in the technologies most relevant to the problem area will be able to produce trustworthy and high quality software quickly.
Research Interests:
A software development manager who temporarily defected to the programmers' trenches learned valuable insights into bottom-up software engineering.
The academic indifference metric AI=LOC/LIA of a computer language is the ratio of the total lines of code written in that language to the total number of lines written about that language in academic articles. Languages such as COBOL and... more
The academic indifference metric AI=LOC/LIA of a computer language is the ratio of the total lines of code written in that language to the total number of lines written about that language in academic articles. Languages such as COBOL and Visual Basic have AI values that are both anomalously high and remarkably persistent over time. The author argues that the existence of such anomalous AI values indicates either an inexplicably broad use of weak programming languages, an incomplete academic understanding of the deeper nature of software development, or both.
ABSTRACT Linux (pronounced LIN-ucks) is the common name for any well-defined collection of system and application software (a distribution) that uses the UNIX-like Linux kernel for its basic operating system services, such as process... more
ABSTRACT Linux (pronounced LIN-ucks) is the common name for any well-defined collection of system and application software (a distribution) that uses the UNIX-like Linux kernel for its basic operating system services, such as process control, memory allocation, device management, file access, and networking. To qualify as a Linux distribution, such a collection of software must also consist entirely or almost entirely of freely redistributable, public-source software, which is variously known as free, open, or sometimes libre software. Linux is most commonly used on Intel x86/Pentium-based personal computers (PCs), but it has also been ported to a large and growing non-x86 computers of diverse ages, sizes, and capabilities. Linux is notable for having achieved a level of public awareness and stock market notice second only to Microsoft Windows. This article discusses the history of Linux, the design of the Linux kernel, Linux development, the role of Linux distributions, and possible future directions of this intriguing operating system. (The non-free encyclopedia article is available by online subscription at: http://onlinelibrary.wiley.com/doi/10.1002/0471028959.sof184/full )
This paper describes the results to date of a research effort to apply a pattern approach to the problem of addressing information assurance (IA) in enterprise-level information engineering. IA is not effectively included in Enterprise... more
This paper describes the results to date of a research effort to apply a pattern approach to the problem of addressing information assurance (IA) in enterprise-level information engineering. IA is not effectively included in Enterprise Architectures today, largely because there is no compendium of knowledge immediately useful to enterprise engineers who are not IA specialists. The goal of this research

And 22 more