Chapter 5. A Bit of Science and Philosophy

Universal Memory Dump: A Definition

Applying a mathematical definition of a memory dump (Volume 1, page 501) to natural systems we can introduce a Universal Memory Dump as a snapshot of observables describing the system. Similar to software memory dump analysis we need a suitable reader and a set of Universal Symbol Files as semantical mappings or NDB (Nature Data Base) files.

Therefore we have these two categories of universal memory dumps:

  • Natural Memory Dumps

  • Software Memory Dumps

The Source of Intuition about Infinite

What is the source of our intuition about ∞, or ∞, and more powers of ∞, and even ∞ number of powers? We conjecture that the underlying structure of our Universe or at least a universe as a model of Universe is Infinite Memory, together with perceived processes as limits and Time Arrow as a bundle of sequences of memory pointers, provides the basis for our intuition about infinite.

Geometrical Debugging

Most of (if not all) debugging is arithmetical. Here we would like to introduce a new kind of debugging and troubleshooting approach that interprets observables as objects in their own spaces, for example, the possible space of various GUI forms. These spaces are not necessarily rational-valued spaces of simulation output or discreet arithmetic spaces of memory locations and values.

This geometrical approach applies modelling and systems theory to debugging and troubleshooting by treating them as mappings (or functions in the case of one-to-one or many-to-one mappings) from the space of all possible software environment states (SE) to the space(s) of observables. Here we have a family of mappings to different spaces:

fi: SE → SOi

Some observables can be found fixed like the list of components and the number of mappings can be reduced (i < j):

fj: SEa,b,c,d,... → SOj

In every system and its environment we have something fixed as parameters (a, b, c, d, ...) and this could be the list of components as high level "genotype" or it could be just specific code (low-level "genotype"), specific data or hardware specification. The whole family of mappings become parameterized. If we want, we can reduce mappings even more to treat them as many-valued (one-to-many or many-to-many) if several observables belong to the same kind of space.

Let's illustrate this by an analogy with modelling of a natural system. The system to be modelled is a falling ball together with its environment (Earth). The system obviously has some internal structure (abstract space of states, E) but we don't know it. Fortunately, we can observe some measurable values like the ball position at any time (Q). So we have these mappings for balls with different masses:

fm: E → Q

We also find that for any individual ball its mass doesn't change so we abstract it as a parameter:

f: Em → Q

The same modelling approach can be applied to a software system be it an application or a service running inside an operating system or a software system itself running inside a hardware. The case of pure software system abstracted from hardware is simple. In such a case SE space theoretically could be the space of abstract memory dumps (Volume 1, page 501). Practically, we deal with the space of observables (universal memory dumps, page 295) that approximate SE and spaces of software "phenotypes", observable behaviour, like distorted GUI, for example, or measured values of memory and CPU consumption or disk I/O throughput.

Riemann Programming Language

Named after Bernhard Riemann[8], this programming language gives software defects first-class status as alternative branches of computation, comparable with multivalued functions[9] and Riemann surfaces[10]. Bugs become first-class constructs. It is reflected in the language syntax, semantics and pragmatics.

Is Memory Dump Analysis a Science?

Based on John Moore 8 science criteria we can consider Memory Dump Analysis (MDA) as a science:

  1. MDA is based on data (memory dumps) collected in the field or re-pro / test environment.

  2. Data (memory dumps) is collected to answer troubleshooting, debugging or forensics and intelligence questions. Observations in memory dumps are made to support or refute these questions.

  3. Analysis of data (via memory dump analyzers, debuggers and log analyzers) is done objectively.

  4. Troubleshooting, debugging or forensics hypotheses are developed and they are consistent with observations and compatible with general conceptual computer memory framework.

  5. Troubleshooting, debugging or forensics hypotheses are tested and several comparable competing ones may be developed at any one time.

  6. Generalizations are made that are valid universally within the domain of MDA.

  7. The facts are confirmed independently.

  8. Previously puzzling facts are explained.

It is also interesting to generalize the domain of MDA to empirical data collection via the so called universal memory dumps (page 295).

My Dangerous Idea: Parameterized Science

Here's my own dangerous idea in return: in the future, all sciences, engineering and technology will be ultimately fused and concerned with universal memory dumps (page 295) of empirical data where appropriate symbol files will be used for every science as we know today, these files called science files. The set of science files can be considered as a parameter, hence the name of this idea. In another words, there will be one Science of memory dump analysis and many sciences. All sciences will be finally unified.

Now the question is: would it be also possible to discover new sciences by finding a suitable set of science files corresponding to a collected dump of empirical data?

Unique Events and Historical Narratives

Sometimes a problem like a crash or a hang never happens again, the so called a unique computational event, like the extinction of dinosaurs if we apply biological metaphors. Biology science copes with such events via constructing historical narratives and multiple probabilistic explanations with cross data examinations. The same is true for memory dump analysis where we construct possible explanations based on evidence and collected supporting data. Like Ernst Mayr[11] pointed, we try to answer both questions: "How?" and "Why?" Usually the answer to the first question is very simple and straightforward, like NULL pointer access (proximate, functional causation) and the answer to the second question is provided by testing various possible historical narratives (ultimate or evolutionary causation) possibly involving an animate agent (a human user of a system).

Notes on Memoidealism

We start our discussion of Urstoff (Ger.), the primitive, primordial and basic element of everything, and relation of memoidealism to Ionian school[12]. In memoidealism, technically speaking, Memory serves the role of Urstoff as a permanent primary element behind the process of state transition changes. In contrast, Ionians considered Urstoff to be of a material nature, for example, air (Anaximenes[13]), fire (Heraclitus[14]) or water (Thales[15]). This abstraction (abstract materialism) of material elements parallels memory abstraction in memoidealism. Another parallel is the unity of science and philosophy.

Memoidealism is characterized by the unity of philosophy and (computer) science. It has deep roots in practical memory (dump) analysis. The interpretation of observations as memory snapshots (universal memory dumps) leads to the declaration of Memory to be the One (or the First Principle) like Water in Thales practical scientific philosophy. We also observe that processes are memory snapshots as well (through their observational data). We try to understand the plurality of experiences through the unity of memory (the so called Unity in Difference).

Indeterminate infinite Urstoff, out of which emerges the plurality of worlds that come and go, is the foundation of Anaximander philosophy. In memoidealism, Memory is indeterminate in the sense that it doesn't represent determinate material substance. It is actually infinite too. The crucial feature of memoidealistic notion of memory is the fact that coming into existence plurality doesn't perish. It is saved. In some sense Memory is apeiron of memoidealism.

Urstoff of Anaximenes is Air much like Memory in memoidealism. How do concrete objects develop from invisible Air? Through the process of condensation and rarefaction, quality arises from quantity (reduction process). Eternity of Urstoff is one of the main features of Milesian philosophers and memory religion[16]. No additional worlds are possible in their philosophies. They are "materialists" because of their material Urstoff. Memoidealists are "idealists" because of their ideal notion of Memory.

A Copernican Revolution in Debugging

A number of Copernican revolutions occurred or announced in various branches of various sciences. Now it's our turn to say that action-based "earth-centric" debugging is replaced by memory (dump) analysis as a "heliocentric" foundation of debugging. Even in live debugging we have memory snapshots and differential memory analysis. A trace in trace-based debugging is another example of a universal memory dump. Therefore, memory (dump) analysis comes first.

On Subjectivity of Software Defects

If we assume the model-based definition of software defects (Volume 1, page 511) we can easily see that any changes to an underlying model can surface the new unanticipated defects and hide the known ones. New and evolving disciplines like software security engineering can change our views about solid code and create defects by introducing non-functional constraints on models. Another aspect of this is the interaction of a human debugger with code, the very act of reading code can create defects. However, the latter effect is controversial and belongs to the evolving quantum theory of software defects (bugtanglement, Volume 2, page 367).

Memory Field Theories of Memuonics

Here we recall memuons[17], the indivisible entities of memory? Their study is the domain of the new science called memuonics[18]. According to the so called memophysical principle[19], we have particle interpretation of memuons. This is called classical memuonics with classical memory field theory where memuons are "quanta" of memory. We can also "quantize" memory fields and get quantum memory field theories where memuons are created and annihilated.

Software Trace: A Mathematical Definition

What is a software trace from a mathematical standpoint? Before any software writes its trace data, it assembles it in memory. Therefore, generally, a software trace is a linear ordered sequence of specifically prepared memory fragments (trace statements):

(ts1, ts2, ..., tsn)

where every tsi is a sequence of bits, bytes or other discrete units (see the definition of a memory dump, Volume 1, page 501):

(s11, s12, ..., s1k, s21, s22, ..., s2l, ..., ..., ..., sn1, sn2, ..., snm)

These trace statements can also be minidumps, selected regions of memory space. In the limit, if every tsi is a full memory snapshot saved at an instant of time (ti) we have a sequence of memory dumps:

(mt1, mt2, ..., mtn)

Like with memory dump analysis we need symbol files to interpret saved memory fragments unless they were already interpreted during their construction. For example, traces written according ETW specification (Event Tracing for Windows) need TMF files (Trace Message Format) for their interpretation and viewing. Usually these files are generated from PDB files and therefore we have this correspondence:

memory dump file -> software trace file

PDB file -> TMF file

Quantum Memory Dumps

Quantum computation[20], quantum memory and quantum information[21] are hot topics at the time of this writing. Unfortunately, quantum mechanics forbids perfect (ideal) memory dumps due to the so called no cloning theorem[22]. Still, it is possible to get inconsistent (imperfect) memory dumps and perfect ones can be made from quantum computer simulators. The analysis of quantum memory snapshots is the domain of Quantum Memoretics (Volume 2, page 347).

Chemistry of Virtual Memory

We can use a nice basic chemical formula representation for processes in memory. In this nomenclature, the class of modules developed by a particular vendor constitutes an element. For example, M is for Microsoft modules, C is for Citrix modules, etc. Individual modules of particular elements are similar to "atoms" and denoted as numbers in subscript. For example, net.exe command running in a typical Citrix terminal services environment has the following loaded modules where we highlighted Citrix modules in bold and Microsoft modules in italics:

0:000> lm1m
net
wdmaudhook
tzhook
twnhook
scardhook
mmhook
mfaphook
cxinjime
CtxSbxHook
MPR
NETAPI32
Secur32
USER32
msvcrt
GDI32
RPCRT4
kernel32
ADVAPI32
MSVCR71
ntdll

Therefore, the formula is this: M11C8.

We put the element of the main process module first in such formulae.

The formula for IE process from the case study on page 268: M126A5U where A is for Adobe modules and U is for an unknown module that needs identification, see Unknown Component pattern (Volume 1, page 367).

These formulas can useful to highlight various hooksware components (Volume 2, page 63) and distinguish memory dumps generated after eliminating modules for troubleshooting and debugging purposes. It also forms the basis for one of many classificatory schemes for the purposes of micro- and macro-taxonomy of software discussed in the forthcoming book:

The Variety of Software: The Richness of Computation (ISBN: 978-1906717544)

In the next volume we discuss the structural formulas as well, similar to the ones used in organic chemistry.



[8] http://en.wikipedia.org/wiki/Bernhard_Riemann

[9] http://en.wikipedia.org/wiki/Multi-valued_function

[10] http://en.wikipedia.org/wiki/Riemann_surface

[11] http://en.wikipedia.org/wiki/Ernst_W._Mayr

[12] http://en.wikipedia.org/wiki/Ionian_school

[13] http://en.wikipedia.org/wiki/Anaximenes_of_Miletus

[14] http://en.wikipedia.org/wiki/Heraclitus

[15] http://en.wikipedia.org/wiki/Thales

[16] http://www.dumpanalysis.org/blog/index.php/2008/12/20/memorianicprophecy-0m1/

[17] The notion of memuons first appeared in the philosophy of memoidealism (Volume 2, page 349)

[18] Please don't confuse with memiotics. The latter is computermemuonics memory semiotics (Volume 2, page 350)

[19] Memophysical principle - theories of memory-based universe need to take into account the current mainstream sciences including physics.

[20] http://en.wikipedia.org/wiki/Quantum_computation

[21] http://en.wikipedia.org/wiki/Quantum_information

[22] http://en.wikipedia.org/wiki/No_cloning_theorem

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset