How encapsulation does not prevent knowledge sharing

http://avancier.website

 

Despite encapsulation, a client may know something of the state data maintained by a server.

Not by direct access to it, but through the exchange of messages, and the specification of operation contracts.

Knowing something

All animals must know something of the world they live in.

An earthworm knows enough to recognise another worm of the same type – for mating purposes.

Certainly, a worm can recognise others members of the worm set, though it may not remember them, and surely cannot count them.

 

Even 3,500 million years ago, the earliest organisms knew enough not to eat themselves.

They could recognise their own substance and distinguish it from chemicals in their environment.

 

Later, through evolution, animals developed ever more sophisticated ways of knowing the world.

By about 700 million years ago, Jellyfish had nerve nets that enabled them to sense things in the world and manipulate them.

In a nerve net, intermediate neurons monitor messages from sensory neurons and react by sending messages to direct motor neurons.

Thus, animals evolved to process transient sensations – each being an encoded description of a reality.

 

By about 550 million years ago, some animals had a central hindbrain to monitor and control homeostatic state variables.

An internal information feedback loop connected that hindbrain to the organs and motors of the body.

The hindbrain had to sense the state of the body state variables and send messages to direct actions that maintain those variables.

 

About 250 million years ago, the paleo-mammalian brain evolved to manage more complex emotional, sexual and fighting behaviors.

A wider information feedback loop was needed to connect that higher brain to the external world.

The higher brain had to sense the current state of food items, friends and enemies in the world, and direct appropriate actions

 

In short

Animals must know the real world well enough to manipulate it and predict its behavior.

If they could not find food and avoid danger, they would soon die.

A staggering achievement of humankind was the invention of machines that can store knowledge.

Software systems must know something of the world around them, since they are designed to monitor and direct entities and events in that world.

Sharing knowledge

Humans not only remember knowledge (in memory) they also share knowledge (via messages).

They describe their knowledge of the world to each other - well enough.

When you talk to me, I get to know something of what you know (not how you store knowledge in your brain).

If humans could not share knowledge thus, human society would soon collapse.

I can test what you tell me against my experience and in other ways, empirical, logical and social.

And then reasonably conclude I do know something of what you know - well enough.

 

Sadly, human communication is imperfect, and misunderstandings occur.

But software systems also remember knowledge (in memory) and share knowledge (via messages).

And they are designed to share knowledge perfectly.

E.g. A sensor object tells a motor object that a room's air temperature is too low; there is no room for misunderstanding.

The motor acts according to that shared knowledge.

The motor need not directly access the sensor's state data, but one way or another it needs to know the state of the sensor.

Encapsulating knowledge storage

Hiding how a system, person or object stores information does not mean hiding what it knows.

People share knowledge via emails stored on an email server.

We don’t know or care how our emails are stored (in binary digits).

But our ongoing discussion depends on sharing the knowledge that is encoded in that store.

 

Encapsulating a system behind an interface is a principle of modular design, much older than OO thinking.

In the 1960s, an interface was understood to hide what is inside the system, its internal operation and state data.

In the 1970s, Parnas proposed principles for modular system design.

He recommended encapsulating the operations that write (encode) and read (decode) the elements of a persistent data structure.

Similarly, in the 1980s, OO theory presumed a client cannot directly access the state of a server.

Still, a client object may request a server object to report facts that it detects or knows.

Why? So that the client can act accord to the knowledge it has acquired.

Exposing the logical structure of state data in operation contracts

In business operations, human and computer actors need to know the state of entities recorded in business databases.

The business rules for an operation can be defined as pre and post conditions referring to the state of the server system that performs the operation.

E.g.

·         a precondition of a sale operation is this: Debt + Sale Value < Credit Limit.

·         a post condition of the sale operation is this: Debt (after) = Debt (before) + Sale Value.

 

A client may need to check that a precondition is met, then direct the server to establish a post condition.

Thus, a client may need to know something of the server’s state data, both its structure and its value.

 

In short, despite encapsulation, a client may know something of the state data maintained by a server.

Not by direct access to it, but through the exchange of messages, and the specification of operation contracts.