Thursday, July 20, 2006

Information, Knowledge, Wisdom

How do these three puzzle pieces fit together?




How do we tell them apart? How do we acquire them? How do we apply them?



GreenSmile said...

========== or is it ==========
its both, sometimes in the same place at the same time but often one is absent so neither survives or is seen.

I pulled out the bits and data to add to your hierarchy right away because I have had that same layering occur to me at different times.

Anonymous said...

ah, semantics. Here's my take:

Information: the basic stuff, like greensmile suggests, multiple bits.

Data: information from a certain set

Knowledge: useful, retained information

Wisdom: knowledge that applies to more than one type of information, and is valuable to share with others

I'd like to think that the latter is only applicable to sentient species.. but something tells me I'd be wrong.

Anonymous said...

Hi, I suggest you to visit:
The relation IPK
Knowledge) is more formal, better integrated and should be more clear.
Data and wisdom concepts belong to two other independent perspectives.
First relates to the processing of everything, the second to the utility (also of everything).
The IPK multi-layer ontology is "computational" and used to the modeling and simulation of an Abstract Intelligent Agent (, it means, for the human symbolic thinking too.

References: TOGA Metatheory, Adam Maria Gadomski ( See Google search).
- 6 Feb.2006

George said...

I've been playing around with something I call the Noetic Hierarchy. From the top:
Wisdom: moral judgment based on tacit knowledge (experience).
Understanding: capacity to interpret and explain - needed for anticipatory decisions.
Conceptual knowledge: learned explicit encoded concepts - basis of language.
Procedural knowledge: learned implicit and tacit - basis of skills.
Phyletic knowledge: inherited - basis of what cognitive psychologists call folk-knowledge, and affect.

For information I like: News of difference that makes a difference (attributed to Gregory Bateson). Shannon information follows along this line in some interpretations - messages that have lower a priori expectancy (by the receiver) convey more information. It's a measure of surprise. Hence knowledge is the inverse of information. The more you know the less you are surprised!

Conversely, the receipt of informational messages is transformed in the brain to knowledge.