[Home]WhatIsKnowledge

MeatballWiki | RecentChanges | Random Page | Indices | Categories

0. Preamble

Perhaps knowledge is "useful information". Useful means, for example, 'knowledge' let's us do things we otherwise couldn't do. Provide an advantage of a kind. Help in a number of situations. People know how to use a roadmap to find their way. People know how to solve a mathematical equation. This implies that knowledge has to model reality to a certain extent. It also means that knowledge is an abstraction. It's usually more than pure information like "the temperature at YZ airport at DATE/TIME was X degrees". Knowledge is about understanding.


[EditThisPage], goTo [EndOfPage], ...

1. TOC

1. TOC
2. Interesting insights at:
2.1. kurzweilAi.net
3. Various views
3.1. My knowledge is ... -- HansWobbe
3.2. Knowledge is ...
3.3. Data, Inclusion & Expert Systems
3.4. wikiPedia view & comments
3.5. Knowledge models & comments

It may be effective to (re)structure this page to have an expanded section for "Knowledge models" , given that at least two are beginning to coalesce out of the material that is accumulating and that several more exist. Of course, that depends on our Objectives for this page, which I seem to recall was a ShallowPage stub just over a year ago, that has since grown as a few people have occassionally added content. Please add any suggestions regarding an Objective for this page that may help guide its development.


2. Interesting insights at:

2.1. kurzweilAi.net

Interesting insights are available at


3. Various views

After a great deal of thought and discussion over many years, the simple statement "Knowledge is what I know." is most appealing. Considerable elaboration is (of course) possible.

Userful perspectives may result from considering ...

-- HansWobbe

3.1. My knowledge is ... -- HansWobbe

"My knowledge is the current information I base my decisions on." -- HansWobbe.

3.2. Knowledge is ...


this "Knowledge is ..." section of this page was reviewed by HansWobbe 061217.

Knowledge is power.

Knowledge gives us the power to initiate Actions with greater confidence,because there is less uncertainty regarding the expected consequences of the act.

Knowledge of our past experiences is what gives us an appreciation of the relationship between Cause and Effect; one of the most fundamental of human cravings. This is a foundation stone for our religious beliefs, our efforts to anticipate and plan for future events, and probably the most important investment decision and individual can make.


3.3. Data, Inclusion & Expert Systems


3.4. wikiPedia view & comments

From Wikipedia, the free encyclopedia. (WikiPedia:Knowledge)

Knowledge is the awareness and understanding of facts, truths or information gained in the form of experience or learning. Knowledge is an appreciation of the possession of interconnected details which, in isolation, are of lesser value.

Knowledge is a term with many meanings depending on context, but is (as a rule) closely related to such concepts as meaning, information, instruction, communication, representation, learning and mental stimulus.

Knowledge is distinct from simple information. Both knowledge and information consist of true statements, but knowledge is information that has a purpose or use. Philosophers would describe this as information associated with intentionality. The study of knowledge is called epistemology.

A common definition of knowledge is that it consists of justified true belief. This definition derives from Plato's Theaetetus. It is considered to set out necessary, but not sufficient, conditions for some statement to count as knowledge.

What constitutes knowledge, certainty and truth are controversial issues. These issues are debated by philosophers, social scientists, and historians. Ludwig Wittgenstein wrote "On Certainty" - aphorisms on these concepts - exploring relationships between knowledge and certainty. A thread of his concern has become an entire field, the philosophy of action.

Note:


3.5. Knowledge models & comments

I have a quantifiable KnowledgeModel?. No doubt someone else has thought of this before, but this is what I think knowledge is:

A structure that once interpreted by an agent reduces the entropy of an intended action.

(Another way of writing "entropy of an intended action" is "the probabilty an intended action is optimal".)

Critically, knowledge is tied to action. The action does not need to ever happen, mind you. The point is that if the action occured, the knowledge would be useful in predicting its success.

This can be operationalized in many relevant ways; e.g.

Critically, it's important that this structure is correct. That is, it isn't knowledge if there was no disaster to be averted. For instance, if a politician causes his country to invade another to, say, seek weapons of mass destruction that aren't there, you might say (in polite circles) that the politician was unknowledgable.

However, as you might guess, there are an infinite number of possible structures that might be interpreted, so it is not a given which structure might be followed. Agents have to chose which ones to follow and which ones to ignore. To do this, we qualify structures:

Knowledge has an efficiency defined as the amount of entropy reduced for a decision.

If, once interpreted, knowledge reduces the entropy to zero, we might call that knowledge a fact. If knowledge only tends to reduce the entropy by some amount in some cases, we might call it a heuristic. This model even accepts concepts as knowledge that have no impact on a decision, which we might (in less polite circles) call bullshit. Also, the model is not limited to right and wrong, which is a futile dichotomy best left to the timeless, static ideal universe of Plato's Forms where none of us have mortgages to pay. What is right and wrong, as we know, depends on the context, which means the decision at hand; we can instead distinguish betwenn worse, better, and optimal.

The interesting thing about this is that we must gain knowledge about knowledge; we must learn which knowledge has been effective in the world. We have to answer the question, "how efficient is this structure at predicting success of this action?" To do this, we have methods, from our day-to-day learning through trial and error all the way to the Scientific Method. We attach arguments to knowledge to reduce the entropy that the knowledge is indeed correct (cf. the SocialConstructionOfScience). However, those arguments are not necessary; they are superfluous to the knowledge itself. Once we have determined that something is a fact, we can forget the arguments that led us to that conclusion.

Morever, it's irrelevant if we even bother to determine how efficient some knowledge is. As long as we interpret it in our decision making and it is efficient, our decisions will be correct--regardless of whether or not we know how efficient it is. Evolution (e.g. Natural, FreeMarket?) works like this. It sends many individuals into the world, who take actions according to some strategy that represents some knowledge--which may be simply embodied as that individual (the individual is both the structure and the agent). If they are successful to some degree, they reproduce that strategy in the population, ignorant of how efficient that strategy was. Only on a aggregate, statistical basis across the population (god's eye view) can one get a sense of which strategy is better or worse.

More directedly, we can either learn how efficient knowledge is by trying it and assessing the outcome (feedback), or we can form a theory, which itself is knowledge, to predict how efficient a given structure might be if executed. The purpose of theory forming is to reduce the entropy of deciding which knowledge to execute in the real world, which helps lower the risk of wasting resources. The purpose of the agent is to ground and verify the ideal in the real, as well as ascribe meaning (i.e. effectiveness at making choices in the real world).

Finally, we consider someone (an agent) to be knowledgable if he or she makes or assists us in making good decisions. Someone who makes bad decisions all the time doesn't know what they are doing. The reason why it is so important to be knowledgable is because if you make good decisions, putting you in charge of decisions lowers the risk of those decisions. Once you are put in charge of decisions, you gain power. We could further define competency as following a process that resulted in desired outcomes, and then define a process as an interpreted structure. We can also define ability as the efficiency of an agent of interpreting a structure.

With this model, it becomes possible to quantify what is knowledge and what isn't, as well as devise methods that in an information theoretical sense (i.e. entropy) how efficient that knowledge is, if at all. Moreover, this model unifies the Scientific notion of knowledge with notions from other human and non-human endeavours. Finally, as rafts of theoretical mumbo jumbo can remain classified as knowledge despite being bullshit, it preserves the rather pointless exercise of scholarly academia, such as this posting. ;) -- SunirShah

This sounds pretty complicated and I don't quite understand how the valid idea of "knowledge having some probabilty" is helped by the term entropy which really nobody understands or can work with. How should this make knowledge quantifyable? Maybe consider CommunityWiki:InformationKnowledgeAndWisdom -- HelmutLeitner

Complicated doesn't mean wrong. Actually, that's a pretty simple explanation. In InformationTheory, entropy is the quantifiable measure of chaos or randomness, or inversely, the level of information. A low entropy means more order in the system. A high entropy means a low predictability. A high probability of one choice amongst a small number choices implies a strong predictability. Try WikiPedia:Information_entropy. The reason why probability matters is that knowledge of the future is indefinite. If we take actions purposefully, i.e. with some expectation, then we decide those actions based on our understanding of how the world will be at the beginning, end, and throughout the duration of that action. If that understanding is indefinite, we can only assign a (belief) probability.

re: CommunityWiki, that's a folk explanation of whatever those terms mean to those people, but it is very difficult to use the conclusions of that discussion in any practical circumstance, so I'm not sure what I'm supposed to make of it. -- SunirShah

In the spectrum from data to wisdom we seem to be pretty free to assign words. I suggested "data - information - knowledge - wisdom" in a section of the cw page and defined knowledge as "activated information" that is needed "to answer questions". This is compatible with the idea of an agent. Connecting "knowledge" and "decision" seems arbitrary because you can have and transfer knowledge without any related decisions.

The term entropy is very interesting in physics or thermodynamics because according "dH = dE(T) - T * dS(T)" this means that the stable state of a physical system depends on the distribution of energy in the microstates of the system. So all this is about connecting energy and probability and the concept is actually used to predict reality. But there is no concept of energy in information science. So all that is done is capture the "hype word" entropy and use it on a low level as a kind of measure for the amount of information or the amount of redundancy, which makes little sense. There is no physics of information, so probably this starts a kind of metaphysics of information science. -- HelmutLeitner

There are several critical physics of information, which I'll summarize here very poorly and briefly. The first one, as described by ClaudeShannon?, is what I am talking about here. Please see the the Wikipedia article I linked to above that explains entropy. Also, the field of physics called InformationThermodynamics? shows more concretely that InformationTheory has a grounding in physics, particularly the notion of entropy.

Finally, as a bonus, special relativity and quantum mechanics have additional theories that matter, such as the maximum speed of information, what 'information' means. Did you know that some particles can travel faster than the speed of light, as long as the total information does not? -- SunirShah

Any information needs a physical carrier, so in a way the laws of physics apply. A book is a physical object and follows the law of gravity. If the book contains knowledge, does this mean that knowledge follows the law of gravitation? No, it doesn't make a difference to physics whether a book contains knowledge or random data. It needs an interpreting mind to understand the difference. But does it make sense to discuss this? I've a few years of experience in simulating entropy effects in molecular systems, but you have no reason to believe me. You have your CS readings but I won't believe that they use the term entropy in a correct physical analogy. The Viennese physic Anton Zeilinger, who is doing this "beaming" experiments, is imho doing "science entertainment". He is successful in the media and with politicians, but will never be able to go from a photon to an object, not even to a small molecule that's not trivial. -- HelmutLeitner

Maybe I am misunderstanding your objection. First, have you given any of the underlying theorems a FirstReading? It seems you opinions are based on personal taste or disgust rather than something based in logic, empiricism, or the material at hand. How does your opinion motivate changes to my statement? i.e. what am I supposed to do in reaction? I am not about to start fighting against mathematics and quantum mechanics. I am not going to abandon the word entropy just because you don't know Shannon's theorems. I have sufficiently, in my opinion, supported my choice of the word, which you are ignoring with this "science entertainment" sideline.

"Science entertainment" is indeed really off base. If you disagree with Zeilinger's conclusions (not that I defend them, as I don't know Zeilinger's work), there is a proper and powerful and only acceptable way to counter them, which is through the ScientificMethod?. Of course, we don't have the ability to do these experiments, but that is just as relevant as knocking Zeilinger's experiments down on a political/emotional level. (yes, yes, cf. Latour, SocialConstructionOfScience) -- SunirShah

Sunir, you are probably right criticizing me. The situation is unclear. Where do we go with this page? If you are producing an article for university I would hate to interfere with your work. If we go into the depth of information science and physics we should ask whether this is relevant for MeatballWiki, where does it fit in?

Basically this started when you claimed that knowledge can be quantified. I deeply doubt this and would like that you prove this claim at least with a simple example. For example how much knowledge is in "2+2=4" and "The capital of Canada is Rome"? (in case we agree that these are not pieces of knowledge in themselves but pieces of information - then let us imagine some agent or actor giving answers to corresponding questions). If you don't like these examples choose your own, but give me an example of a piece of knowledge and a number that quantifies it. -- HelmutLeitner

Maybe I am misunderstanding your objection. First, have you given any of the underlying theorems a FirstReading? It seems you opinions are based on personal taste or disgust rather than something based in logic, empiricism, or the material at hand. How does your opinion motivate changes to my statement? i.e. what am I supposed to do in reaction? I am not about to start fighting against mathematics and quantum mechanics. I am not going to abandon the word entropy just because you don't know Shannon's theorems. I have sufficiently, in my opinion, supported my choice of the word, which you are ignoring with this "science entertainment" sideline.

"Science entertainment" is indeed really off base. If you disagree with Zeilinger's conclusions (not that I defend them, as I don't know Zeilinger's work), there is a proper and powerful and only acceptable way to counter them, which is through the ScientificMethod?. Of course, we don't have the ability to do these experiments, but that is just as relevant as knocking Zeilinger's experiments down on a political/emotional level. (yes, yes, cf. Latour, SocialConstructionOfScience) -- SunirShah

Sunir, you are probably right criticizing me. The situation is unclear. Where do we go with this page? If you are producing an article for university I would hate to interfere with your work. If we go into the depth of information science and physics we should ask whether this is relevant for MeatballWiki, where does it fit in?

Basically this started when you claimed that knowledge can be quantified. I deeply doubt this and would like that you proof this claim at least at a simple example. For example how much knowledge is in "2+2=4" and "The capital of Canada is Rome" in itself or if someone gives it as an answer to corresponding questions? -- HelmutLeitner

I haven't decided whether I want to use this idea yet in a paper. I am sitting here analyzing a major failure of running a wiki inside a course on constructivism in distance education, and trying to understand how the students and instructors viewed the wiki. They spent most of the course talking about how they were creating knowledge in a DiscussionForum? format simply by discussing the readings, and then they tried to import this workflow wholesale into the wiki, which was a disaster. However, I contend that they haven't created much knowledge yet--particularly in a constructivist sense, as they have not tested their suppositions in the real world through real experience, nor were they integrating (i.e. summing) individual PrivateInformation? (i.e. parts) into CollectiveIntelligence (i.e. the whole greater than the sum of the parts).

Now, my PhilosophyOfEducation is biased towards constructivism (a la JohnDewey), and my philosophy of software development is biased against most exercises in the name of 'ScientificManagement?' and other ImposedRationality, towards ParticipatoryDesign and FairSoftware (which has incidentally left me somewhat banished from software development at the moment). What is interesting about constructivism is that is quite overtly based on the underlying principles of the ScientificMethod? (e.g. cf. David Kolb's experiential learning theory), and what I like about participatory design (or properly CooperativeDesign?) is that it is also highly empirically grounded (even if it is still overly political). I can and might argue that ParticipatoryDesign's techniques as evolved fit into Latour's framework outlined in Science in Action (my new personal bible, as you know).

So, why my highly refined theory of knowledge? Well, if I can strongly argue for this model of actionable and useful knowledge, I can argue against people using the term knowledge for what basically is a lot of people talking past each other or to each other, rather than people focusing on integrating, abstracting, and validating ideas. If you like, it's the same argument I've been making for years about why blogging is sadness and wikis are light.

For 2+2=4, or Canada's capital is Rome, we can assess how effective they are by basing decisions on them and seeing if we receive favourable outcomes. If we do, then we are more likely to believe those assertions are factual. If we don't, we may lose faith on this assertions to the point where we no longer consider them knowledge. To put hard numbers to this, a long time ago, I wrote a fuzzy logic backwards chaining inference engine (truly, that is what it is called) that attempted to follow this process, but of course that is a naive approach. I don't know if I really am claiming there are hard numbers to assign to a fact. Perhaps empirical is a better word than quantifiable? Or operationalizable? I just want to get away from folk definitions to something pragmatic. -- SunirShah

Constructivism is an interesting topic. Typically I fight "philosphical constructivism" which says that there is no reality we can rely on, everything just constructions in our minds. If you follow that, there is no "in life", no real empirical experience possible. E. g. if a rainmaker performed archaic rituals until it rains, his experience was positive and reinforced his believes, it worked in his life.

On the other hand I very much like "educational constructivism" which says that learning means extending known patterns by adding new features and forming new concepts in an natural individual stepwise unfolding process. This is very much in sync with ChristopherAlexander. One can think about how children learn language or how we are able to make the difference between "cat", "dog" or "tiger".

It's weird that "philosophical constructivism" and "educational constructivism" can live in the same brains, e. g. Glasersfeld. It seems that he needs the radical philosophy to create a space of freedom to develop their constructivist biological learning theory, which basically sees the brain as pattern recognition and extraction engine. For others, like Maturana, it seems that "radical constructivism" is a political statement supporting "communist" worldviews, in the sense that there is "no truth" and that the human mind/consciousness/perception is a "free construction".

The paradoxical thing about this is, that there is no constructivist learning method (according to experts I listened) and teachers typically don't want their students to construct "they own constructions of mathematics" but they just strive to build a more convincing and inevitable "pyramid of true knowledge" more seemlessly and effectively. Of course "true" in this context doesn't mean "philosophically true" but a pragmatic "true is what the teacher says" or "true what's in the books".

What connects all the considerations and arguments on this page is the concept of "context". There is no knowledge without a context of language and culture, basically a knowledge environment. I think any idea of operationalization is just deferring this problem of recursive definition of knowledge (of language). Perhaps this could be solved by a kind of bootstrapping from a fundamental ontology. -- HelmutLeitner


goTo [TOC]

Discussion

MeatballWiki | RecentChanges | Random Page | Indices | Categories
Edit text of this page | View other revisions
Search: