As i mentioned in my earlier post, the intent is to be able to express metaphysical concepts, ideas and beliefs in formal notations that are connected with simple arithmetic operators based on stated logic. We can then test the logic by comparison with other logic statements and improve content accuracy.
This is a very high level approach and initially will not get into detailed analysis. I do not know where this will lead but hope to get some basic insights that can clarify the subject and provide new ideas and methods for study of knowledge, ancient history & languages in a computational framework using a unified database of vocabulary. I hope open minded and expert researchers in these fields can find some utility in such an approach and pursue it. IMHO, as a bookkeeper, i can only come up with seed ideas and no more.
I will begin with cladking's last post and apply the following technique;
1. Identify key concept words and segregate their definition, exposition.
2. Assign notations to concept words
3. Identify key statements that link concept words
4. Determine the arithmetical relationship between the concept words
Quote (extract 1)
“Author: cladking ()
Date: June 16, 2016 02:25AM
"All knowledge is simple. Manipulating knowledge can become a little complex if you're a rocket scientist. Following complex formulae can be complex for a cosmologist. Even for the toughest occupations it's mostly just rote and memory. Much of all thought is just rote even if it seems difficult to ourselves or others. We have habits of thought that greatly speed the process and one can live virtually his entire life staying within these confines. Complex equations are understood in simpler terms that are habit. A brain surgeon's mind can drift in the trickiest part of the operation and he'll do it perfectly."
Knowledge k | memory m
Knowledge is proportional to memory
k ∝ m (1)
"Knowledge is bits and pieces of information; if I throw my weight forward at the top of my trajectory on a swing I'll go higher. A female redwing blackbird is a mottled brown with a speck of red on her wing. They are visceral or learned and some have millions of them. The total amount in humans is very high due to complex language. But most all questions no one knows the answer. Despite the vast knowledge of humans in aggregate we can hardly scratch the surface of everything there is to know just on the surface of this one tiny planet. We know a few "laws" of nature."
information i |
Knowledge is bits and pieces of information
k ∝ i (2)
"Ancient people thought a lot differently and this is where comparing knowledge gets trickier. Since language defines how an individual thinks and processes information it simply stands to reason that the knowledge gained by ancient man was different than that gained by modern man. This knowledge in the former case is uncovered through logic and observation. Naturally they learned different sorts of thing than we know. It's of no value to people today to know that when a monkey makes a shrill shreik followed by a yip that it means a pair of lions are hunting. Modern people in a similar situation will know how to load their 30: 06. But the ancient language and its users accumulated millions of bits of fact from how to find the best honey to how to make a pair of sandals. We learn how to start a car and steer it and which brand of honey is least adulterated."
Ancient knowledge ka | Modern knowledge km | Ancient language la | Modern language lm |
language defines how an individual thinks and processes information
Knowledge gained by ancient man was different than that gained by modern man
ka ∝ la (3)
km ∝ lm (4)
ka ≠ km (5)
This concludes the basic first part and I hope that it is critically reviewed for rationale and utility. All feedback, comments, suggestions are welcome. (+ or -)
I will post the remaining extract after evaluating the feedback and required changes based on logic.
I must mention here that the inspiration for such an approach was first seeded in my intellect some years ago by a chance (?) and basic reference to "A Primer for Logic and Proof by Holly P. Hirst and Jeffry L. Hirst". I dedicate this OP to them.
This is a very high level approach and initially will not get into detailed analysis. I do not know where this will lead but hope to get some basic insights that can clarify the subject and provide new ideas and methods for study of knowledge, ancient history & languages in a computational framework using a unified database of vocabulary. I hope open minded and expert researchers in these fields can find some utility in such an approach and pursue it. IMHO, as a bookkeeper, i can only come up with seed ideas and no more.
I will begin with cladking's last post and apply the following technique;
1. Identify key concept words and segregate their definition, exposition.
2. Assign notations to concept words
3. Identify key statements that link concept words
4. Determine the arithmetical relationship between the concept words
Quote (extract 1)
“Author: cladking ()
Date: June 16, 2016 02:25AM
"All knowledge is simple. Manipulating knowledge can become a little complex if you're a rocket scientist. Following complex formulae can be complex for a cosmologist. Even for the toughest occupations it's mostly just rote and memory. Much of all thought is just rote even if it seems difficult to ourselves or others. We have habits of thought that greatly speed the process and one can live virtually his entire life staying within these confines. Complex equations are understood in simpler terms that are habit. A brain surgeon's mind can drift in the trickiest part of the operation and he'll do it perfectly."
Knowledge k | memory m
Knowledge is proportional to memory
k ∝ m (1)
"Knowledge is bits and pieces of information; if I throw my weight forward at the top of my trajectory on a swing I'll go higher. A female redwing blackbird is a mottled brown with a speck of red on her wing. They are visceral or learned and some have millions of them. The total amount in humans is very high due to complex language. But most all questions no one knows the answer. Despite the vast knowledge of humans in aggregate we can hardly scratch the surface of everything there is to know just on the surface of this one tiny planet. We know a few "laws" of nature."
information i |
Knowledge is bits and pieces of information
k ∝ i (2)
"Ancient people thought a lot differently and this is where comparing knowledge gets trickier. Since language defines how an individual thinks and processes information it simply stands to reason that the knowledge gained by ancient man was different than that gained by modern man. This knowledge in the former case is uncovered through logic and observation. Naturally they learned different sorts of thing than we know. It's of no value to people today to know that when a monkey makes a shrill shreik followed by a yip that it means a pair of lions are hunting. Modern people in a similar situation will know how to load their 30: 06. But the ancient language and its users accumulated millions of bits of fact from how to find the best honey to how to make a pair of sandals. We learn how to start a car and steer it and which brand of honey is least adulterated."
Ancient knowledge ka | Modern knowledge km | Ancient language la | Modern language lm |
language defines how an individual thinks and processes information
Knowledge gained by ancient man was different than that gained by modern man
ka ∝ la (3)
km ∝ lm (4)
ka ≠ km (5)
This concludes the basic first part and I hope that it is critically reviewed for rationale and utility. All feedback, comments, suggestions are welcome. (+ or -)
I will post the remaining extract after evaluating the feedback and required changes based on logic.
I must mention here that the inspiration for such an approach was first seeded in my intellect some years ago by a chance (?) and basic reference to "A Primer for Logic and Proof by Holly P. Hirst and Jeffry L. Hirst". I dedicate this OP to them.