Class Algebra

Posted by Wonky 11 years, 3 months ago to Philosophy
11 comments | Share | Best of... | Flag

Hoping to find a purely rational forum in which to attempt to integrate algebra with the Objectivist notion of conceptualization (classification) via measurement omission.


Add Comment

FORMATTING HELP

All Comments Hide marked as read Mark all as read

  • Posted by 11 years, 3 months ago
    Aha... Found this old bit of jabbering to copy/paste.



    Class Algebra for AI

    Class Algebra does not acknowledge subtraction or division as valid logical operations. Nor does it recognize sums and products as independent of their constituents. Formally, union and separation are the valid operations underlying all equalities. While a sum may appear to obscure the constituents, we "know" that those constituents remain intact in some form. While a summation process may not necessarily be reversible, it's transactional history remains intact whether forgotten or not. Likewise, separation operations do not cause the portion of a sum that is no longer the focus of an equation to disappear from existence. The transactional history can always be used to reconstitute the original state of the original sum before a given operation separated it.

    A+B=C
    A(X)+B(Y)=C(Z)

    Class axis
    Z may be any of
    1. An instance or entity
    2. A class or concept
    3. A method or process

    1. Instance/Entity
    If Z is an instance, the equality should be considered a definition or an identification. Those A(apples) and those B(oranges) are C[that](instance of a collection of fruit).

    In this case, C is always 1, for no reference to [that] may indicate more or less than the singular instance to which it refers.

    When A is 1, X is an instance. When A is greater than 1, X is a collection of similar instances whose differentiation is not significant in the context of the definition/identification. The same rules apply to B and Y.

    While X and Y, may be the same kind of thing, it is not normal to refer to them separately unless they differ in some way that is significant to the definition/identification. That 1 (left leg) and that 1 (right leg) are 1 (your legs) is not an uncommon formulation, but those 2 (legs) are 1 (your legs) is a simpler formulation.

    No algebraic movement of variables from side to side is permitted, but A(X) and B(Y) may be transposed.

    Instance Definition/Identification Laws
    C=1
    A>=1
    B>=1
    X<>Y
    A(X)<>C(Z)-B(Y)
    B(Y)<>C(Z)-A(X)
    A(X)+B(Y)=B(Y)+A(X)=C(Z)


    2. Class/Concept
    If Z is a class or concept, again C is always 1, and Z is the resultant class or concept arrived at by combining A(X) and B(Y).

    The formulation...
    "an (a=1) animal (x) with (+) (implied 2) wings (y) is (=) a (c=1) bird (z)"
    ...is a linguistic summation serving as a definition/identification of a subclass of animals.

    Again standard algebraic treatment of the equality is invalid. It is nonsensical to say that because "an animal with wings is a bird", so too, "a bird without wings is an animal". The later formulation does nothing to refine, clarify, or identify a class or concept.

    Further, one of x or y is typically a pre-existing class, while the other is an attribute or feature being applied to the pre-existing class in an attempt to clearly identify the resultant subclass z. It is atypical to state that "wings with an animal are a bird", but linguistically common to revise the summation words to other forms such as "wings on an animal make it a bird". These are not precisely logically equivalent, but with respect to class algebra version 1, we'll treat them as such.

    Class/Concept Definition Laws
    C=1
    A>=1
    B>=1
    X<>Y
    A(X)<>C(Z)-B(Y)
    B(Y)<>C(Z)-A(X)
    A(X)+B(Y)=B(Y)+A(X)=C(Z)

    3. Method/Process
    A method or process is a special form of concept in which sequence is vital to the formulation.

    In the formulation...
    "(a=1) turn on the shower faucet (x) then (+) (b=1) stand in the falling water (y) to (=) take a (c=1) shower (z)
    ...demonstrates how sequence is vital to method/process definition. One cannot stand in the falling water until the shower has been turned on.

    Method/Process Definition Laws
    C=1
    A>=1
    B>=1
    X<>Y
    A(X)<>C(Z)-B(Y)
    B(Y)<>C(Z)-A(X)
    B(Y)+A(X)<>C(Z)


    --------------
    c=a-b
    c(z)=a(x)-b(y)

    Specialized Separation as Definition
    Again, Class Algebra does not recognize subtraction as a valid logical operation. In cases of abnormality or deviation from a norm, instances of classes may be identified by a feature of a class to which they belong that is lacking.

    (c=1) Charlie (z) is (=) a (a=1) boy without (-) (implied b=2) ears (y)

    This form of definition is useful insofar as it can be used to rapidly identify the major class to which an instance belongs, and then describe a slight deviation from a normal instance of that major class to create a subclass which is easily identifiable.

    "Which of these plates is chipped?"
    "Do you see that bald man (man with no hair) over there?"

    Abnormality equations can actually be instructive if little is known about the normal class.

    "Normal boys have 2 ears."
    "Normal plates are not chipped."
    "Normal men are not bald."

    Because abnormality equations are just as common as typical definition equations, a good AI system will learn as much, or more, about classes from abnormality as it will from definition.

    Reply | Mark as read | Best of... | Permalink  
  • Posted by 11 years, 3 months ago
    Well, I botched that. Thought the note was an introduction to the thread.

    At any rate, some basics... The only valid concept (class) operations are union and separation. Mathematical operations are a convenience that can easily drop context and transactional history.

    Example 1:
    (1 apple + 1 orange = 2 fruits) is true
    (2 fruits - 1 apple = 1 orange) may or may not be true

    This suggests that a fundamental law of what I like to call "class algebra" is: a union equation requires that the items being unified are instances of or subclasses of the result.
    Reply | Mark as read | Best of... | Permalink  
    • Posted by 11 years, 3 months ago
      It also demonstrates the context dropping inherent in the algebraic movement of arguments from one side of an equation to the other.

      1[A]+1[B]=2[C] (A, B, and C are all classes, subclasses, or instances)

      Assuming a class (concept) is greater than any of its subclasses or instances, we know that:
      C>=A
      C>=B

      Movement of the arguments in this form of equation obscures all but the the greatest class, subclass, or instance (in this case, C) unless A, B, and C are explicitly tracked throughout each operation.
      Reply | Mark as read | Parent | Best of... | Permalink  
      • Posted by LetsShrug 11 years, 3 months ago
        Thanks Quentin.
        Reply | Mark as read | Parent | Best of... | Permalink  
        • Posted by 11 years, 3 months ago
          It amazes me how so many intelligent people (myself included) commonly use words like: all, none, always, never, everybody, nobody, etc.

          How do we so often fail to see logical errors (our own or other's)?
          Reply | Mark as read | Parent | Best of... | Permalink  
        • Posted by 11 years, 3 months ago
          I've read Leonard Peikoff's "Objectivism" several times now, and while I'm satisfied that the epistemology of objectivism is spot on, I feel like there is much more to add to the idea that measurement omission is the key to concept formation. I'm a bit obsessed with the idea that a simple set of rules could be added to this foundation and taught to individuals (myself, for instance) to help them quickly identify contradictions and logical fallacies.
          Reply | Mark as read | Parent | Best of... | Permalink  
          • Posted by Rozar 11 years, 3 months ago
            I think this is a brilliant line of thought and fits in with the Objectivist philosophy, however be careful not to call it a success without exstensive testing. Math is an easy thing to make a mistake in.
            Reply | Mark as read | Parent | Best of... | Permalink  
            • Posted by 11 years, 3 months ago
              Oh my, I'm nowhere near calling any of this success. Actually, I'm hoping that this might prompt a lively and logical dialog to refine or redirect the effort. Thanks for your feedback!
              Reply | Mark as read | Parent | Best of... | Permalink  
  • -1
    Posted by $ MikeMarotta 11 years, 3 months ago
    Thanks for the efforts. I have been reading this over and over, considering and reconsidering. I had a class in symbolic logic back in 2006 and got an A, but there was the day that I spent five hours on a homework problem... Just to say, I agree with this:
    "Formally, union and separation are the valid operations underlying all equalities."

    That would be integration and differentiation (epistemological, not mathematical).

    When you say "I feel like there is much more to add to the idea that measurement omission is the key to concept formation." you identified an interesting problem: what (ELSE) is concept formation? The only difference between atomic isotopes is indeed a measurement, so when we say "Carbon-14" that is a concept, a very complex concept, actually, but for which measurement is critical to the definition.

    You say also: "I'm a bit obsessed with the idea that a simple set of rules could be added to this foundation and taught to individuals (myself, for instance) to help them quickly identify contradictions and logical fallacies."

    Just the actual rules of Boolean algebra would be a great beginning for 99.9% of humanity (including many here). I mean, you could use "predicate calculus" as well to reduce terms to symbols for easy manipulation without embumbrance of emotionalisms like patriotism, family values, tradition, hope for change, etc. But basically, just knowing actual formal logic is a serious step in the right direction.

    What I fear is that you are looking for an automatic or quasi-automatic way - some kind of algorithm or heuristic - to always know the truth. Nothing like that exists or can. For one thing, truth is contextual. When you remove context you risk removing meaning.

    For me, these problems are LINGUISTIC. I speak a couple of languages and knew a few more and am pretty handy with almost anything human or computer. If you want to know the rules of thought, you need to know how people actually think. Language reveals that.

    Just for instance - In English, we have singular and plural only. But Russian (which Ayn Rand knew) has the dual, also. 1 2 Many. That is how people thought for thousands of years and still think today, but in English, we lost the dual - you have to search for it: pair of shoes; brace of pheasants; wedding couple - and so we make generalizations about pluralities all the time.

    As you noted, we say most, some, all, none, etc. with seeming abandon.
    Reply | Mark as read | Best of... | Permalink  
    • Posted by 11 years, 3 months ago
      Thank you for your feedback!

      As a CIO for a small company, I manage, manipulate, and present a fair amount of data and have become all too familiar with dual-entry accounting. There is something ineffably elegant about a balance sheet offset against a profit/loss statement. The scientist in me can't help but attach the concept of a general ledger to the concept of conservation of energy - nothing lost or gained, everything nets out to a single constant, 0 or X. The sum of a ledger account, however, is quite different from the transactions that contributed to that sum just as the conclusion or summary of a paragraph is quite different from the sentences it contained.

      "To know the exact meaning of the concepts one is using, one must know their correct definitions, one must be able to retrace the specific (logical, not chronological) steps by which they were formed, and one must be able to demonstrate their connection to their base in perceptual reality." - Introduction to Objectivist Epistemology

      I've been a programmer for as long as I can remember despite my mechanical engineering education, and the driving force behind this endeavor is... well... artificial intelligence (to be honest). Programming languages are definitely more rigid and logical than human languages, and as they've evolved, their simplicity has also evolved. Without going into the complexities, a language like C is vastly more complicated than a language like Javascript, while the basic syntax looks similar. As I read through the epistemology of Objectivism, I can see concepts as similar to Javascript classes. Instances of those classes can be iterated through, named, sorted, sub-classed for attribute or function assignment, etc.

      Indeed, the concept of element subsumes the concept of carbon as the concept of carbon subsumes carbon-14. This is called polymorphism in object oriented programming. "Element" is a class with attributes such as group, period, proton count, neutron count, electron arrangement, halogen/noble gas/metal. "Carbon" is an instance and a subclass, inheriting all of the attributes of "Element" with specific values assigned. Likewise, "Carbon-14" is a subclass of "Carbon" inheriting all values from the super-class while simply changing the value of neutron count.

      I am indeed looking for "an automatic or quasi-automatic way - some kind of algorithm or heuristic - to always know the truth". Or at least for a way to teach computers to dissect linguistic equalities and inequalities and know which questions to ask in order to integrate new concepts into a concept database.

      Consider a brand new program beginning an iterative learning process by interpreting the statement "apples and oranges are fruits". It is pre-programmed to understand words that serve as equality and inequality operations (words) such as "is", "are", "isn't", "is not", "aren't", "are not", "were", "have", etc. It is also pre-programmed to recognize normal plurality and test for metaphors. The program suddenly (apparently magically) asks you what differentiates apples from oranges. The Q&A goes on through many iterations. The program might consult an online dictionary or Google to answer it's own questions.

      The issues of context, source, credibility, etc. are resolved over time as the program interacts with more individuals and measures the reliability of each input.

      Fantasy... perhaps. Whether to bother with writing said program for public consumption isn't really something I've considered (much). I'm more interested in recognizing more of the subtle ways in which we come to (consciously and/or unconsciously) learn new concepts through our own iterative process of Q&A. Computers may never have all of the perceptual inputs that we, as humans, have to validate the knowledge gained through the process, and as such, we may never trust any artificial knowledge they may gain. That doesn't mean that they couldn't be used as a tool by which to understand more about our own learning processe(s)...
      Reply | Mark as read | Parent | Best of... | Permalink  
      • -1
        Posted by $ MikeMarotta 11 years, 3 months ago
        You might enjoy "Epistemology and Computer Science Object-Oriented Programming and Objectivist Epistemology: Parallels and Implications" by Adam Reed in _The Journal of Ayn Rand Studies_ 4, no. 2 (Spring 2003): 251–84. You can find it online and download the PDF.
        http://web.augsburg.edu/~crockett/210/21...

        You will find quite a few programmers here. For one thing, engineers in general are drawn to Rand, of course. Also, unlike other disciplines, programming is not government regulated. (I had a college instructor in physics with master's degrees in math and physics who left the state because they would not let him take the engineering licensing examination without a bachelor's in engineering. With computers, you do not run into that.

        Also, with computers, talk as they will about fuzzy logic and such, ultimately, it is Either-Or: on or off; yes or no; true or false; 0 or 1.

        I got into programming when a buddy from General Motors in a transportation management class suggested that computers will be everywhere someday. Ten years later, I wrote my first user manual. In 2007 and 2008, I completed my college and university requirements in computer literacy with classes in Access and Java. Last year, I took a community ed class in Ruby and Rails. But you can probably find a dozen real programmers here.
        Reply | Mark as read | Parent | Best of... | Permalink  

FORMATTING HELP

  • Comment hidden. Undo