Relation pm#intentional_relation__intentionalrelation (cognitive_agent,?)  relations between an agent and one or more entities, where the relation requires that the agent has awareness of the entity
  exclusion:  attributive_relation  mereological_relation  temporal_relation  object_relation
  supertype:  relation_playing_a_special_role  this type permits to categorize relations according to their roles ; this is a traditional but quite subjective way of categorizing relations
  instance of:  intentional_relation_type
  subtype:  prefers__prefer (cognitive_agent,formula,formula)  the cognitive_agent prefers the state of affairs expressed by the 1st formula over the state of affairs expressed by the 2nd formula, all things being equal
  subtype:  in_scope_of_interest (cognitive_agent,?)  the 2nd argument is within the scope of interest of the agent;  the interest indicated can be either positive or negative
  subtype:  propositional_attitude_relation (cognitive_agent,formula)  intentional_relations where the agent has awareness of a proposition
     subtype:  desires (cognitive_agent,formula)  the agent wants to bring about the state of affairs expressed by the formula (whcih may be already true); desires is distinguished from wants only in that the former is a propositional_attitude, while wants is an object_attitude
     subtype:  considers__consider (cognitive_agent,formula)  the agent considers or wonders about the truth of the proposition expressed by the formula
     subtype:  believes (cognitive_agent,formula)  the agent believes the proposition expressed by the formula
     subtype:  knows__know (cognitive_agent,formula)  the agent knows the proposition expressed by the formula; knows entails conscious awareness, so this predicate cannot be used to express tacit or subconscious or unconscious knowledge
  subtype:  object_attitude_relation (cognitive_agent,physical)  intentional_relations where the agent has awareness of an instance of sumo#physical
     subtype:  needs__need (cognitive_agent,physical)  the 2nd argument is physically required for the continued existence of the cognitive agent
     subtype:  wants__want (cognitive_agent,physical)  the agent believes that the 2nd argument will satisfy one of its goals; what is wanted may or may not be already possessed by the agent


Another search (with same display options)?