Start Discovering Solved Questions and Your Course Assignments
TextBooks Included
Active Tutors
Asked Questions
Answered Questions
ann representationmostly anns are taught on ai courses since their motivation from brain studies and the fact which they are used in an ai task and
id3 algorithmfurther for the calculation for information gain is the most difficult part of this algorithm hence id3 performs a search whereby the
basic ideahowever in the above decision of tree which it is significant that there the parents visiting node came on the top of the tree whether we
specifying the problemnow next here furtherly we now use to look at how you mentally constructed your decision tree where deciding what to do at the
reading decision treeshowever we can justified by see that a link between decision tree representations and logical representations that can be
decision tree learningfurthermore there is specified in the last lecture such as the representation scheme we choose to represent our learned
variable or compound expression - unification algorithmhere some things to note regarding this method are i there if really trying to match a
function name or connective symbolwhether if we write opx to signify the symbol of the compound operator then predicate name and function name or
unification algorithmhere if notice for instance that to unify two sentences as we must find a substitution that makes the two sentences the same
example of unificationlet now here assume instead that we had these two sentences as knowsjohnx rarr hatesjohn x knowsjack mary thus here
unificationas just above this we have said that the rules of inference for propositional logic detailed in the last lecture can also be required in
implicative normal formthus the sentence is now in cnf in fact for simplification can take place by removing duplicate literals and dropping any
eight-stage process - conjunctive normal formshence we notice the following eight-stage process converts any sentence with cnf as 1 eliminate
conjunctive normal formshowever there for the resolution rule to resolve two sentences so they must both be in a normalised format called as
propositional versions of resolutionjust because of so far weve only looked at propositional versions of resolution however in first-order logic we
binary resolutionhowever we saw unit resolution for a propositional inference rule in the previous lecture a b notb athus we can take
drawbacks to resolution theoremthus the underlining here identifies some drawbacks to resolution theorem proving it only works for true theorems that
resolution methodfor a minor miracle occurred in 1965 where alan robinson published his resolution method as uses a method to generalised version of
proof by contradictionnow for forward chaining and backward chaining both have drawbacks but another approach is to think about proving theorems by
backward chainingin generally given that we are only interested in constructing the path whether we can set our initial state to be the theorem
forward chainingnow we have suppose we have a set of axioms that we know are true statements about the world whether we set these to each be an
chains of inferencenow we have to look at how to get an agent to prove a given theorem using various search strategies thus we have noted in previous
existential introductionnow if we have any sentence as a and variable v that does not occur in a so then for any ground term g such occurs in a than
existential elimination now we have a sentence a is with an existentially quantified variable v so then just for every constant symbol k that it
universal eliminationhere for any sentence there is a containing a universally quantified variable v just for any ground term g so we can substitute