Nov 8, 2009 - travel â not beach, passport ok passport ok â not expired passport expired passport â not passport ok. LS models: {p_ok} {e_p}. LS models:.
Stable Model Implementation of Layer Supported Models by Program Transformation Luís Moniz Pereira
Alexandre Pinto
Centre for Artificial Intelligence - CENTRIA Universidade Nova de Lisboa INAP’09, 5-7 November
Évora, Portugal 1
Sunday, November 8, 2009
Summary Motivation •
For practical applications, availability of a top-down query-driven procedure is most convenient for ease of use and efficient computation of answers, when employing Logic Programs as knowledge bases
•
2-valued semantics for Normal Logic Programs (NLPs) allowing topdown query-solving is highly desirable, but the Stable Models semantics (SM) does not permit it, for lack of the “relevance” property
•
To overcome this limitation we introduce a 2-valued semantics for NLPs — the Layer Supported Models semantics (LSM) — that conservatively extends the SM semantics, enjoys relevance and cumulativity, guarantees model existence, and respects the WellFounded Model (WFM), and implement a meta-interpreter via a program transformation 2
Sunday, November 8, 2009
Outline • Motivation • Layering vs. Stratification • Layer Supported Models • Properties of LSM semantics • Program Transformation: LSM to SM • Conclusions and Future Work 3 Sunday, November 8, 2009
Motivation (1/3) • NLPs are commonly used for KRR • Stable Models fails to give semantics to all NLPs Ex: beach mountain travel
← ← ←
not mountain not travel not beach, passport ok
passport ok expired passport
← ←
not expired passport not passport ok
has no SMs, though the lower (even) loop alone by itself has two: {pass_ok} {exp_pass} 4 Sunday, November 8, 2009
Motivation (2/3) • SM fails to give semantics to Odd Loops Over Negation — OLONs • OLONs can appear by themselves, by combining NLPs, or by updating an NLP • OLONs are not Integrity Constraints ‒ ICs — they express distinct KRR concerns
5 Sunday, November 8, 2009
Motivation (3/3) • SM does not allow top-down query-solving due to lack of Relevance • SM needs full grounding to be computed: a major drawback; another is that whole models are computed • There is a need for a semantics which: • Extends the SM semantics • Guarantees model existence for any NLP • Enjoys Relevance: truth depends only on call-graph • Respects the WFM, like SM semantics does 6 Sunday, November 8, 2009
Layering vs. Stratification (1/3) • Classical notion of Stratification is atom-based => misses structural info • Stratification does not cover loops • More general notion tackles symmetry of loops, and stratifies rules not atoms: Layering 7 Sunday, November 8, 2009
Layering vs. Stratification (2/3) beach mountain travel
← ← ←
not mountain not travel not beach, passport ok
passport ok Layer 1: Even Loop expired passport
← ←
not expired passport not passport ok
Layer 2: Odd Loop
• This program has no Stratification • Layering covers loops: all rules of a loop are in same layer • Non-loop dependencies are mapped into different layers ‒eg. rules for travel and passport_ok 8 Sunday, November 8, 2009
mallest. In the remainder of the paper when referring to the program’s layer n such least layering (easily seen to be unique).
Layering vs. Stratification (3/3) mple 1. Layering example. Consider the following program P , depicted the layer numbers for its least layering: Another example:
c ← not d, not y, not a d ← not c y ← not x b ← not x x ← not x b a ← f alsum
Layer 3 Layer 2 Layer 1 Layer 0
Atom a has•noLayering rules so itsisnow created unique rule a ← all f alsum is placed in rule-based => captures tom b has a fact rule r : its body is empty, and therefore all HighLaye b 1 structural info hLayer(rbB1 ), and HighLayer(rbC1 ) are 0 (zero). Hence, L(rb1 ) = max(0, Rules same head ‒eg. rules b‒ may = max(0,•0, 1) = 1, for where rb1 is the fact rule for b,for placed in Layer 1. The unique rule for x, rin also placed in Layer 1 in the least layering of P b belong different layers x is l B C hLayer(rx ) = L(rx ), HighLayer(rx ) = 0, and HighLayer(rx ) = 9 ) = max(L(r ), 0, (0 + 1)) = max(L(r x x x ), 0, 1) = 1, in the least layering. Sunday, November 8, 2009
Layer Supported Models (1/5) LS models:
beach mountain travel
← ← ←
not mountain not travel not beach, passport ok
{p_ok, b, m} {p_ok, m, t} {p_ok, t, b} {e_p, m}
passport ok expired passport
← ←
not expired passport not passport ok
LS models: {p_ok} {e_p}
• Full models are combinations of individual layer’s minimal models, but ... • ... only of those layer’s minimal models that respect the WFM of rules in layers below 10 Sunday, November 8, 2009
seen to be unique). rted in M . nder of the paper when referring to the program’s layering we (easily seen to be unique). a(easily ∈ M ,seen if atoisbe classically supported in M then a roof. Trivial from the definitions of classical support and la unique). the definitions of classical support and layered support. Consider the following program P , depicted along xample. Consider the following program P , depicted along Layer Supported Models (2/5) nor programs without odd loops layered supported models are ample. Consider the following program P , depicted along st its layering: least layering: support and layered support. " # rassical its least layering: t odd loops layered supported models are classically supporte Another example: , notay, not Layer xample 2. a Layered Unsupported Loop. Consider program not Layer 33 not y, not amodels are classically supported Layer 3 too. supported ed Unsupported cLoop. program P: b ← notConsider a a ← c, not b b ← not x Layer 2 ← not a a ← c, not b b b ← not x Layer 2 The only rule for b is in the first layer of P . Since it is b ← not x Layer op. Consider program P : x b Layer 12 or c, b not isb inb Knowing the . Since a fact Layer 101is rule b firstb layer Layer he WFM. this,of thePbody of itthe forit aisisalways false bt ← um Layer gassically body of thea rule for is false because mthis, Layer Layer 00true layered). Since itais the only rule a, its truth ayer of the P .and Since it is fact it is always in forunsupported so its now created unique rule a ← f alsum is placed in Layer red). Since itisis false the conly rulein for a, its truth(both valueisislthe false in the he rule for a because unsupported nd, consequently, is true the WFM. This intuitive oe its now created unique rule atherefore ←Model, f alsum is placed in Layer • {a,b} is a Minimal but does not r : its body is empty, and all HighLayer(r b 1 b1 ), w created unique rule a ← f alsum is placed in Layer c is true in the WFM. This is the intuitively desirable l andsemant nly rule for a, its truth value is false in the WFM, C , which corresponds to its LSM semantics. LM the LS eighLayer(r rb1 : its body isare empty, and therefore all HighLayer(r ), ) 0 (zero). Hence, L(r ) = max(0, 0, (0 + respect the WFM of Layer 1: in Layer 2, a is b l b1 HighLayer(r1 ), b1 s body is empty, and therefore all C intuitively nds to its LSM semantics. LM and the LSM semantics diffe b1 support M. This is the desirable semantics for ighLayer(r ) are 0 (zero). Hence, L(r ) = max(0, 0, (0 + eside their layering notion and the layered r whereCboth rb1 isbin the fact rule for b, placed in Layer 1. false as a consequence ofb1the WFM of Layer 1 1 er(r are 0 (zero). Hence, L(r ) = max(0, 0, (0 + antics. and the LSM semantics differences b layering notion and the layered support requisite of Def. 5. where r1 b)LM is the fact rule for b, placed in Layer 1. 1 ,xample, rx isbalso placed in Layer 1 in the least layering of P because 1 if we used LM semantics, which does not exact lay • {b,c} is the single LSModel (also an SModel) B C is the fact rule for b, placed in Layer 1. the layered support requisite of Def. 5. In this also placed inx Layer 1 does in theHighLayer(r least layering of) P=because ), HighLayer(r ) = 0, and 0. So, debxr1LM semantics, which not exact layered support, there x is x two models: LM 1 = {b, a} and LM2C = {b, c}. {b} is th B ), HighLayer(r ) = 0, and HighLayer(r ) = 0. So, (0 + 1)) = max(L(r ), 0, 1) = 1, in the least layering. ich does not exact layered support, there would so placed in Layer 1 in the least layering of P because x x x x M = {b, a} and LM = {b, c}. {b} is the only minimal mod 11 1 2 he first layer and there are two minimal model extensions fo C B C 0c,+r 1)) max(L(r 1)because = 1, in the least layering. is = placed in Layer HighLayer(r ) = 2, x ), 0, 3
M =c {b, c}. {b} is the only minimal model for Sunday, November 8, 2009
Layer Supported Models (3/5) From an intuitive operational perspective: 1. Find the (unique) Layering of P: O(|N|+|E|)
Tarjan’s SCC algorithm complexity
2. LSM := {}
3. For each layer repeat (starting from the bottom) 3.1. Compute any one Minimal Model MM for the layer 3.2. LSM := LSM U MM
3.3. Reduce the whole program by the MM 3.4. Go on to the layer above 4. LSM is a Layer Supported Model of P 12 Sunday, November 8, 2009
Layer Supported Models (4/5) • Any MM for the bottom Layer enforces the CWA, and so does the WFM • i.e. any MM for Layer 0 has all the true atoms and none of the false atoms of WFM • Program reduction by MM:
/ MM • For every a ∈ M M and b ∈
• Delete rules with not a in the body • Delete rules with b in the body • Delete a in bodies of rules • Delete not b in bodies of rules 13
Sunday, November 8, 2009
exist now. Should Should we we wish, wish, LSM’s LSM’s cumulativity cumulativity (cf. (cf. below) xist now. below) allows allows adding adding the the mode mode literalas asaafact. fact. iteral
Example6. 6. Program Program with with depth-unbound depth-unbound layering. layering. A Example A typical typical case case of of an an infinitely infinitel ground program (actually the only one with real theoretical interest, to the best of ou round program (actually the only one with real theoretical interest, to the best of ou knowledge) was presented by François Fages in [12]. We repeat it here for illustration nowledge) was presented byalso François Fagesunbound in [12]. term-length We repeat it programs here for illustratio • LSM semantics copes with and explanation. nd explanation. Ex: p(X) ← p(s(X)) p(X) ← not p(s(X)) p(X) ← p(s(X)) p(X) ← not p(s(X)) Ground (layered) version of this program, assuming there only one constant 0 (zero) Ground version: Ground (layered) version of this program, assuming there only one constant 0 (zero p(0) ← p(s(0)) p(0) ← not p(s(0)) p(0) ← p(s(0)) p(0) ← not p(s(0)) p(s(0)) ← p(s(s(0))) p(s(0)) ← not p(s(s(0))) p(s(0)) ← p(s(s(0))) p(s(0)) ← not p(s(s(0))) p(s(s(0))) ← p(s(s(s(0)))) p(s(s(0))) ← not p(s(s(s(0)))) p(s(s(0))). ← p(s(s(s(0)))) p(s(s(0))). ← .not p(s(s(s(0)))) . .. ← .. .... ← .... .. ← .. .←. The only layer OLON-free supported model of this program isStable {p(0), Models p(s(0)), p(s(s(0))) . . .} • This program has no The only layer supported model of this program is {p(0), p(s(0)), p(s(s(0))) . . . or, in a non-ground form, {p(X)}. The theoretical interest of this program lies in that r, in a non-ground form, {p(X)}. interest of lies in tha • It no has a unique { p(0), p(s(0)), p(s(s(0))),...} although it has OLONs it stillLSM hasThe no=theoretical SMs either because nothis ruleprogram is supported (unde lthough has noof OLONs it still has no SMs either no rule is class supported (unde the usualitnotion support), thereby showing therebecause is a whole other of NLPs to he usual of support), thereby there is a whole other class of NLPs t 14 due to the notion of support used. which thenotion SMs semantics provides noshowing model, which the SMs semantics provides no model, due to the notion of support used.
Layer Supported Models (5/5)
Sunday, November 8, 2009
Properties of the LSM Semantics • Conservative Extension of Stable Models — every SM model is a LSM model • Guarantee of Model Existence — all NLPs have models • Relevance
-
Strict call-graph top-down querying is sound — no need to compute whole models
-
Grounding by need — for relevant call-graph only
• Cumulativity — can add LSM atoms as facts; not in SM • LSM models are the minimal models that respect the Well-Founded Model 15 Sunday, November 8, 2009
Program Transformation Summary • We have a transformation TR, from one propositional NLP to another, whose LSMs are the Stable Models of the transform • These can be computed by extant Stable Model implementations, TR providing a tool for immediate use of LSM and its applications
16 Sunday, November 8, 2009
Program Transformation (1/4) • TR used for top-down query-solving metainterpreter implementation • Based on: XSB-Prolog + XASP — XSB plus its SModels interface • Top-down traversal of rules for each literal in the query • Identification of OLONs via ancestors list • “Solving” of OLONs in the residue’s SCCs 17 Sunday, November 8, 2009
detection with algorithm [34], because OLONs are just SCCs which happen length ≥ 3, have more complex (non-determinist ! ! detection algorithm [34], because OLONs are just SCCs which happen to have an o from P we need a means to detect OLONs. The OLON detection To produce P from number of default negations along its edges. Moreover, when an OLON of default negations along its edges. Moreover, when an OLON detected, mploynumber is a variant of Tarjan’s Strongly Connected Component (SCC) mechanism weis employ is need another mechanism to change its rules, that is to produce and add Minimal Models ofthat OLONs In general, an OLON h need another mechanism to change its rules, is to produce and add new rules to m [34], because OLONs are just SCCs which happen to have an odd algorithm [34], program, which make sure the atoms a in thedetection OLON now have “stable R1atoms = λan notOLON λdetected, 1a ← 2 , ∆now 1 we program, which make sure the in the have “stable” rulesnegatio which negations along itsnot edges. Moreover, when OLON is number depend on any OLON. We say such mechanism isofandefault “OLON-solv hanism changeon its rules, that toRproduce add new to the = such λ2and not λin ,∆ nottodepend any OLON. We mechanism is 2an “OLON-solving” one.(aTriv 2say 3rules need anotherexample mechanism OLONs, i.e.iswith length 1← like that the Introduction’s ← ake OLONs, sure the atoms a in length the OLON now have “stable” rules whichexample do . i.e. with 1 like that in the Introduction’s ← a,Gene X), program, which make sur “solved” simply.. by removing the not a from the body(aof thenot rule. OLON. We say such by mechanism is the an “OLON-solving” one. Trivial “solved” simply removing not a from the body of the General OLONs, notrule. depend on any OLON “Solving” OLONs: with length ≥ 3, have more complex (non-deterministic) solutions, des length 1 like that≥in3,the Introduction’s (anot ← not are Rn example = λ(non-deterministic) λ1 ,a,∆X), n ← n solutions, with length have more complex described below OLONs, i.e. with length y removing the In notgeneral a from the body of the rule. General OLONs, i.e.the where all the λ are atoms, and ∆has are arbitrary an OLON has the form ‒ with n odd: i jsimply “solved” by remov Minimal Models of OLONs In general, an OLON the form ave Minimal more complex (non-deterministic) solutions, described below. toIn asgeneral, “contexts”. Assuming any λi true alone s Models of has thewith form R1OLONs =refer λ1 ← not λ2 , ∆1an OLON length ≥ 3, haveinmo R1 = λ1 ← not λ two rules 2 ,λ∆ R = λform , ∆2OLON: one by rendering the head tru 2 2 1← 3the of OLONs In general, an OLON hasnot theof R2 = λ2 ← not λbody . 3 , ∆2 false. Minimal Models of OLO . not λ2 , ∆1 . .. R1 = λ1 ← not λ2 , λ ←∼ λ , ∆ , and not λ3 , ∆.2 i−1 i i−1 Rn = λn ← not λ1 , ∆n R2see = λthat not sati λ3 , 2 ←λ λ ←∼ λ , ∆ Rn = λnwhere ← not λ , ∆ Taking a closer look at the OLON rules we i i+1 i all the λni are atoms, and the ∆j are arbitrary conjunction 1 2 of li .. of literals which are the atoms involved in the OLON where all the λ are atoms, and the ∆ are arbitrary conjunction A minimal set of such λ is what is needed to have i to asrules; “contexts”. anythe λii true alone some model suffic second also Assuming λ4 jsatisfies third and.infourth rules, anda not λ1 , ∆n andrefer refer to as{λ “contexts”. λone true alone inwhich some model suffices to satisfy ithe of the OLON: by rendering the headR true and the other the OLON depends on re atoms, and the ∆ are conjunction of literals - 2two theany of rules nin inwe OLON is odd we kno , λj 4are ,rules λ6arbitrary ,Assuming .Since . .“contexts” , λn−1 }number satisfies all rules OLON except the last = λ ← not λ1 ,b n n n−1 n+1 two rules this of the OLON: one by rendering the head true and the other byλsatisfy rendering ts”. Assuming any λ true alone in some model suffices to satisfy any body false. i where all the are atom rules of OLON. So, + 1 = atoms all minimal model of has of the - any set, since λ1 satisfies thethe lastOLON rule, we get one possible i minimal 2 2 body LON: onefalse. by renderingλthe head true and the other by rendering the ←∼ λ , ∆ , and i−1 i i−1 refer atom to “contexts”. Ass minimal number atoms which areasnecessary to s M = {λ1 , λ2 , λ4 , λ6 , .of . . ,λλi n−1 }. Every in MOLON atoms, OLON e.g. λi−1 ←∼ λi ,λ∆i i−1 , and ←∼ λi+1 , ∆i n+1 n−1 twoλ= rules of the OLON: o means that the remaining n − atoms λ m OLON alone, except λ , the last atom added. satisfies alone on i 1 2 have 2minimal model , ∆i−1 , and λi ←∼ λAi+1 , ∆i set of such1 λi is what is needed body minimal to a false. OLON. The first rule of OLON — λ ← not λ , ∆ — despite for it to be minimal. n−1 1 2 1 18 , ∆i A minimal setSince of such λ is what is needed to have a minimal model for the OLO the number of rules n in OLON is odd we λknow that atom i ←∼ λ , ∆ , 2 i−1 i i−1 n−1 n−1 bymodel Sunday, November 2009λ1is , was already λ2 . n+1 In case, we call λ1 the top lit such λi is 8,what needed to havesatisfied a minimal forthis the OLON.
Program Transformation (2/4)
Program Transformation (3/4) “Solving” the OLON consists in adding the rules: R1! = λ1 ← ∆1 , ∆2 , . . . , ∆n , not λ3 , not λ5 , not λ7 , . . . , not λn R2! = λ2 ← ∆1 , ∆2 , . . . , ∆n , not λ1 , not λ4 , not λ6 , . . . , not λn−1 R3! = λ3 ← ∆1 , ∆2 , . . . , ∆n , not λ2 , not λ5 , not λ7 , . . . , not λn .. .
Rn! = λn ← ∆1 , ∆2 , . . . , ∆n , not λ2 , not λ4 , not λ6 , . . . , not λn−1
One rule may take part in several OLONs TR considers all context combinations of the OLONs each rule partakes in 19 Sunday, November 8, 2009
Program Transformation (4/4) • New program = Original program + TR’s generated rules + ← not lsmQuery ! lsmQuery ← user s literals conjunction query
• New program is sent to Smodels • SMs of New program = LSMs of Original program (under the query) 20 Sunday, November 8, 2009
Example Original program: Query: a Current atom
Current rule
21 Sunday, November 8, 2009
Ancestors
Context
Example Original program: a ⟵ not a, b b⟵c Query: a Current atom
Current rule
c ⟵ not b, not a
Ancestors
Context
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
a
a ⟵not a, b
[]
[]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
a
a ⟵ not a, b
[]
[not a, b]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
a
a ⟵ not a, b
[]
[b]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
b
b⟵c
[a]
[c]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
b
c ⟵ not b, not a
[a,b,c]
[not b, not a]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
b
c ⟵ not b, not a
[a,b,c]
[not a]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
b
c ⟵ not b, not a
[a,b,c]
[not a]
New rule for b to be added: b ⟵ not a
Final program sent to Smodels:
a ⟵ not a, b
b⟵c
c ⟵ not b, not a
21 Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
b
c ⟵ not b, not a
[a,b,c]
[not a]
New rule for b to be added: b ⟵ not a
Final program sent to Smodels:
a ⟵ not a, b
b⟵c c ⟵ not b, not a b ⟵ not a 21
Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
b
c ⟵ not b, not a
[a,b,c]
[not a]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c c ⟵ not b, not a b ⟵ not a 21
Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
a
b ⟵not a
[]
[not a]
Final program sent to Smodels:
a ⟵ not a, b
b⟵c c ⟵ not b, not a b ⟵ not a 21
Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
a
b ⟵not a
[]
[]
New rule for a to be added: a
Final program sent to Smodels:
a ⟵ not a, b
b⟵c c ⟵ not b, not a b ⟵ not a 21
Sunday, November 8, 2009
Example Original program: a ⟵ not a, b b⟵c Query: a
c ⟵ not b, not a
Current atom
Current rule
Ancestors
Context
a
b ⟵not a
[]
[]
New rule for a to be added: a
Final program sent to Smodels:
a ⟵ not a, b
a
b⟵c c ⟵ not b, not a b ⟵ not a 21
Sunday, November 8, 2009
Complexity of the Transformation
Corollary 1. Comparison between number of possible models of The highest number of models possible for LSMs, #LSMs, is larger t #SMs.
Proof. two component previous theorems, we know that for a compone • By Forthe aNloop of #N nodes, in general, 1 1 #LSM s 33 − ∗(log3 2)]) more LSMs than SMs (N ∗[ 3 2 there are . #SM s = N = 3 2
6
2
• So, yes, in the worst case, the transformation is of exponential complexity (it does perform a queryImplementation relevant unfolding)
• The good news:2 The XSB Prolog system is one of the most sophisticated, powerful, ef 1. Transformation query-relevant tile implementations, with aapplied focus ontoexecution efficiency and interac literals only, not the whole program systems, implementing program evaluation following the WFS for N interface (standing for XSB Answer Set is include 2. [1] Can be applied incrementally, noProgramming), duplicate as a practical programming processing of eachinterface atom to Smodels [9], one of the mo efficient implementations of the SMs over generalized LPs. The XA 22 one not only to compute the models of a given NLP, but also to eff Sunday, November 8, 2009
Conclusions • LSM semantics properties: model existence, relevance, cumulativity, respects WFM, extends SM • LSM meta-interpreter plus program transformation: TR + XSB-Prolog-XASP • LSM meta-interpreter allows top-down queries – suitable for ‘by need’ abductive queries 23 Sunday, November 8, 2009
Future Work (1/2) • Having defined a more general 2-valued semantics for NLPs much remains to do: properties, complexity, extensions, comparisons, implementations, applications • Applications afforded by LSM are all those of SM, plus those where OLONs are employed for problem domain representation • Model existence is essential where knowledge sources are diverse — semantic web, updates 24 Sunday, November 8, 2009
Future Work (2/2) • XSB engine-level efficient implementation of LSM semantics • Explore wider scope of applications w.r.t. ASP, namely the combination with abduction, updates, and constructive negation • Define Well-Founded Layer Supported Model. Incidental topic is the relation to O-semantics • Concepts and techniques introduced can be adopted by other logic programming systems 25 Sunday, November 8, 2009
Thank you for your attention!
26 Sunday, November 8, 2009