Skip to content
GitLab
Menu
Projects
Groups
Snippets
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
Michele Volpato
Alnos
Commits
1b365d1e
Commit
1b365d1e
authored
Oct 27, 2015
by
Michele Volpato
Browse files
Merge branch 'development' into 'master'
Development Keep master up to date See merge request
!1
parents
be37e8f3
2c5ad38c
Changes
21
Hide whitespace changes
Inline
Side-by-side
CHANGELOG.md
View file @
1b365d1e
...
@@ -3,8 +3,17 @@ All notable changes to this project will be documented in this file.
...
@@ -3,8 +3,17 @@ All notable changes to this project will be documented in this file.
This project adheres to
[
Semantic Versioning
](
http://semver.org/
)
.
This project adheres to
[
Semantic Versioning
](
http://semver.org/
)
.
## [Unreleased][unreleased]
## [Unreleased][unreleased]
## [v0.2.0] - 2015-10-27
### Added
### Added
-
This CHANGELOG file.
-
This CHANGELOG file.
-
Validity of suspension automata.
-
Testing algorithms
-
Counterexample handling
-
Algorithms for double sets in the table
### Changed
-
Moved old algorithm for learning to
`oraclelearning`
and
`oracleobservationtable`
## [v0.1.0] - 2015-09-29
## [v0.1.0] - 2015-09-29
### Added
### Added
...
@@ -16,4 +25,4 @@ This project adheres to [Semantic Versioning](http://semver.org/).
...
@@ -16,4 +25,4 @@ This project adheres to [Semantic Versioning](http://semver.org/).
-
Simple Examples
-
Simple Examples
[
unreleased
]:
https://gitlab.science.ru.nl/mvolpato/active-learning-nondeterministic-systems/compare/v0.1.0...HEAD
[
unreleased
]:
https://gitlab.science.ru.nl/mvolpato/active-learning-nondeterministic-systems/compare/v0.1.0...HEAD
[
v0.1.0
]:
https://gitlab.science.ru.nl/mvolpato/active-learning-nondeterministic-systems/compare/f7f05033cf5e002a45a67632e60b311892ca0850...v0.1.0
[
v0.1.0
]:
https://gitlab.science.ru.nl/mvolpato/active-learning-nondeterministic-systems/compare/f7f05033cf5e002a45a67632e60b311892ca0850...v0.1.0
\ No newline at end of file
README.md
View file @
1b365d1e
...
@@ -4,12 +4,17 @@ The active-learning-nondeterministic-systems is an implementation of an
...
@@ -4,12 +4,17 @@ The active-learning-nondeterministic-systems is an implementation of an
adaptation of
adaptation of
[
L*
](
http://www.cs.berkeley.edu/~dawnsong/teaching/s10/papers/angluin87.pdf
)
to
[
L*
](
http://www.cs.berkeley.edu/~dawnsong/teaching/s10/papers/angluin87.pdf
)
to
nondeterministic systems. The code is based on these scientific papers:
nondeterministic systems. The code is based on these scientific papers:
*
[
Active Learning of Nondeterminisitc Systems from an ioco Perspective
](
http://link.springer.com/chapter/10.1007%2F978-3-662-45234-9_16
)
*
[
`[1]`
Active Learning of Nondeterminisitc Systems from an ioco Perspective
](
http://link.springer.com/chapter/10.1007%2F978-3-662-45234-9_16
)
*
[
Approximate Active Learning of Nondeterministic Input Output Transition Systems
](
http://www.italia.cs.ru.nl/html/papers/VT15.pdf
)
*
[
`[2]`
Approximate Active Learning of Nondeterministic Input Output Transition Systems
](
http://www.italia.cs.ru.nl/html/papers/VT15.pdf
)
The goal is to construct
ing
a model of a system for model-based testing,
The goal is to construct a model of a system for model-based testing,
simulation, or model checking.
simulation, or model checking.
### Python version
The project is coded in Python3 and tested using Python3.4.
## Included Libraries
## Included Libraries
[
NumPy
](
https://github.com/numpy/numpy
)
[
NumPy
](
https://github.com/numpy/numpy
)
...
@@ -43,14 +48,15 @@ teacher = YourOwnAdapterTeacher()
...
@@ -43,14 +48,15 @@ teacher = YourOwnAdapterTeacher()
oracle
=
YourOwnAdapterOracle
()
oracle
=
YourOwnAdapterOracle
()
underModel
,
overModel
=
LearningAlgorithm
(
teacher
,
oracle
,
maxLoops
=
10
,
underModel
,
overModel
=
LearningAlgorithm
(
teacher
,
oracle
,
maxLoops
=
10
,
tablePreciseness
=
10000
,
modelPreciseness
=
0.1
)
tablePreciseness
=
10000
,
modelPreciseness
=
0.1
,
tester
=
tester
)
```
```
where
`underModel`
and
`overModel`
are the under and over approximations
where
`underModel`
and
`overModel`
are the under and over approximations
of your system, respectively,
`maxLoops`
is the limit of learning loops
of your system, respectively,
`maxLoops`
is the limit of learning loops
when the learned models are not changing any more,
`tablePreciseness`
and
when the learned models are not changing any more,
`tablePreciseness`
and
`modelPreciseness`
are the levels of preciseness you would like to reach
`modelPreciseness`
are the levels of preciseness you would like to reach
before stopping.
before stopping.
Tester is a testing algorithm.
The learning process stops when either the learned model does not change for
The learning process stops when either the learned model does not change for
`maxLoops`
loops, or when both the preciseness levels are met.
`maxLoops`
loops, or when both the preciseness levels are met.
...
@@ -62,4 +68,4 @@ checking [my contact details](https://gitlab.science.ru.nl/u/mvolpato).
...
@@ -62,4 +68,4 @@ checking [my contact details](https://gitlab.science.ru.nl/u/mvolpato).
## License
## License
See
[
LICENSE
](
./LICENSE
)
See
[
LICENSE
](
./LICENSE
)
\ No newline at end of file
examples/learn.py
View file @
1b365d1e
...
@@ -9,8 +9,9 @@ from systems.implementations import InputOutputLTS
...
@@ -9,8 +9,9 @@ from systems.implementations import InputOutputLTS
from
teachers.ltsoracles
import
InputOutputPowerOracle
from
teachers.ltsoracles
import
InputOutputPowerOracle
import
logging
import
logging
import
helpers.bisimulation
as
bi
import
helpers.bisimulation
as
bi
from
testing.randomtesting
import
RandomTester
logging
.
basicConfig
(
level
=
logging
.
WARNING
)
logging
.
basicConfig
(
level
=
logging
.
INFO
)
logger
=
logging
.
getLogger
(
__name__
)
logger
=
logging
.
getLogger
(
__name__
)
inputs
=
set
([
'a'
,
'b'
])
inputs
=
set
([
'a'
,
'b'
])
...
@@ -33,20 +34,31 @@ I1.addTransition(4,'y',2)
...
@@ -33,20 +34,31 @@ I1.addTransition(4,'y',2)
I1
.
addTransition
(
4
,
'b'
,
4
)
I1
.
addTransition
(
4
,
'b'
,
4
)
I1
.
addTransition
(
4
,
'a'
,
0
)
I1
.
addTransition
(
4
,
'a'
,
0
)
I1
.
addTransition
(
4
,
'x'
,
0
)
I1
.
addTransition
(
1
,
'a'
,
0
)
I1
.
addTransition
(
3
,
'b'
,
1
)
I1
.
addTransition
(
3
,
'a'
,
0
)
I1
.
addTransition
(
4
,
'b'
,
2
)
I1
.
addTransition
(
2
,
'a'
,
3
)
I1
.
addTransition
(
4
,
'y'
,
0
)
I1
.
makeInputEnabled
()
I1
.
makeInputEnabled
()
T1
=
InputOutputTeacher
(
I1
)
T1
=
InputOutputTeacher
(
I1
)
O1
=
InputOutputPowerOracle
(
I1
)
O1
=
InputOutputPowerOracle
(
I1
)
tester
=
RandomTester
(
T1
,
10000
,
20
)
currentdir
=
os
.
path
.
dirname
(
os
.
path
.
abspath
(
currentdir
=
os
.
path
.
dirname
(
os
.
path
.
abspath
(
inspect
.
getfile
(
inspect
.
currentframe
())))
inspect
.
getfile
(
inspect
.
currentframe
())))
path
=
os
.
path
.
join
(
currentdir
,
"dotFiles"
)
path
=
os
.
path
.
join
(
currentdir
,
"dotFiles"
)
print
(
"Starting learning..."
)
print
(
"Starting learning..."
)
# change printPath=None to printPath=path for dot files
# change printPath=None to printPath=path for dot files
L2
=
LearningAlgorithm
(
T1
,
O1
,
printPath
=
None
,
maxLoops
=
2
,
logger
=
logg
er
)
L2
=
LearningAlgorithm
(
T1
,
O1
,
printPath
=
None
,
maxLoops
=
4
,
tablePreciseness
=
10000
,
logger
=
logger
,
tester
=
test
er
)
minus
,
plus
=
L2
.
run
()
minus
,
plus
=
L2
.
run
()
print
(
"Models learned. Check language equivalence..."
)
print
(
"Models learned. Check language equivalence..."
)
...
...
helpers/bisimulation.py
View file @
1b365d1e
def
bisimilar
(
system1
,
system2
):
def
bisimilar
(
system1
,
system2
,
startState1
=
0
,
startState2
=
0
):
# starting from
initial
state
# starting from
given
state
s
state1
=
(
0
,)
state1
=
(
startState1
,)
state2
=
(
0
,)
state2
=
(
startState2
,)
past
=
set
()
past
=
set
()
wait
=
set
()
wait
=
set
()
trace
=
()
trace
=
()
wait
.
add
((
state1
,
state2
,
trace
))
wait
.
add
((
state1
,
state2
,
trace
))
while
wait
:
while
wait
:
current
=
wait
.
pop
()
current
=
wait
.
pop
()
past
.
add
((
current
[
0
],
current
[
1
]))
past
.
add
((
current
[
0
],
current
[
1
]))
system1Labels
=
system1
.
getInputs
().
union
(
system1
.
getOutputs
())
system1Labels
.
add
(
system1
.
getQuiescence
())
system2Labels
=
system2
.
getInputs
().
union
(
system2
.
getOutputs
())
system2Labels
.
add
(
system2
.
getQuiescence
())
enabledLabels_1
=
set
()
enabledLabels_1
=
set
()
for
state
in
current
[
0
]:
for
state
in
current
[
0
]:
enabledLabels_1
=
enabledLabels_1
.
union
(
system1
.
outputs
(
state
))
enabledLabels_1
=
enabledLabels_1
.
union
(
system1
.
outputs
(
state
))
...
...
helpers/traces.py
0 → 100644
View file @
1b365d1e
# Functions useful for handling traces
# Given a trace flatten it to only one 'label' in sequence
def
flatten
(
trace
,
label
):
if
trace
==
None
:
return
None
if
len
(
trace
)
==
0
:
return
trace
# If trace does not contain at least 2 quiescence symbols, return trace
if
trace
.
count
(
label
)
<
2
:
return
trace
# Temporary list constructed while checking current trace
finalTrace
=
[
trace
[
0
]]
for
action
in
trace
[
1
:]:
if
action
!=
label
:
finalTrace
.
append
(
action
)
elif
finalTrace
[
-
1
]
!=
label
:
finalTrace
.
append
(
action
)
return
tuple
(
finalTrace
)
# Given a trace, returns δ(σ) as the smallest set s.t. σ ∈ δ(σ) and σ1·δ·σ2 ∈
# δ(σ) => σ1·σ2 ∈ δ(σ)
def
removeLabelsInCombination
(
trace
,
label
):
newTraces
=
set
()
if
label
in
trace
:
index
=
trace
.
index
(
label
)
withQ
=
trace
[:(
index
+
1
)]
withoutQ
=
trace
[:
index
]
rest
=
trace
[(
index
+
1
):]
for
subtrace
in
removeLabelsInCombination
(
rest
,
label
):
newTraces
.
add
(
withQ
+
subtrace
)
newTraces
.
add
(
withoutQ
+
subtrace
)
else
:
newTraces
.
add
(
trace
)
return
newTraces
# Given a set of traces, return a set with all prefixes of all traces
# included those traces
def
getPrefixes
(
traces
):
prefixes
=
set
()
for
trace
in
traces
:
# If I already encountered this trace, skip it
if
trace
in
prefixes
:
continue
for
pos
in
range
(
len
(
trace
)):
prefixes
.
add
(
trace
[:
pos
])
return
traces
.
union
(
prefixes
)
# simple trie structure for traces
def
make_trie
(
traces
):
trie
=
{}
for
trace
in
traces
:
if
trace
==
():
trace
=
(
u
"
\u03B5
"
,)
temp_trie
=
trie
for
label
in
trace
:
temp_trie
=
temp_trie
.
setdefault
(
label
,
{})
temp_trie
=
temp_trie
.
setdefault
(
'_end_'
,
'_end_'
)
return
trie
# if a trace is in trie
def
in_trie
(
trie
,
trace
):
if
trace
==
():
trace
=
(
u
"
\u03B5
"
,)
temp_trie
=
trie
for
label
in
trace
:
if
label
not
in
temp_trie
:
return
False
temp_trie
=
temp_trie
[
label
]
if
"_end_"
in
temp_trie
:
return
True
else
:
return
False
# remove a trace from trie
def
remove_from_trie
(
trie
,
trace
,
depth
=
0
):
if
trace
==
():
trace
=
(
u
"
\u03B5
"
,)
if
len
(
trace
)
==
depth
+
1
:
if
'_end_'
in
trie
[
trace
[
depth
]]:
del
trie
[
trace
[
depth
]][
'_end_'
]
# baz and barz both are safe
if
len
(
trie
[
trace
[
depth
]])
>
0
and
len
(
trie
)
>
1
:
# baz and barz both are present
return
False
elif
len
(
trie
)
>
1
:
# only baz is present
del
trie
[
trace
[
depth
]]
return
False
elif
len
(
trie
[
trace
[
depth
]])
>
0
:
# only barz is present
return
False
else
:
return
True
else
:
temp_trie
=
trie
# Recursively climb up to delete.
if
remove_from_trie
(
temp_trie
[
trace
[
depth
]],
trace
,
depth
+
1
):
if
temp_trie
:
del
temp_trie
[
trace
[
depth
]]
return
not
temp_trie
else
:
return
False
learning/Deprecated/oraclelearning.py
0 → 100644
View file @
1b365d1e
# Learning for OLD algorithm (set and boolean)
# # Deprecated
from
.oracleobservationtable
import
Table
import
random
from
systems.implementations
import
SuspensionAutomaton
import
os
,
inspect
import
helpers.graphhelper
as
gh
import
logging
import
helpers.traces
as
th
class
LearningAlgorithm
:
def
__init__
(
self
,
teacher
,
oracle
,
tester
,
tablePreciseness
=
1000
,
modelPreciseness
=
0.1
,
closeStrategy
=
None
,
printPath
=
None
,
maxLoops
=
10
,
logger
=
None
,):
self
.
_logger
=
logger
or
logging
.
getLogger
(
__name__
)
self
.
_teacher
=
teacher
self
.
_oracle
=
oracle
self
.
tester
=
tester
self
.
_tablePreciseness
=
tablePreciseness
self
.
_modelPreciseness
=
modelPreciseness
self
.
_table
=
Table
(
self
.
_teacher
.
getInputAlphabet
().
copy
(),
self
.
_teacher
.
getOutputAlphabet
().
copy
(),
self
.
_teacher
.
getQuiescence
(),
closeStrategy
,
logger
=
self
.
_logger
)
# Maximum number of loops with no effect on hPlus model
self
.
_noEffectLimit
=
maxLoops
# Current number of loops
self
.
_currentLoop
=
0
outputs
=
self
.
_teacher
.
getOutputAlphabet
()
self
.
_hMinus
=
SuspensionAutomaton
(
1
,
self
.
_teacher
.
getInputAlphabet
().
copy
(),
self
.
_teacher
.
getOutputAlphabet
().
copy
(),
self
.
_teacher
.
getQuiescence
())
self
.
_hPlus
=
SuspensionAutomaton
(
1
,
self
.
_teacher
.
getInputAlphabet
().
copy
(),
self
.
_teacher
.
getOutputAlphabet
().
copy
(),
self
.
_teacher
.
getQuiescence
(),
chaos
=
True
)
self
.
_printPath
=
printPath
# this update uses a realistic teacher. If I need an output to happen I
# cannot force it to happen.
def
updateTable
(
self
):
# First, try to avoid impossible traces: ask observation query
for
trace
in
self
.
_table
.
getObservableTraces
():
observedOutputs
=
self
.
_table
.
getOutputs
(
trace
)
observation
=
self
.
_oracle
.
observation
(
trace
,
observedOutputs
)
if
observation
:
self
.
_table
.
updateEntry
(
trace
,
observation
=
observation
)
# For all traces for which we did not observed all possible outputs
oTraces
=
self
.
_table
.
getObservableTraces
()
trie
=
th
.
make_trie
(
oTraces
)
# Until we tried K times with no results
K
=
len
(
oTraces
)
*
150
# TODO: should not be hardcoded
found
=
0
tries
=
0
while
tries
<
K
:
tries
+=
1
oTraces
=
self
.
_table
.
getObservableTraces
()
trie
=
th
.
make_trie
(
oTraces
)
subtrie
=
trie
# if no trace is observable (best scenario)
if
len
(
oTraces
)
==
0
:
break
observations
=
{}
# Dictionary with obtained outputs
consecutiveInputs
=
()
# keep track of last inputs sequence
currentTrace
=
()
# keep track of current trace
i
=
0
# We build a trace until we either
# 1 - observe an output that makes the trace not a prefix
# 2 - there is no continuation of that trace in prefixes
# We stop when we observed at least an output for each observable
while
len
(
oTraces
)
>
len
(
observations
.
keys
()):
#and i < K:
i
+=
1
# check if trie contains no traces (but still has a child)
children
=
trie
.
keys
()
hasTrace
=
False
for
child
in
children
:
if
trie
[
child
]
!=
{}:
hasTrace
=
True
if
not
hasTrace
:
break
# if currentTrace is observable and we did not process it
# already, we ask an output and we add the result to
# observations[currentTrace]
if
(
currentTrace
in
oTraces
and
currentTrace
not
in
observations
.
keys
()):
# there might be some inputs waiting to be processed
if
consecutiveInputs
!=
():
output
=
self
.
_processInputs
(
consecutiveInputs
)
# reset the inputs, because we processed them
consecutiveInputs
=
()
if
output
==
None
:
# SUT not input enabled: reset
currentTrace
=
()
subtrie
=
trie
self
.
_teacher
.
reset
()
continue
else
:
# no input to process, ask an output
output
=
self
.
_teacher
.
output
()
# we have an output for currentTrace, add it to observations
# this is the first output we observe for currentTrace
observations
[
currentTrace
]
=
set
([
output
])
# remove currentTrace from trie
th
.
remove_from_trie
(
trie
,
currentTrace
)
# if that output is not a valid continuation
if
output
not
in
subtrie
.
keys
():
# reset the process
currentTrace
=
()
subtrie
=
trie
self
.
_teacher
.
reset
()
continue
# navigate trie
subtrie
=
subtrie
[
output
]
currentTrace
=
currentTrace
+
(
output
,)
else
:
# currentTrace not observable, or already observed
# get an input from subtries
children
=
subtrie
.
keys
()
inputChildren
=
[
x
for
x
in
children
\
if
x
in
self
.
_teacher
.
getInputAlphabet
()]
if
len
(
inputChildren
)
>
0
:
# process this input, add it to consecutiveInputs
# and navigate subtrie
input
=
random
.
sample
(
inputChildren
,
1
)[
0
]
consecutiveInputs
=
consecutiveInputs
+
(
input
,)
subtrie
=
subtrie
[
input
]
currentTrace
=
currentTrace
+
(
input
,)
continue
else
:
# no inputs available, wait for output
# there might be some inputs waiting to be processed
if
consecutiveInputs
!=
():
output
=
self
.
_processInputs
(
consecutiveInputs
)
# reset the inputs, because we processed them
consecutiveInputs
=
()
if
output
==
None
:
# SUT not input enabled: reset
currentTrace
=
()
subtrie
=
trie
self
.
_teacher
.
reset
()
continue
else
:
# no input to process, ask an output
output
=
self
.
_teacher
.
output
()
# we have an output for currentTrace,
# if currentTrace is in otraces add it to observations
if
currentTrace
in
oTraces
:
observations
[
currentTrace
].
add
(
output
)
# remove currentTrace from trie
if
th
.
in_trie
(
trie
,
currentTrace
):
th
.
remove_from_trie
(
trie
,
currentTrace
)
# if that output is not a valid continuation
if
output
not
in
subtrie
.
keys
():
# reset the process
currentTrace
=
()
subtrie
=
trie
self
.
_teacher
.
reset
()
continue
# navigate trie
subtrie
=
subtrie
[
output
]
currentTrace
=
currentTrace
+
(
output
,)
# end while loop
# observations contains observed outputs
found
+=
len
(
observations
.
keys
())
for
trace
in
observations
.
keys
():
# Only if trace is a prefix in S, then
# add trace + output to row (S cdot L_delta)
if
self
.
_table
.
isInS
(
trace
):
for
output
in
observations
[
trace
]:
self
.
_table
.
addOneLetterExtension
(
trace
,
output
)
# Update set of outputs for traces where deltas are removed
for
deltaTrace
in
self
.
_table
.
getDeltaTraces
(
trace
):
for
output
in
observations
[
trace
]:
self
.
_table
.
updateEntry
(
deltaTrace
,
output
=
output
)
for
output
in
observations
[
trace
]:
self
.
_table
.
updateEntry
(
trace
,
output
=
output
)
# Observation query
# ask observation query for all entries because I could have added
# some 'impossible' traces
for
trace
in
self
.
_table
.
getObservableTraces
():
observedOutputs
=
self
.
_table
.
getOutputs
(
trace
)
observation
=
self
.
_oracle
.
observation
(
trace
,
observedOutputs
)
if
observation
:
self
.
_table
.
updateEntry
(
trace
,
observation
=
observation
)
# # this update function uses teacher.process(trace)
# # in case a InputOutputTeacher is used, outputs in trace are forced to happen
# # this is not realistic, but still useful at the moment.
# def oldUpdateTable(self):
# temp = 0
# tot = 0
# for c in range(200):
# for trace in self._table.getObservableTraces():
# observedOutputs = self._table.getOutputs(trace)
# output = self._teacher.process(trace)
# for i in range(10):
# # try again if retrieving output is unsuccesful
# if output != None:
# break
# output = self._teacher.process(trace)
# tot += 1
# if output != None:
# # Only if trace is a prefix in S, then
# # add trace + output to row (S cdot L_delta)
# if self._table.isInS(trace):
# self._table.addOneLetterExtension(trace, output)
#
# # Update set of outputs for traces where deltas are removed
# for deltaTrace in self._table.getDeltaTraces(trace):
# self._table.updateEntry(deltaTrace, output)
#
# # Add this output to the set of outputs observed after trace
# observedOutputs.add(output)
# else:
# temp += 1
#
# observation = self._oracle.observation(trace, observedOutputs)
# self._table.updateEntry(trace, output, observation)
def
_processInputs
(
self
,
consecutiveInputs
):
if
consecutiveInputs
!=
():
output
=
self
.
_teacher
.
oneOutput
(
consecutiveInputs
)
if
output
==
None
:
# SUT did not accept an input.
self
.
_logger
.
warning
(
"SUT did not accept input in "
+
str
(
consecutiveInputs
))
return
None
return
output
return
self
.
_teacher
.
output
()
def
stabilizeTable
(
self
):
# While nothing changes, keep closing and consistent the table
closingRows
=
self
.
_table
.
isNotGloballyClosed
()
consistentCheck
=
self
.
_table
.
isNotGloballyConsistent
()
while
closingRows
or
consistentCheck
:
while
closingRows
:
self
.
_logger
.
info
(
"Closing table"
)
self
.
_logger
.
debug
(
closingRows
)
self
.
_table
.
promote
(
closingRows
)
# After promoting one should check if some one letter
# extensions should also be added
if
self
.
_table
.
addOneLetterExtensions
(
closingRows
):
self
.
_logger
.
info
(
"something changed"
)
if
self
.
_logger
.
isEnabledFor
(
logging
.
DEBUG
):
self
.
_table
.
printTable
(
prefix
=
"_c_"
)
self
.
updateTable
()
closingRows
=
self
.
_table
.
isNotGloballyClosed
()
consistentCheck
=
self
.
_table
.
isNotGloballyConsistent
()
# Table is closed, check for consistency
if
consistentCheck
:
self
.
_logger
.
info
(
"Consistency check"
)
self
.
_logger
.
debug
(
consistentCheck
)
if
self
.
_table
.
addColumn
(
consistentCheck
,
force
=
True
):
self
.
_logger
.
info
(
"something changed"
)
if
self
.
_logger
.
isEnabledFor
(
logging
.
DEBUG
):
self
.
_table
.
printTable
(
prefix
=
"_i_"
)
# TODO: is an update needed here? in theory, when I encounter
# an inconsistency, by adding a column, the interesting row
# will immediately make the table not closed, no need of
# update, right?
#self.updateTable()
closingRows
=
self
.
_table
.
isNotGloballyClosed
()
consistentCheck
=
self
.
_table
.
isNotGloballyConsistent
()
def
getHypothesis
(
self
,
chaos
=
False
):
# If table is not closed, ERROR
if
self
.
_table
.
isNotGloballyClosed
():
self
.
_logger
.
error
(
"Tried to get hipotheses with table not
\
closed or not consistent"
)
return
None
,
None
# Get equivalence classes
rows
=
self
.
_table
.
getEquivalenceClasses
(
chaos
)