1 Introduction
The notion of an automaton (Kleene Reference Kleene, Shannon and McCarthy1956), as the de facto mathematical abstraction of a computational process over a discrete state space, is constantly revisited to capture different sorts of computational behaviours in the most varied contexts, either prescribed in a program or discovered in Nature. Already in 1997, Milner (Reference Milner2006) emphasised that from being a prescription for how to do something – in Turing’s terms a ‘list of instructions’, software becomes much more akin to a description of behaviour, not only programmed on a computer, but occurring by hap or design inside or outside it. Over time different kinds of automata were proposed generate (or recognise, depending on the perspective) such behaviours (or the languages that express them). Regular expressions, as a basic notation to express languages and behaviours, were first axiomatised by Kozen (Reference Kozen and Rovan1990) as Kleene algebras, which are basically partially ordered, semirings endowed with a closure operator. Several interpretations and variants of this structure are documented in the literature (Hoare et al. Reference Hoare, Möller, Struth and Wehrman2011; Jipsen and Andrew Moshier Reference Jipsen and Andrew Moshier2016; Kozen Reference Kozen1997; Kozen and Mamouras ; McIver et al. Reference McIver, Cohen, Morgan and Schmidt2006; McIver et al. Reference McIver, Rabehaja, Struth, Bortolussi and Wiklicky2013; Qiao et al. Reference Qiao, Wu, Wang and Gao2008; Thiemann Reference Thiemann2016).
This paper was born out of a challenge: having previously worked with the Fuzzy Arden Syntax (FAS) (Gomes et al. Reference Gomes, Madeira and Barbosa2021), a fuzzy, imperative language used for medical diagnosis and prescription of medical procedures, our aim was to introduce a specific kind of automata, and corresponding languages, are able to express the behaviour of the underlying fuzzy systems.
Two specific ingredients have to be taken into consideration. The first is vagueness, or uncertainty, a notion that underlies the interpretation of both variables and predicates in FAS programs. The second is simultaneity, i.e. a form of parallel execution which is not captured by non-deterministic interleaving of elementary steps, as in typical models of concurrency. Consider, for illustration purposes, the following program.
The program adjusts the dose of medicine to be administrated to a patient depending on her temperature. The variable Fever_condition is a function assigning, to each real value of the temperature measured, a value (e.g. within the range [0,1]) to record how close such temperature is of a ‘fever condition’. In a scenario where the predicate Temperature is in Fever_condition and its negation have a value greater than 0, let us say, $0.4$ and $0.6$, respectively, the program executes both the then and the else blocks, weighted by the value associated to each of them. In practice, this results in a multiplication of the values taken, in each case, by variable medicine. Intuitively, the values $0.4$ and $0.6$ mean that Temperature has probably not reached the limit of a fever condition but is close to it.
Summing up, the intended semantics of a conditional statement in FAS does not reduce to a non-deterministic, or even to a probabilistic choice (McIver et al. Reference McIver, Rabehaja, Struth, Bortolussi and Wiklicky2013). Instead, it corresponds to a sort of parallel execution enforcing all branches to run in parallel, with (possibly) different weights associated to the evaluation of each condition. Therefore, as this small program illustrates, vagueness and simultaneity are the two ingredients our framework needs to deal with.
Vagueness can be captured by a fuzzy finite-state automata (FFA), a structure introduced in the 1960’s in Wee and Fu (Reference Wee and Fu1969) to give a formal semantics to uncertainty and vagueness inherent to several computational systems. Different variants of this idea, e.g. incorporating fuzziness into either states or transitions, or both, are well documented in the literature (Doostfatemeh and Kremer Reference Doostfatemeh and Kremer2005; Li and Pedrycz Reference Li and Pedrycz2005; Liu et al. Reference Liu, Wang, Barbosa and Sun2021; Mateescu et al. Reference Mateescu, Salomaa, Salomaa and Yu1995). The corresponding fuzzy languages (Lee and Zadeh Reference Lee and Zadeh1969; Zadeh Reference Zadeh1996) are recognised by this class of automata only up to a certain membership degree. Applications are transversal to several domains as reported in Lin and Ying (Reference Lin and Ying2002), Mordeson and Malik (2002), Pedrycz and Gacek (Reference Pedrycz and Gacek2001), Ying (Reference Ying2002).
On the other hand, simultaneity was suitably formalised in what Milner called the ‘synchronous version of CCS’ – the SCCS calculus (Milner Reference Milner1983), a variant of CCS (Milner Reference Milner1980) where arbitrary actions are allowed to execute synchronously. This very same idea of synchronous evolution appears in the work of C. Priscariu on synchronous Kleene algebra (Prisacariu Reference Prisacariu2010). Models for such structures are given in terms of sets of synchronous strings and finite automata accepting them. These structures found application, for instance, in variants of deontic logic to formalise contract languages (Segerberg Reference Segerberg1982; von Wright Reference von Wright1968) and of Hoare logic to reason about parallel synchronous programs with shared variables (Prisacariu Reference Prisacariu2010).
The aim of this paper is to formalise the behaviour of this class of systems. $\mathcal{H}$-automata are introduced as a variant of fuzzy transition automata in the spirit of reference (Mateescu et al. Reference Mateescu, Salomaa, Salomaa and Yu1995), where transitions take ‘truth’ values in a complete Heyting algebra $\mathcal{H}$, and a suitable synchronous product construction is defined. The paper proceeds by generalising synchronous sets (Prisacariu Reference Prisacariu2010) into a notion of a $\mathcal{H}$-synchronous language, defined as a word valuation function over $\mathcal{H}$. Some preliminary results in this direction appeared in the authors’ conference paper (Gomes et al. Reference Gomes, Madeira and Barbosa2020). However, the formal framework was now completely redefined in a very general sense – note, for example, that the need for explicitly introducing $\mathcal{H}$-valued guards in the language, as suggested in that preliminary work, becomes redundant, i.e. implicit in the relevant mathematical strucutre and, thus, in the proposed language semantics.
As a main result it is shown that, for any complete Heyting algebra $\mathcal{H}$, $\mathcal{H}$-synchronous languages equipped with suitable language operators, as proposed here, defines a synchronous Kleene algebra. Moreover, its actions can generate a $\mathcal{H}$-automaton accepting precisely the $\mathcal{H}$-synchronous language that constitutes its interpretation. As in the classical, well-known case, a regular expression can be obtained from a $\mathcal{H}$-automaton by a standard state elimination procedure (Hopcroft et al. Reference Hopcroft, Motwani and Ullman2003). The procedure results in a $\mathcal{H}$-automaton with a single transition from the initial to the final state, labelled by an action $\alpha$ whose interpretation is precisely the language recognised by that $\mathcal{H}$-automaton.
This paper is organised as follows. The remaining of this section sums up related work and some preliminaries to the paper’s contribution. Section 2 introduces $\mathcal{H}$-synchronous languages and defines a number of operators over them, proving that, in this way $\mathcal{H}$-synchronous languages forms a synchronous Kleene algebra. Section 3 studies $\mathcal{H}$-automata, including their synchronous product. A few examples of FAS programs involving conditionals are interpreted in this framework. Then, a Kleene theorem for $\mathcal{H}$-automata and $\mathcal{H}$-synchronous languages is proved in Section 4. Finally, Section 5 concludes and enumerates some topics for future research.
1.1 Related work
The construction of a finite fuzzy automata with membership degrees taken in a lattice-ordered monoid $\mathcal{L}$ is studied in Li and Pedrycz (Reference Li and Pedrycz2005) in a context analogous to the one considered here, based on the concept of $\mathcal{L}$-fuzzy regular expression. Those are defined as regular expressions from an alphabet X with a scalar $\lambda\in\mathcal{L}$ multiplication, which resorts to the monoid multiplication. It is precisely this scalar that attributes the weight to a transition in the automaton. In our approach, on the other hand, automata are built using standard regular expressions instead of fuzzy regular expressions. Regular expressions are then interpreted as some sort of weighted languages (i.e. functions with values on a complete Heyting algebra) accepted by an automaton with weighted transitions.
Most of the results presented in the context of fuzzy languages are constructed using either the real interval [0,1] or a generic residuated lattice to model the (possible) many valued membership values. Such is the case of reference (Mateescu et al. Reference Mateescu, Salomaa, Salomaa and Yu1995). However, one of the main results of this paper, Theorem 1, relies on properties provided by a specific characterisation of the underlying lattice structure. In particular, the operator ‘$;$’ has to be idempotent and commutative. The definition a $\mathcal{H}$-automaton proposed here differs from the one in Mateescu et al. (Reference Mateescu, Salomaa, Salomaa and Yu1995) with respect to the underlying semantic structure, which is assumed to be as a complete Heyting algebra.
Probabilistic automata (Rabin Reference Rabin1963), another approach to handle uncertainty, weigh transitions by elements of a probability distribution. An equivalent to Kleene’s theorem for these family of automata is presented by Bollig et al. (Reference Bollig, Gastin, Monmege, Zeitoun, Chakraborty and Mukund2012), considering (probabilistic) strings with probabilistic choice, guarded choice, concatenation and the star operator. Extensive surveys on this class of automata are documented in Vidal et al. (Reference Vidal, Thollard, de la Higuera, Casacuberta and Carrasco2005a,b). In the approach proposed here, however, uncertainty can be measured in an arbitrary, either discrete or continuous, domain, depending on the relevant application scenario. This is captured by a complete Heyting algebra introduced as a parameter in the model.
Figure 1 summarises some systems from the literature, highlighting the difference of the approach taken in this paper.
Moreover, as we summarised above, the notion of weight can take different meanings. Figure 2 summarises some of these different approaches.
1.2 Preliminaries
The notion of a synchronous Kleene algebra (SKA) plays in the automata construction introduced by Prisacariu (Reference Prisacariu2010) a role similar to the one played by Kleene algebras in the classical case (Broda et al. Reference Broda, Machiavelo, Moreira, Reis, Gasieniec and Wolter2013). Actually, SKA:
• extends Kleene algebra with a synchronous operator to model synchronous execution of actions;
-
• has an interpretation over synchronous languages, the equivalent of regular languages to include actions corresponding to the synchronous execution of other actions;
-
• induce the construction of a class of finite automata, which accept the same languages that defines the interpretation of the SKA actions.
The relevant definitions are recalled below.
Definition 1. (Kleene algebra). A Kleene algebra $(A,+,\cdot,^*,\mathbf{0},\mathbf{1})$ is an idempotent semiring with a unary operator ‘$^*$’ satisfying axioms (1)–(13) in Table 1. Partial order $\leq$ is induced by ‘$+$’ as $\alpha\leq\beta\Leftrightarrow \alpha+\beta=\beta$.
Well-known examples of Kleene algebras include the algebra of binary relations over a set, the set of all languages over an alphabet, and the $(min,+)$-algebra, also known as the tropical algebra, defined over the reals with an additional $+\infty$ constant, as
Extending this definition with a multiplication ‘$\times$’ to capture the synchronous execution of actionsFootnote 1 leads to the notion of a synchronous Kleene algebra (SKA), introduced in Prisacariu (Reference Prisacariu2010).
Definition 2. (SKA). Let B be a set of labels. A SKA is a Kleene algebra extended with an operator ‘$\times$’ i.e. a tuple
where $B\subset A$, satisfying the axioms in Table 1.
Following a common practice, we write a b, rather than $a\cdot b$, for a, b $\in B$. Note that axiom (18) applies only to elements of B, instead of any arbitrary action A. This comes from the fact that such a property, being intuitive for atomic actions, is not so, or even desirable, for an arbitrary action in A. Consider, for example, action $(a+b)\times(a+b)$, whose execution may result in $a\times b$ by choosing a from the first entity and b from the second. However, by the axiomatisation above, we have
Moreover, axiom (21) provides an exchange like rule to describe interaction between elements in $B^\times$ and A. The restriction to actions in $B^\times$ relates to the synchrony model, describing the parallelism of sequences of actions by concatenating small synchronous steps.
We will call by synchronous regular expressions the terms of a SKAs, i.e., the terms given the grammar
where a is a atomic action, constituting the set B. Actions $\alpha_\times$, $\beta_\times$ are built only with operator ‘$\times$’ from B, constituting the set $B^\times$ (e.g. a, $a\times b\in B^{\times}$ but $a+b$, $a\times b+c$, $\mathbf{0}$, $\mathbf{1}\notin B^{\times}$). The set of synchronous regular expressions will be denoted by Sreg.
If synchronous execution of actions is captured as above, vagueness, on the other hand, requires the consideration of weighted transitions forming a complete Heyting algebra.
Definition 3. (Complete Heyting algebra). A Heyting algebra is a bounded distributive lattice
with join ‘$\vee$’ and meet ‘$\wedge$’ operators, least ‘0’ and greatest ‘1’ elements, equipped with a binary operator ‘$\to$’ that is right adjoint to ‘$\wedge$’. Some axioms are listed in Table 2.
$\mathcal{H}$ is a complete Heyting algebra (CHA) iff it is complete as a lattice, therefore entailing the existence of arbitrary suprema. The usual precedence of the operators, with ‘$^*$’ having the highest precedence, then ‘$;$’, ‘$\times$’, and finally ‘$+$’, will be assumed.
Let us briefly revisit some properties of this structure that will be later used in proofs. Completeness ensures that all suprema exist when characterising operators ‘$\cdot$’, ‘$\times$’ and ‘$^*$’ on $\mathcal{H}$-synchronous languages as (possible) infinite sums. Let us denote by $\bigvee$, $\bigwedge$ the distributed versions of the associative operators ‘$\vee$’ and ‘$\wedge$’, respectively, and by I a (possible infinite) index set. Axiom (25) ensures that every suprema distributes, on both sides, over arbitrary infima, i.e.
Instances of a complete Heyting algebra are enumerated in the following examples.
Example 1.2 ($\mathbf{2}$- the Boolean algebra) A first example is the well-known binary structure
with the standard interpretation of Boolean connectives.
Example 1.3 A second example is the three-valued Gödel chain, which introduces an explicit denotation u for ‘unknown’ (or ‘undefined’). $\mathbf{3} = (\{\top, u, \bot\},\vee,\wedge,\bot,\top,\to)$ where
Example 1.4 (Gödel algebra) Another example is given by the standard Gödel algebra $\mathbf{G} = ([0,1],\max,\min,0,1,\to)$ where
2 $\mathcal{H}$-Synchronous Languages
In order to capture both synchronous execution and vagueness in transitions, their interpretation is made over synchronous languages with embedded weights. The latter are taken, as explained above, from a complete Heyting algebra $\mathcal{H}$. In a sense, this generalises the work of Prisacariu (Reference Prisacariu2010) which considers non-weighted, but synchronous languages. A number of operators over these languages, referred to as $\mathcal{H}$-synchronous languages, are introduced below, structuring this domain as a SKA, parametric on the set of weights.
Definition 4. ($\mathcal{H}$-synchronous languages) Let B be a set of symbols and $\mathcal{H}$ a complete Heyting algebra over a carrier H. $\mathcal{H}$-synchronous actions are pairs associating a non-empty set of symbols in B to a weight in H. Formally, $\Sigma= \mathcal{P}_{ne}(B) \times H\setminus\{\mathbf{0}\}$, where $\mathcal{P}_{ne}(X)$ denotes the non-empty powerset of X. For each action, functions ${{b}: {\Sigma} \longrightarrow {\mathcal{P}_{ne}(B)}}$ and ${{h}: {\Sigma} \longrightarrow {H\setminus\{\mathbf{0}\}}}$, denote the corresponding projections. $\mathcal{H}$-synchronous words are elements of $\Sigma^*$. $\mathcal{H}$-synchronous languages are sets of such words, i.e. elements of $\mathcal{P}(\Sigma^*)$.
The weight of a word is computed by
Clearly, $hs(\varepsilon) = \mathbf{1}$, for $\varepsilon$ the empty string.
As an illustration, consider a finite set of labels $B=\{a,b\}$ and take the Gödel algebra $\mathbf{G}$, from Example 1.4, as a domain for weights. Thus, representing a sequence by the juxtaposition of its elements, $hs (({\mathopen{ \{ }{a}\mathclose{ \} }},0.6) ({\mathopen{ \{ }{a,b}\mathclose{ \} }},0.5)) = 0.6\wedge 0.5 =0.5$. Thus, one may turn a language $\mathcal{L} \in \mathcal{P}(\Sigma^*)$ into a function of synchronous words to weights through a translation function
such that
Of course, t is not injective, thus two characterisations of a language, i.e. as an element of $\mathcal{P}(\Sigma^*)$ or of $H^{\mathcal{P}(B)^*}$, are not isomorphic. Function hs is particularly relevant to state, as we will do later, that an automata recognises a word w if it does so with a weight equal or bigger than hs(w).
The standard operators from regular language theory can be defined over $\mathcal{H}$-synchronous languages, as follows.
Definition 5. The following operations are defined over $\mathcal{H}$-synchronous languages $\mathcal{L}$, $\mathcal{L}_1$, $\mathcal{L}_2$, for any complete Heyting algebra $\mathcal{H}$:
• $\varnothing = \emptyset$ (the empty language)
-
• $\mathsf{1} = {\mathopen{ \{ }{\varepsilon}\mathclose{ \} }}$ (the language containing only the empty string)
-
• $\mathcal{L}_1 + \mathcal{L}_2 = \mathcal{L}_1 \cup \mathcal{L}_2$
-
•
-
• , where $u\times v$ is defined by
- $u\times \varepsilon=u=\varepsilon\times u$
-
- $u\times v=(b(x)\cup b(y), h(x)\wedge h(y) )(u'\times v')$ where $u=xu'$ and $v=yv'$.
-
• $\mathcal{L}^*$ is the least fixed point of equation $X = \mathsf{1} + \mathcal{L} \cdot X$.
With respect to the product of languages, note that $\{a,b\}\in \mathcal{L}_1\times \mathcal{L}_2$ may correspond to any of the following situations: $\{a\}\in \mathcal{L}_1$ and $\{b\}\in \mathcal{L}_2$, $\{b\}\in \mathcal{L}_1$ and $\{a\}\in \mathcal{L}_2$, or, finally, $\{a,b\}$ belongs just to one of the languages, and $\varepsilon$ to the other. Note also that if $a\in \mathcal{L}_1$ and $bc\in \mathcal{L}_2$, then $\{a,b\}c \in \mathcal{L}_1\times \mathcal{L}_2$.
Definition 6. (Atomic languages). Let B be a set of symbols and $\mathcal{H}$ a complete Heyting algebra over a carrier H and $\Sigma= \mathcal{P}_{ne}(B) \times H\setminus\{\mathbf{0}\}$ a set of synchronous actions. The set of atomic actions of $\Sigma$ is given by $\Sigma_0=\{a\in \Sigma \mid |b(a)|=1\}$. For any atomic action $a\in \Sigma_0$, the language $\mathcal{L}_a =\{a\}$ is called atomic language. We denote by $\mathcal{B}_\Sigma$ the class of atomic languages of $\Sigma$ and by $\mathcal{B}^\times_\Sigma$ the $\times$-closure of $\mathcal{B}$. We use also $\mathcal{L}_\Sigma$ to denote the class of the languages of $\Sigma$.
Theorem 1. Let B be a set of symbols and $\mathcal{H}$ a complete Heyting algebra over a carrier H and $\Sigma= \mathcal{P}_{ne}(B) \times H\setminus\{\mathbf{0}\}$ a set of synchronous actions. The structure
defines a SKA.
Proof. We detail the verification of axioms (1), (13), (20) and (18) making repeated use of Definition 5. The remaining cases follow a similar argument. For axiom (1) observe:
Regarding axiom (18), consider the atomic language $\mathcal{L}_a$. We have
For axiom (20),
For axiom (21), consider the $\Sigma$-languages $\mathcal{L}_1^\times$, $\mathcal{L}_2^\times\in \mathcal{B}^\times$. Then,
Similarly to the homomorphism used to interpret Sreg as synchronous sets (Prisacariu Reference Prisacariu2010), we define a map to interpret actions of $\alpha\in \mathrm{Sreg}$ as $\mathcal{H}$-synchronous languages.
Definition 7. (Sreg-interpretation). The function $I:\mathrm{Sreg}\to \mathcal{P}(\Sigma^*)$, called a Sreg-interpretation, is defined as follows:
• $I(a)=\mathcal{L}_{a}$, $a\in {B\times H}$
-
• $I(\mathbf{0})=\varnothing$
-
• $I(\mathbf{1})=\chi$
-
• $I(\alpha+\beta)=I(\alpha)\cup I(\beta)$
-
• $I(\alpha\cdot\beta)=I(\alpha)\cdot I(\beta)$
-
• $I(\alpha\times\beta)=I(\alpha)\times I(\beta)$
-
• $I(\alpha^*)=I(\alpha)^*$
3 $\mathcal{H}$-Automata
This section presents the automata construction for $\mathcal{H}$-synchronous languages. First we define a class of automata on top of a complete Heyting algebra $\mathcal{H}$, referred as $\mathcal{H}$-automata. An appropriate notion of a synchronous product for these automata is then presented.
Definition 8. ($\mathcal{H}$-Automata). Let $\mathcal{H}$ be a complete Heyting algebra and B a set of symbols. A $\mathcal{H}$-automaton is a tuple
where:
• S is a finite set of states;
-
• $\Sigma= \mathcal{P}_{ne}(B) \times H$ is the input alphabet;
-
• $s_0 \in S$ is the initial state;
-
• $F\subseteq S$ is the set of final states;
-
• $\delta:S\times \Sigma\times S\to H$ is the transition function.
Intuitively, $\delta(s_1,x,s_2)$, for $x\in \Sigma$, can be interpreted as the truth degree of ‘input x causing a transition from $s_1$ to $s_2$’. In a graphical representation of a $\mathcal{H}$-automaton, the weight of a transition from $s_1$ to $s_2$ caused by an action a is represented explicitly as follows:
The transition function can be inductively extended to sequences $\Sigma^*$ by defining $\delta^*:S\times\Sigma^*\times S\to H$ such that, for any $s_1, s_2\in S$,
and, for any $s_1,s_2\in S, w\in \Sigma^*$ and $x\in \Sigma$,
Clearly, for any states $s_1, s_2\in X$ and any word $w\in \Sigma^*$, $\delta^*(s_1,w,s_2)$ can be interpreted as the truth degree of ‘word w causes a transition from $s_1$ to $s_2$’.
A recognising function for a particular automaton $\mathcal{M}$ succeeds in recognising a word if, for each label $x\in \Sigma$ appearing in the word, the weight associated to the corresponding transition $\delta(s_1,x,s_2)$ is such that $h(x)\leq \delta(s_1,x,s_2)$. Formally,
Definition 9. (Recognising function). Let $\mathcal{H}$ be a complete Heyting algebra and $\mathcal{M}=(S,\Sigma,s_0,F,\delta)$ a $\mathcal{H}$-automata. The recognising function for $\mathcal{M}$, $\rho_{\mathcal{M}}:S\times\Sigma^*\times S\to H$, is recursively defined by
and
Definition 10. Let $\mathcal{H}$ be a complete Heyting algebra, $\mathcal{M}=(S,\Sigma,s_0,F,\delta)$ be an $\mathcal{H}$-automaton, and $\rho_{\mathcal{M}}$ a recognising function for $\mathcal{M}$. The $\mathcal{H}$-synchronous language recognised by $\mathcal{M}$ is defined as follows:
Theorem 2. Let $\mathcal{H}$ be a complete Heyting algebra, $\mathcal{M}=(S,\Sigma,s_0,F,\delta)$ be an $\mathcal{H}$-automaton, and $\rho_{\mathcal{M}}$ a recognising function for $\mathcal{M}$ and $w\in \Sigma^*$. Then,
Proof. First observe that from the definitions of $\rho_{\mathcal{M}}$ and hs, for any $s,s'\in S$ and $w\in \Sigma^*$, either $\rho_{\mathcal{M}}(s,w,s')=0$ or $\rho_{\mathcal{M}}(s,w,s')\geq hs(w)$. For the converse direction, since $\rho_{\mathcal{M}}(s,w,s')\geq hs(w)$, we have $\rho_{\mathcal{M}}(s,w,s')\geq 0$ and hence $w\in \mathcal{L}(\mathcal{M})$.
We end this section defining and exemplifying a notion of synchronous product of $\mathcal{H}$-automata, which corresponds to the automata counterpart to synchronous composition in Sreg. It is based on the parallel product of labelled transition systems with shared actions. Formally,
Definition 11. (Synchronous product of $\mathcal{H}$-automata). Let $\mathcal{M}_\alpha=(S^\alpha,{\Sigma^{\alpha}},s^\alpha_0,F^\alpha,\delta^\alpha)$ and $\mathcal{M}_\beta=(S^\beta,{\Sigma^{\beta}},s^\beta_0,F^\beta,\delta^\beta)$ be two $\mathcal{H}$-automata. Let ${\Sigma^{\alpha\times\beta}}={\Sigma^{\alpha}}\cup{\Sigma^{\beta}}\cup({\Sigma^{\alpha}}\times{\Sigma^{\beta}})$, with
The synchronous product of $\mathcal{M}_\alpha$ and $\mathcal{M}_\beta$ is the $\mathcal{H}$-automaton
whose transition function
is defined by, for any $p\in \Sigma^{\alpha\times\beta}$,
As discussed in the Introduction, $\mathcal{H}$-automata provide a suitable semantic structure for FAS programs. Let us illustrate such a potential through the discussion of two concrete examples.
Example 3.1. Our fist example, already mentioned in the introduction as Example 1.1, is that of a conditional in FAS which involves the simultaneous execution of its branches. An intuitive metaphor to this behaviour is represented as a pipe as depicted in Figure 3. The liquid, represented by blue arrows, reaches a point where it flows through both channels in parallel (capturing simultaneity), with different volumes going through each channel, represented by the different thicknesses of arrows (representing different truth degrees modelling vagueness).
The execution of this program, which involves the multiplication of the values assigned to variable medicine in different branches, may lead to two distinct outcomes:
(A) The branches remain separated, and further instructions are executed in parallel. The information from different branches is taken into account by the user;
-
(B) The information is combined, which results in a single (crisp) output.
Option (B) enforce the consolidation of multiple variable values, which is achieved through instruction aggregate, as in the program:
This behaviour can be modelled by the synchronous product of the automata, assuming, for illustration purposes, that the weight of each branch are $0.58$ and $0.50$, respectively. We also assume that weights are taken from a Gödel algebra (Example 1.4).
Since we are assuming that the truth degrees associated to the evaluation of both branches are strictly positive, actions $\texttt{medicine:=5}$ and $\texttt{medicine:=0}$ run in parallel. Formally, the two automata are combined through ‘$\times$’ giving rise to
where $x_1 = \big(\{\texttt{medicine:=5}, \texttt{medicine:=0}\}, \delta((s_1,s_3),\{ \texttt{medicine:=0}, \texttt{medicine:=5}\}$ and $x_2 = (s_2,s_4)$
The truth degree associated to aggregated variable medicine, after execution depends on the truth degrees of both branches of the conditional, which corresponds to the second projection of the actions $(\texttt{medicine:=5}, \delta^{\texttt{medicine:=5}}(s_1,\texttt{medicine:=5},s_2))$ and $(\texttt{medicine:=5}, \delta^{\texttt{medicine:=0}}(s_3,\texttt{medicine:=0},s_4))$. Choosing the minimum function as the aggregation operator, leads to the following computation:
Example 3.2. As a second example consider an excerpt of a FAS program representing a control system intended to adjust the peak inspiratory pressure (PIP) of a patient depending on her levels of $O_2$ and $CO_2$, after a cardiac surgery de Bruin et al. (Reference de Bruin, Schuh, Rappelsberger, Adlassnig, Kacprzyk, Szmidt, Zadrozny, Atanassov and Krawczak2018).
The set of labels in this example is $B=\{\texttt{PIP:=5}, \texttt{PIP:=2}, \texttt{PIP:=0}\}$. Suppose that the truth degrees of the predicates in each of the branches of the conditional are $0.4$, $0.2$ and $0.6$, respectively, and again assume the Gödel algebra as the domain for weights. The three branches of the conditional are modelled by the automata below
Again, the values of the three predicates are strictly positive and thus the three branches of the conditional are executed in parallel, corresponding to action $\texttt{PIP:=5}\times \texttt{PIP:=2}\times \texttt{PIP:=0}$. Operator ‘$\times$’ being associative, such behaviour is modelled by the synchronous product of the three automata above, yielding
An intermediate step is represented in Figure 4, with $s'=(s_1,s_3)$, $s''=(s_2,s_4)$, and $(x_1,x_2)$ abbreviating $\big(\{\texttt{PIP:=5},\texttt{PIP:=2}\},\delta^{\texttt{PIP:=5}\times \texttt{PIP:=5}}\big((s_1,s_3),\{\texttt{PIP:=5},\texttt{PIP:=2}\},(s_2,s_4)\big)\big)$. Similarly, $(x_3,x_4)$ abbreviates
$\big(\{\texttt{PIP:=5},\texttt{PIP:=2},\texttt{PIP:=0}\},\delta^{\texttt{PIP:=5}\times \texttt{PIP:=2}\times \texttt{PIP:=0}}\big((s',s_5),\{\texttt{PIP:=5},\texttt{PIP:=2},\texttt{PIP:=0}\},(s'',s_6)\big)\big)$
The truth degree corresponding to the combined values taken by variable PIP depends on three other truth degrees: the second projections of $(\texttt{PIP:=5}, \delta^{\texttt{PIP:=5}}(s_1,\texttt{PIP:=5},s_2))$, $(\texttt{PIP:=2}, \delta^{\texttt{PIP:=2}}(s_3,\texttt{PIP:=2},s_4))$ and $(\texttt{PIP:=0}, \delta^{\texttt{PIP:=0}}(s_5,\texttt{PIP:=0},s_6))$. It is computed as follows:
4 A Kleene Theorem for $\mathcal{H}$-Synchronous Languages
This section establishes a Kleene theorem for $\mathcal{H}$-automata and $\mathcal{H}$-synchronous languages. To proceed in such a direction, however, entails the need for showing that, as it happens in the classic case, the introduction of non-determinism and transitions labelled by the empty string does not compromise the expressiveness of finite $\mathcal{H}$-automata. Such is the aim of the following subsection.
4.1 $\mathcal{H}$-Automata with $\varepsilon$-moves
In standard finite automata theory, it is well-known that the introduction of non-determinism and the presence of $\varepsilon$-moves, i.e. spontaneous transitions labelled by the empty word, do not change the expressiveness of finite automata, since given a non-deterministic automaton with $\varepsilon$-moves, there is a standard procedure to build an equally finite and deterministic automaton recognising exactly the same language (see e.g. Hopcroft and Ullman Reference Hopcroft and Ullman1979).
This subsection develops an analogous result for $\mathcal{H}$-automata. Firstly, we notice that the non-determinism is inherent to the very definition of $\mathcal{H}$-automata. For example, the non-deterministic transition $\delta(s,a)=\{w,v\}$ can be represented in a $\mathcal{H}$-automaton by $\delta(s,a,w)=1$ and $\delta(s,a,v)=1$. Of course, it is also easy to characterise the class of finite deterministic automata as the subclass of $\mathcal{H}$-automata such that, for each $s,v,w \in S$ and for any symbol a, if $\delta(s,a,v)=1=\delta(s,a,w)$ then $v=w$. This clarified, let us consider the effect of $\varepsilon$-moves.
Definition 12. ($\mathcal{H}$-Automata with $\varepsilon$-moves). Let $\mathcal{H}$ be a complete Heyting algebra and B a set of symbols. A $\mathcal{H}$-automata with $\varepsilon$-moves, $\varepsilon\mathcal{H}$-automaton for short, is a tuple
where
• S is a finite set of states;
-
• $\Gamma\subseteq \mathcal{P}(B) \times H$ such that, for any $a\in \Gamma$, if $b(a)=\emptyset$, $h(a)=1$ (by a slight abuse of notation, the empty set of symbols will be represented by $\varepsilon$, originating transitions $(\varepsilon,1)$);
-
• $s_0 \in S$ is the initial state;
-
• $F\subseteq S$ is the set of final states;
-
• $\delta:S\times \Gamma\times S\to H$ is the transition function such that
- for any $s\in S$, $\delta(s,\varepsilon,s)=1$
-
- for any $s,s'\in S$, $\delta(s,\varepsilon,s')=1$ or $\delta(s,\varepsilon,s')=0$.
Definition 13. The language recognised by an $\varepsilon\mathcal{H}$-automaton $\mathcal{E}=(S,\Gamma,s_0,F,\delta)$ is given by
where
with
for any $a\in \Gamma\setminus\{(\varepsilon,1)\}$.
Definition 14. Let $\mathcal{E}=(S,\Gamma,s_0,F,\delta)$, be a $\varepsilon\mathcal{H}$-automaton with $\mathcal{E}\subseteq \mathcal{P}(B) \times H$. The $\varepsilon$-closure of $\mathcal{E}$ is the $\mathcal{H}$-automaton
where
• $\hat{S}=\{\hat{v} \mid v\in S\}$ where $\hat{v}=\{w \mid \delta^*(v,\varepsilon, w)=1\}$
-
• $\Sigma=\Gamma \setminus \{(\varepsilon,1)\}$
-
• $\hat{F}=\{P\in \hat{S}\mid P\cap F \neq \emptyset\}$
-
• for any $\hat{s},\hat{v}\ \in \hat{S}$ and $a\in\Sigma$, $\hat{\delta}(\hat{s},a,\hat{v})=\bigvee_{s\in\hat{s},v\in\hat{v}} \delta^{\varepsilon}(s,a,v)$, where
\[\delta^{\varepsilon}(s,a,v)=\bigvee_{s_1,s_2\in S} \big(\delta^*(s,\varepsilon^*,s_1) \wedge \delta(s_1,a,s_2)\wedge \delta^*(s_2,\varepsilon^*,v)\big)\]
Theorem 3. Let $\mathcal{E}=(S,\Gamma,s_0,F,\delta)$, be a $\varepsilon-\mathcal{H}$-automaton. Then
Proof. First, observe that, for any $a\in \Gamma \setminus \{(\varepsilon,1)\}$ and for all $s,v\in S$,
since
Then, the result follows by induction on the structure of words. For a basic word $a\in \Gamma\setminus\{(\varepsilon,1)\}$,
For composed words $aw\in (\Gamma\setminus\{(\varepsilon,1)\})^*$,
4.2 The theorem
The setting is now ready to establish a Kleene theorem for $\mathcal{H}$-automata and $\mathcal{H}$-synchronous languages. Thus, for any synchronous regular expression $\alpha\in \mathrm{Sreg}$, we will provide a method to build a $\varepsilon\mathcal{H}$-automaton (translatable to a $\mathcal{H}$-automaton, as discussed above) $\mathcal{M}_\alpha$ such that $I(\alpha) = \mathcal{L}(\mathcal{M}_\alpha)$.
For regular expressions built from atomic actions $a\in\Sigma= \mathcal{P}_{ne}(B) \times H$ without resorting to the synchronous product operator, the construction follows the classical recipe, as presented e.g. in Hopcroft and Ullman (Reference Hopcroft and Ullman1979). This is then extended to synchronous regular expressions, by generalising a construction in Prisacariu (Reference Prisacariu2010) for the synchronous operator ‘$\times$’.
Theorem 4. For any $\alpha\in \mathrm{Sreg}$, there exists a $\mathcal{H}$-automaton $\mathcal{M}_\alpha$ such that
Proof. The automata corresponding to $a\in \Sigma$, $\mathbf{0}$ and $\mathbf{1}$, denoted, respectively, by $\mathcal{M}_a$, $\mathcal{M}_\mathbf{0}$ and $\mathcal{M}_\mathbf{1}$, are depicted in Figure 5. From Definitions 10 and 7, observe that $I(a)=\mathcal{L}_a=\mathcal{L}_{\mathcal{M}_a}$, $I(\mathbf{0})=\{\}=\varnothing=\mathcal{L}_{\mathcal{M}_{\mathbf{0}}}$ and that $I(\mathsf{1}) = {\mathopen{ \{ }{\varepsilon}\mathclose{ \} }}=\mathcal{L}_{\mathcal{M}_{\mathbf{0}}}$. Then, assuming there exist automata for arbitrary regular actions $\alpha$ and $\beta$, we inductively build an $\varepsilon\mathcal{H}$-automaton for Sreg expressions $\alpha+\beta$, $\alpha\cdot\beta$ and $\alpha^*$. The resulting automata, denoted by $\varepsilon-\mathcal{H}$-automata $\mathcal{E}_{\alpha+\beta}$, $\mathcal{E}_{\alpha\cdot\beta}$, $\mathcal{E}_{\alpha^*}$ and $\mathcal{E}_{\alpha \times \beta}$, are depicted in Figures 6, 7, 8 and 9, respectively. Clearly, Definition 13 entails $I(\alpha+\beta)=\mathcal{L}^\varepsilon(\mathcal{E}_{\alpha+\beta})$, $I(\alpha\cdot\beta)=\mathcal{L}^\varepsilon(\mathcal{E}_{\alpha\cdot\beta})$, $I(\alpha^*)=\mathcal{L}^\varepsilon(\mathcal{E}_{\alpha^*})$ and $I(\alpha\times\beta)=\mathcal{L}^\varepsilon(\mathcal{E}_{\alpha\times\beta})$. Then, by Theorem 3, we conclude that $I(\alpha+\beta)=\mathcal{L}(\hat{\mathcal{E}}_{\alpha+\beta})$, $I(\alpha\cdot\beta)=\mathcal{L}^(\hat{\mathcal{E}}_{\alpha\cdot\beta})$, $I(\alpha^*)=\mathcal{L}(\hat{\mathcal{E}}_{\alpha^*})$ and $I(\alpha\times\beta)=\mathcal{L}(\hat{\mathcal{E}}_{\alpha\times\beta})$.
5 Conclusions
The paper introduced a new class of automata, and corresponding languages, able to capture both vagueness, through transitions weighted over a complete Heyting algebra, and synchronous execution, through a specific product operator. The work was motivated by the quest for a suitable demantic structure for FAS programs.
To model other situations, for example, in face of a requirement to compute the number of steps involved in an execution, or the resources consumed by a computational process, exploring other structures to parametrise the construction would be a possibility. The tropical semiring
with $x\to y=\max\{y-x,0\}$, $\forall_{x,y\in \mathbb{R}_+\cup\{\infty\}}$ would be worth to consider, although it fails idempotency and, therefore Theorem 1.
Finally, a detailed comparison with other possible semantic structures is in order. Probabilistic concurrent Kleene algebra (PCKA), introduced in McIver et al. (Reference McIver, Rabehaja, Struth, Bortolussi and Wiklicky2013), is an obvious choice. Such an approach embodies two distinct operators: the concurrency operator ‘$||$’, from concurrent Kleene algebra of Hoare et al. (Reference Hoare, Möller, Struth and Wehrman2011), to describe the parallel execution of two crisp actions, and a probabilistic choice operator ‘$\oplus$’, to capture uncertainty in the execution of actions.
For reasoning about concurrent programs with some form of uncertainty, PCKA can model Jone’s rely/guarantee style calculus with probabilistic behaviour, resorting to a probabilistic event structure semantics (McIver et al. Reference McIver, Rabehaja and Struth2016). On the other hand, SKA encodes reasoning in the style of Qwicki and Gries Owicki and Gries (Reference Owicki and Gries1976) calculus. A possible direction for future work will investigate whether and how this can be extended to the weighted case.
Conflicts of interest
The authors declare none.