diff --git a/README.md b/README.md
index 939db16b..7a6a5050 100644
--- a/README.md
+++ b/README.md
@@ -17,7 +17,7 @@ While it is ready to play with, it might not ready for production depending on y
Installation
============
-In this version, WAFL is built to run as a two-part system.
+In this version, WAFL is a two-part system.
Both can be installed on the same machine.
![The two parts of WAFL](images/two-parts.png)
@@ -65,29 +65,26 @@ Running WAFL
This document contains a few examples of how to use the `wafl` CLI.
There are four modes in which to run the system
-![The two parts of WAFL](images/wafl-commands.png)
-
-
-## wafl run-audio
+## $ wafl run-audio
This is the main mode of operation. It will run the system in a loop, waiting for the user to speak a command.
The activation word is the name defined in config.json.
The default name is "computer", but you can change it to whatever you want.
-## wafl run-server
+## $ wafl run-server
It runs a local web server that listens for HTTP requests on port 8889.
The server will act as a chatbot, executing commands and returning the result as defined in the rules.
-## wafl run-cli
+## $ wafl run-cli
This command works as for the run-server command, but it will listen for commands on the command line.
It does not run a webserver and is useful for testing purposes.
-## wafl run-tests
+## $ wafl run-tests
This command will run all the tests defined in the file testcases.txt.
diff --git a/changelog.txt b/changelog.txt
new file mode 100644
index 00000000..a2f15855
--- /dev/null
+++ b/changelog.txt
@@ -0,0 +1,5 @@
+### v 0.0. 70
+
+* Added the keyword RETRIEVE to get list of relevant items from the knowledge base
+* On-the-spot rule generation
+* mapping lists onto queries
diff --git a/documentation/build/doctrees/chitchat.doctree b/documentation/build/doctrees/chitchat.doctree
deleted file mode 100644
index 056234f9..00000000
Binary files a/documentation/build/doctrees/chitchat.doctree and /dev/null differ
diff --git a/documentation/build/doctrees/directory_structure.doctree b/documentation/build/doctrees/directory_structure.doctree
index d79bfa68..6f17a3b2 100644
Binary files a/documentation/build/doctrees/directory_structure.doctree and b/documentation/build/doctrees/directory_structure.doctree differ
diff --git a/documentation/build/doctrees/environment.pickle b/documentation/build/doctrees/environment.pickle
index f31eef6e..f93f325a 100644
Binary files a/documentation/build/doctrees/environment.pickle and b/documentation/build/doctrees/environment.pickle differ
diff --git a/documentation/build/doctrees/examples.doctree b/documentation/build/doctrees/examples.doctree
index 63605e65..bd9579a5 100644
Binary files a/documentation/build/doctrees/examples.doctree and b/documentation/build/doctrees/examples.doctree differ
diff --git a/documentation/build/doctrees/index.doctree b/documentation/build/doctrees/index.doctree
index dd36c87a..6a397c8e 100644
Binary files a/documentation/build/doctrees/index.doctree and b/documentation/build/doctrees/index.doctree differ
diff --git a/documentation/build/doctrees/installation.doctree b/documentation/build/doctrees/installation.doctree
index ed580c5d..4eabba05 100644
Binary files a/documentation/build/doctrees/installation.doctree and b/documentation/build/doctrees/installation.doctree differ
diff --git a/documentation/build/doctrees/introduction.doctree b/documentation/build/doctrees/introduction.doctree
index c6f608af..35b82c28 100644
Binary files a/documentation/build/doctrees/introduction.doctree and b/documentation/build/doctrees/introduction.doctree differ
diff --git a/documentation/build/doctrees/license.doctree b/documentation/build/doctrees/license.doctree
index d1fd64b9..a9ef0d9c 100644
Binary files a/documentation/build/doctrees/license.doctree and b/documentation/build/doctrees/license.doctree differ
diff --git a/documentation/build/doctrees/rules.doctree b/documentation/build/doctrees/rules.doctree
index db30d473..ad3a0350 100644
Binary files a/documentation/build/doctrees/rules.doctree and b/documentation/build/doctrees/rules.doctree differ
diff --git a/documentation/build/doctrees/rules_and_backtracking.doctree b/documentation/build/doctrees/rules_and_backtracking.doctree
index 2e6ef0ac..220b51a9 100644
Binary files a/documentation/build/doctrees/rules_and_backtracking.doctree and b/documentation/build/doctrees/rules_and_backtracking.doctree differ
diff --git a/documentation/build/doctrees/running_WAFL.doctree b/documentation/build/doctrees/running_WAFL.doctree
index 8d838fcd..ad095642 100644
Binary files a/documentation/build/doctrees/running_WAFL.doctree and b/documentation/build/doctrees/running_WAFL.doctree differ
diff --git a/documentation/build/doctrees/wafl_init.doctree b/documentation/build/doctrees/wafl_init.doctree
index 28a3be05..f4eb3481 100644
Binary files a/documentation/build/doctrees/wafl_init.doctree and b/documentation/build/doctrees/wafl_init.doctree differ
diff --git a/documentation/build/html/_images/wafl-commands.png b/documentation/build/html/_images/wafl-commands.png
deleted file mode 100644
index 35ef3ef6..00000000
Binary files a/documentation/build/html/_images/wafl-commands.png and /dev/null differ
diff --git a/documentation/build/html/_sources/chitchat.rst.txt b/documentation/build/html/_sources/chitchat.rst.txt
deleted file mode 100644
index 88a75170..00000000
--- a/documentation/build/html/_sources/chitchat.rst.txt
+++ /dev/null
@@ -1,8 +0,0 @@
-Chitchat
-======================
-
-WAFL will improvise all conversations unless a rule is triggered.
-This is possible thanks to the LLM it is connected to.
-
-Aspirationally, the system will one day be able to generate its own rules according to the context of the conversation.
-This is not possible yet, but it is not too far off in the future.
diff --git a/documentation/build/html/_sources/directory_structure.rst.txt b/documentation/build/html/_sources/directory_structure.rst.txt
index 40c9b870..37167e75 100644
--- a/documentation/build/html/_sources/directory_structure.rst.txt
+++ b/documentation/build/html/_sources/directory_structure.rst.txt
@@ -39,5 +39,5 @@ Only the rules and facts that are included will be part of the inference.
For example, the keyword `#using facts` within greetings/ (2) will not include the folder above it.
Inference in a subfolder is limited the the rules and facts that are part of that folder or below it.
-For more information, you can have a look at the (still early) project in
+For more complete example, you can have a look at the (still early) project in
`wafl_home `_.
\ No newline at end of file
diff --git a/documentation/build/html/_sources/examples.rst.txt b/documentation/build/html/_sources/examples.rst.txt
index 09a47008..802b35dc 100644
--- a/documentation/build/html/_sources/examples.rst.txt
+++ b/documentation/build/html/_sources/examples.rst.txt
@@ -5,7 +5,4 @@ Examples
:maxdepth: 2
wafl_init
- chitchat
- rules
- rules_and_backtracking
directory_structure
\ No newline at end of file
diff --git a/documentation/build/html/_sources/index.rst.txt b/documentation/build/html/_sources/index.rst.txt
index b2d5fe7a..d2f4b8bd 100644
--- a/documentation/build/html/_sources/index.rst.txt
+++ b/documentation/build/html/_sources/index.rst.txt
@@ -3,8 +3,8 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
-Welcome to WAFL's documentation!
-================================
+Welcome to WAFL's 0.0.45 documentation!
+=======================================
.. toctree::
:maxdepth: 3
@@ -13,6 +13,8 @@ Welcome to WAFL's documentation!
introduction
installation
running_WAFL
+ query_processing_pipeline
+ rules
examples
license
diff --git a/documentation/build/html/_sources/installation.rst.txt b/documentation/build/html/_sources/installation.rst.txt
index dadf8de7..2eb6156d 100644
--- a/documentation/build/html/_sources/installation.rst.txt
+++ b/documentation/build/html/_sources/installation.rst.txt
@@ -9,21 +9,21 @@ Both can be installed on the same machine.
:align: center
Interface side
----------
+--------------
The first part is local to your machine and needs to have access to a microphone and speaker.
To install it, run the following commands:
.. code-block:: bash
- sudo apt-get install portaudio19-dev ffmpeg
- pip install wafl
+ $ sudo apt-get install portaudio19-dev ffmpeg
+ $ pip install wafl
After installing the requirements, you can initialize the interface by running the following command:
.. code-block:: bash
- wafl init
+ $ wafl init
which creates a `config.json` file that you can edit to change the default settings.
A standard rule file is also created as `wafl.rules`.
@@ -33,13 +33,16 @@ Please see the examples in the following chapters.
LLM side (needs a GPU)
----------------------
-The second part is a server that runs on a machine with a public IP address.
-This last machine will need to have a GPU to run the Large Language Model at a convenient speed.
-This part can be run using a docker image by running the script
+The second part is a machine that runs on a machine accessible from the interface side.
+The initial configuration is for a local deployment of language models.
+No action is needed to run WAFL if you want to run it as a local instance.
+
+However, a multi-user setup will benefit for a dedicated server.
+In this case, a docker image can be used
.. code-block:: bash
- docker run -p8080:8080 --env NVIDIA_DISABLE_REQUIRE=1 --gpus all fractalego/wafl-llm:latest
+ $ docker run -p8080:8080 --env NVIDIA_DISABLE_REQUIRE=1 --gpus all fractalego/wafl-llm:latest
The interface side has a `config.json` file that needs to be filled with the IP address of the LLM side.
diff --git a/documentation/build/html/_sources/introduction.rst.txt b/documentation/build/html/_sources/introduction.rst.txt
index 683ba6a0..a1a6c0e9 100644
--- a/documentation/build/html/_sources/introduction.rst.txt
+++ b/documentation/build/html/_sources/introduction.rst.txt
@@ -1,8 +1,10 @@
Introduction
============
-WAFL is a framework for home assistants.
-It is designed to combine Large Language Models and rules to create a predictable behavior.
+WAFL is a framework for personal agents.
+It integrates Large language models, speech recognition and text to speech.
+
+This framework combines Large Language Models and rules to create a predictable behavior.
Specifically, instead of organising the work of an LLM into a chain of thoughts,
WAFL intends to organise its behavior into inference trees.
diff --git a/documentation/build/html/_sources/rules.rst.txt b/documentation/build/html/_sources/rules.rst.txt
index 2def6baa..86b1dba4 100644
--- a/documentation/build/html/_sources/rules.rst.txt
+++ b/documentation/build/html/_sources/rules.rst.txt
@@ -1,181 +1,11 @@
Rules
=====
-The file rules.wafl contains the rules used by the system.
-Each rule is in the following format
+Examples
+========
-.. code-block:: text
-
- Trigger condition
- action 1
- action 2
- action 3
- ...
-
-Notice that the trigger condition has no indentation, while the actions are indented by any number spaces.
-Each action returns a true or false value.
-If that value is false, the rule stops executing and the next rule is triggered.
-A demo of the rules can be found in the repository `wafl_home `_.
-
-A rule ends when the next rule is encountered (a new trigger condition is found).
-The trigger condition can be a single fact. For example
-
-.. code-block:: text
-
- This bot's name is "computer"
-
-There are two actors in the system: "the user" and "the bot".
-One simple rule example can be
-
-.. code-block:: text
-
- The user asks what is this bot's name
- SAY Hello, my name is Computer
-
-The rule above will be triggered when the user asks what is this bot's name.
-There are 7 types of actions:
-**SAY**,
-**REMEMBER**,
-**asking a question**,
-**generate a text**,
-**triggering of another rule**,
-**code execution**,
-**entailment**.
-
-
-SAY command
------------
-
-This command will make the bot say something.
-For example the rule above will make the bot say "Hello, my name is computer".
-
-REMEMBER command
-----------------
-
-This command will make the bot remember something.
-for example the rule below will make the bot remember the user's name.
-
-.. code-block:: text
-
- The user says their name is John
- REMEMBER The user's name is John
-
-Asking a question
------------------
-
-Typing a question (with or without question mark) will return a variable.
-This variable can be used later in the rule
-For example the rule below will make the bot ask the user's name.
-
-.. code-block:: text
-
- The user says their name
- name = what is the user's name?
- REMEMBER The user's name is {name}
-
-Yes/No questions return a truth condition.
-For example by using the rule below, the bot will ask the user if they want to remember their name.
-
-.. code-block:: text
-
- The user says their name
- name = what is the user's name?
- Do you want to remember the user's name?
- REMEMBER The user's name is {name}
-
-If the user says "no", the rule will stop executing and the REMEMBER command will never be used
-
-
-Generate a text
-----------------
-
-A text can be generated in a similar fashion as when asking questions
-
-.. code-block:: text
-
- The user says their name
- name = what is the user's name?
- italian_name = the italian version of {name} is
- SAY The italian version of {name} is {italian_name}
-
-The text will be generated by the line "the italian version of {name}" according to the LLM model.
-The only difference with asking question is that the text on the right hand side of `=` is a statement
-and not a question.
-
-Triggering of another rule
---------------------------
-
-A rule can trigger another rule as follows
-
-.. code-block:: text
-
- The user says their name
- name = what is the user's name?
- the name if the user is {name}
-
- The name of the user is John
- SAY Hello John!
-
-In this case the second rule is triggered if the user says their name is John.
-
-Code execution
---------------
-
-The code execution is done by using the python syntax.
-A function defined in the file `functions.py` can be called from the rule.
-
-
-For example, the file `rules.wafl` contains the following rule
-
-.. code-block:: text
-
- The user says their name
- name = what is the user's name?
- greet({name})
-
-
-and the file `functions.py` contains the following function
-
-.. code-block:: python
-
- def greet(name):
- print("Hello", name)
-
-When the user says their name, the bot will greet the user by calling the function greet with the user's name as argument.
-However print() does not activate the SAY command.
-From the `functions.py` file, a rule can be triggered by using the syntax `"% ... %"`
-
-.. code-block:: python
-
- def greet(name):
- "% SAY Hello %"
- f"% SAY your name is {name} %"
-
-The first line will make the bot say "Hello". The second line will make the bot say "your name is John" if the user's name is John.
-
-The syntax `"% ... %"`, can be used to trigger a rule, to generate a text, to ask a question, to remember something, or any other action available in the rules file.
-For example the prior function can be written as follows
-
-.. code-block:: python
-
- def greet(name):
- "% SAY Hello %"
- "% SAY your name is {name} %"
- date = "% what is the date today? %"
- "% SAY today is {date} %"
- while "% Do you want to continue? %":
- "% SAY I am happy to continue %"
-
-Entailment
-----------
-
-The entailment is done by using the :- operator. if RHS entails LHS, then LSH :- RHS is true, otherwise it is false.
-For example the rule below will stop at the second line if the user's name is not John.
-
-.. code-block:: text
-
- The user says their name
- name = what is the user's name?
- The user's name is John :- The user's name is {name}
- SAY Your name is John!
+.. toctree::
+ :maxdepth: 2
+ writing_the_rules
+ rules_and_backtracking
\ No newline at end of file
diff --git a/documentation/build/html/_sources/running_WAFL.rst.txt b/documentation/build/html/_sources/running_WAFL.rst.txt
index fb26b50e..7cfc03c0 100644
--- a/documentation/build/html/_sources/running_WAFL.rst.txt
+++ b/documentation/build/html/_sources/running_WAFL.rst.txt
@@ -3,36 +3,30 @@ Running WAFL
This document contains a few examples of how to use the `wafl` CLI.
There are four modes in which to run the system
+$ wafl run-audio
+----------------
-.. image:: _static/wafl-commands.png
- :alt: Basic CLI commands
- :align: center
-
-
-wafl run-audio
---------------
-
-This is the main mode of operation. It will run the system in a loop, waiting for the user to speak a command.
+It will run the system in a loop, waiting for the user to speak a command.
The activation word is the name defined in config.json.
The default name is "computer", but you can change it to whatever you want.
-wafl run-server
----------------
+$ wafl run-server
+-----------------
It runs a local web server that listens for HTTP requests on port 8889.
The server will act as a chatbot, executing commands and returning the result as defined in the rules.
-wafl run-cli
-------------
+$ wafl run-cli
+--------------
This command works as for the run-server command, but it will listen for commands on the command line.
It does not run a webserver and is useful for testing purposes.
-wafl run-tests
---------------
+$ wafl run-tests
+----------------
This command will run all the tests defined in the file testcases.txt.
diff --git a/documentation/build/html/_sources/wafl_init.rst.txt b/documentation/build/html/_sources/wafl_init.rst.txt
index 469574d1..070713d9 100644
--- a/documentation/build/html/_sources/wafl_init.rst.txt
+++ b/documentation/build/html/_sources/wafl_init.rst.txt
@@ -1,7 +1,7 @@
-Running wafl init
------------------
+Initialization
+--------------
-What does this command do?
+This command initialises WAFL's work environment
.. code-block:: bash
@@ -24,22 +24,28 @@ It creates a set of files that can be used to the interface side of WAFL.
.. code-block:: text
{
- "allow_interruptions": true, # interruptions are allowed in the conversation
- "waking_up_word": "computer", # the word that wakes up the LLM and the name of the chatbot
- "waking_up_sound": true, # A sound is generated when the LLM is woken up
- "deactivate_sound": true, # A sound is generated when the LLM is woken up
- "listener_model": "openai/whisper-tiny.en", # The model used for speech recognition.
- # Only whisper models are supported
- "listener_hotword_logp": -8, # The threshold log probability of the hotword
- "listener_volume_threshold": 0.6, # The threshold volume when listening
- "listener_silence_timeout": 0.7, # The silence timeout when listening
- "model_host": "127.0.0.1", # The host of the LLM
- "model_port": 8080 # The port of the LLM
+ "allow_interruptions": true,
+ "waking_up_word": "computer",
+ "waking_up_sound": true,
+ "deactivate_sound": true,
+ "improvise_tasks": true,
+ ...
}
+These settings regulate the following:
+
+ * The "allow_interruptions" allows the user to create rules with the highest priority.
+ For example, the user might want a rule to be triggered in the middle of a conversation.
+
+ * "waking_up_word" is the name of the bot, used to wake up the system in the "run-audio" mode.
+
+ * "waking_up_sound" and "deactivate_sound" are played to signal the system is up or is back to idle.
+
+ * "improvise_tasks" allows the system to create its own rules to accomplish a goal.
+
- The rules.wafl file contains the rules that guide the chatbot.
-The rules are written in the WAFL language, with a trigger condition on top and a list of actions below.
+ The rules are written in the WAFL language, with a trigger condition on top and a list of actions below.
.. code-block:: text
@@ -58,4 +64,4 @@ This rule is activated when the user says "bring yourself online", and the actio
- `start_llm.sh` is a script that starts the LLM locally.
It simply starts a docker container with the LLM image.
-- The `testcases.txt` file contains the test cases that can be used to test the LLM.
\ No newline at end of file
+- The `testcases.txt` file contains the test cases that can be used to test the LLM.
diff --git a/documentation/build/html/_static/pygments.css b/documentation/build/html/_static/pygments.css
index 08bec689..84ab3030 100644
--- a/documentation/build/html/_static/pygments.css
+++ b/documentation/build/html/_static/pygments.css
@@ -17,6 +17,7 @@ span.linenos.special { color: #000000; background-color: #ffffc0; padding-left:
.highlight .cs { color: #3D7B7B; font-style: italic } /* Comment.Special */
.highlight .gd { color: #A00000 } /* Generic.Deleted */
.highlight .ge { font-style: italic } /* Generic.Emph */
+.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */
.highlight .gr { color: #E40000 } /* Generic.Error */
.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */
.highlight .gi { color: #008400 } /* Generic.Inserted */
diff --git a/documentation/build/html/_static/wafl-commands.png b/documentation/build/html/_static/wafl-commands.png
deleted file mode 100644
index 35ef3ef6..00000000
Binary files a/documentation/build/html/_static/wafl-commands.png and /dev/null differ
diff --git a/documentation/build/html/chitchat.html b/documentation/build/html/chitchat.html
deleted file mode 100644
index 61dd5d8e..00000000
--- a/documentation/build/html/chitchat.html
+++ /dev/null
@@ -1,123 +0,0 @@
-
-
-
-
-
-
- Chitchat — WAFL documentation
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
WAFL will improvise all conversations unless a rule is triggered.
-This is possible thanks to the LLM it is connected to.
-
Aspirationally, the system will one day be able to generate its own rules according to the context of the conversation.
-This is not possible yet, but it is not too far off in the future.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
\ No newline at end of file
diff --git a/documentation/build/html/directory_structure.html b/documentation/build/html/directory_structure.html
index 6674597f..57674d11 100644
--- a/documentation/build/html/directory_structure.html
+++ b/documentation/build/html/directory_structure.html
@@ -18,7 +18,7 @@
-
+
@@ -45,11 +45,11 @@