<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>http://wiki.wisevoice.ai/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Admin</id>
		<title>Wisevoice Wiki - User contributions [en]</title>
		<link rel="self" type="application/atom+xml" href="http://wiki.wisevoice.ai/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Admin"/>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php/Special:Contributions/Admin"/>
		<updated>2026-04-24T05:17:26Z</updated>
		<subtitle>User contributions</subtitle>
		<generator>MediaWiki 1.29.0</generator>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Scripts&amp;diff=108</id>
		<title>Scripts</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Scripts&amp;diff=108"/>
				<updated>2020-02-10T09:53:43Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Scripts are components used to do custom processing on data from the user's session.&lt;br /&gt;
&lt;br /&gt;
Scripts us the JavaScript programming language.&lt;br /&gt;
&lt;br /&gt;
You can read and edit any of the variables in the users session by referencing them with ''this.userData[&amp;quot;key&amp;quot;]''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Conversation_Flows&amp;diff=107</id>
		<title>Conversation Flows</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Conversation_Flows&amp;diff=107"/>
				<updated>2020-01-22T17:55:59Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: /* Connections */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
Dialogs based on conversation flows are guided dialogs. With conversation flows you create a dialog tree where you define all the branches of the conversation. Here you can also connect different services and scripts to match business logic requirements.&lt;br /&gt;
&lt;br /&gt;
[[File:Demo_canvas.png|800px|link=http://www.wiki.wisevoice.ai]]&lt;br /&gt;
&lt;br /&gt;
Conversation flow are trigged either by an event or by another flow.&lt;br /&gt;
Flows are build using 6 building blocks. Read more about each building block on it's dedicated page:&lt;br /&gt;
*'''Events'''&lt;br /&gt;
*'''Intents'''&lt;br /&gt;
*'''Messages'''&lt;br /&gt;
*'''Entities'''&lt;br /&gt;
*'''Scripts'''&lt;br /&gt;
*'''Services'''&lt;br /&gt;
&lt;br /&gt;
In the example above we have a flow that is triggered by the Start Event. The Start Event is the most common type of event and is triggered when the user opens the chat or calls in.&lt;br /&gt;
The Start Event is usually connected to a message node, that contains the welcome message. This means that when the user opens the chat he will instantly receive the welcome message.&lt;br /&gt;
After sending the welcome message, the virtual agent will wait for user input. When receiving the user input it will detect the intent and continue on that branch. If for example the agent couldn't detect the intent it will select the &amp;quot;default fallback&amp;quot; intent, that is configured as a fallback; thus it will send the user the message from the &amp;quot;mesaj fallback&amp;quot; and will wait for user input again. The &amp;quot;messaj fallback&amp;quot; node is connected to all the intents from the previous state, creating a loop.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The topics bellow describe the mechanics of the conversation flows.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Stop conditions ==&lt;br /&gt;
&lt;br /&gt;
In more complex scenarios that also contain scripts and services the agent might pass through multiple nodes in the flow before generating the response for the user.&lt;br /&gt;
The agents stops and returns a message in 3 situations:&lt;br /&gt;
# It reached the end of the flow&lt;br /&gt;
# It reached a node that is followed by intents, thus user input is required&lt;br /&gt;
# It reached an entity node that requires user input&lt;br /&gt;
&lt;br /&gt;
== Component library ==&lt;br /&gt;
&lt;br /&gt;
Every node on the canvas holds a reference to a component. The node receives the name and the configuration of the component but it is only a reference to it. This means that when deleting the node the component is not deleted and also that we can reference the same component on different branches of the conversation.&lt;br /&gt;
All component can be browsed in the component library.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Connections ==&lt;br /&gt;
&lt;br /&gt;
The default way of adding new nodes to the flow is by using the &amp;quot;Add new connection&amp;quot; button on an existing node. This will created a node connected to the parent node.&lt;br /&gt;
When creating nodes from an existing node you can't create event nodes, as events only exist as entry points. There is a dedicated button on the left side of the screen for creating events.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
If you do not want to create a new node, but to connect to an existing one you can use the connections tab.&lt;br /&gt;
In the detail view of a node you can find all the information about the component and a connection tab. From the connection tab you can create connections to existing nodes by pressing &amp;quot;Add connection&amp;quot;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
When having more than one destination node connected to the same source node you can use the connections tab to define conditions for different branches.&lt;br /&gt;
&lt;br /&gt;
If the destination node is an intent node, then the condition will be implicit. The flow will continue on that branch only when matching that intent.&lt;br /&gt;
For destination nodes that are not intents (messages, entities, scripts, services) you can configure custom conditions based on different session variables, such as entities extracted from the user or the current time of day. (e.g. if age &amp;lt; 20 than go to message1; else go to message2;).&lt;br /&gt;
&lt;br /&gt;
==Retraining==&lt;br /&gt;
&lt;br /&gt;
After doing a change to a flow press the 'Retrain' button to update the model.&lt;br /&gt;
If you only update one component you can press the 'Retrain' button inside the detail view of that component.&lt;br /&gt;
To retrain the whole flow press the 'Retrain' button in the canvas.&lt;br /&gt;
&lt;br /&gt;
The retrain time grows with the size of the flow and the number of entities inside the flow. You can not start a second retrain while the first one is still in progress.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Wisevoice_Builder&amp;diff=106</id>
		<title>Wisevoice Builder</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Wisevoice_Builder&amp;diff=106"/>
				<updated>2020-01-22T17:53:29Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: /* Dialog definition */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Wisevoice Builder provides access to development of virtual conversational agents.&lt;br /&gt;
&lt;br /&gt;
A project defines an individual virtual agent. The project configurations consists of two main areas:&lt;br /&gt;
*Integration configurations&lt;br /&gt;
*Dialog definition&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Integration  configurations==&lt;br /&gt;
&lt;br /&gt;
There are 3 types of integrations:&lt;br /&gt;
*Communication channels, such as SIP Callcenter, Web Chat, Facebook, SMS, Telegram and API that can be consumed by custom clients&lt;br /&gt;
*External services integrations, such as UiPath and API &lt;br /&gt;
*Customer service integrations such as Zendesk or Salesforce&lt;br /&gt;
&lt;br /&gt;
== Dialog definition ==&lt;br /&gt;
&lt;br /&gt;
There are two ways of defining a dialog:&lt;br /&gt;
*'''The Q&amp;amp;A module''' offers a simple way of defining question and answer pairs. The content defined as Q&amp;amp;A is accessible to the chat user at any point in the conversation regardless of the context. ''(coming soon)''&lt;br /&gt;
*'''Conversation Flows''' are visual maps of dialog trees used to define more complex scenarios. A project can contain multiple conversation flows that are triggered in different situations.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Wisevoice_Builder&amp;diff=105</id>
		<title>Wisevoice Builder</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Wisevoice_Builder&amp;diff=105"/>
				<updated>2020-01-22T17:52:47Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: /* Integration  configurations */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Wisevoice Builder provides access to development of virtual conversational agents.&lt;br /&gt;
&lt;br /&gt;
A project defines an individual virtual agent. The project configurations consists of two main areas:&lt;br /&gt;
*Integration configurations&lt;br /&gt;
*Dialog definition&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Integration  configurations==&lt;br /&gt;
&lt;br /&gt;
There are 3 types of integrations:&lt;br /&gt;
*Communication channels, such as SIP Callcenter, Web Chat, Facebook, SMS, Telegram and API that can be consumed by custom clients&lt;br /&gt;
*External services integrations, such as UiPath and API &lt;br /&gt;
*Customer service integrations such as Zendesk or Salesforce&lt;br /&gt;
&lt;br /&gt;
== Dialog definition ==&lt;br /&gt;
&lt;br /&gt;
There are two ways of defining a dialog:&lt;br /&gt;
*'''The Q&amp;amp;A module''' offers a simple way of defining question and answer pairs. The content defined as Q&amp;amp;A is accessible to the chat user at any point in the conversation regardless of the context. ''(coming soon)''&lt;br /&gt;
*'''Conversation Flows''' are visual maps of dialog trees used to define more complex scenarios. A project can contain multiple conversation flows that are triggered in different situations. &lt;br /&gt;
&lt;br /&gt;
For example a medical clinic might have the content of it's project structured the following way: &lt;br /&gt;
*A Q&amp;amp;A collection that contains answers to general questions like the location of the clinic and working hours.&lt;br /&gt;
*One conversation flow for creating new appointments&lt;br /&gt;
*A second conversation flow for canceling appointments&lt;br /&gt;
*A third conversation flow for routing between the first two. This flow is triggered by inbound calls from patients, asks for the call reason and triggers either the new appointment or the cancelation flow&lt;br /&gt;
*A forth conversation flow for confirming appointments. This flow is triggered by API calls from the clinic's business logic server and generates outbound calls to the patients asking for confirmation.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Conversation_Flows&amp;diff=104</id>
		<title>Conversation Flows</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Conversation_Flows&amp;diff=104"/>
				<updated>2019-11-26T14:15:56Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: /* Stop conditions */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
Dialogs based on conversation flows are guided dialogs. With conversation flows you create a dialog tree where you define all the branches of the conversation. Here you can also connect different services and scripts to match business logic requirements.&lt;br /&gt;
&lt;br /&gt;
[[File:Demo_canvas.png|800px|link=http://www.wiki.wisevoice.ai]]&lt;br /&gt;
&lt;br /&gt;
Conversation flow are trigged either by an event or by another flow.&lt;br /&gt;
Flows are build using 6 building blocks. Read more about each building block on it's dedicated page:&lt;br /&gt;
*'''Events'''&lt;br /&gt;
*'''Intents'''&lt;br /&gt;
*'''Messages'''&lt;br /&gt;
*'''Entities'''&lt;br /&gt;
*'''Scripts'''&lt;br /&gt;
*'''Services'''&lt;br /&gt;
&lt;br /&gt;
In the example above we have a flow that is triggered by the Start Event. The Start Event is the most common type of event and is triggered when the user opens the chat or calls in.&lt;br /&gt;
The Start Event is usually connected to a message node, that contains the welcome message. This means that when the user opens the chat he will instantly receive the welcome message.&lt;br /&gt;
After sending the welcome message, the virtual agent will wait for user input. When receiving the user input it will detect the intent and continue on that branch. If for example the agent couldn't detect the intent it will select the &amp;quot;default fallback&amp;quot; intent, that is configured as a fallback; thus it will send the user the message from the &amp;quot;mesaj fallback&amp;quot; and will wait for user input again. The &amp;quot;messaj fallback&amp;quot; node is connected to all the intents from the previous state, creating a loop.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The topics bellow describe the mechanics of the conversation flows.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Stop conditions ==&lt;br /&gt;
&lt;br /&gt;
In more complex scenarios that also contain scripts and services the agent might pass through multiple nodes in the flow before generating the response for the user.&lt;br /&gt;
The agents stops and returns a message in 3 situations:&lt;br /&gt;
# It reached the end of the flow&lt;br /&gt;
# It reached a node that is followed by intents, thus user input is required&lt;br /&gt;
# It reached an entity node that requires user input&lt;br /&gt;
&lt;br /&gt;
== Component library ==&lt;br /&gt;
&lt;br /&gt;
Every node on the canvas holds a reference to a component. The node receives the name and the configuration of the component but it is only a reference to it. This means that when deleting the node the component is not deleted and also that we can reference the same component on different branches of the conversation.&lt;br /&gt;
All component can be browsed in the component library.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Connections ==&lt;br /&gt;
&lt;br /&gt;
The default way of adding new nodes to the flow is by using the &amp;quot;Add new connection&amp;quot; button on an existing node. This will created a node connected to the parent node.&lt;br /&gt;
When creating nodes from an existing node you can't create event nodes, as events only exist as entry points. There is a dedicated button on the left side of the screen for creating events.&lt;br /&gt;
&lt;br /&gt;
If you do not want to create a new node, but to connect to an existing one you can use the connections tab.&lt;br /&gt;
In the detail view of a node you can find all the information about the component and a connection tab. From the connection tab you can create connections to existing nodes by pressing &amp;quot;Add connection&amp;quot;&lt;br /&gt;
&lt;br /&gt;
When having more than one destination node connected to the same source node you can use the connections tab to define conditions for different branches&lt;br /&gt;
If the destination node is an intent node, then the condition will be implicit. The flow will continue on that branch only when matching that intent.&lt;br /&gt;
For destination nodes that are not intents (messages, entities, scripts, services) you can configure custom conditions based on different session variables, such as entities extracted from the user or the current time of day. (e.g. if age &amp;lt; 20 than go to message1; else go to message2;).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Retraining==&lt;br /&gt;
&lt;br /&gt;
After doing a change to a flow press the 'Retrain' button to update the model.&lt;br /&gt;
If you only update one component you can press the 'Retrain' button inside the detail view of that component.&lt;br /&gt;
To retrain the whole flow press the 'Retrain' button in the canvas.&lt;br /&gt;
&lt;br /&gt;
The retrain time grows with the size of the flow and the number of entities inside the flow. You can not start a second retrain while the first one is still in progress.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Annotations&amp;diff=103</id>
		<title>Annotations</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Annotations&amp;diff=103"/>
				<updated>2019-11-19T12:57:37Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: Created page with &amp;quot;''coming soon''&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;''coming soon''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Wisevoice_Builder&amp;diff=102</id>
		<title>Wisevoice Builder</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Wisevoice_Builder&amp;diff=102"/>
				<updated>2019-11-19T12:54:10Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;The Wisevoice Builder provides access to development of virtual conversational agents.&lt;br /&gt;
&lt;br /&gt;
A project defines an individual virtual agent. The project configurations consists of two main areas:&lt;br /&gt;
*Integration configurations&lt;br /&gt;
*Dialog definition&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Integration  configurations==&lt;br /&gt;
&lt;br /&gt;
There are 3 types of integrations:&lt;br /&gt;
*Communication channels, such as SIP Callcenter, Web Chat, Facebook, SMS, Telegram and API that can be consumed by custom clients&lt;br /&gt;
*External services integrations, such as UiPath and API &lt;br /&gt;
*Customer service integrations ''(coming soon)''&lt;br /&gt;
&lt;br /&gt;
== Dialog definition ==&lt;br /&gt;
&lt;br /&gt;
There are two ways of defining a dialog:&lt;br /&gt;
*'''The Q&amp;amp;A module''' offers a simple way of defining question and answer pairs. The content defined as Q&amp;amp;A is accessible to the chat user at any point in the conversation regardless of the context. ''(coming soon)''&lt;br /&gt;
*'''Conversation Flows''' are visual maps of dialog trees used to define more complex scenarios. A project can contain multiple conversation flows that are triggered in different situations. &lt;br /&gt;
&lt;br /&gt;
For example a medical clinic might have the content of it's project structured the following way: &lt;br /&gt;
*A Q&amp;amp;A collection that contains answers to general questions like the location of the clinic and working hours.&lt;br /&gt;
*One conversation flow for creating new appointments&lt;br /&gt;
*A second conversation flow for canceling appointments&lt;br /&gt;
*A third conversation flow for routing between the first two. This flow is triggered by inbound calls from patients, asks for the call reason and triggers either the new appointment or the cancelation flow&lt;br /&gt;
*A forth conversation flow for confirming appointments. This flow is triggered by API calls from the clinic's business logic server and generates outbound calls to the patients asking for confirmation.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=101</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=101"/>
				<updated>2019-11-19T11:18:34Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
*Configuration&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation Flows|Conversation flows&lt;br /&gt;
**Events|Events&lt;br /&gt;
**Intents|Intents&lt;br /&gt;
**Messages|Messages&lt;br /&gt;
**Entities|Entities&lt;br /&gt;
**Services|Services&lt;br /&gt;
**Scripts|Scripts&lt;br /&gt;
**Entity types|Entity types&lt;br /&gt;
*Concepts&lt;br /&gt;
**Session data management|Session data management&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=100</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=100"/>
				<updated>2019-11-19T11:18:15Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
*Set up&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation Flows|Conversation flows&lt;br /&gt;
**Events|Events&lt;br /&gt;
**Intents|Intents&lt;br /&gt;
**Messages|Messages&lt;br /&gt;
**Entities|Entities&lt;br /&gt;
**Services|Services&lt;br /&gt;
**Scripts|Scripts&lt;br /&gt;
**Entity types|Entity types&lt;br /&gt;
*Concepts&lt;br /&gt;
**Session data management|Session data management&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=99</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=99"/>
				<updated>2019-11-19T11:17:39Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
*Setup&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation Flows|Conversation flows&lt;br /&gt;
**Events|Events&lt;br /&gt;
**Intents|Intents&lt;br /&gt;
**Messages|Messages&lt;br /&gt;
**Entities|Entities&lt;br /&gt;
**Services|Services&lt;br /&gt;
**Scripts|Scripts&lt;br /&gt;
**Entity types|Entity types&lt;br /&gt;
*Concepts&lt;br /&gt;
**Session data management|Session data management&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=98</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=98"/>
				<updated>2019-11-19T11:16:49Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation Flows|Conversation flows&lt;br /&gt;
**Events|Events&lt;br /&gt;
**Intents|Intents&lt;br /&gt;
**Messages|Messages&lt;br /&gt;
**Entities|Entities&lt;br /&gt;
**Services|Services&lt;br /&gt;
**Scripts|Scripts&lt;br /&gt;
**Entity types|Entity types&lt;br /&gt;
*Concepts&lt;br /&gt;
**Session data management|Session data management&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=97</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=97"/>
				<updated>2019-11-19T11:15:48Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation Flows|Conversation flows&lt;br /&gt;
***Events|Events&lt;br /&gt;
***Intents|Intents&lt;br /&gt;
***Messages|Messages&lt;br /&gt;
***Entities|Entities&lt;br /&gt;
***Services|Services&lt;br /&gt;
***Scripts|Scripts&lt;br /&gt;
***Entity types|Entity types&lt;br /&gt;
***Session data management|Session data management&lt;br /&gt;
*test&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=96</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=96"/>
				<updated>2019-11-19T11:11:37Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation Flows|Conversation flows&lt;br /&gt;
***Events|Events&lt;br /&gt;
***Intents|Intents&lt;br /&gt;
***Messages|Messages&lt;br /&gt;
***Entities|Entities&lt;br /&gt;
***Services|Services&lt;br /&gt;
***Scripts|Scripts&lt;br /&gt;
***Entity types|Entity types&lt;br /&gt;
***Session data management|Session data management&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Session_data_management&amp;diff=95</id>
		<title>Session data management</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Session_data_management&amp;diff=95"/>
				<updated>2019-11-19T11:10:50Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Each Wisevoice conversation has an associated session that contains a key-value map of user's data.&lt;br /&gt;
&lt;br /&gt;
Data can be inserted in the session in the following ways:&lt;br /&gt;
*Entities extracted from user input are stored in the session using the entity name as a key.&lt;br /&gt;
*When triggering a REST or UiPath event you can also pass a JSON Object. All the keys-values from the JSON Object will be injected in the session.&lt;br /&gt;
*When a Service is triggered it can return a synchronous response in the form of a JSON Object. All the keys-values from the JSON Object will be injected in the session.&lt;br /&gt;
*Values can be added or overwritten by Script components.&lt;br /&gt;
*The following system values will be inserted automatically:&lt;br /&gt;
**channel_name - the name of the channel, web, sip, facebook, etc&lt;br /&gt;
**sip_phone_number - for callcenter conversation the caller phone number is stored in the session&lt;br /&gt;
**navigation_history - a list of all the &amp;quot;visited&amp;quot; nodes from the flow&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Session data can be used the following way:&lt;br /&gt;
*Use the session data in any message or text by using the {{&amp;lt;key name&amp;gt;}} syntax. This is available for Message texts, Entity prompts or Service parameters&lt;br /&gt;
*Do custom data processing in Script components by referencing them with this.user_data.key &lt;br /&gt;
*Configure conditions in the connections tab based on session data&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Session_data_management&amp;diff=94</id>
		<title>Session data management</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Session_data_management&amp;diff=94"/>
				<updated>2019-11-19T11:03:36Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: Created page with &amp;quot;Each Wisevoice conversation has an associated session that contains a key-value map of user's data.  Data can be inserted in the session in a few ways: *Entities extracted fro...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Each Wisevoice conversation has an associated session that contains a key-value map of user's data.&lt;br /&gt;
&lt;br /&gt;
Data can be inserted in the session in a few ways:&lt;br /&gt;
*Entities extracted from user input are stored in the session using the entity name as a key.&lt;br /&gt;
*When triggering a REST or UiPath event you can also pass a JSON Object. All the keys-values from the JSON Object will be injected in the session.&lt;br /&gt;
*When a Service is triggered it can return a synchronous response in the form of a JSON Object. All the keys-values from the JSON Object will be injected in the session.&lt;br /&gt;
*Values can be added or overwritten by Script components.&lt;br /&gt;
*The following system values will be inserted automatically:&lt;br /&gt;
**channel_name - the name of the channel, web, sip, facebook, etc&lt;br /&gt;
**sip_phone_number - for callcenter conversation the caller phone number is stored in the session&lt;br /&gt;
**navigation_history - a list of all the &amp;quot;visited&amp;quot; nodes from the flow&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=93</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=93"/>
				<updated>2019-11-19T09:18:15Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Service components handle calls to external business logic. They can be used to send and receive data to and from the business logic services.&lt;br /&gt;
&lt;br /&gt;
Two integration types are supported at the moment, REST and UiPath.&lt;br /&gt;
&lt;br /&gt;
==REST Services==&lt;br /&gt;
&lt;br /&gt;
[[File:service1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
REST services allow you to configure the url, headers and body of the request. When configuring the service fields you can insert values from the user's session using the {{&amp;lt;entity name&amp;gt;}} syntax .&lt;br /&gt;
&lt;br /&gt;
Services also support synchronous answers. To inject data in the user's session the api should return a json object with the keys and values you want to inject. &lt;br /&gt;
&lt;br /&gt;
==UiPath Services==&lt;br /&gt;
&lt;br /&gt;
[[File:service2.png|600px]]&lt;br /&gt;
&lt;br /&gt;
To configure a UiPath service you need to have an active UiPath integration.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath integration you will fill the UiPath orchestrator credentials.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath service you can select witch job you want to trigger, the robot that will execute the job and the JSON parameters that will be sent to the job.&lt;br /&gt;
&lt;br /&gt;
When generating the UiPath call, the request engine will also inject a token in the JSON parameters. The token allows the UiPath job to make a callback after finishing the processing.&lt;br /&gt;
&lt;br /&gt;
[[File:service3.png|600px]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=92</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=92"/>
				<updated>2019-11-19T09:17:38Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Service components handle calls to external business logic. They can be used to send and receive data to and from the business logic services.&lt;br /&gt;
&lt;br /&gt;
Two integration types are supported at the moment, REST and UiPath.&lt;br /&gt;
&lt;br /&gt;
[[File:service1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
REST services allow you to configure the url, headers and body of the request. When configuring the service fields you can insert values from the user's session using the {{&amp;lt;entity name&amp;gt;}} syntax .&lt;br /&gt;
&lt;br /&gt;
Services also support synchronous answers. To inject data in the user's session the api should return a json object with the keys and values you want to inject. &lt;br /&gt;
&lt;br /&gt;
[[File:service2.png|600px]]&lt;br /&gt;
&lt;br /&gt;
To configure a UiPath service you need to have an active UiPath integration.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath integration you will fill the UiPath orchestrator credentials.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath service you can select witch job you want to trigger, the robot that will execute the job and the JSON parameters that will be sent to the job.&lt;br /&gt;
&lt;br /&gt;
When generating the UiPath call, the request engine will also inject a token in the JSON parameters. The token allows the UiPath job to make a callback after finishing the processing.&lt;br /&gt;
&lt;br /&gt;
[[File:service3.png|600px]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=91</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=91"/>
				<updated>2019-11-19T09:09:25Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Service components handle calls to external business logic. They can be used to send and receive data to and from the business logic services.&lt;br /&gt;
&lt;br /&gt;
Two integration types are supported at the moment, REST and UiPath.&lt;br /&gt;
&lt;br /&gt;
[[File:service1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
REST services allow you to configure the url, headers and body of the request. When configuring the service fields you can insert values from the user's session using the {{&amp;lt;entity name&amp;gt;}} syntax .&lt;br /&gt;
&lt;br /&gt;
[[File:service2.png|600px]]&lt;br /&gt;
&lt;br /&gt;
To configure a UiPath service you need to have an active UiPath integration.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath integration you will fill the UiPath orchestrator credentials.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath service you can select witch job you want to trigger, the robot that will execute the job and the JSON parameters that will be sent to the job.&lt;br /&gt;
&lt;br /&gt;
When generating the UiPath call, the request engine will also inject a token in the JSON parameters. The token allows the UiPath job to make a callback after finishing the processing.&lt;br /&gt;
&lt;br /&gt;
[[File:service3.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Services also support synchronous answers.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=90</id>
		<title>Entity types</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=90"/>
				<updated>2019-11-19T08:59:15Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entity types define how a valid entity looks like.&lt;br /&gt;
&lt;br /&gt;
There are system types supported out of the box like Date, Number and Free text and you can also define custom entity types.&lt;br /&gt;
&lt;br /&gt;
[[File:entity_type1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Entity types can be defined either as a value list or as a regular expression.&lt;br /&gt;
&lt;br /&gt;
==Defining with value lists==&lt;br /&gt;
&lt;br /&gt;
When defining a value list you can also chose the appropriate matching sensitivity, ranging from exact matching to wildcard matching.&lt;br /&gt;
&lt;br /&gt;
For each value you can define a list of synonyms. If one of the synonym is detected it will be automatically replaced with the base value when saving the entity in the user session. This help take NLP tasks out of the business logic pipeline.&lt;br /&gt;
&lt;br /&gt;
Value lists can also be imported as CSV files or refreshed real time from an URL.&lt;br /&gt;
&lt;br /&gt;
When marking entities in training examples in the Intent, Entity or Annotation views, new values will get automatically inserted in the value list corresponding to the entity type.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Defining with regular expressions ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Here is a RegEx example for defining a phone number entity type:&lt;br /&gt;
&lt;br /&gt;
''0 ?[7|3] ?(\d ?){n}''&lt;br /&gt;
&lt;br /&gt;
Read more about Regular [https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions|Expressions here].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=89</id>
		<title>Entity types</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=89"/>
				<updated>2019-11-19T08:56:26Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entity types define how a valid entity looks like.&lt;br /&gt;
&lt;br /&gt;
There are system types supported out of the box like Date, Number and Free text and you can also define custom entity types.&lt;br /&gt;
&lt;br /&gt;
[[File:entity_type1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Entity types can be defined either as a value list or as a regular expression.&lt;br /&gt;
&lt;br /&gt;
When defining a value list you can also chose the appropriate matching sensitivity, ranging from exact matching to wildcard matching.&lt;br /&gt;
&lt;br /&gt;
For each value you can define a list of synonyms. If one of the synonym is detected it will be automatically replaced with the base value when saving the entity in the user session. This help take NLP tasks out of the business logic pipeline.&lt;br /&gt;
&lt;br /&gt;
Value lists can also be imported as CSV files or refreshed real time from an URL.&lt;br /&gt;
&lt;br /&gt;
Here is a RegEx example for defining a phone number entity type:&lt;br /&gt;
&lt;br /&gt;
''0 ?[7|3] ?(\d ?){n}''&lt;br /&gt;
&lt;br /&gt;
Read more about Regular [https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions|Expressions here].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=88</id>
		<title>Entity types</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=88"/>
				<updated>2019-11-19T08:55:58Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entity types define how a valid entity looks like.&lt;br /&gt;
&lt;br /&gt;
There are system types supported out of the box like Date, Number and Free text and you can also define custom entity types.&lt;br /&gt;
&lt;br /&gt;
[[File:entity_type1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Entity types can be defined either as a value list or as a regular expression.&lt;br /&gt;
&lt;br /&gt;
When defining a value list you can also chose the appropriate matching sensitivity, ranging from exact matching to wildcard matching.&lt;br /&gt;
&lt;br /&gt;
For each value you can define a list of synonyms. If one of the synonym is detected it will be automatically replaced with the base value when saving the entity in the user session. This help take NLP tasks out of the business logic pipeline.&lt;br /&gt;
&lt;br /&gt;
Value lists can also be imported as CSV files or refreshed real time from an URL.&lt;br /&gt;
&lt;br /&gt;
Here is a RegEx example for defining a phone number entity type:&lt;br /&gt;
&lt;br /&gt;
0 ?[7|3] ?(\d ?){n}&lt;br /&gt;
&lt;br /&gt;
Read more about Regular [https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions|Expressions here].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=87</id>
		<title>Entity types</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=87"/>
				<updated>2019-11-19T08:51:47Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entity types define how a valid entity looks like.&lt;br /&gt;
&lt;br /&gt;
There are system types supported out of the box like Date, Number and Free text and you can also define custom entity types.&lt;br /&gt;
&lt;br /&gt;
[[File:entity_type1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
Entity types can be defined either as a value list or as a regular expression.&lt;br /&gt;
&lt;br /&gt;
When defining a value list you can also chose the appropriate matching sensitivity, ranging from exact matching to wildcard matching.&lt;br /&gt;
&lt;br /&gt;
For each value you can define a list of synonyms. If one of the synonym is detected it will be automatically replaced with the base value when saving the entity in the user session. This help take NLP tasks out of the business logic pipeline.&lt;br /&gt;
&lt;br /&gt;
Value lists can also be imported as CSV files or refreshed real time from an URL.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=86</id>
		<title>Entity types</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entity_types&amp;diff=86"/>
				<updated>2019-11-19T08:51:32Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: Created page with &amp;quot;Entity types define how a valid entity looks like.  There are system types supported out of the box like Date, Number and Free text and you can also define custom entity types...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entity types define how a valid entity looks like.&lt;br /&gt;
&lt;br /&gt;
There are system types supported out of the box like Date, Number and Free text and you can also define custom entity types.&lt;br /&gt;
&lt;br /&gt;
[[File:entity_type1.png]]&lt;br /&gt;
&lt;br /&gt;
Entity types can be defined either as a value list or as a regular expression.&lt;br /&gt;
&lt;br /&gt;
When defining a value list you can also chose the appropriate matching sensitivity, ranging from exact matching to wildcard matching.&lt;br /&gt;
&lt;br /&gt;
For each value you can define a list of synonyms. If one of the synonym is detected it will be automatically replaced with the base value when saving the entity in the user session. This help take NLP tasks out of the business logic pipeline.&lt;br /&gt;
&lt;br /&gt;
Value lists can also be imported as CSV files or refreshed real time from an URL.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=File:Entity_type1.png&amp;diff=85</id>
		<title>File:Entity type1.png</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=File:Entity_type1.png&amp;diff=85"/>
				<updated>2019-11-19T08:43:15Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Scripts&amp;diff=84</id>
		<title>Scripts</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Scripts&amp;diff=84"/>
				<updated>2019-11-19T08:40:28Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Scripts are components used to do custom processing on data from the user's session.&lt;br /&gt;
&lt;br /&gt;
Scripts us the JavaScript programming language.&lt;br /&gt;
&lt;br /&gt;
You can read and edit any of the variables in the users session by referencing them with ''this.user_data.key''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=83</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=83"/>
				<updated>2019-11-19T08:37:05Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Service components handle calls to external business logic. They can be used to send and receive data to and from the business logic services.&lt;br /&gt;
&lt;br /&gt;
Two integration types are supported at the moment, REST and UiPath.&lt;br /&gt;
&lt;br /&gt;
[[File:service1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
REST services allow you to configure the url, headers and body of the request. When configuring the service fields you can insert values from the user's session using the {{&amp;lt;entity name&amp;gt;}} syntax .&lt;br /&gt;
&lt;br /&gt;
[[File:service2.png|600px]]&lt;br /&gt;
&lt;br /&gt;
To configure a UiPath service you need to have an active UiPath integration.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath integration you will fill the UiPath orchestrator credentials.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath service you can select witch job you want to trigger, the robot that will execute the job and the JSON parameters that will be sent to the job.&lt;br /&gt;
&lt;br /&gt;
When generating the UiPath call, the request engine will also inject a token in the JSON parameters. The token allows the UiPath job to make a callback after finishing the processing.&lt;br /&gt;
&lt;br /&gt;
[[File:service3.png|600px]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=File:Service3.png&amp;diff=82</id>
		<title>File:Service3.png</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=File:Service3.png&amp;diff=82"/>
				<updated>2019-11-19T08:36:47Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=81</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=81"/>
				<updated>2019-11-19T08:35:19Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Service components handle calls to external business logic. They can be used to send and receive data to and from the business logic services.&lt;br /&gt;
&lt;br /&gt;
Two integration types are supported at the moment, REST and UiPath.&lt;br /&gt;
&lt;br /&gt;
[[File:service1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
REST services allow you to configure the url, headers and body of the request. When configuring the service fields you can insert values from the user's session using the {{&amp;lt;entity name&amp;gt;}} syntax .&lt;br /&gt;
&lt;br /&gt;
[[File:service2.png|600px]]&lt;br /&gt;
&lt;br /&gt;
To configure a UiPath service you need to have an active UiPath integration.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath integration you will fill the UiPath orchestrator credentials.&lt;br /&gt;
&lt;br /&gt;
When configuring the UiPath service you can select witch job you want to trigger, the robot that will execute the job and the JSON parameters that will be sent to the job.&lt;br /&gt;
&lt;br /&gt;
When generating the UiPath call, the request engine will also inject a token in the JSON parameters. The token allows the UiPath job to make a callback after finishing the processing.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=80</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=80"/>
				<updated>2019-11-19T08:07:59Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Service components handle calls to external business logic. They can be used to send and receive data to and from the business logic services.&lt;br /&gt;
&lt;br /&gt;
Two integration types are supported at the moment, REST and UiPath.&lt;br /&gt;
&lt;br /&gt;
[[File:service1.png|600px]]&lt;br /&gt;
&lt;br /&gt;
REST services&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=79</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=79"/>
				<updated>2019-11-19T08:07:41Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Service components handle calls to external business logic. They can be used to send and receive data to and from the business logic services.&lt;br /&gt;
&lt;br /&gt;
Two integration types are supported at the moment, REST and UiPath.&lt;br /&gt;
&lt;br /&gt;
[[File:service2.png|600px]]&lt;br /&gt;
&lt;br /&gt;
REST services&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=File:Service2.png&amp;diff=78</id>
		<title>File:Service2.png</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=File:Service2.png&amp;diff=78"/>
				<updated>2019-11-19T08:00:59Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=File:Service1.png&amp;diff=77</id>
		<title>File:Service1.png</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=File:Service1.png&amp;diff=77"/>
				<updated>2019-11-19T08:00:38Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Conversation_Flows&amp;diff=76</id>
		<title>Conversation Flows</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Conversation_Flows&amp;diff=76"/>
				<updated>2019-11-18T16:13:14Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
Dialogs based on conversation flows are guided dialogs. With conversation flows you create a dialog tree where you define all the branches of the conversation. Here you can also connect different services and scripts to match business logic requirements.&lt;br /&gt;
&lt;br /&gt;
[[File:Demo_canvas.png|800px|link=http://www.wiki.wisevoice.ai]]&lt;br /&gt;
&lt;br /&gt;
Conversation flow are trigged either by an event or by another flow.&lt;br /&gt;
Flows are build using 6 building blocks. Read more about each building block on it's dedicated page:&lt;br /&gt;
*'''Events'''&lt;br /&gt;
*'''Intents'''&lt;br /&gt;
*'''Messages'''&lt;br /&gt;
*'''Entities'''&lt;br /&gt;
*'''Scripts'''&lt;br /&gt;
*'''Services'''&lt;br /&gt;
&lt;br /&gt;
In the example above we have a flow that is triggered by the Start Event. The Start Event is the most common type of event and is triggered when the user opens the chat or calls in.&lt;br /&gt;
The Start Event is usually connected to a message node, that contains the welcome message. This means that when the user opens the chat he will instantly receive the welcome message.&lt;br /&gt;
After sending the welcome message, the virtual agent will wait for user input. When receiving the user input it will detect the intent and continue on that branch. If for example the agent couldn't detect the intent it will select the &amp;quot;default fallback&amp;quot; intent, that is configured as a fallback; thus it will send the user the message from the &amp;quot;mesaj fallback&amp;quot; and will wait for user input again. The &amp;quot;messaj fallback&amp;quot; node is connected to all the intents from the previous state, creating a loop.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The topics bellow describe the mechanics of the conversation flows.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Stop conditions ==&lt;br /&gt;
&lt;br /&gt;
In more complex scenarios that also contain scripts and services the agent might pass through multiple nodes in the flow before generating the response for the user.&lt;br /&gt;
The agents stops and returns a message in 3 situations:&lt;br /&gt;
1. It reached the end of the flow&lt;br /&gt;
2. It reached a node that is followed by intents, thus user input is required&lt;br /&gt;
3. It reached an entity node that requires user input&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Component library ==&lt;br /&gt;
&lt;br /&gt;
Every node on the canvas holds a reference to a component. The node receives the name and the configuration of the component but it is only a reference to it. This means that when deleting the node the component is not deleted and also that we can reference the same component on different branches of the conversation.&lt;br /&gt;
All component can be browsed in the component library.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Connections ==&lt;br /&gt;
&lt;br /&gt;
The default way of adding new nodes to the flow is by using the &amp;quot;Add new connection&amp;quot; button on an existing node. This will created a node connected to the parent node.&lt;br /&gt;
When creating nodes from an existing node you can't create event nodes, as events only exist as entry points. There is a dedicated button on the left side of the screen for creating events.&lt;br /&gt;
&lt;br /&gt;
If you do not want to create a new node, but to connect to an existing one you can use the connections tab.&lt;br /&gt;
In the detail view of a node you can find all the information about the component and a connection tab. From the connection tab you can create connections to existing nodes by pressing &amp;quot;Add connection&amp;quot;&lt;br /&gt;
&lt;br /&gt;
When having more than one destination node connected to the same source node you can use the connections tab to define conditions for different branches&lt;br /&gt;
If the destination node is an intent node, then the condition will be implicit. The flow will continue on that branch only when matching that intent.&lt;br /&gt;
For destination nodes that are not intents (messages, entities, scripts, services) you can configure custom conditions based on different session variables, such as entities extracted from the user or the current time of day. (e.g. if age &amp;lt; 20 than go to message1; else go to message2;).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Retraining==&lt;br /&gt;
&lt;br /&gt;
After doing a change to a flow press the 'Retrain' button to update the model.&lt;br /&gt;
If you only update one component you can press the 'Retrain' button inside the detail view of that component.&lt;br /&gt;
To retrain the whole flow press the 'Retrain' button in the canvas.&lt;br /&gt;
&lt;br /&gt;
The retrain time grows with the size of the flow and the number of entities inside the flow. You can not start a second retrain while the first one is still in progress.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=75</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=75"/>
				<updated>2019-11-18T16:12:39Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation Flows|Conversation flows&lt;br /&gt;
***Events|Events&lt;br /&gt;
***Intents|Intents&lt;br /&gt;
***Messages|Messages&lt;br /&gt;
***Entities|Entities&lt;br /&gt;
***Services|Services&lt;br /&gt;
***Scripts|Scripts&lt;br /&gt;
***Entity types|Entity types&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=74</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=74"/>
				<updated>2019-11-18T16:11:11Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation flows|Conversation flows&lt;br /&gt;
***Events|Events&lt;br /&gt;
***Intents|Intents&lt;br /&gt;
***Messages|Messages&lt;br /&gt;
***Entities|Entities&lt;br /&gt;
***Services|Services&lt;br /&gt;
***Scripts|Scripts&lt;br /&gt;
***Entity types|Entity types&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=73</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=73"/>
				<updated>2019-11-18T16:10:19Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables that can be extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
The entity component is used to configure how the entity is going to be extracted and validated.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;br /&gt;
&lt;br /&gt;
The entity name is important because it will be used as a key when saving the extracted entity in the user's session.&lt;br /&gt;
&lt;br /&gt;
The entity type defines the format and rules a valid entity must match. &lt;br /&gt;
There are some system types supported out of the box, such as Date, Number and Free text and you can define your own custom types. Read more about creating new [[Entity Types]].&lt;br /&gt;
&lt;br /&gt;
When the flow reaches an Entity node it will first check the user's session to see if he doesn't already have the entity in the session. If the session already contains the entity key or if the entity is not configured as &amp;quot;Required&amp;quot; the flow continues, otherwise the agent will send the prompt message. &lt;br /&gt;
&lt;br /&gt;
The prompt message can be configured in a similar manner as a regular [[Messages|Message]].&lt;br /&gt;
&lt;br /&gt;
After sending the prompt message the agent waits for user input. It will used the expression examples and the constraints of the entity type to attempt extracting the entity from the user input.&lt;br /&gt;
If no entity is extracted the flow will continue with the error prompt and stay in a loop until receiving a valid entity.&lt;br /&gt;
If an entity was extracted the flow will with the validation faze if it was enabled or otherwise continue with the next node.&lt;br /&gt;
&lt;br /&gt;
If user validation was enabled the agent will send the validation prompt asking the user to validate or update the extracted entity. This behavior is particularly useful for callcenter conversations.&lt;br /&gt;
User validation will trigger even if the entity was already present in the user session from previous steps.&lt;br /&gt;
&lt;br /&gt;
'''Interactions with other components'''&lt;br /&gt;
&lt;br /&gt;
Entities, as all other context variables can be used in messages and services using the syntax {{&amp;lt;entity name&amp;gt;}} . E.g. &amp;quot;Salut, {{nume}}!&amp;quot;&lt;br /&gt;
They can also be used in scripts and connection conditions.&lt;br /&gt;
&lt;br /&gt;
After creating an entity you can also mark its occurrences in training expressions from Intents. Just select the word of group of words and a popup showing available entities will appear.&lt;br /&gt;
This allows extracting entities together with the intent from a single user input. If this happens the entity node will be skipped.&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=72</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=72"/>
				<updated>2019-11-18T15:54:41Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables that can be extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
The entity component is used to configure how the entity is going to be extracted and validated.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;br /&gt;
&lt;br /&gt;
The entity name is important because it will be used as a key when saving the extracted entity in the user's session.&lt;br /&gt;
&lt;br /&gt;
The entity type defines the format and rules a valid entity must match. &lt;br /&gt;
There are some system types supported out of the box, such as Date, Number and Free text and you can define your own custom types. Read more about creating new [[Entity Types]].&lt;br /&gt;
&lt;br /&gt;
When the flow reaches an Entity node it will first check the user's session to see if he doesn't already have the entity in the session. If he does the flow continues, otherwise the agent will send the prompt message. The prompt message can be configured in a similar manner as a regular [[Messages|Message]].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=71</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=71"/>
				<updated>2019-11-18T15:54:28Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables that can be extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
The entity component is used to configure how the entity is going to be extracted and validated.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;br /&gt;
&lt;br /&gt;
The entity name is important because it will be used as a key when saving the extracted entity in the user's session.&lt;br /&gt;
&lt;br /&gt;
The entity type defines the format and rules a valid entity must match. &lt;br /&gt;
There are some system types supported out of the box, such as Date, Number and Free text and you can define your own custom types. Read more about creating new [[Entity Types]].&lt;br /&gt;
&lt;br /&gt;
When the flow reaches an Entity node it will first check the user's session to see if he doesn't already have the entity in the session. If he does the flow continues, otherwise the agent will send the prompt message. The prompt message can be configured in a similar manner as a regular [[Message|Messages]].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=70</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=70"/>
				<updated>2019-11-18T15:54:17Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables that can be extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
The entity component is used to configure how the entity is going to be extracted and validated.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;br /&gt;
&lt;br /&gt;
The entity name is important because it will be used as a key when saving the extracted entity in the user's session.&lt;br /&gt;
&lt;br /&gt;
The entity type defines the format and rules a valid entity must match. &lt;br /&gt;
There are some system types supported out of the box, such as Date, Number and Free text and you can define your own custom types. Read more about creating new [[Entity Types]].&lt;br /&gt;
&lt;br /&gt;
When the flow reaches an Entity node it will first check the user's session to see if he doesn't already have the entity in the session. If he does the flow continues, otherwise the agent will send the prompt message. The prompt message can be configured in a similar manner as a regular [[Message:Messages]].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entity_Types&amp;diff=69</id>
		<title>Entity Types</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entity_Types&amp;diff=69"/>
				<updated>2019-11-18T15:47:15Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: Created page with &amp;quot;''Coming soon''&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;''Coming soon''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=68</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=68"/>
				<updated>2019-11-18T15:46:07Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables that can be extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
The entity component is used to configure how the entity is going to be extracted and validated.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;br /&gt;
&lt;br /&gt;
The entity name is important because other than being helpful for the configuration, it will also be used as a key when saving the extracted entity in the user's session.&lt;br /&gt;
&lt;br /&gt;
The entity type defines the format and rules a valid entity must match. &lt;br /&gt;
There are some system types supported out of the box, such as Date, Number and Free text and you can define your own custom types. Read more about creating new [[Entity Types]].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=67</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=67"/>
				<updated>2019-11-18T15:45:10Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables that can be extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
The entity component is used to configure how the entity is going to be extracted and validated.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;br /&gt;
&lt;br /&gt;
The entity name is important because other than being helpful for the configuration, it will also be used as a key when saving the extracted entity in the user's session.&lt;br /&gt;
&lt;br /&gt;
The entity type defines the format and rules a valid entity must match. &lt;br /&gt;
There are some system types supported out of the box, such as Date, Number and Free text and you can define your own custom types.&lt;br /&gt;
More about defining [[Entity Types here]].&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=66</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=66"/>
				<updated>2019-11-18T15:35:10Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables that can be extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=65</id>
		<title>Entities</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Entities&amp;diff=65"/>
				<updated>2019-11-18T15:34:14Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: Created page with &amp;quot;Entities are variables extracted from the user input.  500px&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Entities are variables extracted from the user input.&lt;br /&gt;
&lt;br /&gt;
[[File:Entity.png|500px]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=File:Entity.png&amp;diff=64</id>
		<title>File:Entity.png</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=File:Entity.png&amp;diff=64"/>
				<updated>2019-11-18T15:33:43Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Messages&amp;diff=63</id>
		<title>Messages</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Messages&amp;diff=63"/>
				<updated>2019-11-18T15:33:19Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Messages are components used to configure the virtual agent's replies.&lt;br /&gt;
&lt;br /&gt;
[[File:message.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Messages contain a text that will be used as the main response. If the channel is callcenter the text will be vocalized.&lt;br /&gt;
&lt;br /&gt;
Each message can also have a media url that will be sent together with the text. Configure the media url in the bottom right corner of each text variant.&lt;br /&gt;
&lt;br /&gt;
You can create multiple text variants from which the agent will select a random one by pressing the &amp;quot;Add new&amp;quot; button.&lt;br /&gt;
&lt;br /&gt;
To insert user session data like extracted entities in a message use the syntax {{&amp;lt;entity name&amp;gt;}}. E.g. &amp;quot;Salut {{nume}}!&amp;quot;&lt;br /&gt;
&lt;br /&gt;
For text based channels the agent will also send a list of suggestion that will be displayed above the keyboard.&lt;br /&gt;
[[File:message3.png|600px]] &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The action list is useful for the call center client and other custom clients. By example when implementing a native app custom client using the API integration you can send custom actions to the client that can trigger certain business logic, like turning on the flash.&lt;br /&gt;
&lt;br /&gt;
For callcenter integrations there are three supported actions:&lt;br /&gt;
*Stop&lt;br /&gt;
*Redirect:&amp;lt;phone_number&amp;gt;&lt;br /&gt;
*Hold&lt;br /&gt;
&lt;br /&gt;
Other channel specific settings are available in the tab dedicated to that channel.&lt;br /&gt;
&lt;br /&gt;
For example the Callcenter channel have extra settings for handling silences with no user input.&lt;br /&gt;
&lt;br /&gt;
Checking the Template list checkbox will cause the suggestions to display vertically on facebook.&lt;br /&gt;
''Other message templates are coming soon.''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=62</id>
		<title>Services</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Services&amp;diff=62"/>
				<updated>2019-11-18T15:28:19Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: Created page with &amp;quot;''coming soon''&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;''coming soon''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Scripts&amp;diff=61</id>
		<title>Scripts</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Scripts&amp;diff=61"/>
				<updated>2019-11-18T15:28:02Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: Created page with &amp;quot;''coming soon''&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;''coming soon''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=60</id>
		<title>MediaWiki:Sidebar</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=MediaWiki:Sidebar&amp;diff=60"/>
				<updated>2019-11-18T15:18:44Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
* navigation&lt;br /&gt;
** Wisevoice Builder|Wisevoice Builder&lt;br /&gt;
** Conversation flows|Conversation flows&lt;br /&gt;
***Events|Events&lt;br /&gt;
***Intents|Intents&lt;br /&gt;
***Messages|Messages&lt;br /&gt;
***Entities|Entities&lt;br /&gt;
***Services|Services&lt;br /&gt;
***Scripts|Scripts&lt;br /&gt;
* SEARCH&lt;br /&gt;
* TOOLBOX&lt;br /&gt;
* LANGUAGES&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	<entry>
		<id>http://wiki.wisevoice.ai/index.php?title=Messages&amp;diff=59</id>
		<title>Messages</title>
		<link rel="alternate" type="text/html" href="http://wiki.wisevoice.ai/index.php?title=Messages&amp;diff=59"/>
				<updated>2019-11-18T15:02:00Z</updated>
		
		<summary type="html">&lt;p&gt;Admin: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Messages are components used to configure the virtual agent's replies.&lt;br /&gt;
&lt;br /&gt;
[[File:message.png|600px]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Messages contain a text that will be used as the main response. If the channel is callcenter the text will be vocalized.&lt;br /&gt;
&lt;br /&gt;
Each message can also have a media url that will be sent together with the text.&lt;br /&gt;
&lt;br /&gt;
You can create multiple text variants from which the agent will select a random one by pressing the &amp;quot;Add new&amp;quot; button.&lt;br /&gt;
&lt;br /&gt;
For text based channels the agent will also send a list of suggestion that will be displayed above the keyboard.&lt;br /&gt;
[[File:message3.png|600px]] &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The action list is useful for the call center client and other custom clients. By example when implementing a native app custom client using the API integration you can send custom actions to the client that can trigger certain business logic, like turning on the flash.&lt;br /&gt;
&lt;br /&gt;
For callcenter integrations there are three supported actions:&lt;br /&gt;
*Stop&lt;br /&gt;
*Redirect:&amp;lt;phone_number&amp;gt;&lt;br /&gt;
*Hold&lt;br /&gt;
&lt;br /&gt;
Other channel specific settings are available in the tab dedicated to that channel.&lt;br /&gt;
&lt;br /&gt;
For example the Callcenter channel have extra settings for handling silences with no user input.&lt;br /&gt;
&lt;br /&gt;
Checking the Template list checkbox will cause the suggestions to display vertically on facebook.&lt;br /&gt;
''Other message templates are coming soon.''&lt;/div&gt;</summary>
		<author><name>Admin</name></author>	</entry>

	</feed>