Configuring Conversational Intelligence (with LLM enabled)
Various settings related to Generative AI can be configured in the Settings section of the Train tab present on the left panel of the Engati platform.
Navigate to Train > Settings > User Experience.
When "Enable FAQ Responses" is activated, users can ask contextual questions and receive relevant answers directly from the LLM (Large Language Model). This feature leverages Generative AI capabilities to generate responses for trained FAQs, enhancing the accuracy and depth of answers.
By default, this feature is enabled from the UI, ensuring that FAQ responses are readily available and dynamically generated for user queries.
When enabled, this feature presents additional related questions after a response is generated, encouraging further exploration. These follow-up questions prioritize nudges over suggestions to guide users effectively.
The suggestions are drawn from the uploaded document and tailored to the user’s query (based on the document content). By generating relevant questions for each document topic, this feature keeps conversations flowing, helping users explore information seamlessly and maximize the potential of each response.
This feature shortens document or URL links for non-website channels, making reference source URLs more concise and easier to share.
I trust that reading this documentation has provided a clearer understanding of how to make the most of the Generative AI Settings on our platform.
If you face any issues or queries please reach out to us at [email protected].