AI Chat Assistant
Purpose
This app provides an easy way to create a chat for end users to interface with the project.
Installation
Simple add the ai application to your INSTALLED_APPS
and the workflow class with module name to your settings:
INSTALLED_APPS = [
...
'django_spire.ai',
'django_spire.ai.chat',
...
]
# this is the class and module that will handle the chat interactions
AI_CHAT_WORKFLOW_CLASS = 'example.ai.chat.intelligence.chat_workflow.ChatWorkflow'
You also need to add django spire to your projects urls.py
:
from django.urls import path, include
urlpatterns = [
path('django_spire/', include('django_spire.urls', namespace='django_spire')),
]
Warning
Properly configure Dandy install is required for more information see the documentation.
Usage
Your AI chat will need a workflow class that will be the place that all messages are sent to be processed.
The function you use will need to take 3 arguments:
request
- the request object:django.core.handlers.wsgiWSGIRequest
user_input
- the message from the user:str
message_history
- the message history of the chat:dandy.llm.MessageHistory
from dandy.recorder import recorder_to_html_file
from dandy.workflow import BaseWorkflow
from dandy.llm import LlmBot, MessageHistory
from django.core.handlers.wsgi import WSGIRequest
from django_spire.ai.chat.messages import DefaultMessageIntel, BaseMessageIntel
from test_project.apps.ai.chat.intelligence.llm_maps import IntentLlmMap
from test_project.apps.ai.chat.intelligence.prompts import test_project_company_prompt
from test_project.apps.ai.chat.messages import ClownFlyingDistanceMessageIntel, PirateMessageIntel
class ChatWorkflow(BaseWorkflow):
@classmethod
@recorder_to_html_file('chat_workflow')
def process(
cls,
request: WSGIRequest,
user_input: str,
message_history: MessageHistory | None = None
) -> BaseMessageIntel:
intents = IntentLlmMap.process(user_input, max_return_values=1)
if intents[0] == 'clowns':
response = LlmBot.process(
prompt=user_input,
intel_class=ClownFlyingDistanceMessageIntel,
message_history=message_history,
)
elif intents[0] == 'pirates':
response = LlmBot.process(
prompt=user_input,
intel_class=PirateMessageIntel,
message_history=message_history,
)
else:
response = LlmBot.process(
prompt=user_input,
intel_class=DefaultMessageIntel,
message_history=message_history,
postfix_system_prompt=test_project_company_prompt()
)
return response
Once this is setup you can simply add the chat card to your templates:
Tip
Since this application uses a center point to process messages make sure to fully utilize dandy. This will allow you to direct people from a central point to different areas of information.