<key>NSMicrophoneUsageDescription</key><string>We need access to your microphone to enable voice conversations with the AI agent.</string>
Add the following to your Podfile, since we depend on permission_handler to manage permissions and audio_session to manage audio sessions.
Copy
Ask AI
post_install do |installer| installer.pods_project.targets.each do |target| target.build_configurations.each do |config| config.build_settings['GCC_PREPROCESSOR_DEFINITIONS'] ||= [ '$(inherited)', # audio_session settings 'AUDIO_SESSION_MICROPHONE=0', # For microphone access 'PERMISSION_MICROPHONE=1' end endend
Due to an issue of the Onnx Runtime getting stripped by XCode when archived, you need to follow these steps in XCode for the voice activity detector (VAD) to work on iOS builds:
Under βTargetsβ, choose βRunnerβ (or your projectβs name)
final agent = Agent( // Replace with your agent ID from PlayAI agentId: 'your-agent-id-here', // Customize your agent's behavior prompt: 'You are a helpful assistant who speaks in a friendly, casual tone.', // Define actions the agent can take in your app actions: [ AgentAction( name: 'show_weather', triggerInstructions: 'Trigger this when the user asks about weather.', argumentSchema: { 'city': AgentActionParameter( type: 'string', description: 'The city to show weather for', ), }, callback: (data) async { final city = data['city'] as String; // In a real app, you would fetch weather data here return 'Weather data fetched for $city!'; }, ), ], // Configure callbacks to respond to agent events callbackConfig: AgentCallbackConfig( // Get user speech transcript onUserTranscript: (text) { setState(() => _messages.add(ChatMessage(text, isUser: true))); }, // Get agent speech transcript onAgentTranscript: (text) { setState(() => _messages.add(ChatMessage(text, isUser: false))); }, // Handle any errors onError: (error, isFatal) { ScaffoldMessenger.of(context).showSnackBar( SnackBar(content: Text('Error: $error')), ); }, ),);
One of the most exciting features of the PlayAI Agents SDK is the ability to define custom actions that allow the agent to interact with your app.
Copy
Ask AI
AgentAction( name: 'open_settings', triggerInstructions: 'Trigger this when the user asks to open settings', argumentSchema: { 'section': AgentActionParameter( type: 'string', description: 'The settings section to open', ), }, callback: (data) async { final section = data['section'] as String; // Navigate to settings section in your app return 'Opened $section settings'; },)
Send contextual information to the agent during a conversation to inform it of changes in your app.
Copy
Ask AI
// When user navigates to a new screenvoid _onNavigate(String routeName) { agent.sendDeveloperMessage( 'User navigated to $routeName screen. You can now discuss the content on this page.', );}// When relevant data changesvoid _onCartUpdated(List<Product> products) { agent.sendDeveloperMessage( 'User\'s cart has been updated, now containing: ${products.map((p) => p.name).join(", ")}.', );}