Mobile SDKs
Flutter
vocall_sdk -- Widgets, overlays, voice integration
Flutter SDK
The Flutter SDK (currently published as jarvis_sdk, will be renamed to vocall_sdk) provides
full voice and chat integration for Flutter applications across Web, Android, and iOS.
Installation
Add the dependency to your pubspec.yaml:
dependencies:
jarvis_sdk: ^0.1.0
Note: The package will be renamed from
jarvis_sdktovocall_sdkin an upcoming release. The API surface will remain the same.
Creating a Client
final client = JarvisClient(
serverUrl: 'wss://your-server.example.com',
token: 'your-auth-token',
);
Building a Manifest
Use the ManifestBuilder DSL to declare your application screens and fields:
final manifest = ManifestBuilder()
.screen('dashboard', label: 'Dashboard')
.field('search', type: FieldType.text, label: 'Search')
.action('refresh', label: 'Refresh Data')
.done()
.screen('form', label: 'Entry Form')
.field('name', type: FieldType.text, label: 'Full Name')
.field('email', type: FieldType.text, label: 'Email')
.action('submit', label: 'Submit')
.done()
.build();
Field Registry
Register your UI controllers so the engine can read and write field values:
// Text fields with TextEditingController
final nameCtrl = TextEditingController();
client.fieldRegistry.registerField('form', 'name', nameCtrl);
// Actions
client.fieldRegistry.registerAction('form', 'submit', () {
submitForm();
});
// Navigation callback
client.fieldRegistry.onNavigate = (screenId) {
navigator.pushNamed('/$screenId');
};
Overlay Widget
Wrap your app with JarvisOverlay to get the floating chat and voice UI:
JarvisOverlay(
client: client,
child: MaterialApp(
home: DashboardScreen(),
),
)
Voice Modes
The SDK supports two voice interaction patterns:
Always-listening mode -- the microphone stays open and continuously streams audio:
await client.startAlwaysListening();
// User speaks freely...
await client.stopAlwaysListening();
Click-to-talk mode -- the user holds or toggles recording:
await client.startRecording();
// User speaks...
await client.stopRecording();
Interrupt -- stop the assistant mid-speech:
client.interrupt();
Client States
IDLE ──connect()──> CONNECTING ──ws open──> CONNECTED
^ │
│ manifest sent
│ │
│ v
└──disconnect()── READY <──manifest_ack── WAITING
│
voice/chat active
│
v
PROCESSING ──response──> READY
Platform Support
| Platform | Chat | Voice | Field Binding | Overlay | |----------|------|-------|---------------|---------| | Web | Yes | Yes | Yes | Yes | | Android | Yes | No | Yes | Yes | | iOS | Yes | No | Yes | Yes |
Voice requires Web Audio API, which is available on web browsers. Android and iOS support is limited to chat-based interactions for now.
Architecture
┌─────────────────────────────────────┐
│ Flutter App │
│ ┌───────────┐ ┌────────────────┐ │
│ │ JarvisOverlay│ │ Your Screens │ │
│ └──────┬────┘ └───────┬────────┘ │
│ │ │ │
│ ┌──────┴───────────────┴────────┐ │
│ │ JarvisClient │ │
│ │ ┌────────────┐ ┌──────────┐ │ │
│ │ │FieldRegistry│ │ManifestBuilder│ │
│ │ └────────────┘ └──────────┘ │ │
│ └──────────────┬────────────────┘ │
│ │ WebSocket │
└─────────────────┼───────────────────┘
│
┌─────────┴─────────┐
│ Vocall Engine │
│ /connect (WSS) │
└───────────────────┘
Quick Start
import 'package:jarvis_sdk/jarvis_sdk.dart';
void main() {
final client = JarvisClient(
serverUrl: 'wss://engine.example.com',
token: 'tok_abc123',
);
final manifest = ManifestBuilder()
.screen('home', label: 'Home')
.field('query', type: FieldType.text, label: 'Search')
.done()
.build();
client.connect(manifest: manifest);
runApp(
JarvisOverlay(
client: client,
child: const MyApp(),
),
);
}