Ever wondered how to integrate AI capabilities into your mobile app? Today, we’ll build a fully functional AI chat app using React Native and OpenAI’s API from scratch.
What We’re Building
Our chat app will feature:
- Real-time AI conversations using GPT-4
- Message persistence with AsyncStorage
- Typing indicators and smooth animations
- Custom message bubbles with markdown support
- Voice input integration (bonus feature)
Prerequisites
Before we dive in, make sure you have:
# React Native CLI
npm install -g @react-native-community/cli
# OpenAI API key (get yours at platform.openai.com)
# iOS/Android development environment set up
Project Setup
Let’s start by creating our React Native project:
npx react-native init AIChat
cd AIChat
# Install dependencies
npm install @react-native-async-storage/async-storage
npm install react-native-vector-icons
npm install openai
npm install react-native-markdown-display
npm install react-native-reanimated
Building the Core Chat Interface
1. Message Component
First, let’s create our message bubble component:
// components/MessageBubble.js
import React from 'react';
import { View, Text, StyleSheet } from 'react-native';
import Markdown from 'react-native-markdown-display';
const MessageBubble = ({ message, isUser }) => {
return (
<View style={[
styles.messageBubble,
isUser ? styles.userMessage : styles.aiMessage
]}>
{isUser ? (
<Text style={styles.messageText}>{message.content}</Text>
) : (
<Markdown style={markdownStyles}>{message.content}</Markdown>
)}
<Text style={styles.timestamp}>
{new Date(message.timestamp).toLocaleTimeString([], {
hour: '2-digit',
minute: '2-digit'
})}
</Text>
</View>
);
};
const styles = StyleSheet.create({
messageBubble: {
maxWidth: '80%',
padding: 12,
marginVertical: 4,
borderRadius: 18,
},
userMessage: {
backgroundColor: '#4CC5DC',
alignSelf: 'flex-end',
marginRight: 16,
},
aiMessage: {
backgroundColor: '#2A2D3A',
alignSelf: 'flex-start',
marginLeft: 16,
},
messageText: {
color: '#FFFFFF',
fontSize: 16,
},
timestamp: {
color: 'rgba(255, 255, 255, 0.7)',
fontSize: 12,
marginTop: 4,
alignSelf: 'flex-end',
},
});
const markdownStyles = {
body: { color: '#FFFFFF' },
code_inline: { backgroundColor: '#1A1D26', color: '#4CC5DC' },
code_block: { backgroundColor: '#1A1D26', color: '#FFFFFF' },
};
export default MessageBubble;
2. OpenAI Service Integration
Create a service to handle OpenAI API calls:
// services/openaiService.js
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your-openai-api-key-here', // In production, use environment variables
});
export const sendMessageToAI = async (messages) => {
try {
const response = await openai.chat.completions.create({
model: 'gpt-4',
messages: [
{
role: 'system',
content: 'You are a helpful AI assistant. Keep responses concise and helpful.'
},
...messages.map(msg => ({
role: msg.isUser ? 'user' : 'assistant',
content: msg.content
}))
],
max_tokens: 500,
temperature: 0.7,
});
return response.choices[0].message.content;
} catch (error) {
console.error('OpenAI API Error:', error);
throw new Error('Failed to get AI response');
}
};
3. Main Chat Screen
Now let’s build the main chat interface:
// screens/ChatScreen.js
import React, { useState, useEffect, useRef } from 'react';
import {
View,
Text,
TextInput,
TouchableOpacity,
FlatList,
StyleSheet,
KeyboardAvoidingView,
Platform,
Alert,
} from 'react-native';
import AsyncStorage from '@react-native-async-storage/async-storage';
import MessageBubble from '../components/MessageBubble';
import { sendMessageToAI } from '../services/openaiService';
const ChatScreen = () => {
const [messages, setMessages] = useState([]);
const [inputText, setInputText] = useState('');
const [isLoading, setIsLoading] = useState(false);
const flatListRef = useRef(null);
useEffect(() => {
loadMessages();
}, []);
const loadMessages = async () => {
try {
const savedMessages = await AsyncStorage.getItem('chatMessages');
if (savedMessages) {
setMessages(JSON.parse(savedMessages));
}
} catch (error) {
console.error('Failed to load messages:', error);
}
};
const saveMessages = async (newMessages) => {
try {
await AsyncStorage.setItem('chatMessages', JSON.stringify(newMessages));
} catch (error) {
console.error('Failed to save messages:', error);
}
};
const sendMessage = async () => {
if (!inputText.trim()) return;
const userMessage = {
id: Date.now(),
content: inputText.trim(),
isUser: true,
timestamp: new Date().toISOString(),
};
const updatedMessages = [...messages, userMessage];
setMessages(updatedMessages);
setInputText('');
setIsLoading(true);
try {
const aiResponse = await sendMessageToAI(updatedMessages);
const aiMessage = {
id: Date.now() + 1,
content: aiResponse,
isUser: false,
timestamp: new Date().toISOString(),
};
const finalMessages = [...updatedMessages, aiMessage];
setMessages(finalMessages);
await saveMessages(finalMessages);
} catch (error) {
Alert.alert('Error', 'Failed to get AI response. Please try again.');
} finally {
setIsLoading(false);
}
};
const renderMessage = ({ item }) => (
<MessageBubble message={item} isUser={item.isUser} />
);
return (
<KeyboardAvoidingView
style={styles.container}
behavior={Platform.OS === 'ios' ? 'padding' : 'height'}
>
<View style={styles.header}>
<Text style={styles.headerTitle}>AI Chat Assistant</Text>
</View>
<FlatList
ref={flatListRef}
data={messages}
renderItem={renderMessage}
keyExtractor={(item) => item.id.toString()}
style={styles.messagesList}
onContentSizeChange={() => flatListRef.current?.scrollToEnd()}
/>
{isLoading && (
<View style={styles.typingIndicator}>
<Text style={styles.typingText}>AI is typing...</Text>
</View>
)}
<View style={styles.inputContainer}>
<TextInput
style={styles.textInput}
value={inputText}
onChangeText={setInputText}
placeholder="Type your message..."
placeholderTextColor="#8B8B8B"
multiline
maxLength={500}
/>
<TouchableOpacity
style={[styles.sendButton, !inputText.trim() && styles.sendButtonDisabled]}
onPress={sendMessage}
disabled={!inputText.trim() || isLoading}
>
<Text style={styles.sendButtonText}>Send</Text>
</TouchableOpacity>
</View>
</KeyboardAvoidingView>
);
};
const styles = StyleSheet.create({
container: {
flex: 1,
backgroundColor: '#161B22',
},
header: {
padding: 16,
backgroundColor: '#21262D',
borderBottomWidth: 1,
borderBottomColor: '#30363D',
},
headerTitle: {
color: '#FFFFFF',
fontSize: 18,
fontWeight: 'bold',
textAlign: 'center',
},
messagesList: {
flex: 1,
paddingVertical: 8,
},
typingIndicator: {
padding: 16,
alignItems: 'center',
},
typingText: {
color: '#8B8B8B',
fontStyle: 'italic',
},
inputContainer: {
flexDirection: 'row',
padding: 16,
backgroundColor: '#21262D',
alignItems: 'flex-end',
},
textInput: {
flex: 1,
borderWidth: 1,
borderColor: '#30363D',
borderRadius: 20,
paddingHorizontal: 16,
paddingVertical: 12,
marginRight: 8,
backgroundColor: '#0D1117',
color: '#FFFFFF',
maxHeight: 100,
},
sendButton: {
backgroundColor: '#4CC5DC',
paddingHorizontal: 20,
paddingVertical: 12,
borderRadius: 20,
},
sendButtonDisabled: {
backgroundColor: '#30363D',
},
sendButtonText: {
color: '#FFFFFF',
fontWeight: 'bold',
},
});
export default ChatScreen;
Advanced Features
Message Persistence Strategy
Our app uses AsyncStorage to persist chat history:
// utils/messageStorage.js
export const MESSAGE_STORAGE_KEY = 'ai_chat_messages';
export const MAX_STORED_MESSAGES = 100;
export const cleanupOldMessages = (messages) => {
if (messages.length > MAX_STORED_MESSAGES) {
return messages.slice(-MAX_STORED_MESSAGES);
}
return messages;
};
Adding Voice Input (Bonus)
Integrate speech-to-text for voice messages:
npm install @react-native-voice/voice
// Add to ChatScreen.js
import Voice from '@react-native-voice/voice';
const startVoiceRecording = async () => {
try {
await Voice.start('en-US');
} catch (error) {
console.error('Voice recording error:', error);
}
};
Performance Optimizations
1. Message Virtualization
For large chat histories, implement FlatList with getItemLayout
for better performance.
2. Image Optimization
const optimizeMessage = (message) => {
// Implement message compression for storage
return {
...message,
content: message.content.length > 1000
? message.content.substring(0, 1000) + '...'
: message.content
};
};
3. API Rate Limiting
const rateLimiter = {
lastRequest: 0,
minInterval: 1000, // 1 second between requests
canMakeRequest() {
const now = Date.now();
if (now - this.lastRequest < this.minInterval) {
return false;
}
this.lastRequest = now;
return true;
}
};
Testing Your App
Unit Tests
// __tests__/MessageBubble.test.js
import React from 'react';
import { render } from '@testing-library/react-native';
import MessageBubble from '../components/MessageBubble';
test('renders user message correctly', () => {
const message = {
content: 'Hello, world!',
timestamp: new Date().toISOString(),
};
const { getByText } = render(
<MessageBubble message={message} isUser={true} />
);
expect(getByText('Hello, world!')).toBeTruthy();
});
Deployment Considerations
Environment Variables
// config/environment.js
export const OPENAI_API_KEY = __DEV__
? 'your-dev-key-here'
: 'your-production-key-here';
Error Handling
const handleAPIError = (error) => {
if (error.response?.status === 429) {
return 'Rate limit exceeded. Please try again in a moment.';
} else if (error.response?.status === 401) {
return 'Authentication failed. Please check your API key.';
}
return 'Something went wrong. Please try again.';
};
What’s Next?
Now that you have a working AI chat app, consider adding:
- Custom AI Personalities: Different AI characters with unique personalities
- Image Analysis: Send photos to OpenAI’s vision API
- Voice Responses: Text-to-speech for AI messages
- Chat History Search: Find specific conversations quickly
- Dark/Light Theme Toggle: User preference customization
Key Takeaways
Building AI-powered mobile apps is more accessible than ever:
- Start Simple: Focus on core functionality first
- Handle Errors Gracefully: API calls can fail - plan for it
- Optimize Performance: Large chat histories need careful handling
- Secure API Keys: Never hardcode production keys
- Test Thoroughly: AI responses can be unpredictable
Ready to enhance your app further? Check out the OpenAI API documentation for advanced features. Share your AI chat creations with us on Twitter @CuratorsHub!