AI in Mobile Apps: When and How Should You Integrate SwiftUI with OpenAI API?

Integration with large language models is currently one of the most exciting uses of AI in mobile apps.
Intelligent features are appearing in them more and more frequently. They answer users’ questions and assist them with daily tasks.
Discover when it is beneficial to integrate the OpenAI API with a mobile app using SwiftUI, and learn how to easily create a chatbot.
Why should you connect your mobile app with AI?
AI adoption in companies is becoming more frequent. According to a PwC report, as many as 79% of surveyed senior executives state that AI solutions are already being used in their companies. Two-thirds of them report an improvement in efficiency as a result.
Artificial intelligence is also becoming more accessible to mobile app users. They are increasingly expecting apps to “understand” them better—and this is precisely what is made possible by algorithms based on machine learning and language models, such as GPT-4.1 or o4-mini. AI also better solves their daily problems.
Thanks to AI, apps can better predict user needs, automate routine tasks, assist in decision-making, and provide personalized content.
They offer access to a wide range of solutions—from automatic text recognition and product recommendations to natural conversations with the user. As a result, the app can engage the user more deeply and provide greater satisfaction with the product.
Benefits of integrating apps with the GPT model
The widely known ChatGPT is a chatbot based on an advanced language model, built on the GPT (Generative Pre-trained Transformer) architecture. It was developed by OpenAI. It works by predicting the next words in a sentence, creating coherent and (usually) logical responses.
Among other things, it can hold conversations, answer questions, translate texts, summarize documents, and generate ideas. This is possible because it was trained on an enormous amount of content. Thanks to this, it can be useful in educational, business, and entertainment apps alike.
What are the benefits of integrating AI with a mobile app?
- Time savings: The project can be completed faster by automating processes such as registration, bug reporting, and obtaining support through AI.
- Lower customer service and technical support costs.
- Increased engagement and UX personalization: Customers are more likely to return to an app when they feel it addresses their needs. AI can help predict these needs and develop solutions that users will like.
- Multilingual support: Thanks to AI, you can quickly provide information or publish content in different languages.
- Better communication: With the help of AI models, you can create a chat interface, streamline app navigation, and improve the process of providing information. A 24/7 assistant can answer user questions without requiring your team’s involvement.
Examples of AI chat usage in apps
- E-commerce: A chatbot shopping assistant suggests products, answers questions about delivery, and processes complaints.
- Education: An interactive tutor answers course questions and helps with studying.
- Health: Reminders, symptom monitoring, and answers to common patient questions.
- Financial Services: An AI advisor explains how banking and investment products work in simple terms.
What model should you choose?
When choosing a language model, the newest one is not always the best choice. To find the right one for your project, you should consider your priorities and intended use. Think about which aspects are key for you: cost, performance, response time, or quality.
Example no.1:
GPT-3.5-turbo model
- Pros: very low cost, quality is good enough for simple chatbots, FAQ, client support
- Cons: poorer understanding of context, lower quality in longer conversations
Example no.2:
GPT-4o
The GPT-4o model is an option that provides high-quality conversations (e.g., for AI therapists or advanced assistants).
- Pros: better nuance recognition, more natural and faster responses
- Cons: higher cost
You can also use a mixed approach, where the model is dynamically switched. For example, you can use gpt-3.5 by default but switch to gpt-4o for more difficult queries or for paying users (with a premium version).
Keep in mind: AI solutions are not infallible. When you ask a question, the model analyzes sequences of words and tries to generate the most probable continuation, but it doesn’t understand their true meaning; it only operates on linguistic patterns.
That’s why, when it lacks the right information, it sometimes “guesses” the answer, presenting fabricated facts. In AI jargon, this phenomenon is called hallucinations. You should take every possible action to protect against providing incorrect answers.
SwiftUI as a modern tool to create UI
SwiftUI is an Apple framework. Thanks to its declarative approach, it allows for creating clean and efficient code that is easy to test and develop. SwiftUI integrates very well with asynchronous network operations, which is crucial for communicating with AI APIs.
SwiftUI also offers a rich ecosystem of development tools, integration with Combine and Swift Concurrency, and support for modern design patterns (e.g., MVVM). All of this makes SwiftUI an excellent choice for apps that use artificial intelligence.
Check out how to integrate SwiftUI with GPT.
Configuring a SwiftUI project with GPT integration
Environment setup
Before you start writing code, you need to prepare the app to work with an external API:
a) Imports
Make sure you have the right imports in your file:
import SwiftUI
import Foundation
b) Network permissions
For the app to connect to the Internet and the OpenAI server, you need to add an entry to the Info.plist
file. The simplest way (for testing purposes) is this:
<key>NSAppTransportSecurity</key>
<dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
</dict>
Na produkcji zaleca się ograniczyć wyjątki tylko do domeny api.openai.com, np.:
<key>NSAppTransportSecurity</key>
<dict>
<key>NSExceptionDomains</key>
<dict>
<key>api.openai.com</key>
<dict>
<key>NSExceptionAllowsInsecureHTTPLoads</key>
<false/>
<key>NSIncludesSubdomains</key>
<true/>
<key>NSRequiresCertificateTransparency</key>
<false/>
</dict>
</dict>
Building a chat interface in SwiftUI
This stage involves creating the visual part of the app—a chat view where the user can type messages and the app responds to them.
The code below creates a basic text chat:
struct ChatMessage: Identifiable {
let id = UUID()
let content: String
let isUser: Bool
}
struct ChatView: View {
@State private var messages: [ChatMessage] = []
@State private var inputText: String = ""
var body: some View {
VStack {
ScrollView {
ForEach(messages) { message in
HStack {
if message.isUser {
Spacer()
Text(message.content).padding().background(Color.blue).cornerRadius(8)
.foregroundColor(.white)
} else {
Text(message.content).padding().background(Color.gray.opacity(0.2)).cornerRadius(8)
Spacer()
}
}.padding(.horizontal)
}
}
HStack {
TextField("Wpisz wiadomość...", text: $inputText)
Button("Wyślij") {
sendMessage()
}
}.padding()
}
}
func sendMessage() {
let userMessage = ChatMessage(content: inputText, isUser: true)
messages.append(userMessage)
inputText = ""
fetchResponse(prompt: userMessage.content) { response in
DispatchQueue.main.async {
let botMessage = ChatMessage(content: response, isUser: false)
messages.append(botMessage)
}
}
}
}
Connecting to the OpenAI API
This stage is the heart of the app’s logic.
You create a fetchResponse
function that sends a request to OpenAI and then receives a response. At this point, the app connects to the artificial intelligence server and forwards the response to the user.
Here is a sample function that sends a request:
func fetchResponse(prompt: String, completion: @escaping (String) -> Void) {
let url = URL(string: "https://api.openai.com/v1/chat/completions")!
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.addValue("Bearer TWÓJ_KLUCZ_API", forHTTPHeaderField: "Authorization")
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
let body: [String: Any] = [
"model": "gpt-3.5-turbo",
"messages": [["role": "user", "content": prompt]],
]
request.httpBody = try? JSONSerialization.data(withJSONObject: body)
URLSession.shared.dataTask(with: request) { data, response, error in
guard let data = data,
let json = try? JSONSerialization.jsonObject(with: data) as? [String: Any],
let choices = json["choices"] as? [[String: Any]],
let message = choices.first?["message"] as? [String: Any],
let content = message["content"] as? String
else {
completion("Brak odpowiedzi od modelu.")
return
}
completion(content.trimmingCharacters(in: .whitespacesAndNewlines))
}.resume()
}
The fetchResponse(prompt:completion:)
function sends a text query to the GPT language model via the OpenAI API and returns the generated response as text.
The code will include the following snippet:
request.addValue("Bearer TWÓJ_KLUCZ_API", forHTTPHeaderField: "Authorization")
This enables the integration of artificial intelligence functionalities into SwiftUI apps. For example, to create a smart user assistant.
Keep in mind: For security reasons, you shouldn’t keep the API key directly in the app’s code. You can: keep it in a separate
.plist file
, store it in a Keychain, or use a proxy API server to hide the key from the client app.
Sygnature:
func fetchResponse(prompt: String, completion: @escaping (String) -> Void)
Parameters:
prompt: String
The content of the question or message that the user wants to send in the chat.completion: @escaping (String) -> Void
A block that runs asynchronously after receiving a response from the API. It returns the model’s text response or an error message.
Creating a URL and an HTTP request
let url = URL(string: "https://api.openai.com/v1/chat/completions")!
var request = URLRequest(url: url)
request.httpMethod = "POST"
Adding headers
request.addValue("Bearer TWÓJ_KLUCZ_API", forHTTPHeaderField: "Authorization")
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
Authorization
: API key received from OpenAI
Content-Type
: data type (JSON)
Preparing a request
let body: [String: Any] = [
"model": "gpt-3.5-turbo",
"messages": [["role": "user", "content": prompt]],
]
request.httpBody = try? JSONSerialization.data(withJSONObject: body)
model
: specifies the model version (here: gpt-3.5-turbo)
messages
: an array with the user’s message; the model works contextually
Sending a request and response handling
URLSession.shared.dataTask(with: request) { data, response, error in
// Dekodowanie odpowiedzi z JSON
}.resume()
Function:
- Creates an asynchronous request
- Reads the response in JSON format
- Extracts the generated content from the first element of the
choices
array.
Success and error handling
guard let content = ... else {
completion("Brak odpowiedzi od modelu.")
return
}
Continuous app development
The possibilities of AI integration don’t end with a simple chat. For example, you can add support for conversation content or use alternative models and APIs. All of this will allow you to expand the app’s functionalities and tailor it to the needs of users and the business.
Conversation context management
Initially, the chatbot only responds to one question at a time and doesn’t remember previous messages. It’s a bit like talking to someone who forgets what was just said.
To have the conversation remembered, you can add new messages to the messages array and send the entire conversation context with each query. This way, the app remembers previous questions and responses, and then sends the entire chat history to the AI model every time the user types something.
As a result, the AI better understands what the user is asking about because it already knows the previous topics and doesn’t have to start from scratch each time. This makes the conversation more natural.
Other models and APIs
In the future, you can add solutions such as:
- Speech recognition (Speech API)
- Image generation (DALL·E)
- Integration with a database like Firestore or CoreData.
Summary
Integrating artificial intelligence, including GPT models, into mobile apps built with SwiftUI opens up new possibilities for personalization and automation. It allows for streamlining processes, reducing costs, and increasing user engagement.
At the same time, however, it’s important to remember the challenges related to AI accuracy.
Would you like to learn how to implement AI solutions in your app? Contact us to discuss options tailored to your needs.