iOS 18.2: Everything You Can Do With ChatGPT Integration
In iOS 18.2, Apple introduced the integration of ChatGPT with Apple Intelligence to extend the iPhone’s AI capabilities in a variety of ways. When enabled, Siri can leverage ChatGPT to conduct complex queries about photos and files, and the integration can also be extended to writing tools for generating text and images, while Visual Intelligence can help identify objects and places using the iPhone’s camera.
You don’t need a ChatGPT account to get started, but connecting a free or paid ChatGPT account can unlock additional features and provide more frequent access to premium features. This guide explains how to set up and take advantage of these new features. Note that you must have an iPhone 15 Pro or iPhone 16 model to use the Apple Intelligence feature in iOS 18.2.
Set up ChatGPT in iOS 18.2
If you have Apple Intelligence enabled, setting up the optional ChatGPT integration takes just a few steps, and you don’t even need a ChatGPT account to get started. You can always start with a basic setup and upgrade to a connected account if needed (although you may never need to – more on that below).
- Open settings.
- tap Apple Intelligence and Siri.
- Under “Extensions,” click Chat GPT.
- Turn on the switch next to it Using ChatGPT.
If you have a ChatGPT account (free or paid) you can choose log in Use your account credentials from the last screen. One of the advantages of doing this is that it allows you to keep your chat history and return to it later in the ChatGPT app or website.
Keep in mind that you don’t need an account to use ChatGPT with Siri – it’s completely free, but you may eventually hit the daily limit of OpenAI’s premium features, which use the latest GPT-4o model. Once these limits are reached, the system switches to basic mode until 24 hours have passed. While it’s not confirmed yet, the base mode may use OpenAI’s GPT-4o mini-model, which can handle the most common requests faster, but the responses you get may be less detailed. However, in our testing, there wasn’t much difference between them in everyday iPhone use.
When the ChatGPT extension is enabled, Siri automatically determines when to use ChatGPT to better respond to your queries. However, you can control whether Siri asks you before sending any messages to ChatGPT by toggling the switch next to it Confirm ChatGPT request In the Chat GPT extension settings menu. Note that Siri will always ask for permission before sending files to ChatGPT.
Apple, ChatGPT, and your privacy
Apple says that when you use the ChatGPT extension without logging in, only your request and any attachments (such as files or photos) will be sent to ChatGPT to process your query. OpenAI does not receive any communication associated with your Apple account, and your IP address remains hidden. Share only your approximate location.
OpenAI does not store your query or its responses, nor is your data used to enhance or train their models. Your ChatGPT account settings and OpenAI’s data privacy policy will only apply if you choose to log in.
Siri integration
The combination of Siri and ChatGPT easily enhances your voice assistant’s capabilities compared to what you’re used to. This integration is ideal for complex queries involving problem solving, writing help, detailed explanations, and step-by-step instructions. You’ll find that responses are more detailed and context-aware than standard Siri functionality.
Siri will analyze each request to see if a ChatGPT response is required, but you can specify that ChatGPT be used simply by launching a Siri query via “ChatGPT.” This actually opens up more integration utility. For example, you can ask ChatGPT to generate images based on prompts, and it will use Dall-E to do the heavy lifting. Even better, the results are often better than Apple’s Image Playground. You can save the resulting image using the “Save” button in the upper right corner of the output card.
Pro tip: If you query ChatGPT in a message and ask it to generate an image, it will even put the image into a paragraph of text, ready to be shared in the conversation.
You can also ask ChatGPT a question about something on the screen, and Siri will proactively send a screenshot of it or, if the file is long, the entire content as a file.
You can use clone A button in the upper right corner of the scrollable response window copies the output to the clipboard.
Or, you can save a helpful reply by calling Siri and saying “save this to my notes,” which gives you a searchable archive of the conversation in Notes that’s retained after the conversation ends. This feature is especially useful if you are not logged into ChatGPT and cannot return to your account to view chat history.
ChatGPT can be used with Siri, but it’s also integrated into writing tools and visual intelligence. With writing tools, ChatGPT can generate text, and with “visual intelligence,” ChatGPT can answer questions about what the camera is seeing. We discuss these specific integrations below.
With the advent of ChatGPT integration in iOS 18.2, the writing tool gets a new Compose option. This allows you to describe what you want to write and ChatGPT will create it for you.
But the Compose option doesn’t just limit you to text prompts. If you look at the text input field, you’ll see a + button. Clicking this button will bring up the option to upload files or images on your iPhone to ChatGPT so that the chatbot can refer to them when querying. Once a response is generated, you’ll also see further suggested queries from ChatGPT in the Compose panel.
Compose functionality isn’t limited to generating text from files or images stored on your iPhone. For example, if you’re in a Note, you can select the text you want ChatGPT to use, or you can ask it to quote all text in the note. You’ll also see these options in a clickable menu above the compose input field.
ChatGPT’s writing options work virtually anywhere on iPhone, where you can access writing tools such as Notes, Messages, and Safari, as well as third-party applications that support the Apple Intelligence feature set.
ChatGPT and visual intelligence
Visual Intelligence is an iPhone 16 feature that uses the camera controls located on the lower right side of the device. If you press and hold it, you can enter “Visual Intelligence” mode, where the camera app can be used to identify things around you.
For example, if you point your camera at an object or take a photo of the object, you can click ask button and ChatGPT will analyze the contents of the viewfinder to identify objects. If the output does not answer your query, you can follow up by typing in the ChatGPT input field. This is great for getting more information about just about anything around your home or when you’re out and about.
Future chatbot extension for iOS
There are reports that Apple Intelligence will be integrated with artificial intelligence chatbots such as Google’s Gemini and Anthropic’s Claude in the future, but no official announcement has been made yet. Still, Apple software chief Craig Federighi said in June that he’d like to see Gemini integration happen at some point.
according to BurundiMark GurmanApple may delay the integration of Google Gemini until next year to provide an exclusivity window for OpenAI, especially considering that Apple is not paying for the technology. However, it’s unclear whether Gemini’s arrival will coincide with the iOS 18 update in the spring, or if it will be part of the iOS 19 release cycle later next year.
2024-12-17 22:30:45