Skip to main content
  1. Home
  2. Phones
  3. Features

I turned the Notes app on my iPhone into a ChatGPT-powered memory bank

A single button tap is all it takes to save a snapshot, summarize it, save the URL, and save it as a visual note.

Add as a preferred source on Google
IPhone 17 Pro in a hand
oplus_2097152 Nadeem Sarwar / Digital Trends

Saving memories is usually a hassle. URLs go in the bookmarks section or are copied and pasted in a dedicated app. Then there are camera clicks of posters, or the dozens of screenshots we take, which lie around like an uncluttered mess in the Photos app.

Imagine a system where you press the iPhone’s Action Button. It takes a screenshot, writes a brief summary of the on-screen content, adds hashtags for quick search, and automatically saves all the information in an app of your choice. 

Recommended Videos

Sounds terrific, right? Well, the iPhone won’t do that. Android devices, such as the OnePlus 15, can. And that leaves you with a tedious process where you have to save, move, and manage important memories. 

What’s important is not always easy to find

I have over two thousand screenshots saved on my iPhone. Most of it is content that I wanted to revisit later. Random articles, a cool meme, course material, job listings, a music video on YouTube, and more. I want to save all that information and revisit it. 

It’s easier said than done. The search system built within the Photos app is simply not smart enough. I recently saved the screenshot of a fantastic photo series documenting the local dance traditions of Shillong, hoping to visit the place in the near future. 

Between saving that information and the weeks of work that followed, I took dozens of fresh screenshots for work duties. When I finally got some breathing space to explore the nearby village, I couldn’t find the original screengrab. 

I didn’t have the heart to scroll past a long gallery, and the search system didn’t help with contextual text-based queries. I was disappointed and somewhat furious. My iPhone 17 Pro can run AI models locally, but it doesn’t offer a system that can help save important stuff with some analysis data tagging along. 

You see, on the OnePlus 15, you get a feature called Mind Space. With the press of a button, a snapshot of your phone’s screen is saved. The onboard AI analyzes it, writes a short summary of it, saves the page URL, and even creates one-step chores. 

All of these memories are saved with a headline and the preview of the screenshot, as a dedicated card in an app. It’s like an AI-powered memory vault, one that is now also integrated with Gemini. The iPhone doesn’t offer any such convenience, despite ChatGPT being baked at the heart of Apple Intelligence

Thankfully, the Shortcuts app offered a respite. And even though it took a few minutes and some trial-and-error, I was able to replicate the same functionality as the Mind Space on OnePlus phones, and its equivalent system on Nothing smartphones

How do I get it done? 

My target was to create a quick and clear task flow, which looks like this: 

Button press > Screenshot saving > Analyze content > Write summary, save URL, create tags > Save to Notes. 

To get it working, I herded over to the Shortcuts app on my iPhone and created a framework of the actions. After deciding the exact chain of commands, I searched the “Use” function and selected the AI “Model” for the analysis and text generation work. 

When you are picking the model, you can choose between Apple’s Cloud, On-Device AI, and ChatGPT. For maximum speed, on-device works best. I, however, went with ChatGPT because it has been updated to the smarter GPT 5.1 model with better reasoning capabilities. 

Next, I described the requirements in the “Use Model” field using natural language description. Finally, I selected the built-in Notes app as the destination for saving these memories. You can also push any other app of your choice.  

Once the shortcut was ready and I gave it a name, I moved to the physical action part. Since I barely use the action button on my iPhone, I headed over to the Settings app and configured it to trigger the shortcut of my choice. 

That’s the end of it. Now, every time I need to save a nugget of information, I simply long-press on the action button and the on-page information (alongside a summary and URL) is saved in the Notes app. All of it happens in the background without jumping back and forth between apps. 

But what if you already use the Action Button for something else? Well, there’s a workaround for that, as well. You can park the same shortcut in the Control Center and trigger it from there. But you will have to make a small modification.

To put it specifically, you will need to create a one or two-second delay between tapping on the shortcut button in the control center and returning to the app where you want to take a snapshot. The delay is crucial. 

When you trigger the shortcut, and it instantly takes a screenshot, your image saved will be that of the control center. But when there is a delay, you can swipe up the control center and return to the app/page that you want to capture and save. Alternatively, you can use Siri to trigger the shortcut with a voice command on your iPhone. 

Overall, thanks to on-device AI chops, you can now save yourself the hassle of saving random screenshots and losing them in a crowded gallery. More importantly, you can customize the whole flow of saving the information, or add more steps to it, to fit your needs.

Nadeem Sarwar
Nadeem is the Managing Editor at Digital Trends.
I can’t live without iPhone shortcuts. These 7 are my favorites that you must try, too.
I've been using these 7 iPhone shortcuts for years, and they've completely changed how I use my phone.
iPhone showing shortcuts app

The iPhone Shortcuts app reminds me of Minecraft. It might be relatively easy to jump into, but it offers nearly limitless potential, allowing you to build anything you want. The same holds true for the Shortcuts app, and that endless possibilities are what many iPhone users might find intimidating. But you don't have to.

If you are new to iPhone shortcuts, think of them as little automated helpers. You can build them yourself or find ones that others have built and use them. And that’s the beauty of shortcuts. If you don’t want to get your hands dirty, you can find shortcuts others have created and tailor them to your needs. 

Read more
Gemini Intelligence has strict requirements, and your phone may not qualify
Gemini Intelligence

Google’s new Gemini Intelligence platform is quickly becoming one of the biggest talking points in the Android world right now. After being highlighted during this week’s Android Show, the feature is already being tied to several upcoming premium foldables and flagship phones. But there’s a catch: not every high-end Android device will be able to run it. And surprisingly, even some of Google and Samsung’s latest foldables may miss out.

According to Google’s requirements, Gemini Intelligence isn’t just another software update you can casually push to older devices. The company appears to be building this around a much stricter hardware and long-term software support system. To qualify, a phone needs a flagship-grade chipset, at least 12GB RAM, support for AI Core, and Gemini Nano v3 or newer. That immediately creates a problem for several current-generation phones.

Read more
Meta’s Ray-Ban Display now types messages from your finger movements
Neural Handwriting is a really cool feature, but Meta opening the Ray-Ban Display to developers is the quiet announcement that turns a clever wearable into a platform with immense possibilities.
Meta Ray-Ban Display and EMG Band.

Six months into its life, the Meta Ray-Ban Display is starting to look less like an experiment, thanks to what is arguably the most significant update Meta has ever pushed for the device. 

The headline feature is Neural Handwriting, which is now available to every Ray-Ban Display owner, having spent its early months in limited access for Messenger and WhatsApp users. 

Read more