We're just a few days away from Apple's WWDC 2024 keynote, and we expect the event to focus heavily on software. It all starts at 10am PT / 1pm ET / 6pm BST on June 10 (3am AEST, June 11) and luckily the special event will be live-streamed so you can watch.
While TechRadar will be on the ground, I polled some of my colleagues ahead of the developer conference to get an idea of what we want Apple to talk about and even reveal. These are some of the most repeated desired items and are not based on a specific platform or product.
Read on for five things we expect Apple to announce at WWDC, from a smarter Siri to a more advanced iPadOS with AI photo editing and handy iPhone features, which we expect Apple's Tim Cook to show off on stage.
5. A smarter keyboard for iPhone
With the long-awaited treasure trove of AI features, including everything from summarizing notes to letting Siri control apps, I expect we'll see some improvements for the keyboard on the iPhone and, why not, the iPad as well.
We have already suggested words and phrases above that take context into account. However, incorporating AI and understanding the message, what has been said or even the steps to know when the keyboard appeared could make it even more practical. Considering there are rumored smart replies and suggested responses for Messages and Mail, integrating this additional functionality into the keyboard would seem to go hand in hand.
Incorporating some formal grammar, spelling, and punctuation checking system would also help make messages a little cleaner. You could also avoid having to edit yourself or rewrite something entirely.
Between third-party apps like Snapseed or Pixelmator for iPhone, iPad, and Mac and competing phones like Google's Pixel with Magic Eraser and the Galaxy AI suite on Samsung phones, it's time to step up the editing game in Photos.
Whether on iPhone, iPad, or Mac, I'd love to be able to intelligently erase a person or object from the background of images. Some sort of super button that goes beyond simply altering white balance or contrast, but also integrates smart crops and other smart photography tools, could make the whole process much more concise and be a way for Apple to show your generative AI skills.
3. Smarter battery management on iPhone
When you still have a lot to do and your iPhone's battery is running low, you've probably looked in the settings for low power mode. But similar to how focus modes (basically customized versions of Do Not Disturb depending on what you're doing) turn on automatically, Apple should add some intelligence to battery management.
So if you're away from home and the percentage hits 50% or less, why not automatically turn on Low Power Mode, or at least send a notification to turn it on, to help your iPhone make it through the rest of the day?
2. A more advanced iPadOS
We've heard a lot of reports and rumors about iOS 18: summaries of notes, emails, and web pages along with a Siri that can control apps and a custom emoji creator. Many of these are likely coming to the iPad courtesy of iPadOS 18, but considering we just launched the iPad Pro with the M4 chip and the new iPad Air, pro users of Apple's tablets are still eager for more features. including.
One idea is a further expansion of Stage Manager, an advanced multitasking experience that lets you place and use multiple windowed apps on the same screen. It's also available on macOS, but when docked with the iPad's Magic Keyboard, it becomes a more laptop-like experience. It would be nice if using the iPad on the Magic Keyboard unlocked a more desktop-like mode with more freedom to place open apps and even icons or widgets on the home screen. The key with this, however, is still to tap first and allow the user to expand the controls with a trackpad, keyboard, or Apple Pencil.
1. A smarter and more intuitive Siri
Like everyone else, I want Siri to do more and be more useful on all devices where I can access the virtual assistant. The idea of a Siri that can control specific features of an app and even stack them is amazing and could be really useful. You wouldn't need to search through multiple apps to copy a photo, put it into files, and then share it with a colleague… it could just be a voice request. This would also be an opportunity for Apple to make its internal large language model technology more flexible, as it would allow Siri to understand what is on the screen, what was there, and what could be seen next; It's about being aware of the context.
The answer seems to be to give Siri the equivalent of a new brain or, rather, integrate what makes AI chatbots so interesting, and that's a whole new big language model to revamp Apple's virtual assistant. It could be a built-in house, but rumors also point to Apple partnering with OpenAI, which could inject some ChatGPT into Siri, but it's all speculative now. It also raises the question of processing on the device versus sending the request to the cloud; The latter may be a privacy issue but could also extend the time needed to provide a response.
In the end, I hope Siri will be more useful on the go with requests or queries, but also smarter at home, on Apple TV, on the wrist with an Apple Watch, and even on the Mac.