newsweekshowcase.com

Apple’s intelligence is still not ready to wow you

The Verge: https://www.theverge.com/2024/10/28/24279804/apple-intelligence-ios-18-1-siri-ai

The Apple Intelligence is Here, but it still has a lot to learn (An Analysis of a Screenshot of Joes Moments with Child)

Essentially, there are two Apple Intelligences: the one that’s here now and the one we might see in the future. Even in today’s launch announcement, Apple is busy teasing the features that haven’t launched yet. A few of the tools here are related to helping you find the signal when the noise is loud. That is the theory, anyway.

Other features do what is expected, but they lack a human touch. I didn’t send any of the AI-suggested replies in my messages even though they conveyed the right sentiments. If I’m going to take the time to respond to a text, I might as well just write “That’s tough” myself rather than have AI do it, you know? That is part of the point of texting someone. I also prompted Photos to create a memory of moments of my kid, which it did but titled it the eerily impersonal “Joyous Moments with Child.”

There is, of course, an upgraded iteration of the same device. It looks different, but you don’t need to use it for a while to realize its just the same as before. It handles natural language better and includes more product knowledge to help you find settings on your iPhone, but that’s about it right now. In the near future, Apple will bring with it new features for Siri such as a ChatGp extension, which will arrive by the end of the year. The big stuff is contextual awareness, the ability to use apps and be part of the community.

Source: Apple Intelligence is here, but it still has a lot to learn

How useful is Magic Eraser, or a tool that detects when a user is expecting an unexpected medical emergency? A comparison of two different approaches

The tool does a good-enough job, especially for smaller objects in the background. Sometimes it is better than the older Magic Eraser tool, but only occasionally; it is not as good as the Magic Editor. That tool runs in the cloud, so it’s a little apples to oranges, but still. The results of Magic Eraser are pretty similar to what I get from Clean Up, but it is not a good argument for the upgrade cycle.

Over in Photos, you’ll find the new Clean Up tool in your editing options. The tool is designed to remove objects from a scene in a few seconds; you can choose to highlight the object or remove it on your own. It runs on-device, so you only have to wait a few moments, and you’ll see the selected object (mostly) disappear.

The notifications summaries seem a little more promising to me, at least at the very least, and I think it is pretty funny. But it also surfaced a bit of important information in a string of texts from a friend, and had I not seen that summary when glancing at my phone, I might have read the messages much later. That was helpful.

In the Mail app, AI summaries appear where the first line of an email would normally show up when you’re viewing an entire inbox; there’s also an option to summarize individual emails. Perhaps it’s a reflection of how useless email has become, but I did not find either of these features very useful. The feature we already use that summarizes an email makes sense to me. The subject line. At least that’s true of most emails I get; they’re usually short and to the point. I could live without a summary of every email I receive from the DNC, as Tim Cook saves himself lots of time reading long emails.

Apple uses artificial intelligence to summarize notifications so you know what’s going on. You can summarize long emails and use a new focus mode that filters out unnecessary distractions. After a week of using them, I don’t feel like I saved much time or energy.

One time, it summarized my work emails and said “medical emergency” as a part of it. I checked my inbox to see what was up. Turns out someone said they were responding a day late due to a medical emergency but that they were fine. It wasn’t an important work email—glad to hear they were fine—but the summary made me check my inbox when I didn’t need to. I would click into my notifications more times than not in order to see what Apple Intelligence highlighted.

Summaries are a part of Apple Intelligence. You can use it to get an overview of web pages and even your notifications. If you have multiple messages from a group chat, the summary will highlight important things that were said and you’ll be able to click in to see the full details. There is not much use in this, as my summaries are often garbled.

Elsewhere you’ll see the option to send Smart Replies—quick AI-generated messages based on the context of the conversation, like “Thank you” or “Sounds good”—to people in Messages and Mail. This can be helpful, but it’s hard to be excited about a feature that was built into the email service.

You can type to Siri now, though this is technically not new. Previously this was an accessibility setting, which Apple has now baked into the experience, finally catching up to Alexa and Google Assistant that have had this default capability for years. Even if you trip up while asking the question, the new design and ability to comprehend queries makes it easier to use. It has a new coat of paint, but it still feels the same as before and might feel a bit let down.

So what can you do right now? Writing tools allows you to rewrite, Proofread or Summarize text wherever you are in the operating system. Rewrite and Proofread can change a sentence’s tone from casual to professional. Too bad it’s nearly impossible to remember this feature exists, because it only shows up when you highlight words. Maybe Writing Tools could be better with a button built into the virtual keyboard.

Exit mobile version