Apple Intelligence

@dan Would love to see the text and image generators added to RW products.

We had a hunch Apple would integrate AI into the OS, it’s just like when Steve Jobs said “Dropbox is a feature, not a product”, it’s the same all over again with AI.

Using the system level API’s will make it easier and more seamless to integrate into RapidWeaver. Lots to dig into and see how this actual works and what’s possible, but today’s announcement is great news!


So true…

Now the question is what will Apple expose to you and when will they expose it to you to implement in Elements. And, besides the obvious stuff, to what creative uses can you put Apple Intelligence and App Intents?

These are good questions. I have been building AI Prompts. I have learned a lot from my studies. Most of my teachers have been crap repeating the same old stuff, but a couple have been very useful. My point is, will I be able to customize the prompts enough to get the same or better results? How easy, difficult, automated will prompt building be? What granular controls will I have to create text and images? These are things I hope Apple thought of. Without them, it will all sound like an AI output the results. By now, I’m sure most of us recognize the garbage in, garbage out most people get. I have turned 180 degrees from the early days of using AI. I can now get results as good or better than my own original work and WAY better researched in seconds, all day long. So I hope to see Apple’s implementation be a good one. :slight_smile:

1 Like

Based on the keynote alone, I’d say mostly text and image creation for now. And of course the writing coach functionality, as that works in any mac application that accepts text input.

But who knows - Apple hinted at more integration options to be rolled out in the coming year, so maybe in the near future you can simply ask Siri to build a website in RapidWeaver for you.

On the iPhone, if a task is too complex for the on-device neural engine to handle, it sends it off to Apple’s cloud servers. If even they are incapable of handling it, a prompt will pop up asking you if you agree to send the query off to ChatGPT instead. Still anonymous, as Apple’s server will make the request for you and obfuscate your identity (including your IP).

So, in short, anything that ChatGPT can do, Apple Intelligence can do. I assume the same is true for Apple Intelligence on the Mac.

You will need an Apple Silicon-based Mac for this (although even the entry level M1 will do). Apple Intelligence will not be present in macOs if you’re running it on an Intel based machine.


1 Like

This ties into this post AI integration in RW Elements

The Keynote yesterday was probably one of the more interesting ones I’ve seen in recent years. I like Apple’s approach to AI. As usual they are a bit late to the party, but when they do show up they do it with style, substance, and polish. :dancer::man_dancing:

IMHO the most useful Apple Intelligence features for RapidWeaver would be Writing Tools and Image Generation.

While I wouldn’t personally want to talk to Siri to build a website, I can imagine it would be an incredible accessibility feature for people who are paralyzed or have other motor function disabilities that prevent them from using a mouse or keyboard.

Looking forward to seeing how Apple Intelligence progresses.

On the pure text/image side I agree with you, but I just see that mostly as bling for the masses…however useful it is. I think that the real interesting stuff starts to happen with what when the developers start using Siri and Apple Intelligence with App Intents…the ability for your text/voice to actually do things on your device. Some simple examples:

“Republish the entire website.”
“Publish the Homepage and the new Product A landing page.”
“Change the website theme to Lander Pro.”
“Change the website theme to the same one I used on the website I started last week.”
“Ignore the Master Style for this page and change the background color to blue.”

I could go on and on obviously

1 Like

Perhaps not an entire site, but imagine this:

“Siri, add a grid of images here that resize automatically when a user hovers over them with their mouse.”

Or, perhaps even better:

“Siri, make this link so that the document always opens in a new window or tab instead of the user’s browser downloading it.”


Exactly! App Intents are the key, but a ways off for now. Not only does Apple need to release it, app publishers need to integrate it too.

So all the things you and I just typed as examples, need to be automated in RapidWeaver before Siri can do anything with it.

At least, that’s how I understand it.


We will see how things land on GM right now I am sure there will be many iterations on Mac OS Beta processes. I only have one machine while working abroad so I will wait before giving the os a try. there have been times when testing betas messed up my testing machine in the past.

I do have a second machine that I can run the beta on, but that’s an Intel based machine so no Apple Intelligence :frowning:

My primary machine is an M1, but I’m not installing a beta on that (this is the machine I earn my income on after all).

So I’ll let others try it out this time :smiley:


my Mac mini is an intel machine as well need to relegate that as the office web server and then buy a new Mac mini for trashing with beta builds but on the road as much as I am I will have to get it online either in the data centre and remote in or use noip with a subdomain and run it from the home office. At least now I have Fibre 800mps line and now users per se should be fine.

Yeah I definitely can see its potential, just wondering how many people would use it. I feel like a lot of people still don’t like “talking” to their computer, but I don’t have any stats to back that up, just a feeling.

Of course if Apple improves Siri so that it’s not a complete dumpster :fire: and can actually understand context and well, getting Siri to understand and respond correctly to what people are asking would be a great start, then perhaps people would use it more. I am really looking forward to seeing what Siri will be able to do with all these AI advancements.

Also I like they are adding the feature to AirPods Pro 2 where you can answer Siri by nodding yes or no, makes it a little less weird by not having to randomly shout out yes or no in public.

Hmm I spend a lot of time asking myself questions and then answering them. At this point I just tell people I am talking to “Siri”. I hope Siri responds audibly with more context so that at least people start believing me. right no people think I have gone mad in the office .lol

1 Like

It’s a generational thing.

Me? I grew up in the 80s. Everybody talked to everything on TV, from Matt Tracker’s computer in M.A.S.K., via the Enterprise-D’s ship computer to Knight Rider’s KITT. Not to mention the Droids in Star Wars.

I guess my generation was conditioned to talk to inanimate objects and expect them to talk back…



Don’t forget Twiki (yeah I know I’m older) :robot:

1 Like

You don’t know how happy I was when everyone started using AirPods as I was no longer the only one wandering around talking to himself.

I think we’re from the same generation, I remember my 80’s morning cartoons (Saturdays were the best) and good old TNG. :slightly_smiling_face:

Talking to computers has been a common sci-fi trope well before our generation though. It’s been in the public consciousness for a long while, yet I can’t recall a time I’ve ever been in public and come across someone verbally communicating with their phone. Take for example translation apps, they’ve come miles in their accuracy and ability to have more fluid two-way communications in real-time, but yet when walking the streets in foreign countries I still see people typing into their phone and then showing the translation to the other person on the screen, or simply using the tried and true “point to the thing you want and make hand and head gestures”.

Voice commands seem more used in the home, for example “Alexa play pop playlist” or “Hey Google search for movies with Harrison Ford”, etc. Or maybe in the car for getting directions or replying to texts.

Anyway this is all personal observations. Indeed it could be generational, or even regional. I do believe with the advancements in AI, people are definitely going to be moving in the direction where talking to their tech becomes natural.

1 Like

Haha yeah exactly this, I still feel weird talking to tech in public, can’t shake the whole “Why’s that person talking to themselves” mentality lol. AirPods definitely help reduce the “That person’s crazy” vibe. :crazy_face:

1 Like