Project Details
YEAR 2023
About
CONCEPT for a cross-platform iOS, macOS, and visionOS integration of an LLM-based Apple AI assistant.
Scope
Discovery
→ Examine existing AI solutions like Google Bard, Bing Chat, and Perplexity.ai for UX and usability
→ Examine existing Apple Searclight and Siri features & functionality to identify superfluity
→ Identify potential specific use-cases for this technology
Design
→ Replicate iOS, macOS, and visionOS using official Apple UI Figma kits
→ Take existing Apple UI standards and iterate Chat concept integrations for each platform
Development
→ Built exclusively in Figma
→ High-fidelity mockups of standalone interface elements & features added to current Apple devices
Challenges & Reasoning
Would this be a more capable technology introduced alongside Siri, or would Siri remain and the enhanced LLM-powered features of Apple Chat simply be incorporated into Siri? If this is the case, how do we reconcile Apple Chat with existing features like Spotlight Search or Search on iOS and iPadOS?
New Macs already have a dedicated key (F4) for Spotlight Search, so it was logical to start building here. The UI is already excellent and serves its purpose well (centralized, accessible system-wide at all times, gives results from multiple sources, etc.), but its name and functionality are confusing.
Similarly, iPhones and iPads have their version of Search (just Search, not Spotlight Search) by swiping down on the home screen (with Siri suggestions thrown in for good measure?) with essentially the same functionality as Spotlight Search on macOS.
There have become so many places to search for things in Apple’s interfaces over the last few years that the uses for Spotlight/Search have become far less clear. While searching was getting cluttered, Siri was neglected with no significant updates or new functionality.
You can find things with Spotlight/Search, but you can’t really ask questions to learn more. You can find a Calendar event or message or note, but you can’t make one… You need to use Siri for that. With both Siri and Spotlight/Search, the interaction always ends with finding the information and navigating out to the source. I’m imaging that more computing can happen inside “search” with conversation as the means of interaction instead of clicks or gestures.
You type to Spotlight, but you talk to Siri? Unless you use Type to Siri, which is already an established Accessibility feature. Typing to Siri feels like a better Siri, but a worse Spotlight/Search. Siri can’t do what Spotlight/Search does, and Spotlight can’t do what Siri does. The issue is that they’re separate.
This was my thinking behind Apple Chat. There should be one clean, clear, and obvious way to search for and do anything and everything on your phone: local file lookup, Internet search, or turning on the flashlight. Maybe Apple Search or Unified Search as a name instead?
Future Considerations
If I were to revisit this project, I would draft up some versions of this as a rebranded, enhanced-capability Siri. The approach and UI/UX would be similar, but instead of introducing Apple Chat, every AI-powered/contextual search feature would be cleaned up and branded as a capability of Siri.
I would also be interested in incorporating some design cues to allow for voice input as well as text when interfacing with the system. Not only for accessibility, but also to reinforce the iterative and conversational benefits of the technology.