Hi there , I'm Nick
I design and build humane interfaces -- mostly in Swift (iOS & macOS), sometimes in Rust and Python.
My work bridges HCI research, developer tools, and new ways of working with AI.
San Francisco
Open Source Tools
- Stitch -- founding engineer of Stitch, an open-source tool for designers
- Chroma-Swift -- Swift package for Chroma's on-device database engine
- TiktokenSwift -- Swift bindings for OpenAI's tiktoken via UniFFI
- Roboflow Swift SDK -- first SDK for running Roboflow-trained models on iOS
- AudioKit -- helped launch this open-source audio synthesis/analysis framework
Prototypes
- Visual iMessage -- what if Siri could describe images in a thread?
- Diffusion Demo -- SwiftUI interfaces for Inception Labs' model
- ASL Classifier -- detecting ASL signs on-device with CoreML
Earlier Work
- Emulating Touche -- open-source capacitive sensing with plants & water
- O Soli Mio -- radar-powered gestural interfaces for music
- Whistlr -- contact sharing over audio on iOS
- Push-to-Talk Chat -- lightweight audio chat app
- Plus experiments with BLE sensors, CoreML sound recognition, LED control, and more (archive)
Publications
- O Soli Mio: Exploring Millimeter Wave Radar for Musical Interaction (NIME 2017)
- Investigation of the use of Multi-Touch Gestures in Music Interaction (MSc Thesis, University of York)
Links
Get In Touch
Send me a note at nicholasarner (at) gmail (dot) com, or find me on Twitter.