Categories
Education Portfolio Technology

Swift Student Challenge

A few days ago, Apple announced the winners of their Swift Student Challenge. I had applied and used my “taking a test” tactic, which was to hit ‘submit’ and then promptly erase the whole thing from my brain. (What’s done is done, and I feel silly worrying about something I have no control over.)

So when I got the email that “my status was updated” it was a bit of a surprise.

And when I clicked through the link (because, of course, they can’t just say in the email, you have to sign in) I was in for more of a surprise.

My submission had been accepted. I’m one of 350 students around the world whose work sufficiently impressed the judges at Apple.

Screenshot from Apple Developer website. It reads: Congratulations! Your submission has been selected for a WWDC20 Swift Student Challenge award. You'll receive an exclusive WWDC20 jacket and pin set at the mailing address you provided on your submission form. You'll also be able to download pre-release software, request lab appointments, and connect with Apple engineers over WWDC20 content on the forums. In addition, one year of individual membership in the Apple Developer Program will be assigned free of charge to eligible accounts of recipients who have reached the age of majority in their region. For details, see the WWDC20 Swift Student Challenge Terms and Conditions.
Neat!

Now, throughout the whole process of applying, I was my usual secretive self. I think two people knew that I was applying at all, much less what I was working on. Since it’s over with, though, it’s time for the unveiling.

What I made

I wanted to bring back a concept I’ve played with before: cellular automata. A few days before the competition was announced, I’d seen a video that really caught my interest.

Well hey, I thought, I’ve got some code for running cellular automata. I want to learn Swift Playgrounds. And I’ve been having fun with SwiftUI. Let’s combine those things, shall we?

The first big change was a visual history; when a cell dies, I don’t want it to just go out, I want it to fade slowly, leaving behind a trail of where the automata have spread.

The second was rewriting all the visuals in SwiftUI, which was a fun project. Animation timings took me a bit to get right, as did figuring out how to do an automated ‘update n times a second’ in Combine. The biggest issue I had, actually, was performance – I had to do some fun little tricks to get it to run smoothly. (Note the .drawingGroup() here – that made a big difference.)

And third, I didn’t want it to just be “here’s some code, look how pretty,” I wanted to actually use the Playground format to show some cool stuff. This turned out to be the most frustrating part of the whole thing – the Swift Playgrounds app doesn’t actually support creating a PlaygroundBook, and the Xcode template wasn’t supported in the then-current version of Xcode.

But the end result? Oh, I’m quite happy with it. PlaygroundBooks are cool once you get past how un-documented they are. You can, to borrow a Jupyter turn of phrase, mix code and prose in a lovely, interactive way.

Screenshot of the 'Grid' page of the playground book.  The full text is at https://github.com/grey280/SwiftLife/blob/master/Swift%20Student%20Submission.playgroundbook/Contents/Chapters/Chapter1.playgroundchapter/Pages/Grid.playgroundpage/main.swift
Don’t worry, the real version (and some videos) are below.

Doing the actual writing was pretty fun. This is a concept I’ve spent a lot of time learning about, just because it captured my interest, and I wanted to share that in a fun way.

Overall, I’m quite happy with the result. If you’d like to see more, I’ve made recordings of the ‘randomized grid’ and ‘Wolfram rule’ pages running, and the actual playground is available on GitHub.

Categories
Review

“The One Device,” or, “I’m amazed this man didn’t get arrested”

Brian Merchant
It’s rather fitting that I’m writing this review on my iPhone. Parts of the book were written on an iPhone, I suspect, and the author mentioned that a good deal of the interview recordings and photos were made on his iPhone.
Structurally, the book is interesting — there are two through lines, and they’ve got the same writing style but different feels. The more story-like one is the historical aspect, going from the beginning of the project through to the keynote where Steve Jobs introduced the world to the iPhone. And it’s a story, for sure: there’s a narrative to it, characters being introduced, politics and inventions, failures and triumphs. It’s the best telling of the story I’ve read so far, though admittedly I don’t think I’ve actually sat down to read the full story before.1
The other part is more of the ‘now’ aspect, which explores the impact of the iPhone as a product, focusing on the manufacturing process. The author tells how he… made his way into the Foxconn plant where iPhones are assembled; predictably gets hacked immediately after arriving at a hacker convention; goes on a claustrophobic tour of a tin mine; under-details an agoraphobic tour of the salt flats that produce most of the lithium used in the iPhone’s battery; and a few other stops along the way.
All told, it’s an interesting read. Some of the historical context was new to me—the history of ARM was inspiring, for example—and while I already knew a lot of things—photos of those lithium flats are pretty striking—I’m glad I took the time to read it. If you’re at all interested in the history, I can recommend the book.


  1. Creative Selection is on my list to read, so I’ll get there eventually. 
Categories
Programming

SwiftUI’s Picker

I’m very excited about SwiftUI, and have been using what little free time I have to do some tinkering with it. I started during the beta period, which was fun in between being very frustrating; a lovely side effect was that some of the knowledge I picked up is… entirely wrong. One that caught me was the implementation details for the Picker type.
Based on the rather rough state of the SwiftUI documentation for Picker and ForEach,1 I’d assumed that combining the right binding with a .tag(_:) on the items would work:

Form {
    Picker(selection: $selectedItemID, label: Text("Choose Something") {
        ForEach(items){
            Text($0.label).tag($0.value)
        }
    }
    Text("You've selected item \(selectedItemID)!")
}

For reference, the models I’m referring to throughout are pretty simple:

struct CustomModel {
    let value: Int
    let label: String
}

Now this looks like it’s working in simple cases. However, I was trying to interact with a web API, so that items array looked something like this:

var items: CustomModel[] = [
    CustomModel(value: 7, label: "First"),
    CustomModel(value: 3, label: "Second"),
    CustomModel(value: 1, label: "Third")
]

If you tapped “Second” in the picker that SwiftUI generated, however, the text wouldn’t read “You’ve selected item 3!” like it should; it would be “You’ve selected item 1!”
A bit more tinkering revealed that, instead of pulling the value from the .tag(_:) on there, it was just using… the index in the ForEach.2
After some frustrated Googling, utterly despairing of Apple’s documentation, and a lot of StackOverflow searches, I finally figured out the solution:

Form {
    Picker(selection: $selectedItemID, label: Text("Choose Something") {
        ForEach(items, id: \.value){
            Text($0.label).tag($0.value)
        }
    }
    Text("You've selected item \(selectedItemID)!")
}

Quite frankly, I don’t have a good explanation of what’s going on here; last time I was tinkering with Pickers, the .tag(_:) provided SwiftUI with the information it needed to do the binding. (When I’ve got more time, I’d like to do another test — now that I’ve got the id keypath, do I even need the tag?)
I’d love a good explanation of what all the id keypath gets used for, and where else it might be necessary, but alas:


  1. It’s a bit unfair for me to link to No Overview Available when referring to SwiftUI; the coverage is low, but the problem isn’t so much that as the fact that ‘documentation coverage’ just doesn’t work as a metric for something like SwiftUI. The tutorials are a start, and a good sign that Apple was at least trying to rethink their approach to documentation, but they’re not nearly complete enough. 
  2. Zero-based index, of course, which seemed obvious to me, but got me a “???” response when I was complaining about this issue to a non-programmer friend. 
Categories
Technology

iOS Notification Routing

The other day I was thinking about the way iOS handles notifications; the new Do Not Disturb stuff in iOS 12 is a good start, but it’s still rather lacking. It’s a fun thought exercise: say you’re Jony Ive or whoever, and you’re setting out to redesign the way that notifications work, from a user standpoint.1 How do you make something that offers advanced users more power… but doesn’t confuse the heck out of the majority of your user base?
After a while dancing around the problem, I came to the conclusion that you don’t.23
Instead, imagine something more along the lines of Safari Content Blockers. By default, the existing system stays in place, but create an API that developers can use to implement notification routing, and allow users to download and install those applications as they so desire.4
Obviously, this would have some serious privacy implications — an app that can see all your notifications? But hey, we’re Jony Ive, and Apple has absolute control over the App Store. New policy: Notification routing apps can’t touch the network.5 And, to prevent any conflict of interest stuff, let’s just say that the routing apps aren’t allowed to post notifications at all.
Alright, we’ve hand-waved our way past deciding to do this, so let’s take a look at how to do it, shall we?
Let’s start with the way notifications currently work. From UNNotificationContent we can grab the properties of a notification:


For proper routing, we’ll probably want to know the app sending the notification, so let’s add the Bundle ID in there, and we’ll also give ourself a way to see if it’s a remote notification or local.

Alright, seems nice enough.6
Next up, what options do we want to have available?
1. Should the notification make a sound?7
2. Should the notification vibrate the phone?
3. Should the notification pop up an alert, banner, or not at all?
4. If the user has an Apple Watch, should the notification go to the Watch, or just the phone?
5. Should the notifications show up on the lock screen, or just notification center?
6. Finally, a new addition, borrowing a bit from Android: which group of notifications should the notification go into?8

Alright, that should be enough to work with, let’s write some code.


Not a complex object, really, and still communicating a lot of information. I decided to make the ‘group’ aspect an optional string — define your own groupings as you’d like, and the system would put notifications together when the string matches; the string itself could be the notification heading.9
And with that designed, the actual routing could just be handled by a single function that an application provides:

And with that, I’d be free to make my horrifying spaghetti-graph system for routing notifications, and the rest of the world could make actually sensible systems for it.
Thoughts? There’s a comment box below, I’d love feedback.


  1. I haven’t done much work with the UserNotification framework, so I’m not going to be commenting on that at all. 
  2. I spent a while mentally sketching out a graph-based system, somewhere between Shortcuts and the pseudo-cable-routing stuff out of Max/MSP, but realized pretty quickly that that’d be incredibly confusing to anyone other than me, and would also look very out of place in the Settings app. 
  3. As a side concept, imagine that but implemented in ARKit. “Now where did I put the input from Messages? Oh, shoot, it’s in the other room.” 
  4. Unlike Safari Content Blockers, though, I think this system would work best as a “select one” system, instead of “as many as you like, they work together!” thing. Mostly because the logistics of multiple routing engines put you back in the original mess of trying to design data-flow diagrams, and users don’t want to do that. Usually. 
  5. I’d call this less of an ‘App Store’ policy and more of a specific entitlement type; if you use the ‘NotificationRouting’ entitlement in your app, any attempt to access the network immediately kills the application. 
  6. Of course, those last two additions wouldn’t be things that you’d be able to set while building a UNNotificationContent object yourself, so perhaps we should be writing this as our own class; UNUserNotification perhaps? 
  7. We’ll assume that setting notification sounds is handled somewhere else in the system, not by our new routing setup. 
  8. This would be at a higher level than iOS 12’s new grouped notifications (the stacks), more like the notification channels in Android: categories like ‘Work’, ‘Family’, ‘Health’, and so on. 
  9. Since we’re Jony Ive, and everything has to be beautiful, we’re presumably running it through some sort of text normalization filter so people don’t have stuff going under the heading “WOrk” 
Categories
Collection

“Personalized Hey Siri”

Apple Machine Learning Journal:

In addition to the speaker vectors, we also store on the phone the “Hey Siri” portion of their corresponding utterance waveforms. When improved transforms are deployed via an over-the-air update, each user profile can then be rebuilt using the stored audio.

The most Apple-like way to continuously improve that I can think of. More interesting, though, is this bit later on:

The network is trained using the speech vector as an input and the corresponding 1-hot vector for each speaker as a target.

To date, ‘personalized Hey Siri’ has meant “the system is trained to recognize only one voice.” That quote, though, sounds like they’re working on multiple-user support; which, with the HomePod, they really should be.

Categories
Technology

Tidbits from Apple’s Machine Learning Journal

A short while ago, Apple launched a journal on machine learning; the general consensus on why they did it is that AI researchers want their work to be public, although as some have pointed out, the articles don’t have a byline. Still, getting the work out at all, even if unattributed, is an improvement over their normal secrecy.
They’ve recently published a few new articles, and I figured I’d grab some interesting tidbits to share.
In one, they talked about their use of deep neural networks to power the speech recognition used by Siri; in expanding to new languages, they’ve been able to decrease training time by transferring over the trained networks from existing language recognition systems to new languages.1 Probably my favorite part, though, is this throwaway line:

While we wondered about the role of the linguistic relationship between the source language and the target language, we were unable to draw conclusions.

I’d love to see an entire paper exploring that; hopefully that’ll show up eventually. You can read the full article here.
Another discusses the reverse – the use of machine learning technology for audio synthesis, specifically the voices of Siri. Google has done something similar,2 but as Apple mentions, it’s pretty computationally expensive to do it that way, and they can’t exactly roll out a version of Siri that burns through 2% of your iPhone’s battery every time it has to talk. So, rather than generate the entirety of the audio on-device, the Apple team went with a hybrid approach – traditional speech synthesis, based on playing back chunks of audio recordings, but using machine learning techniques to better select which chunks to play based, basically, on how good they’ll sound when they’re stitched together. The end of the article includes a table of audio samples comparing the Siri voices in iOS 9, 10, and 11, it’s a cool little example to play with.
The last of the three new articles discusses the method by which Siri (or the dictation system) knows to change “twenty seventeen” into “2017,” and the various other differences between spoken and written forms of languages. It’s an interesting look under the hood of some of iOS’ technology, but mostly it just made me wonder about the labelling system that powers the ‘tap a date in a text message to create a calendar event’ type stuff – that part, specifically, is fairly easy pattern recognition, but the system also does a remarkable job of tagging artist names3 and other things. The names of musical groups is a bigger problem, but the one that I wonder about the workings of is map lookups – I noticed recently that the names of local restaurants were being linked to their Maps info sheet, and that has to be doing some kind of on-device search, because I doubt Apple has a master list of every restaurant in the world that’s getting loaded onto every iOS device.
As a whole, it’s very cool to see Apple publishing some of their internal research, especially considering that all three of these were about technologies they’re actually using.


  1. The part in question was specific to narrowband audio, what you get via bluetooth rather than from the device’s onboard microphones, but as they mention, it’s harder to get sample data for bluetooth microphones than for iPhone microphones. 
  2. Entertainingly, the Google post is much better designed than the Apple one; Apple’s is good-looking for a scientific journal article, but Google’s includes some nice animated demonstrations of what they’re talking about that makes it more accessible to the general public. 
  3. Which it opens, oh-so-helpfully, in Apple Music, rather than iTunes these days. 
Categories
Collection

WWDC Wishlists

Six Colors pretty well covered what I’d like to see in the next version of macOS:1

I’d like to see an entirely new and simplified version of iTunes for Mac, perhaps multiple apps. iTunes can become the hub for Apple’s media sales, as it is on iOS. A new Music app will need to support Apple Music as well as local music files. And as for syncing, updating and configuring iOS devices, let’s move all of that to a new iOS Sync app that’s completely separate. Break up the iTunes monopoly—it’s way past time.


  1. I’m still not sure if I like ‘macOS’ or ‘MacOS’ better, but ‘OS X’ just doesn’t feel right anymore. It’s been around as a brand for too long. Plus, now that Windows 10 is out, there’s room for verbal confusion- did they mean ‘OS X’ or ‘O.S. 10’?