Snap is sending Spectacles 2021 to selected creatives. As far as I know, typically, it's not paid partnership. You can apply about getting the pair of glasses here /Link/.
I personally was keen to get them since they were released. I was sending emails and applied multiple times until one day I got in :) I'm not sure what worked for me. The only advice I could give is to start building and sharing stuff to get noticed. Anything similar would work: AR lenses for Snapchat or VR interactions for Quest. The main idea is to show that you can and want to produce spectacular stuff.
I won't do another Spectacles review, but I want to share my insight as I learned a lot about them.
Instead of going full-blown version of future glasses, Snap build a comfortable and friendly device to wear. At least as long as battery and heat buildup allows you. For example, I had them on me during the travel on a train, and no one even looked at me. I definitely would get more attention with, let's say, Magic leap. Even turning glasses on takes seconds, not minutes.
The second huge plus is a creation pipeline. Yeah, they're locked in Snap's ecosystem, but Lens Studio is outstanding. After switching back to Unity, I missed so much just hitting the "Send to Devices" button, which just worked.
Needless to say, about Snap ecosystem, but also test concepts I can apply to other platforms.
Of course 🤷
As an independent contractor, I just can not play with the new hardware all the time. I need to have real projects to pay my bills. So, I wanted to use the exposure from these experiments to lead me to new compelling contracts.
As usual, I'm working on such side projects on weekends or free time, so I'm trying to keep the scope as lean as possible. Looking back, I noticed that quite often, I automatically apply the same framework to all my experiments. I pick up one technical challenge and one use case to test and try to avoid the rest. I'll explain it more on examples below.
I was acquainted with Lens Studio, as I explored it a lot while working for Verison Media, building their tool for creating AR experiences. But I had very little hands-on experience. To get an idea of how things work, I wanted to have the entire development cycle completed asap. I took an old lens my wife Inna built for Snapchat last year and tried to make it work on Spectacles. I had the pipeline figured out in no time and went outside to test it in the wild.
I found out that having objects just flying around isn't the best approach, as it's easy to lose them. Also, when an object moves out of FOV, it just emphasizes its boundaries. Latter, I always tried to keep the visuals small and local in further prototypes.
Tech challenge:
Use case:
Huge thanks to Snap for preparing the templates. I was able to build the next experiment literally the next day.
Here is a good example of cutting corners: For dots on the guitar that show me where to press, I just made a video on black background, made it loop, changed the blend mode to lighten, and connected it to the marker. I didn't even try to build the whole experience as my goal was just to test how it felt.
Actually, tracking wasn't sufficiently confident, and in general, the experience didn't feel good enough to continue working on it.
Tech challenge:
Use case:
Then I had to travel to Berlin. The day before, I saw a tutorial about Dynamic texts. It's a simple but interesting feature that allows you easily get users info like location and whether as text. So, I was looking for a use case where I could use it.
So, while on a train, I thought it would be cool to know the cities I'm passing through and other relatively useful info. I pulled out my laptop and had the lens ready in less than 5 minutes. I had a hard time connecting my Spectacles to train wifi, so I recorded video using Snapchat on mobile. Later I released a similar lens for Spectacles.
Tech challenge:
Use case:
When I came back before going for a run with my dog, I decided to test how Spectacles can track movements in world space. The white snow cover is supposed to make it useless. I did not even build anything and just used the default running app by Snap. And it worked unexpectedly adequate. The tinted glasses were comfortable to wear outside. Obviously, the battery didn't last too long, and I did not have with me the case which has x4 charges.
Tech challenge:
Use case:
This exploration was technically descendent of the guitar demo. I wanted to find a way to localize AR info and keep testing markers. The use case is simple, based on recognized marker show the value the banknote cost in crypto.
Marker tracking worked not too bad. I even built the functionality to calculate the combined value of all visible banknotes but later found out that only a single marker is supported.
Also, the data wasn't dynamic. So, the lens I published had an accurate conversion rate only on the day I created it :)
Tech challenge:
Use case:
After discovering that world tracking works excellent, I wanted to keep testing it. The obvious use case should be related to sports. Adam did a remarkable series of explorations in this direction, so I had difficulty coming up with something new :)
I've been doing various technical tests. While measuring the vertical position of the headset in the world space, I noticed that it's very accurate. Later I built the app to test vertical jump height, and we went to trampolines to stress test.
It was a success: technically, glasses worked faultlessly. The use case was interesting, and glasses were no barrier to moving at all.
Tech challenge:
Use case:
I've been thinking a lot about input. I always feel that voice has too high a margin of error. There is hand tracking, but it's not as good as on Quest, and I played too much with hands demos anyway. The trackpad on the side of the glasses is treated more like a temporary workaround, but I wanted to see if I could squeeze a little more out of it. After building a quick tech demo, I found out that its accuracy is quite good. So, I decided to create a numpad for inputting numbers (like for unblocking the device). I'm sure there are better ways to build the mechanism of unblocking glasses, but it's a perfect use case for testing the trackpad's accuracy limits.And it worked okay-ish. I could enter desired numbers with unexpectedly high accuracy. A couple of interesting details:As I do not know where my finger would land on the tiny trackpad before I touch it, I would only show the cursor when it's touched and, on release, activate the button. It worked nice.The trackpad has two parts and a couple of system gestures. I used only the bottom 50% of the trackpad to avoid accidentally triggering them. It still was more than enough.
Tech challenge:
Use case:
Later I decided to take it a little bit further and create a full qwerty keyboard as a joke. And I got to say it wasn't as bad as I expected. I still could type anything I wanted.
On Lens Fest, Snap released a couple of external APIs. The crypto API was as created for my previous crypto demo. So, I quickly figured out how it works and made one more "useful" public lens over the weekend—a converter of USD banknotes to crypto.
Tech challenge:
Use case:
7 prototypes built
5 published lenses
11 videos were recorded and shared on social media
>700k impressions on Twitter on one of the videos
>450k views on LinkedIn on one of the videos
>100 people received pitch how cool are Spectacles
<1 hour spent getting started with Lens Studio
>10 hours waiting glasses to cool down
∞ fun during creation
And most important, did I achieve my main objective?
Even overachieved. After seeing my stuff, some great companies contacted me online, which led to multiple exciting gigs, which grew into, unimaginable before, long-term partnerships, which kept me so busy that I got time to publish this article almost a year after I started drafting it 😅