C
D
CD
Huge AI Sign Huge AI Sign

The Problem —

Huge seems to love egg shell white walls. Big, tall, plain white walls. So as we were preparing to move into out new space on a very visible ground floor I had an idea.

I wanted to create a large sign to welcome clients and passersby that was a physical piece but had a digital spin. I quickly sketched up a example of what I wanted to build and our managing director said *"Yuuuuuuuusss!"* So this is how I spent the next 30 days of my life...

The Design

I started the journey in Illustrator, carefully laying out and kerning each character. When type setting something of this size any minor layout oversight would become very apparent once the sign was installed.

Once the type was set I moved into Fusion 360, the super amazing product design suite from Autodesk. Importing the vector work from Illustrator keeps me confident that the letterforms are tight. The next step was to add points for each mounting point on every letter. This assures me that every character once machined would be correctly spaced as it was in Illustrator.

Then the entire project from Fusion 360 was converted back to vector with each mounting point and printed out on a large format printer. This creates a template for mounting the sign on the wall to assure its completely straight and level.

The Build

Back to the Fusion 360 project I created the tool paths to cut each and every letter and marking each mount point for the hardware to be installed. The letters were machined out on a large format CNC machine once letter at a time.

Once all of the letters were cut I hand sanded them and coated them with three coats of matte polyurethane.

Installation of the 1000 individual LEDs was no small task. It took roughly a week to affix the LEDs, log the location of each, and wire / solder the power and signal lines. Each LED was logged into an image file by its index to render graphics correctly. (More on that later!)

Light em' up

The brain of the sign is a Raspberry Pi Zero W. It is a single board computer that is smaller than a credit card with plenty of horsepower to render the graphics for the sign.

I created a Node.js app on the Pi, renders graphics at 60 frames per second, and converts the images into a linear series of color values to send to the LEDs. While affixing each LED to the back of the letters I created an image in photoshop where the pixel color value corresponds to the actual physical index of the LED in the strip. When the Node.js app starts it creates a function to map a regular cartesian grid image into a linear series of color values that is sent to the LED strip. Like magic it can display full color images, even without a full grid of pixels! I promptly loaded up a classic Snoop music video to give it a test run. Then I had some Gin and Juice to celebrate.

This video shows the first test of lighting up all LEDs

It's in the details

The actual graphics that are displayed are generated using an HTML canvas in Node.js. Those images are animated and then mapped out to the LED strip at 60 frames per second. It supports regular canvas animation and videos. We created a number of Huge branded videos and a bunch of Client themed videos as well.

To round out the build I designed and 3d printed up a Huge themed case for the 800-watt power supply and Raspberry Pi brain. All of the wires were fed through the wall to give it a nice clean look.

Hey Huge Sign!

The only interface to control the sign is with Dialogflow a Chat bot api from Google. This allowed me to easily integrate controls for the sign with a Google Home device sitting next to the sign. You could yell 'Hey Huge Sign, It's St. Patricks Day!" and like magic it would display a video of the Irish flag. 'Hey Huge Sign, Jeep is here!' and you get some off road vibes!

Augment Reality

I had put a lot of work into the rendering system but wasn't really putting it to use. So I had an idea to build an Augmented Reality app that would allow you to fire paintballs at the sign and 'paint' it different colors. The application was taught to recognize and track the actual sign and lock its 3D camera onto it. Once it got a lock you could fire paintballs at it.

The application would render the paintballs and monitor where on the physical sign that the paintball would hit. When it detected a collision it would send that location and color value to the sign and the sign would paint it over top of whatever other program it was running. After a couple minutes of inactivity the paint would fade away.

It's like graffiti art without all the running away from police!

Done and Done.

This wrap up video was created by the very talented & super strong man Shaun Strack.