Champaign-Urbana Community Fab Lab
University of Illinois at Urbana-Champaign
Champaign-Urbana Community Fab Lab

Seth Sawant Iteration Assignment

This week’s assignment was to to either build upon a previous week’s project or to start it over from scratch. Because we’re soon starting the final project, I decided this would be the perfect opportunity to combine two projects into one to create a larger more and ambitious design. I also wanted to make something that was functional and something that I think that I think would hard to find commercially, so those things with in mind I brainstormed various ideas. The final project I settled on was creating a temperature display for my apartment, as one of the first pieces of information that I need in the morning is the outside temperature so I can dress accordingly. At the same time, I wanted this to be something I would want to display in my apartment, so the means of displaying the temperature will be a mechanical display similar to that of an odometer or a mechanical counter. For the iteration project portion of this, I tackled just the electronics aspect of the piece, namely the method of acquiring the temperature information and driving a motor based on that data, leaving the mechanical (and more complex) part for my final project.

The DHT22 temperature and humidity sensor I used.

For the backbone of this project, I decided to use Adafruit’s Feather HUZZAH micro controller, which is an Internet of Things focused device with built-in support for low energy Bluetooth, WiFi, and useful power-saving features built in. The chip can be programmed numerous ways; I started by trying to use MicroPython, but got frustrated by the unreliable programming method which involved connecting to the HUZZAH over WiFi and using a web-based terminal to update the code. Instead, I used the Arduino IDE which was less effort but required a little setup by installing some special board packages.

For acquiring the temperature data, I used two methods: one was to use a temperature sensor to get the ambient temperature of the room, and the second was to pull the weather data from the internet to get the outside temperature. The thinking behind this is that for my final design, I want the display to periodically switch between the inside and outside temperature to highlight  the ccool transition between digits that mechanical displays have. For the ambient temperature I used another piece of Adafruit kit, a DHT22 temperature and humidity sensor, which comes with an Arduino library which makes getting readings from the device easy. Although I didn’t use the humidity readings in this project, I’m thinking of adding the ability in the final project to alternate between showing the temperature and the humidity. To get the outside temperature from the internet, I used a weather API called OpenWeather which gives basic weather information for free, and used the Arduino HTTP library to send the API requests. Finally, to display the temperature data to the user I made a simple dial out of cardboard and a servo to act as a stand-in for the final, more complex display. Although the servo worked in this case, I realized that because servos can’t rotate more than one full rotation it probably won’t work for my final design; I’m considering other options like using a stepper motor or a regular DC motor, which can rotate indefinitely in one direction.

 

#include <ESP8266WiFi.h>
#include <ESP8266HTTPClient.h>
#include <ArduinoJson.h>
#include <Servo.h>
#include <Adafruit_Sensor.h>
#include <DHT.h>
#include <DHT_U.h>

#define WIFI_TIMEOUT 5

const char* ssid = "NETWORK_NAME";
const char* password = "NETWORK_PASSWORD";

HTTPClient http; //Declare an object of class HTTPClient

double outside_temp_k = 273.15;
double outside_temp_c = 0;
double outside_temp_f = 32;

double inside_temp_c = 0;

Servo dial;
DHT_Unified dht(14, DHT22);
sensor_t sensor;

void setup () {
dial.attach(2); //attach servo object to pin 2 of the Feather
dial.write(0);
dht.begin();
Serial.begin(115200);
WiFi.begin(ssid, password);

int timeout = 0;
Serial.print("Connecting");
while (WiFi.status() != WL_CONNECTED && timeout < WIFI_TIMEOUT) {
delay(1000);
Serial.print(".");
timeout++;
}
}

void loop() {
dht.temperature().getSensor(&sensor);
if (WiFi.status() == WL_CONNECTED) { //Check WiFi connection status
http.begin("http://api.openweathermap.org/data/2.5/weather?id=4887158&appid=API_KEY"); //Specify request destination
int httpCode = http.GET();
if (httpCode > 0) { //Check the returning code
String payload = http.getString(); //Get the request response payload
DynamicJsonDocument doc(1024);
DeserializationError error = deserializeJson(doc, payload);
if (!error) {
outside_temp_k = doc["main"]["temp"]; // gets the temperature in kelvin
outside_temp_c = outside_temp_k - 273.15; // convert to celcius
outside_temp_f = (outside_temp_c*1.8) + 32; // convert to fahrenheit
}
} else {Serial.println("Error contacting OpenWeather API!");}
http.end(); //Close connection
delay(2500);
} else {
Serial.println("No network connection!");
}

dial.write((int)outside_temp_f); //-20 to 120 to 0 to 180 range.

sensors_event_t event;
dht.temperature().getEvent(&event);
if (isnan(event.temperature)) {
Serial.println(F("Error reading temperature!"));
}
else {
if ((int)event.temperature != (int)inside_temp_c) {
inside_temp_c = event.temperature;
dial.write((int)inside_temp_c);
}
}

// ESP.deepSleep(2000000); connect GPIO pin 16 to RST for this to work
delay(500);

}

Continue Reading

Arduino Iteration Adventures

My iteration project was built as a stepping-stone from my newly-acquired Arduino skills to my final project. To do so, I built a box with a map of my local area on the front, and a number of LEDs that lit up at 13 different bus stop locations to indicate where certain buses where. 

before we dive into the process, here’s the final result:

The three lines near my house are green, gold, and silver, as pictured above. 

 

My previous project for Arduino was a strip of NeoPixel LEDs that could be controlled using a joystick:

Arduino Adventures

 

This iteration project began with a sketch of some ideas I was considering:

Option (A) was the one I decided to construct, with the integrated LEDs. Option (B) would have involved a sort of clock/timer with rotating servos to indicate approximately when the next bus would arrive. Options (C) and (D) involved using LCD screens, which were a little too advanced for what I was looking to work with for this project.

To design the front of the box, I found a site call snazzy maps that could modify the look of google maps to be whatever I wanted. So I took a screenshot of my local area with just the roads visible and converted it to an SVG for the engraving process. I then found a different online tool to construct the SVG files for my wood-engraved box.

The initial SVG for my local area

The final SVG for the front-facing part of my box, with holes designating bus stops

 

 

I was initially looking to use individual LED bulbs for each bus stop, but if I wanted to use a multicolor LED, I would have 4 pins on each bulb, and 13bulbs in total… so I decided to change my approach and use NeoPixels. Since I was primarily concerned with accuracy on the map for the bus stops, I hadn’t considered the troubles I might experience when attaching NeoPixels to the board. this resulted in a bit of a haphazard setup, with the NeoPixel strip contorted in order to line up each hole in the board:

 

I then went over to the Makerspace and learned how to solder! I tried looking at a few youtube tutorials to help me along, but it was still quite a time-consuming process. I asked Niel about how he goes about soldering, and he showed me a technique that involved applying solder to each component separately, and then bringing them together and heating to attach them securely. This made the process much easier, and I was able to successfully attach all of the neopixel strips together. In particular, I made sure that I had the data line all heading in the same direction to avoid trouble later down the line.

 

My first attempt at soldering NeoPixels

First set of LEDs attached, and they work!

The completed soldering job, with Clear tape in place to secure the pixels onto the board

 

Now that the LEDs were setup, I began programming the device! Since I would need to connect to the cuMTD servers to get bus data, I decided to use a Wifi-enabled Arduino. I then installed Postman onto my laptop so I could test my API calls to MTD, and then implemented them into the arduino code once I felt confident. 

 

I ran into some trouble when making API calls through the university wifi, as they likely have security measures in place. With that said, the project is working successfully at my apartment! Here’s the code I used, which utilized a mix of NeoPixels, HTTP requests, and WIFI connectivity code:

https://pastebin.com/SNyX4EuR 

Some issues that I was facing with the implementation is that random LEDs would light up for colors that I haven’t even programmed in, as seen here:

My best guess as to the source of this problem was due to power surges in the data signal for the LEDs. After getting access to a resistor and a capacitor, I was able to mitigate this issue. 

 

Here’s a video of the final product in action:

 

 

 

Continue Reading

Iteration Assignment

For my iteration assignment I decided to revisit the arduino Input/Output project. Previously I had used a current sensor to measure the current through an LED to determine the right resistor needed for a particular brightness.

I had used these sensors in the past, so I wanted to try a project with a sensor I had never used before. One area I had been interested in was biometrics and I had seen some of the cheaper options for pulse or muscle sensors that were on the market and figured I could explore those. The sensor I chose was the MyoWare muscle sensor, my plan was to use it to create a sort of feedback mechanism to help combat stress by detecting if one was clenching their hands which can be a sign of stress. The feedback would be provided by a group of vibrating motors where a slight innervation of the muscle fibers would cause a “low” reading and activate a single motor, a medium innervation would activate two, and a strong innervation would activate all three. 

The setup was fairly straight forward, the sensor only has 3 pins to connect, Vcc, Gnd, and Signal. The sensor operates off of an analog signal so it can be tested with the example code included with the Arduino IDE.

 

No clench, numbers between ~450-460

Light clench numbers ~470-480

Medium clench, numbers ~490-500

Hard Clench, numbers read ~520-530

After I prototyped it on a breadboard, I decided to try and solder a protoboard together for it and also use an Arduino Micro in the hopes that I would get to actually make it a wearable device.

 

The first thing I noticed was that between applications of the sensors, the numbers had a tendency to change, so the sensor had to be calibrated before each use. Then after a few uses, the sensor’s ability to detect kept worsening. I bought a pack of 50 electrode pads and used nearly all of them placing and replacing the electrodes to try and get a reading. Eventually, the sensor stopped working and I had to halt all progress on the project.

Before that happened, I was able to at least get a video of the setup working as intended.

 

My reflection on this project was actually fairly frustrating. The sensitivity and erratic nature of the sensor meant I had a hard time getting very reproducible results. furthermore, the sensors seem to be very picky about where on the muscle bundles you place the sensor. I am hoping this is just due to the sensor being defective, however if I were to use biometrics again, I may need to use one of the more stable sensors such as the armband.

Failure aside, it was definitely good to have worked with a different kind of sensor that actually uses the sensing of muscle action potentials to control electronics. wearing the sensor around the lab or my workshop did kind of make me feel a slight bit bionic.

Continue Reading

Iteration Assignment

Ideation: For this week, I wanted to improve my sewing skills because while I had a lot of fun during sewing week, I saw my technique had abundant room for improvement. So, I went back to the google drive folder of patterns and picked a plushie. I made bags last time, a drawstring one and one with a zipper (which was more difficult than the drawstring one). So, I decided to challenge myself and work with stretchier fabrics like felt and plush, which was a jump from working with pure cotton on the bags. The plushie also was more of a sewing challenge because it had a variety of strokes used (zigzag, straight, and basting). While I am not iterating the exact bags I made from that week, I am taking my sewing skills and techniques further by creating something more difficult and considerably different than last time. 

Images of previous project:

Construction: Unfortunately, I didn’t take many pictures of my build process, but I can describe the process in detail. I first cut out the pattern, and there were a lot of pieces. Then, I started sewing together the face. After that, I went ahead and put the sides and bottom together, which brought the squares to 3D. The places I had difficulty were the legs, because the fabric had to be squished a little to get lined up on both sides, so it came out a little frayed. 

Reflection: This project was actually fun to make. The hard part was understanding some of the directions actually, which the lab assistants helped to explain and show. Also the good part was that I didn’t have to start over on any of the pieces this time. I learned what y-seams, basting, and gathering / gathering stitches are. So, I feel like a more sophisticated sewer (not sure if that is the word for someone who sews yikes) than when I first began. Of course, my most favorite part was stuffing. The end result came out pretty good! I was impressed with my abilities and now I have a cute little plushie to put on my bed. 

Continue Reading

Iteration Project – Record Streamer – Isaac Iyengar

Intro:

The goal of this project was to extend my nametag project from the beginning of the semester. That project was a simple press-fit box with stained plywood, and a black acrylic hot glued on top. The black acrylic was cut and rastered to resemble a standard vinyl record. 

I started collecting records once I was gifted a record player from a friend, and it helped me explore a lot of old Jazz and Soul music and collect some of my favorite Hip-Hop albums as well. I like listening to music on my record player and even just seeing the motion of the record, however, I also use Spotify a lot, so I wanted to have a record player that would instead play Spotify.

I wanted to iterate on this project because I wanted to make a motorized version that would spin the record and also function as a music streaming box. The Raspberry Pi fits this specific use case. Essentially its a smart speaker in the form of a record player. 

Here’s a link to the original project: http://cucfablab.org/isaac-iyengar-nametag/

Rotating Record:

Unfortunately, the Raspberry Pi was not able to drive the DC motor on its own, so I used an Arduino instead which received bytes of data via serial indicating whether to spin the record or halt from the Raspberry Pi. I used a DC Motor following the circuit schematic shown below. The Arduino code would simply wait for signals to spin or stop spinning sent via Serial and write to the motor accordingly. 

 


Completed Circuit for DC Motor Control:

Music Streaming:

I explored using the Spotify Web API for this since it allowed for a lot of different functionality with the music such as receiving the current song playing, album art, and various playback functions. However, the Web API for my use case required authentication for every single API call, which is impossible to do in a headless setup of the Pi. Instead, I opted to use Raspotify which allows the Pi to function as a connectable device from Spotify Connect. This essentially made the device a connectable speaker. This worked perfectly and audio would seamlessly play through the speaker connected to the Raspberry Pi. 

 

Below is Demo Video from before the project was in the enclosure and synchronization between playing music and motion of the record was setup.

 

Enclosure:

To design the enclosure I used an online press-fit box generator and cut out holes for the speaker, cords, and ventilation. I sprayed plywood black and etched a pattern into the design to expose the bare wood where the wood was rastered. This created a really cool white on black look to the record case. I also attached a clear acrylic laser cut record to the DC motor so you could see the record spinning. The enclosure also has a slot that currently contains a computer speaker that was taken from a computer monitor. 

Below is a photo of the finished laser cut job.

Speaker inside enclosure:

 

Improvements to Current Design + Reflection:

This implementation requires WiFi, which isn’t great to use in a headless RPi setup since this would require SSH from another laptop to modify the file that specifies which network the device connects to. This works fine from my room, so I’m going to modify this design to work as a Bluetooth device. 

Spotify Connect is fairly limited as a means of streaming music, I’d like to be able to incorporate an LCD display into the record player that shows the current song playing, and push buttons that allow the user to have playback control. One design feature I really want to investigate is incorporated some sort of accelerometer into the record to determine if a user places there hands on the record or spinning the record, and have this be the means of playback functionality. For example, a user could stop the movement of the record with there hand and this would pause the playing. This would be similar to an actual record player.  There are services like Mopidy that I’ll need to investigate some more which allow for much better streaming of music, and incorporation with SoundCloud Last.Fm and other streaming services, all in a really nice UI that I could display on an LCD touch screen. 

 

Mopidy Front End:

One improvement I really want to do is to take apart a record player, and put the hardware inside this enclosure and control it through the Pi. This would be a really cool improvement since this project would then be able to stream music and play records as well. 

Overall though, given the time constraints for this project, I’m really pleased with the results. It’s definitely a significant improvement from the original record player nametag project. I was successful in playing music through the Pi, while rotating the record. My favorite part about this project is definitely the enclosure which turned out really nicely with the black etched wood. 

 

Here’s the Final Product:

Continue Reading

Iteration Project: Papercut Lightbox

For this iteration project, I wanted to redo my papercut lightbox from Assignment 3: Paper Circuits. The first project can be found here: 

The final product wasn’t too bad, but there were definitely some issues and snags I would’ve liked to have done better. Some of these included:

  • I originally attempted to put layers of styrofoam in between the layers, but even after laboring with a hot kitchen knife and the wrong kind of foam for an hour, the foam would simply crumble. Therefore I had to use pieces of cardboard instead.
  • There was no frame around the layers so you could see all of the inside, and it looked unfinished and messy. 
  • As the project was supposed to be focused on the paper circuits and LEDs, there were only three lights within the backdrop. The three LEDs showed through as discrete light points instead of the collective diffused ambience as I wanted them to be. 

Because I had these distinct points of improvement, I thought this project would be a good choice to iterate on. My improved plan was to create the lightbox with papercut layers as before, but with the right kind of foam, a wooden frame, and a string of lights around the backdrop border. 

I brought my proposal up to the instructors, and they gave me the helpful suggestions to use Arduino NeoPixels instead of fairy lights, and Duralar sheets to diffuse the light more than regular paper. Also since the previous type of foam I used was extremely difficult to cut and quickly crumbled, the instructors gave me foam core to use this time around. 

For the paper art, I chose to recreate the artwork of one of my favorite albums. This part took a lot of thinking through the layout of the art and breaking up the components into paper layers. After some time I managed to reduce everything to ten layers, which I then outlined in dark pen to better distinguish what to cut out. I got several sheets of plain white cardstock and added a .5 inch border around the image. To trace the art onto each sheet of cardstock, I came up with a makeshift tracing lightbox using my phone flashlight and my empty sock drawer lol.

The foam core ended up working perfectly; it was MUCH easier to cut, and the cuts were very clean. The thin bars were firm and easy to work with. I reinforced each layer with .5 x .5 inch squares on the corners and 3 inch bars along the sides, as seen on the right.

 

 

 

I cut each of the paper layers by hand using a thin box cutter; this part definitely took the longest time (an all-nighter!) and the most labor intensive. However after everything was cut out and finished, it worked out pretty well! The right image is the final product after assembling and gluing all the layers together. 

Next were the NeoPixels. I had never sautered before, so I got some help from my ECE roommate to show me how to use a sautering iron and solder. We successfully sautered the three wires to the end of the NeoPixel strip. However, when I wrote and uploaded the code to make the strip shine white to the Arduino, nothing happened. After talking to an instructor I found that it was because I had sautered the wires to the wrong end of the strip; apparently they were supposed to be joined at the DI end following the arrows, where I sautered them to the DO end. (Another great learning experience from this project.) After removing the old sauter and trying again, I finally got the NeoPixels to properly light up. 

Lastly, the wooden frame. Another instructor suggested the Box Maker site to me, where you simply input your desired dimensions and the site exports an SVG of the design for a press-fit box. I opened the SVG in good old Inkscape and, as I learned during the first week of class, I edited the layout to fit on a 15 x 30 inch wooden board and for the laser to cut straight through the wood. Uploading the file to the printer, cutting the pieces out, and assembling them was a quick process, and easily enough I had my frame. 

 

I finally had all of my parts, and it was pretty quick to setup everything up. I cut out a huge square for the front frame, and a small hole in the back panel to feed the Arduino wires through. I’m not really sure why the lights ended up being orange instead of white even though the code had values for white, but that wasn’t too much of an issue. 

 

I didn’t have a hammer to completely press the sides together, and plus it was 1 in the morning so I didn’t want to make too much noise doing so. So the box sides are a little loose in the pictures, but otherwise here’s my final product! 

 

Reflection

Overall I’m pretty happy with how this turned out, considering how much work I put into it. Only nitpicky things I would fix are how gluing paper layers directly on top of each other, e.g. the lion and the man, makes certain pieces look darker which I wasn’t expecting. The sun in the top middle also is rather faint and not as pronounced as I would like it to be, so perhaps there’s a solution for that. Outside the scope of this project I would like to paint the frame white and keep it. 

Even though this project took a lot of work and time, because it was over something I really wanted to do, none of what I put in felt like a lot at all. I had a lot of fun doing this and see how much the iteration project improved from the original, and I also got to learn the new skill of sautering. 

 

 

Continue Reading

iteration project – Justin Franklin

Iteration Project – Justin Franklin

In the makerspace class this week we were directed to go back and redo a project in a new way, or improve upon it somehow. I thought about this for a little while, because I didn’t know what I wanted to do at first. I decided to go back and revisit my patch cable holder that I had 3D designed and printed a few weeks earlier.

I began by returning the files I had made in TinkerCad. One thing that had been a problem for me, with my first iteration was that it was too rough and not symmetrical. Also, it was was very simple. Although, I had reasons for initially designing it this way, I wanted to return and fix the details to make it better. I also thought that attaching a light to it would make it good as well, and add a new dimension to it. I began by first recreating the original design and this time making it more visually appealing by making it symmetrical. I then added some supports for the prongs so that it was  overall sturdier.

I then had this idea to create clips in the back that clip onto edges of table or such. With my first version, I didn’t have anyway of connecting it to anything, and this had become a frustration. I tried attaching velcro tape to the back of it and velcroing it to he side of cabinets, but it never seemed to work. With the clips, I wouldn’t really have to worry about this, and I could freely clip it wherever. I decided to just make them very simple hoping they would kind of latch onto something rather than actually clip, cause I wasn’t sure how I was going to design that, with the material being somewhat brittle.

Once I had my refined desgin I sent into the MakerLab at the Gies Business College to print overnight and I could pick it up the next morning, check it out and attempt to reiterate on it, and see exactly what I would need to do about making the clips better. I felt pretty good at this stage and waited until the next day, when I recieved an email that it would be ready to pick up at 2pm. Upon arrival, I found that what had been printed was a reprint of my first version, not my new redesign. Apparently I had accidently uploaded the incorrect file for some reason, and I would have to try again, so I did. The second time, I arrived and it was not printed at all! The problem was that the project file kept failing everytime they had to print. They couldn’t tell me why. So I reuploaded the file to the printer and tried a third time. The third time, I went to pick it up and although it had printed this time, An employee was trying to remove the supports and had accidentally broke it. I was getting frustrated at this point, even though someone had mistakenly broken it, the print itself looked very weird. There were all sorts of gaps and weird defects. I had to have it printed one last time, and by this time it was already Thursday, the night before I had to present this project, and still had nothing done, because I had expected to have it printed much sooner in the week. Luckily, the last time it worked, and I rushed to the fablab to work out how to create a circuit with a button and light for my cable

holder.

Unfortunately, this print was also pretty bad, but at least all the pieces where there so I had to work with it. Also, I didn’t have time at this point to redo the clips or anything so I just went on to  the next thing, the light. I began by just making a very basic light circuit and lighting an LED, with a breadboard. I quickly learned that a single LED would not do the job. I asked, Emelie about it and she suggested neopixels. I wasn’t a fan of this Idea because I thought that introducing an arduino device, would make this unnecessarily complicated. Instead I wanted to do a small lightbulb, but since there were none at the fablab, I had to go the neopixel route. I began by tinkering with some of the code, and quickly stumbled across some test coed for neopixel strips. at first this wasn’t working at all. After talking to Brandon, I found that the code was fine, but the neopixel strips really needed to be soldered first to have a secure useable connection. 

This part was tricky, because although I had soldered before, I really didn’t have a ton of experience. Eventually I had to ask for some soldering tips, and learned a got a ton of great advice from Neil about it. Soon enough, after some effort, I managed to get everything soldered and my neopixels began to work. I had originally planned to use a coathanger as the bendable ‘post’ for my light, but I realized that the wires I had used were fairly sturdy by themselves. I also originally planned to use rubber tubing to cover them, but settled on a straw, and then wrapped electrical tape around them.

Below is the final product, which I was ultimately pretty pleased with. I didn’t get to work on this as much as I had wanted, because of the printing errors I kept running into, but I think I will continue to refine it. One thing I especially want to do, is incorporate clips with springs in them for better options when hanging it up onto something. I’m glad I got to work with more arduino stuff, even though I hadn’t planned to use any at first. I also learned much about soldering here, I feel more comfortable with that skill now.

Continue Reading

Christian Amato Iteration Assignment

For this assignment, I was inspired by Emilie’s idea that she was going to make a lit-up acrylic sign in layers using neo-pixels and a 3D base. I decided I wanted to go one step further and add an ultrasonic sensor to the front of the base to light up the sign based upon the distance away from you.

 

 

This was my nametag, to begin with, and from there I decided I wanted to display the information in sequence, with my hobbies at the front, my information in the middle and my name at the back. 

 

 

I also kept the same original bear, as that’s my favorite animal but I made it much larger, to approximately 4.5 inches to stay within the realm of the 3D printer base. 

 

 

 

 

This was the overall design I created through Tinkercad with my measurements to scale. 

 

 

But some of my measurements were not correct and were actually in millimeters instead of centimeters, which led me to have holes that were too small to hold the acrylic, and a cavity that was too narrow to house the ultrasonic sensor and the Arduino. 

 

 

As you can tell, the dimensions for the acrylic cut-outs were way too small which could have been a big problem if I had not had access to a Dremel to widen the holes.

 

Thankfully, even with scaling issues, the Neopixel cavity was still wide enough to house the pixels and the perfect length to carry four pixels without any hanging off the edges. This took 8.5 hours to print. 

 

 

 

 

At first, I decided that I wanted to drill out the holes for the Neopixels, but the drill bit and the plastic combination was making it extremely difficult to make consistent cuts without falling into a previous hole. 

 

 

I have used a Dremel in the past and really did not want to use it, but it was my only option at that point. The Dremel’s friction heated the plastic so hot that I was able to rip out the melted strands of plastic with the needle-nose pliers. It was arduous and honestly a huge waste of my time because it could have so easily been avoided with proper measurements 

By widening the holes for the acrylic, it also allowed for larger amounts of light to illuminate the acrylic pieces, which was overall a great thing. 

 

 

Next, I used a soldering iron and soldered the Neopixels in series. I thought it would be generally easy to do but soldering with such small contacts my first time took a ton of trial and error.

 

After a failed first attempt after wiring them from the wrong side of the arrow, Brandon generously helped me rewire from the front side. 

 

 

 

From that point, I wired up the ultrasonic sensor by drilling two holes in the front, along with the Neopixels and as you can see from the picture, the housing was very snug. 

 

 

 

 

 

 

 

 

 

 

This is how it looked without any illumination. 

 

 

I think it could have looked much cleaner without the hot glue, but it was necessary to keep it sturdy. 

 

 

Here is the video of the Neopixels in action without the ultrasonic sensor. I really thought this looked a lot better than the plain colors that I found on the ultrasonic sensor. 



Then, I also decided I wanted to use the Ultrasonic sensor to trigger the lights. 



REFLECTION

I thought this project really used a culmination of all the skills I have developed over this course. I utilized the knowledge I had from the 3D printing unit to create a 3D model that would serve as the base for the entire project using Tinkercad. From there I expanded my knowledge of the Arduino and the code that would end up controlling the entire device. I used the laser cutter again, learned what Neopixels are and the practically unlimited potential for different use. I honed my soldering skills and understood the danger behind lead poisoning. I utilized the drill and the Dremel to cut the housing unit to larger specs. I think this project was great because it allowed me to really use a wide array of devices and in the end it actually functioned. 

What went well?

I think the fact that Adafruit has a large array of test codes for Neopixels aided in the overall aesthetic of the design. The coincidence that I was using white 3D printing plastic was also a huge surprise to see how the Neopixels lit up the honeycomb designs on the inside. I thought it really looked great at the end. I think the soldering took a lot of patience but was worth it when it worked on the second try. Brandon was very helpful in showing me how to properly solder, and talking about the importance of cleaning the solder tip to make the solder go down more smoothly. 

What could I have done differently?

I think the 3D printing of the base was the hardest part of the project. From the beginning, the dimensions were really not right, and it forced me to rip it apart and use a Dremel to try and accommodate the larger components. I got really lucky with the base cavities somehow being the exact right length for the Arduino and the Neopixels being the exact width to fit in the cavities. But as you can see the base is not flat at all. Its curved on all sides and I noticed this after only 10 minutes of printing. I should have put something that would stick all the plastic to the base of the forge so that it would have been nice and neat. My number one takeaway from this project is to measure twice and cut once. Because of this, I bought a digital caliper for use in my future projects!

Arduino Code-

// Turning NeoPixels on and off using a HC-SRO4 Ping Sensor

/*

This sketch reads a HC-SR04 ultrasonic rangefinder and returns the

distance to the closest object in range. To do this, it sends a pulse

to the sensor to initiate a reading, then listens for a pulse

to return. The length of the returning pulse is proportional to

the distance of the object from the sensor.


The Arduino then takes this information and illuminates a strip of
NeoPixel’s based on the distance of the object from the sensor.

This code was developed partially from Ping))) code found in the public domain
written by David A. Mellis, and adapted to the HC-SRO4 by Tautvidas Sipavicius,
while other portions were written by Charles Gantt and Curtis Gauger from
http://www.themakersworkbench.com.
*/

//Tell the Arduino IDE to include the FastLED library
#include <FastLED.h>

//Setup the variables for the HC-SR04
const int trigPin = 8;
const int echoPin = 7;

//Setup the variables for the NeoPixel Strip
#define NUM_LEDS 12 // How many leds in your strip?
#define DATA_PIN 6 // What pin is the NeoPixel’s data line connected to?
CRGB leds[NUM_LEDS]; // Define the array of leds

void setup() {
// initialize serial communication:
Serial.begin(9600);
FastLED.addLeds<NEOPIXEL, DATA_PIN>(leds, NUM_LEDS);
}

void loop()
{
// establish variables for duration of the ping,
// and the distance result in inches and centimeters:
long duration, inches, cm;

// The sensor is triggered by a HIGH pulse of 10 or more microseconds.
// Give a short LOW pulse beforehand to ensure a clean HIGH pulse:
pinMode(trigPin, OUTPUT);
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);

// Read the signal from the sensor: a HIGH pulse whose
// duration is the time (in microseconds) from the sending
// of the ping to the reception of its echo off of an object.
pinMode(echoPin, INPUT);
duration = pulseIn(echoPin, HIGH);

// convert the time into a distance
inches = microsecondsToInches(duration);
cm = microsecondsToCentimeters(duration);

Serial.print(inches);
Serial.print(“in, “);
Serial.print(cm);
Serial.print(“cm”);
Serial.println();

if (inches <= 20) {fill_solid( &(leds[0]), NUM_LEDS /*number of leds*/, CRGB::Blue); //{whitestrobe(30);
FastLED.show();

}
else if (inches >= 21) {fill_solid( &(leds[0]), NUM_LEDS /*number of leds*/, CRGB::Black);
FastLED.show();
}

delay(100);
}

long microsecondsToInches(long microseconds)
{
// According to Parallax’s datasheet for the PING))), there are
// 73.746 microseconds per inch (i.e. sound travels at 1130 feet per
// second). This gives the distance travelled by the ping, outbound
// and return, so we divide by 2 to get the distance of the obstacle.
// See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf
return microseconds / 74 / 2;
}

long microsecondsToCentimeters(long microseconds)
{
// The speed of sound is 340 m/s or 29 microseconds per centimeter.
// The ping travels out and back, so to find the distance of the
// object we take half of the distance travelled.
return microseconds / 29 / 2;
}

ARDUINO ADAFRUIT CODE

// NeoPixel test program showing use of the WHITE channel for RGBW
// pixels only (won’t look correct on regular RGB NeoPixel strips).

#include <Adafruit_NeoPixel.h>
#ifdef __AVR__
#include <avr/power.h> // Required for 16 MHz Adafruit Trinket
#endif

// Which pin on the Arduino is connected to the NeoPixels?
// On a Trinket or Gemma we suggest changing this to 1:
#define LED_PIN 6

// How many NeoPixels are attached to the Arduino?
#define LED_COUNT 12

// NeoPixel brightness, 0 (min) to 255 (max)
#define BRIGHTNESS 50

// Declare our NeoPixel strip object:
Adafruit_NeoPixel strip(LED_COUNT, LED_PIN, NEO_GRBW + NEO_KHZ800);
// Argument 1 = Number of pixels in NeoPixel strip
// Argument 2 = Arduino pin number (most are valid)
// Argument 3 = Pixel type flags, add together as needed:
// NEO_KHZ800 800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
// NEO_KHZ400 400 KHz (classic ‘v1’ (not v2) FLORA pixels, WS2811 drivers)
// NEO_GRB Pixels are wired for GRB bitstream (most NeoPixel products)
// NEO_RGB Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
// NEO_RGBW Pixels are wired for RGBW bitstream (NeoPixel RGBW products)

void setup() {
// These lines are specifically to support the Adafruit Trinket 5V 16 MHz.
// Any other board, you can remove this part (but no harm leaving it):
#if defined(__AVR_ATtiny85__) && (F_CPU == 16000000)
clock_prescale_set(clock_div_1);
#endif
// END of Trinket-specific code.

strip.begin(); // INITIALIZE NeoPixel strip object (REQUIRED)
strip.show(); // Turn OFF all pixels ASAP
strip.setBrightness(50); // Set BRIGHTNESS to about 1/5 (max = 255)
}

void loop() {
// Fill along the length of the strip in various colors…
colorWipe(strip.Color(255, 0, 0) , 50); // Red
colorWipe(strip.Color( 0, 255, 0) , 50); // Green
colorWipe(strip.Color( 0, 0, 255) , 50); // Blue
colorWipe(strip.Color( 0, 0, 0, 255), 50); // True white (not RGB white)

whiteOverRainbow(75, 5);

pulseWhite(5);

rainbowFade2White(3, 3, 1);
}

// Fill strip pixels one after another with a color. Strip is NOT cleared
// first; anything there will be covered pixel by pixel. Pass in color
// (as a single ‘packed’ 32-bit value, which you can get by calling
// strip.Color(red, green, blue) as shown in the loop() function above),
// and a delay time (in milliseconds) between pixels.
void colorWipe(uint32_t color, int wait) {
for(int i=0; i<strip.numPixels(); i++) { // For each pixel in strip…
strip.setPixelColor(i, color); // Set pixel’s color (in RAM)
strip.show(); // Update strip to match
delay(wait); // Pause for a moment
}
}

void whiteOverRainbow(int whiteSpeed, int whiteLength) {

if(whiteLength >= strip.numPixels()) whiteLength = strip.numPixels() – 1;

int head = whiteLength – 1;
int tail = 0;
int loops = 3;
int loopNum = 0;
uint32_t lastTime = millis();
uint32_t firstPixelHue = 0;

for(;;) { // Repeat forever (or until a ‘break’ or ‘return’)
for(int i=0; i<strip.numPixels(); i++) { // For each pixel in strip…
if(((i >= tail) && (i <= head)) || // If between head & tail…
((tail > head) && ((i >= tail) || (i <= head)))) {
strip.setPixelColor(i, strip.Color(0, 0, 0, 255)); // Set white
} else { // else set rainbow
int pixelHue = firstPixelHue + (i * 65536L / strip.numPixels());
strip.setPixelColor(i, strip.gamma32(strip.ColorHSV(pixelHue)));
}
}

strip.show(); // Update strip with new contents
// There’s no delay here, it just runs full-tilt until the timer and
// counter combination below runs out.

firstPixelHue += 40; // Advance just a little along the color wheel

if((millis() – lastTime) > whiteSpeed) { // Time to update head/tail?
if(++head >= strip.numPixels()) { // Advance head, wrap around
head = 0;
if(++loopNum >= loops) return;
}
if(++tail >= strip.numPixels()) { // Advance tail, wrap around
tail = 0;
}
lastTime = millis(); // Save time of last movement
}
}
}

void pulseWhite(uint8_t wait) {
for(int j=0; j<256; j++) { // Ramp up from 0 to 255
// Fill entire strip with white at gamma-corrected brightness level ‘j’:
strip.fill(strip.Color(0, 0, 0, strip.gamma8(j)));
strip.show();
delay(wait);
}

for(int j=255; j>=0; j–) { // Ramp down from 255 to 0
strip.fill(strip.Color(0, 0, 0, strip.gamma8(j)));
strip.show();
delay(wait);
}
}

void rainbowFade2White(int wait, int rainbowLoops, int whiteLoops) {
int fadeVal=0, fadeMax=100;

// Hue of first pixel runs ‘rainbowLoops’ complete loops through the color
// wheel. Color wheel has a range of 65536 but it’s OK if we roll over, so
// just count from 0 to rainbowLoops*65536, using steps of 256 so we
// advance around the wheel at a decent clip.
for(uint32_t firstPixelHue = 0; firstPixelHue < rainbowLoops*65536;
firstPixelHue += 256) {

for(int i=0; i<strip.numPixels(); i++) { // For each pixel in strip…

// Offset pixel hue by an amount to make one full revolution of the
// color wheel (range of 65536) along the length of the strip
// (strip.numPixels() steps):
uint32_t pixelHue = firstPixelHue + (i * 65536L / strip.numPixels());

// strip.ColorHSV() can take 1 or 3 arguments: a hue (0 to 65535) or
// optionally add saturation and value (brightness) (each 0 to 255).
// Here we’re using just the three-argument variant, though the
// second value (saturation) is a constant 255.
strip.setPixelColor(i, strip.gamma32(strip.ColorHSV(pixelHue, 255,
255 * fadeVal / fadeMax)));
}

strip.show();
delay(wait);

if(firstPixelHue < 65536) { // First loop,
if(fadeVal < fadeMax) fadeVal++; // fade in
} else if(firstPixelHue >= ((rainbowLoops-1) * 65536)) { // Last loop,
if(fadeVal > 0) fadeVal–; // fade out
} else {
fadeVal = fadeMax; // Interim loop, make sure fade is at max
}
}

for(int k=0; k<whiteLoops; k++) {
for(int j=0; j<256; j++) { // Ramp up 0 to 255
// Fill entire strip with white at gamma-corrected brightness level ‘j’:
strip.fill(strip.Color(0, 0, 0, strip.gamma8(j)));
strip.show();
}
delay(1000); // Pause 1 second
for(int j=255; j>=0; j–) { // Ramp down 255 to 0
strip.fill(strip.Color(0, 0, 0, strip.gamma8(j)));
strip.show();
}
}

delay(500); // Pause 1/2 second
}

 

Continue Reading

Iteration Project– Joy Shi

For my iteration project, I decided to redo the first assignment, using the laser cutter to make a name tag. My original name tag was nice and simple, though a little small. For my iteration, I wanted to make my name out of acrylic as it makes the name tag look cooler than the plywood. I noticed that when rastoring on acrylic, the print is white and I wanted something with more of a color. My original idea was to use the Silhouette cutter to make a sticker that I would then place on top of the acrylic. While also looking for inspiration on Google, I came across someone using an LED light and coin battery to light up the acrylic. The effects of the light were really cool, and it seems simple enough for me to do. While proposing my idea to Maxx, she concluded that my idea was too simple. She suggested I instead use NeoPixel strips as the base and have the light shine up that would give the rastored acrylic some color instead of placing a sticker. After showing me a few examples that were in the FabLab, I decided that this was a much better idea than what I originally thought of doing.

Original Name Tag

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Since I already had a general idea of what I wanted to put on my name tag, the designing part of the name tag was relatively easy to do. Since I wanted this name tag to be more personalized than just having my name on it, I decided to add my Bitmoji with a coffee cup since coffee plays such an essential part in my life!

I decided to print out a prototype version of the name tag on plywood, just so I can see if I liked how the designed turned out and how big it was. Right when I was about to print, Maxx noticed that the name tag was a tad too big. After adjusting the size, I was ready to print it out! The final prototype turned out pretty nice and I really liked the size and design of it.

Sketch of the design

 

 

 

 

 

 

 

 

 

 

 

 

Prototype

 

 

 

 

 

 

 

 

I went back and made a few tweaks on the design since there was a bit of blank space on the prototype. After adjusting, I decided to laser cut my design onto acrylic. Halfway through the print, I realize that I forgot to adjust the size of the design!! So in the end, the design was slightly larger than I anticipated, but it was still really nice. 

 

 

 

 

 

 

 

 

After all that fiasco, I went to search for NeoPixel strips. Sadly the only NeoPixel strips they had didn’t have the wires soldered on. After a brief tutorial, I was soldering the wires onto the NeoPixel strips. The process was pretty cool and was really easy if my hands weren’t so shaky. With the wires connected to the NeoPixel strips, it was time to test it out. Since I’m not that good with coding for the Arduino, I just found a code on Google. With the code loaded and wires connected, I tested out the NeoPixel strips.

 

 

 

 

 

 

 

 

With the acrylic design ready, and NeoPixel strips working, it was time to determine how to make the stand I was going to put my acrylic on. Google had many ideas, from using a box to 3D printing to using the laser cutter. Since I was already using the laser cutter, it only seems reasonable to continue using the laser cutter. The design of the stand was relatively easy to format. Drawing out the stand on Inkscape was relatively easy since they were all rectangles cut into different shapes. I specifically was very careful when it came to the measurements of the slit for where I was going to place the acrylic. The lasering of the stand was also straight forward. However, when I attempted to put the acrylic into the slit I made, the acrylic wouldn’t fit!! I was slightly disappointed since I specifically paid attention to make sure that this wouldn’t happen:( Thankfully it was only slightly off, so I used some sandpaper to remove the excess part. Surprisingly this took a while since the pieces were small, and I didn’t want to break it. After finally being able to have all the pieces fit through the slit, I worked on placing the NeoPixel strips. With the help of some double-sided tape, I placed the NeoPixel strips on the base of the stand. I then glued the top part of the stand with the acrylic on top of the NeoPixel strips.

Using sandpaper to make the acrylic fit:(

finally with all the slits lining up with the acrylic

In the end, I really liked how the whole project turned out, even though now it’s more like a nameplate than a name tag!!!

 

Code used:

https://forum.arduino.cc/index.php?topic=282769.0

 

Inspiration used for base: 

https://www.xstron.com/how-to-make-custom-dimension-wood-led-base-for-acrylic-plate

Continue Reading

Daniel Shin – Assignment 8: Iteration

For the iteration project, I decided to iterate upon the sculpture I made on the 3d printing session. Some difficulties I had with the 3d printing project was that it was difficult to salvage all the small details from a complicated object. Therefore, I decided to simplify the 3d printed object and combine it with other techniques such as laser cutting and LED lighting to improve the visuals of the final product.

The planning process included the basic idea of how each part was going to be put together to function as one. Also, I had to put effort into getting the sizes of the parts right since the 3d printed objects had to fit perfectly with the laser cut objects.

While I was still in the workshop, I made the template for the box according to the planned sizes within Inkscape and laser cut them.

 

Once I got home, I focused on getting all the digital works done so I can finish all the works once I get back to the workshop. I used the 3d model I created on my previous project and made it much more simple. For the front design of the box, I found an image of a horn from the internet and combined that with a design of a helmet and patterns that I drew it myself on photoshop. Once I was done with the design, I imported into Inkscape so it was ready for laser cutting.

The first thing I did when I got back to the workshop was to start running the 3d printing and the laser cutting process. All processes went smoothly without an issue, so I was able to focus on the LED building right away.

While I was scavenging the workshop to find a battery, I found this battery holder that comes with a switch and cables. I was wondering how I was going to make the switch, copper tape circuit, and a battery holder before I found this. Discovering this solved everything, and all I had to do was connect the cables with the LED, and it worked perfectly.

Once the 3D printing was done, I sanded the outcome and glued the parts together. I didn’t have to color the sculpture because the sculpture was going to function as a silhouette. Then I placed the sculpture within the box and glued the box together.

I made three more patterns of the horn design to add more visual aspects to the box. Once I glued them to the box, I was done with the project.

This is how it looks in the dark.

 

 

Reflection

Overall, I didn’t face many difficulties throughout this project. I was comfortable using all the tools I used in the process, and the thorough planning made the process easy. The only problem I was worried about was getting the electric circuit working for the LED, but it was resolved easier than I expected, thanks to finding the battery holder I used. If I were to do this project again, I would probably make the whole thing slightly larger.

Continue Reading

Iteration Project – Scott Kim

 

Iterating on the first Arduino project.

The project was something that could definitely work well in theory, but ran into a lot of issues and shortcomings that I didn’t initially expect. The original idea was to be able to type out words without the use of fingers or fine motor skills. It would be used in a way like this:

  1. Once the ultrasonic sensor detects your hand, it keeps measuring the distance. It uses this distance select a letter.
  2. Once the sensor detects a sudden change in the distance (in that, you moved your hand away from the sensing area), it would select and type the letter corresponding to the most recent letter.

 

But it had a lot of problems.

This seemed like a good idea in theory, but a lot of its shortcomings can be summarized through this:

(sorry for the dark video, the screen was just too bright in comparison)

Digging into what might be the problem for it being inconsistent, I printed out what the sensor measured as distance and got this:

Despite its errors, it did work well enough for a short demonstration and video for the assignment, but it really was nowhere near what I imagined. The whole thing needed a better frame to make the hardware more reliable, and an entire software rewrite to accommodate for the small errors. Even with that, I found some oversimplifications in my original description of the project.

So immediately, the distance measurements are unreliable. I saw three sources of this, and three solutions to them that I would try to implement first:

  1. When you’re really far out (the letter Z is ~50cm away from the sensor) you can’t really tell if the sensor is facing in your hand’s direction.
    • I need to make some solid frame that holds the sensor perpendicular to some ruler/surface
  2. “A sudden change in distance” is really vague and hard to quantify. Especially with an unreliable sensor like this.
    • Once I get the frame, have something at the end so I can tell exactly when a hand is/isn’t on the sensor path.
  3. “Hands” are oddly shaped and aren’t always detected too well.
    • Use a 90 degree reflector instead

 

First, making more reliable hardware

I knew that about 2cm per letter was a good amount of room where it doesn’t take too much space, but you also don’t have to be too precise. So, I started preparing a ~55cm long scaffold to hold the sensor and display. I created a cutting pattern for an open box using Boxes.py, and added a few more cutouts of my own for the sensor, screen, and wires.

The pieces at the top will make up the open box, while the 4 bottom pieces in the cut pattern will be used to solve problem #4 and create 90 degree reflectors.

Once cut, I started putting together the pieces!

 

My proudest part of this, really.

The ultrasonic sensor was a amazing fit, with 10/10 positioning of the wiring hole. I was honestly way too happy for just getting this one bit so perfectly 🙂


 

Downhill from there

The display might look like an almost perfect fit, but the cut hole was just barely too small because of a tiny part of it I didn’t account for. I amend this later.

The 90 degree reflector! I heard that using angled pieces like this more reliably reflect lights and sounds back to the source, so I thought it would help a lot.

I made the mistake of cutting a little too big a piece because I didn’t correctly tolerance it when measuring, cutting, and Pythagorean Theorem-ing.  

And then finally, it’s all coming together! Some last things to fix though.

 

Measure once, cut twice

Now as I mentioned before, the screen was not fitting perfectly, and the reflector was cut too big. I quickly fixed those issues.

The oversized reflector was marked, cut, then put back:

 

And then for the screen, this tiiiny part of the screen/wires were getting caught in the wood and would not let it fit.

 

And so, a bit of careful carving out the wood later, I managed to make some room. (There’s a pretty guilty-looking utility knife in the background)

After some hot gluing, taping, and writing the letters on the board as a guide, it came together pretty well! From a hardware perspective, it had everything to make the ultrasonic sensor as reliable as possible, addressing all 4 of the problems.

 

Actually writing good code this time

It was now time for the software rework.

After toying with the old version of the code, I saw several issues that I should address similarly:

 

Issue #1: No matter how reliable everything is, I’ll get the occasional outlier from the ultrasonic sensor.

The first method used the statistical mean of ~30 samples, but outliers affected that a lot.

Instead of using the mean, I now used median. That way, huge outliers won’t have any affect as long as they’re not the majority. 

 

Issue #2: As I move my hand away to “select” a letter, I might accidentally select an adjacent letter instead.

For example, say the user wants to select ‘h’ which is between 18cm and 20cm. The sensor might return series of numbers like this:

  • 19.3cm (h)
  • 18.8cm (h)
  • 19.1cm (h)
  • 21.2cm (i)
  • 57.8cm (hand completely moved away)

Once the sensor realizes the hand completely moved away, it needs to select the letter that was most present in the last few seconds (which is ‘h’), not just the most recent measurement (which would be ‘i’).

I solved this by keeping track of the ~7 most recent measurements, and then taking the median value once it realizes the hand moved away.

I set a delay at the end so that the user gets some moment to see that the letter they want has been selected.

 

Issue #3: There’s no good way to tell if you’re in the ‘border’ between two letters.

If you take a look at the past video or images, you can say it displays a “Distance”. This was meant to serve the above purpose, but it only really works for me because I know roughly where the letters are supposed to land. But even then, it’s not very useful because I don’t know the thresholds memorized of course.

I had all this screen space to work with, so I thought I could put a pretty intuitive indicator of how “centered” you are on the letter! It would work like this:

If you’re currently selected on the letter ‘c’, and perfectly centered, the display would show something like this:

  • b          c          d

Otherwise, if you’re selected on the letter ‘c’ but a little over to the left:

  • b   c                 d

Or if you’re a little over to the right:

  • b                 c   d

Hopefully that would indicate that you should move your hand a little over to the right to get a more precise selection of the letter ‘c’. 

 

And with that, we’re done!

I’ll let the video speak for itself 🙂

 

 

Most of the trial-and-error in this project was really in the programming. Seeing what seems like would intuitively or theoretically work, trying to make it work, and not being sure if my method is bad or if the program is buggy. There was one hour where everything was working fine, but it would inexplicably type “z” no matter what. I was completely sure I was writing the correct letter to the LCD, and then I realized I was just overwriting the correct letter with “z” just a couple milliseconds later because of a bad if statement 🙁

It was also a lot of tinkering with how many samples I should take for a reliable median, many samples I should “remember” for the selection, as well as how much space to give per letter. If these numbers were too big then the system felt sluggish, but too small would lead to more mistakes.

Then, there was the reorganizing and commenting everything, because holy moly the original code I wrote was awful.

When I saw how well it was working in the end, I was genuinely so happy with its consistency and reliability. The typing, the “Ready” notice, as well as the visual guide and feedback for each letter was all exactly how I imagined it to be. (Almost) everything fits together perfectly, and there’s so little about it that feels hacked together or improvised. I really can’t express just how “complete” this project feels, really having made no compromises in terms of quality, accuracy, or usability. 

One thing I considered was laser cutting the letters onto the board, but I thought it would be better to hand-write them on so that I can fiddle with the letter spacing and locations in the code. Next time though!

Continue Reading

Iteration project: Sean White

For my Iteration project I decided to iterate on the game I made with the LCD. I wanted to use a screen that offered more pixels and had an easier interface to work with in Arduino. The main issue with the LCD screen that I originally used was that there were very little pixels and everything was broken into blocks. With this new OLED screen I used, it was 128×64 pixels and it was all located on one area. The coding interface was also much easier since the ADAFRUIT libraries allowed for drawing bitmaps based on x and y coordinates. This made it much easier to code the moving blocks.

A few problems I ran into had to deal with, like last time, making it show the walls would spawn in a way where it was possible to avoid them. I spent a while trying to figure out all of the conditions for the walls, but in the end, I used the same method before which was brute forcing and checking the different options for when a wall spawned. Unfortunately, I wasn’t able to take a picture of this, but this is a image of what it looked like:

I had also originally had each box move 16 pixels at a time. However, the look of it was very choppy. As a result, I decided to increase the amount of times the screen refreshed since this OLED screen could handle a faster refresh rate. This allowed the look of the game to look much smoother compared to the original LCD.

In order to improve upon this, I decided to make a housing for it as well as use a joystick instead, since that is more intuitive to control. I also added a scoring system based on the total time the player survived, and I made it so every five seconds the walls would speed up a little bit. This made it more challenging to overtime.

I at first wanted to design this like an arcade system, but once I received the screen, I realized it was too small to make it something like that. Instead, I created a box using boxes.py in order to hold the arduino and I made cutouts for the screen and the joystick. Overall, I am happy with how it turned out, but If I were to redo it, I would probably make the case more ergonomic to hold. I would do this by using an arduino nano to make the overall form factor smaller and use smaller wires. I would also solder the wires together too.

This was the original lcd project:

This was the new improved project:

Continue Reading