Pursuits of Design and Robotics

The documentation of concept development and general processes


Leave a comment

Sketch Helper: Precedents, Ideas, Goals.

There are a couple of projects that were really inspiring to me as drawing aids. The first being from a Carnegie Mellon student from Computation Design years ago, Sketch it, Make it, which is now called Zotebook. The app allows you to draw quickly on your tablet and it offers corrections and perfections as you go.

What is particularly nice about this application is that it can be output straight to a 3d-printer or laser-cutter to create quick models that were drawn up in minutes. Very useful and accurate way to ‘sketch’ prototypes and have them realized immediately.

Another precedent to look at is the NeoLucida which incidentally was also started by someone who was at CMU, Pablo Garcia. The NeoLucida is essentially a lucida with some electronics involved. It allows the user to trace the reality that the user sees through the lucida as it is projected, in a way, on to the paper.

This project is great for its unobtrusiveness in the analog process. It does not limit the user, as it can only provide visual suggestion. I think this could help a lot in how I think about how my application will aid the drawing process.

People today are relying quite heavily on the aid of technology and in my opinion it is losing people the touch of the hand, the personal quality of mistakes, imperfections, and personality. This feels especially true in renderings of architecture, as the goal is to be realistic but in being so all images to me look like they could be made by the same person, nothing really stands out. I think it’s valuable to add personal flair to an image. In the style of the drawing, through the personal touch. Even when this is not the case, when the drawer is simply trying to quickly output a sketch for an idea, when using technology the image tends to come out very complete-looking. The common thought about this is that completed and accurate drawings look resolved and thus makes the conceptual idea being communicated uninviting to critiqued by clients or peers because it seems resolved. This is a real issue in the world of HCI, for example, when it comes to sketching user interfaces and testing them with surveys. The surveyor will feel inclined to say less if the tested product is more complete or finalized looking.

Essentially then, what feels necessary that seems missing in the technologically aided drawing world is something that provides the efficiency that drew people into the digital drawing world, while maintaining the hand-drawn sketch feeling that can only be provided by the old fashioned pen-and-paper method. My idea is to make that happen using the Equill pen and combining it with another technology, which I’m currently considering to be either a screen or a projector.

This is what I had had up until a short while ago, using the Surface Pro 3 and its pen with the tip replaced by a piece of graphite. As you can see in the second video there are plenty of comments to be made on how it can be improved.

In addition to Rohan’s comments in the video, I’ve also come to the conclusion that there should also be a small space designated to construction drawing. As you saw, Rohan even drew a smaller version of what he was to draw. It is common in perspective drawings to first draw a floor plan and the viewpoint and viewing angle. I could use that basic plan and possibly an additional section to automatically construct the room for the user.

Because that could be complicated and not quite intuitive for a beginner, I’m also considering having the option for a default scaffolding that can be navigated through and have the view be selected before beginning the drawing process. This option allows for a building off of the scaffold, which could inspire ideas and be easier to start from than nothing.

I think both can be done. I am currently tapping into the Equill Pen SDK and trying to figure out how to take advantage of all of its features. I am hoping to move completely away from the Surface Pen because it is obviously limited to itself, while the Equill can work on multiple platforms.

Advertisements


Leave a comment

Smart 3D Pen Research

“FreeD” by Amit Zoran. 2013

More

Amit references

“Haptic Intelligentsia” by Joong Han Lee. 2012

More

Amit uses magnets for 3D tracking. I found this website that appears to be a DIY. Maybe I can work on this for another project if the Leap works.

I was worried it wouldn’t so I did a lot of research on 3D magnet tracking, which is still an option maybe…maybe I’ll save it for another project. I’ll post some links just so people can see what I was thinking about if you’re interested.

DIY Magnet Tracker Sites
1 2 3 4 5

 Understanding the limitations of the pen. Make sure it can work with how I want to use it.

Understanding the limitations of the pen. Make sure it can work with how I want to use it.

THE LEAP WORKS!

Pen over Leap


Leave a comment

Independent Study

My independent study is going to be a continuation of my project for the embedded wearable technology in Ali Momeni’s Hybrid Instruments course.

I want to make my cat bowl smart enough to determine what cat is approaching it, whether it should let the cat eat, how much, and how many times through out the day. The bowl will relay information to me through a phone application. With the app I can see how many times each of my cats tried to eat today, what cat interrupted what cat from eating, how much the cats ate so far today, and I can control everything.

I found a few precedent examples of people who have thought about this, but I don’t think anybody did it as well as I will do it.

http://gatefeeder.com/

Gatefeeder is decently designed, the door is clever for not needing a servo and having the cat open the door. The inset depth of the door is presumably to prevent a second cat from pushing his head in and keeping the door open for himself to eat. I could consider a design that accomplishes similar results. Gatefeeder is doing a lot of what I want except that it doesn’t differentiate between cats and it is not intelligently learning about the cat’s behavior. Actually, there are no cat bowls that actually monitor the feeding. They only open and close.

http://petsweekly.com/en/all-about-cats/cat-health/cat-behavior/663-feeding-individual-diets-in-multi-cat-households

This website shows some other examples of cat feeders, mainly SureFlap and Meow Space. Both use the idea of a cat-door to a confined space to allow certain cats in to eat. Again, these lack control over how much is eaten by different cats who have access to the same bowl.

The precedents are there and it’s reassuring to see that there are no data-interpreting bowls like I intend to make, but the bowls are similar enough to help me understand the best actuation methods. RFID was the right way to go, yay!

I have ordered online an RFID reader component and an antenna for it to allow a range of up to 3 feet. This way if one cat tries to sneak up on the one eating, the bowl will just shut down and they’ll have to give up on eating at the same time. I also got tags that should be perfect for the collar.


Leave a comment

Sing To Me Physical Process

The first model of the flower was too stiff for the servo to work. Also the box was ugly. I did really like the idea of wood, though, because of the warm ambiance it gives off. I thought the ribbons might look nice but they look terrible and desperate. The flower was too big to get a lot of movement out of a small servo, and I hadn’t considered until after seeing this that the string should not be out in the open like that all the way down to the base. There should be some kind of protection.

I redid it to make it look a bit more planned and smooth. See the final product in the final post for this.

Elevation view, closed.

Elevation view, closed.

No hardware could fit down here. I did not plan well.

No hardware could fit down here. I did not plan well.

Flower Opened

Flower Opened

Flower Closed

Flower Closed


Leave a comment

Sing To Me Code

Just a brief of the code before showing the gold. HAH! Rhymes. Didn’t really make sense since they’re referring to the same thing..anyway. 

With the FFFT library I can use the microphone to pick up sound across the spectrum and store each frequency value in an array. From the tutorial, this array is size 64. I decided to reduce it to half that size by adding every pair of numbers in the array, which also allows for higher tolerance of sound input.

Next step was to figure out the prominent note which was a simple find max value loop with some smoothing thrown in there to, well, smooth things out.

For the buttons I have one to set a new password and one to test new input against the password. This was done by recording the loudest note for 30 frames and saving that to an array. The saved array from the set password would then be compared to the new input. There is an error tolerance to accommodate for offsets in timing or being off-key.  If you are too off in timing or too off key, you’ll get a point, and after 5 points out of the 30, you got the password wrong. This makes the input pretty picky, considering how sketchy the microphone seems to be with ambiance and the like.

I could have done more with the buttons. I wanted to have a light turn on to indicate that a password is being set and stay on once it is set. A light for the new input as well, which would blink a couple times to celebrate your correctness. I wanted to incorporate protection around the premise of inputting a password before setting one, which currently means that you automatically got it right. There are a few other simple situations I could have delved in to to enrich the experience and make it more fluid, easy to understand, and functional. But hey, at least writing about it shows awareness. 

 

Here is my very, very messy code. Stick THAT in your Arduino and run it.

#include <stdint.h>
#include <ffft.h>
#include <Servo.h>

//SOUND STUFF

volatile byte position = 0;
volatile long zero = 0;

int16_t capture[FFT_N]; /* Wave captureing buffer */
complex_t bfly_buff[FFT_N]; /* FFT buffer */
uint16_t spektrum[FFT_N/2]; /* Spectrum output buffer */

//AUDIO INPUT
#define IR_AUDIO 0 // ADC channel to cap

//SOUND AVG

const int fftNum = int(FFT_N / 2);

//SOUND SMOOTHE
const int smoothFrames = 5;
int sIndex = 0;
uint16_t readings[smoothFrames][fftNum/2];
uint16_t average[fftNum/2];
uint16_t total[fftNum/2];

//INPUT RECORDER
const int totalFrames = 30;
int thisFrame = 0;
int indexRec[totalFrames];
bool recorded = false;

//INPUT PASSWORD
int passFrame = 0;
int indexPass[totalFrames];
bool passworded = false;
int worked = 0;

//BUTTON STUFF
//int ledPin1 = 12; // choose the pin for the LED
//int ledPin2 = 13;
int butPin1 = 7; // choose the input pin (for a pushbutton)
int butPin2 = 6;
int butVal1 = 0; // variable for reading the pin status
int butVal2 = 0;

//SERVO
Servo box;
int boxPint = 13;

void setup()
{
Serial.begin(9600);
adcInit();
adcCalb();

box.attach(boxPint);
pinMode(13, OUTPUT);
Serial.println(fftNum);
box.write(180);

}

void loop()
{

//SOUND STUFF
if (position == FFT_N)
{
fft_input(capture, bfly_buff);
fft_execute(bfly_buff);
fft_output(bfly_buff, spektrum);

position = 0;
}

int ind = 0;

// //SMOOTHER
for(int i = 0; i < fftNum/2; i++)
{

total[i] = total[i] – readings[sIndex][i];
readings[sIndex][i] = spektrum[ind] + spektrum[ind + 1];
total[i] += readings[sIndex][i];

average[i] = total[i]/smoothFrames;

ind += 2;
}

//FOR DEBUGGING INPUT WITH SMOOTHING
// Serial.println(“THIS IS A SPEKTRUM”);
// Serial.print(“[“);
//
// for (byte i = 0; i < fftNum/2; i++){
// Serial.print(average[i]);
// Serial.print(“,”);
// }
// Serial.println(“]”);

sIndex ++;
if (sIndex >= smoothFrames) sIndex = 0;
int thisNote = getNote(average);
Serial.println(thisNote);

//BUTTON STUFF

// if (recorded) digitalWrite(ledPin1, HIGH); // turn LED ON

butVal1 = digitalRead(butPin1); // read input value

if (butVal1 == HIGH) { // check if the input is HIGH (button released)
// digitalWrite(ledPin1, LOW); // turn LED OFF
}
else {
// digitalWrite(ledPin1, HIGH); // turn LED ON
//RECORDER
if(!recorded)
{
if (thisFrame < totalFrames){

Serial.println(“RECORDING!”);
indexRec[thisFrame] = thisNote;
thisFrame ++;
}
else{
Serial.println(“DONE RECORDING!”);
recorded = true;
delay(5000);
}
}
}

butVal2 = digitalRead(butPin2); // read input value

// if (passworded) digitalWrite(ledPin2, HIGH);

if (butVal2 == HIGH) { // check if the input is HIGH (button released)
// digitalWrite(ledPin2, LOW); // turn LED OFF
}
else {
// digitalWrite(ledPin2, HIGH); // turn LED ON

//PASSWORD

if(!passworded)
{
Serial.println(“INPUT PASSWORD”);
indexPass[passFrame] = thisNote;
passFrame ++;

if (passFrame >= thisFrame){
Serial.println(“DONE”);
passworded = true;
}
}
if(passworded)
{
Serial.println(“PASSWORD ENTERED”);
for(int i = 0; i < thisFrame; i++)
{
//FOR DIRECT COMPARISON OF INPUTS
// Serial.println(“RECORDER:”);
// Serial.println(indexRec[i]);
// Serial.println(“PASSWORD:”);
// Serial.println(indexPass[i]);

if(abs(indexPass[i] – indexRec[i]) > 2)
{
Serial.println(“THIS WAS WRONG!”);
worked ++;
}
}
Serial.println(“DID IT WORK?”);
if (worked < 5) {

box.write(0);
delay(1000);
passworded = false;
passFrame = 0;

}
Serial.println( worked<10);
delay(3000);
passFrame = 0;
}
}

}

// free running ADC fills capture buffer
ISR(ADC_vect)
{
if (position >= FFT_N)
return;

capture[position] = ADC + zero;
if (capture[position] == -1 || capture[position] == 1)
capture[position] = 0;

position++;
}
void adcInit(){
/* REFS0 : VCC use as a ref, IR_AUDIO : channel selection, ADEN : ADC Enable, ADSC : ADC Start, ADATE : ADC Auto Trigger Enable, ADIE : ADC Interrupt Enable, ADPS : ADC Prescaler */
// free running ADC mode, f = ( 16MHz / prescaler ) / 13 cycles per conversion
ADMUX = _BV(REFS0) | IR_AUDIO; // | _BV(ADLAR);
// ADCSRA = _BV(ADSC) | _BV(ADEN) | _BV(ADATE) | _BV(ADIE) | _BV(ADPS2) | _BV(ADPS1) //prescaler 64 : 19231 Hz – 300Hz per 64 divisions
ADCSRA = _BV(ADSC) | _BV(ADEN) | _BV(ADATE) | _BV(ADIE) | _BV(ADPS2) | _BV(ADPS1) | _BV(ADPS0); // prescaler 128 : 9615 Hz – 150 Hz per 64 divisions, better for most music
sei();
}
void adcCalb(){
Serial.println(“Start to calc zero”);
long midl = 0;
// get 2 meashurment at 2 sec
// on ADC input must be NO SIGNAL!!!
for (byte i = 0; i < 2; i++)
{
position = 0;
delay(100);
midl += capture[0];
delay(900);
}
zero = -midl/2;
Serial.println(“Done.”);
}

int getNote(uint16_t thisInput[])
{
int index = 0;
uint16_t note = 0;

for(int i = 4; i < fftNum/2; i++)
{
if (thisInput[i] > note)
{
note = thisInput[i];
index = i;
}
}
return index;
}

 


Leave a comment

“Sing To Me” by Yeliz Karadayi

The MTI Blog has failed me, or I have failed it. I could not log in this night.

Flower from Yeliz Karadayi on Vimeo.

 

There is a common myth that plants will grow faster when you sing to them often. This flower guarantees you its blossom, that is if you are able to sing it the right song.

Well, luckily you get to pick that song. You can be that special person to make this flower blossom. You can put this flower next to other flowers so that you can get that instant blossom gratification from your beautiful singing efforts, and simultaneously test out that myth…I think my basil plant grew faster this week?
Elevation

On my window sill please!

On my window sill please!


Leave a comment

Inspiration

“Ishin-den-shin” by Yuri Suzuki, et al. Disney, Pittsburgh (2013)

This project is an incredible manifestation of the technological capabilities we have reached today. The ability to use your own body to contain a hidden message and, by simple touch, deliver it to another, takes interactivity to another level of true immersion in the activity with the technology. The whole system seems very low tech but of course that is because the actual technology working here is unseen, adding a layer of complexity that goes completely unnoticed- except for when you touch someone’s ear!


More
“A Drifting Up” by Syed Reza Ali (2009)

A Drifting Up is a project I was shown over the summer of a flocking particle system that responds to sound. Its response creates particle sub-systems within the larger whole. It is a complicated system that yields an interesting visual effect, but to take something like this a step further and manifest physically somehow could be even more intriguing.

More

“Acoustic Barcodes” by Chris Harrison, el at. Presented @ Cambridge (2012)
The thoughtfulness of this project is appealing as an object identification device. Though it is difficult to consider a realistic application for this, it is intriguing to imagine a world physically marked with identification that you can hear, feel, and see. The requirement of a haptic interaction with an object in order to get feedback from the audio effect is the true function that entices me.

Interact with one thing, produce a resulting sound from it, then that sound is heard. The indirect relationship between the “input” object and the direct input to the program is where I find inspiration.

More

“Mogee” by Bruno Zamborlin, et al. (2012)
This fourth precedent is being posted because Ishin-den-shin was already used and I wanted to be sure to have three new ones. This technology can turn anything into an instrument by picking up on the way that the object being used as an instrument is being touched. Like Ishin-den-shin, this project has something seemingly low-tech do something unexpected and truly immerses the user in the experience by having them be directly involved.


More