Pursuits of Design and Robotics

The documentation of concept development and general processes


Leave a comment

Feedback Loop

feedback

Describing the process of a person using the haptic feedback device to print. The idea here is that by being able to control both the digital and the physical output, the user has a level of control and live-feedback while manufacturing, something I will now refer to as design manufacturing.

The perspective of the artist, designer, or ‘manufacturer’, is very important here. The same model could be distributed a hundred thousand times and with this process, no two printed models will be the same. There is a sense of ownership here. Ownership, control, personalization. Craft. Skill. Technique.

However, this concept is not new, as we have seen from Haptic Intelligentsia and FreeD. It is also not necessarily the part that excites me most. It is the notion that the user has control, not just over how he outputs the print, but what the print is, the ability to change it and distribute that. The notion of changing the original, the digital original, redistributing that, someone else changing it again. And again. Like a model tracing game of telephone. The sense of collaboration, direct and indirect: direct collaboration with the haptic device, indirect with others who have affected this model.

And what about other inputs? Can the manufacturer collaborate with other mediums during the printing process? Can sound affect the haptics, can light, or perhaps something even more meaningful, like words which affect the manner in which you collaborate?

Perhaps then it is valuable to have the movements of the user recorded back in to the digital space to be shared again, to be passed on and constantly changed each time it is accessed, not unlike the recollecting of a memory, which for each time is accessed it is slightly altered.

Advertisements


Leave a comment

Guided Hand

Guided Hand will use the Haptic Phantom Touch’s ability to snap to virtual points in physical space and snap to surface boundaries of virtual models. This snapping allows for the hand to be physically guided. I will fix a 3D Pen onto the Phantom so that the guided hand can now 3D print at a higher efficiency, allowing for drawing on virtual boundaries that do not physical exist. This project will explore the capabilities and limitations of the proposed system and conclude with a completed setup that allows for quick and accurate sculptures or prototypes.

This sort of ‘augmented virtual/physical’ process allows limitations on the print by guiding the hand, thus allowing the designer to focus more on the design, worrying less about how to be accurate. It takes advantage of digital accuracy but maintains enough freedom to make way for the valuable hand-craft process. This could potentially open up an entirely new method of working in the design phase. Additionally, this can allow for multiple designs to be directly output based on the same virtual model, so that an artist can iterate through many ideas, all the while maintaining consistency.


Leave a comment

Sketch Aid

SketchAid

Sketch Aid is an application designed to teach architecture and design students the meaning of developing in plan and section. Often times students limit their design process to 3D modeling, which they then cut into the sections to create their drawings. With this method there is no understanding of how to actually develop in section and plan, and thus the relationships across the progression of a building. With this in mind, Sketch Aid challenges the user’s understanding in these drawing strategies and thus pushes the user to develop their skills in working this way. The videos show the struggle turning into a more fluid workflow as users use the tool for longer and get the hang of the process.


Leave a comment

Precedents For Thesis: Guided Hand

Here is a link to all my precedents in a Google Spreadsheet.

My favorite projects for reference are the ones that implement virtual/digital intelligence and physically manifest themselves as an aid to the fabrication process. Christian Fiebig, Amit Zoran, and Joong Han Lee all did projects where there is a very real integration of machine intelligence and accuracy with the design and creation process. I aim to keep their approaches in mind as I pursue my thesis, Guided Hand.


Leave a comment

Guided Hand: Machine Aid for the Analog Process

Using a Haptic Phantom Touch device to guide the hand as it uses
a 3D Printing Pen, allowing for more efficient hand 3D printing.

FRFAF Request Justification

The majority of the money will go to the Haptic Phantom Touch which costs around 2500$. The rest of the money would go toward purchasing the 3D printing pen and the equipment required to make a moving table-bed not unlike those found in a MakerBot, with the additional capability of rotation. This would require 4 motors, a rubber footed belt, material for the table itself such as the bed, the feet, and some nuts and bolts for making.

Description

Guided Hand will use the Haptic Phantom Touch’s ability to snap to virtual points in physical space and snap to surface boundaries of virtual models. This snapping allows for the hand to be physically guided. I will fix a 3D Pen onto the Phantom so that the guided hand can now 3D print at a higher efficiency, allowing for drawing on virtual boundaries that do not physical exist. This project will explore the capabilities and limitations of the proposed system and conclude with a completed setup that allows for quick and accurate sculptures or prototypes.

Abstract

This MTID thesis research explores the use of haptic augmentation in developing new artistic tools for sculpture. The use of the 3D pen is currently more of a gimmick, sort of an arts and crafts toy to be played with and then forgotten. By adding a layer of accuracy and efficiency while maintaining some level of freedom, the 3D pen has the potential to be used on a more serious level for fabrication, sculpture, and design. This could change the creation process entirely.

Narrative

1) Get Haptic Phantom Touch

2) Ensure Virtual Guiding Capabilities

3) Affix 3D Printing Pen

4) Prove Concept

5) Build Moving Table/Bed

6) Control Table and Virtual Model Simultaneously

7) Reconfirm Concept with Multiple Output

The project will contribute to the field by creating the opportunity to guide the hand through 3D space in the same way a ruler guides the hand for drawing. By setting physical limits on movement, snapping to points, and limiting physical movement to virtual surfaces, anyone can use this system to physically 3D print upon a sort of virtual scaffold. The opportunity allows for a design process where fabrication is instant and directly out of the virtual model. Unlike current systems where the entire model must be printed, then observed, then changed virtually, then printed again, this system allows for the user to change the print as he prints with his hand, or even change the virtual model as he prints with his hand, so that the entire process is more symbiotic.

Relevance to FRFAF

This sort of ‘augmented virtual/physical’ process allows limitations on the print by guiding the hand, thus allowing the designer to focus more on the design, worrying less about how to be accurate. It takes advantage of digital accuracy but maintains enough freedom to make way for the valuable hand-craft process. This could potentially open up an entirely new method of working in the design phase. Additionally, this can allow for multiple designs to be directly output based on the same virtual model, so that an artist can iterate through many ideas, all the while maintaining consistency.

Process

This project will be broken down into sub-parts. The first step will be to acquire the Phantom and test its capabilities and limits. I will start by getting it to snap to a basic virtual model, such as a cylinder. I will then affix a 3D printing pen to the Touch and make sure it matches up accurately. Once making proper adjustments and ensuring optimal performance I will build a table that moves in the X Y and Z directions, and rotates. The table will be connected to a 3D mouse which will move the virtual model to match the movement of the physical table. With this system the user can use the left hand to control the position of the table and the right hand to 3D print on a guiding invisible ‘scaffold’. This could allow a table to turn as the pen guides the hand along the synchronously rotating virtual model’s surface. Using the table allows for larger models to be built. Using this system allows for artists to have a hand-crafted effect on their prototypes and sculptures, allowing for physically designed output to be traced over virtual surfaces, with high accuracy down to the millimeter.

Biography

I graduated with a B.Arch from Carnegie Mellon in May 2014. I knew before graduating that I did not want to be an architect anymore, but I did begin to take on a great interest in computation design and fabrication processes. They seemed limited to me, very disrupted and cumbersome. I appreciate the direct output of a pen, and the way the artist can change the drawing as he responds to it as he draws it. I saw a lot of digital tools that made the process more efficient, so then students become very dependent on digital tools the the accuracy and efficiency. Why isn’t there a way to accomplish both? My father started a company that sells software which determines machine fabrication efficiency in real time as the machine is doing the fabricating, and correcting errors on the go. Can we not do the same with hand-crafted fabrication? This project is meant to answer that question, and the answer is yes, we can.

Expected Project Outcomes

The audience is to be any designer, artist, or rapid prototyper. They will experience it first hand for my first demo which I hope to release to a gallery or some other type of open-house. Additionally I have plans to publish this project, although I haven’t looked so far into which publications exactly. I plan to grow my knowledge of machine-aid for the analog process through this project. Publishing it will showcase my capabilities and give me some validity to enter the field of design-manufacturing.


Leave a comment

Sketch Helper: Precedents, Ideas, Goals.

There are a couple of projects that were really inspiring to me as drawing aids. The first being from a Carnegie Mellon student from Computation Design years ago, Sketch it, Make it, which is now called Zotebook. The app allows you to draw quickly on your tablet and it offers corrections and perfections as you go.

What is particularly nice about this application is that it can be output straight to a 3d-printer or laser-cutter to create quick models that were drawn up in minutes. Very useful and accurate way to ‘sketch’ prototypes and have them realized immediately.

Another precedent to look at is the NeoLucida which incidentally was also started by someone who was at CMU, Pablo Garcia. The NeoLucida is essentially a lucida with some electronics involved. It allows the user to trace the reality that the user sees through the lucida as it is projected, in a way, on to the paper.

This project is great for its unobtrusiveness in the analog process. It does not limit the user, as it can only provide visual suggestion. I think this could help a lot in how I think about how my application will aid the drawing process.

People today are relying quite heavily on the aid of technology and in my opinion it is losing people the touch of the hand, the personal quality of mistakes, imperfections, and personality. This feels especially true in renderings of architecture, as the goal is to be realistic but in being so all images to me look like they could be made by the same person, nothing really stands out. I think it’s valuable to add personal flair to an image. In the style of the drawing, through the personal touch. Even when this is not the case, when the drawer is simply trying to quickly output a sketch for an idea, when using technology the image tends to come out very complete-looking. The common thought about this is that completed and accurate drawings look resolved and thus makes the conceptual idea being communicated uninviting to critiqued by clients or peers because it seems resolved. This is a real issue in the world of HCI, for example, when it comes to sketching user interfaces and testing them with surveys. The surveyor will feel inclined to say less if the tested product is more complete or finalized looking.

Essentially then, what feels necessary that seems missing in the technologically aided drawing world is something that provides the efficiency that drew people into the digital drawing world, while maintaining the hand-drawn sketch feeling that can only be provided by the old fashioned pen-and-paper method. My idea is to make that happen using the Equill pen and combining it with another technology, which I’m currently considering to be either a screen or a projector.

This is what I had had up until a short while ago, using the Surface Pro 3 and its pen with the tip replaced by a piece of graphite. As you can see in the second video there are plenty of comments to be made on how it can be improved.

In addition to Rohan’s comments in the video, I’ve also come to the conclusion that there should also be a small space designated to construction drawing. As you saw, Rohan even drew a smaller version of what he was to draw. It is common in perspective drawings to first draw a floor plan and the viewpoint and viewing angle. I could use that basic plan and possibly an additional section to automatically construct the room for the user.

Because that could be complicated and not quite intuitive for a beginner, I’m also considering having the option for a default scaffolding that can be navigated through and have the view be selected before beginning the drawing process. This option allows for a building off of the scaffold, which could inspire ideas and be easier to start from than nothing.

I think both can be done. I am currently tapping into the Equill Pen SDK and trying to figure out how to take advantage of all of its features. I am hoping to move completely away from the Surface Pen because it is obviously limited to itself, while the Equill can work on multiple platforms.


Leave a comment

Sketch Helper

Pen with graphite tip replacing the plastic one it comes with

img20

SketchHelper is an app that aids in the analog perspective sketching process. The idea is that the artist can use the application to provide on-the-go construction lines. Currently the design of the interface is static and not technically optimal; there is much left to be done in this project and for this reason it will be continued next semester with a lot of surveys, user-testing, and design upgrades based on the advice.

SketchHelper speaks to the growing world of intelligent machine-aids in analog processes. The interest to these kinds of process designs is to create potential for the quick and intuitive methods of making that people are capable of responding to in the living world, in real time, to incorporate efficiency and insight and thus allow the designer to create things, and in ways, that might not have been possible before. By doing the tedious work for the designer and intelligently responding to the designers gestural habits, this application will become embedded into the processes that designers are already familiar with. I envision the work to be analogous to a calculator in the math world, there to make math easier but not to replace the person’s thought process.

The first video shows the app in use at high speed. The second video shows someone who was newly introduced to the application at that instant and providing what I considered to be extremely insightful feedback, and communicates the direction that I intend on going as I continue this project.