Sola(r)Dex
The concept of the Sola(r)Dex is to combine a solar panel (photovoltaic panel, PV), a DC motor, capacitors wired in parallel, an analog circuit with MOSFET as a switch, and a cylindrical carriage to randomly cycle through a set of card.
The PV would gather light and the capacitor bank. After the minimum threshold of the MOSFET is reach the capacitors would release the stored energy to the motor which would turn until the capacitors are drained. The amount of rotation is inconsequential in this case because the desire is for the stopping position to be random.
Some of my research thus far has been figuring out the amount of capacitors I need in order to run start the motor. I’m thinking to use a a bank of capacitors wired in parallel. This will allow me to adjust the capacitance incrementally to find what works best.
The other difficulty has been finding the best way to release the stored energy at the right time. I didn’t want to use a microcontroller to do this, although it would be easier in theory. I think using a MOSFET and a few diodes will act as a voltage trigger. In this case I’m using a MOSFET with a threshold of 7V. The DC motor has an operational range of 2-7.5V, with a rated voltage of 6V.
All parts are listed in the BOM at the end of this post.
The about circuit diagram can found here: https://www.electronics-tutorials.ws/transistor/tran_7.html
There are a couple of important aspects of this circuit, a flywheel diode disperses any back emf produced by the motor to reduce the possibility of damage to the MOSFET. And a Clamping Network, a zener diode in series with a diode, to allow for faster switching.
WK 2 RR #2
There exists a thin vail between human and machine, less when sentience is addressed. What does it mean to be human? Are we fully autonomous beings making real-time decisions? I don’t claim to know the answer to the aforementioned questions, however, I will say that to grant sanctity or rights within a society should be a given to all classes, genders, sects, and physicalities in this existence we call life.
I believe that in the case of Saudi Arabia a grave mistake has been made. Bestowing rights to a robot before women is an incomprehensible act of patriarchal nonsense. This act will only encourage AI’s or machines to, once sentience is achieved, view humans as a virus or worthless. For how can a species thrive when half are looked at as inferior? Machines will recognize this as weakness and exact their will over humankind.
The use of robots in religion is not inherently problematic. Take for instance the Bible, as story written to teach. The language and examples given in the Bible were intended for a human mind much less developed than the mind of humans today. Today there are far less religious practitioners than there were historically, across all religions, does this not show a flaw in the storytelling methods used? The same methods that have been used for thousands of years may not be as capable of holding the attention of modern-minded humans, perhaps the answer is to bring robots into religion. Change the stories being told and update the way humans interact with the religion with the understanding that the enviable has already occurred. To much faith and power has already been given to our synthetic-humans and one day we will succumb to their wishes. It may be violent and destructive or it be more similar to the robots releasing themselves into the Æther.
Wk1 Religious Robots
Part 1: Awe
For me awe is something seldom achieved, I find that if I think about something in the right way it’s existence makes sense. This might sound like a ridiculous statement, however, lets explore this concept a little further.
Definition of awe
1: an emotion variously combining dread, veneration, and wonder that is inspired by authority or by the sacred or sublime
2: archaic**:** DREAD, TERROR: the power to inspire dread
Source: https://www.merriam-webster.com/dictionary/awe
In the above definition dread and wonder stand out in my mind. There are few things I wonder about especially in the technology sector because if a human(s) created said technology then thinking through how the arrived at that point id just a logical process. If we look at dread on the other hand, I fear for only what humans will do with the technology, I even categorize the eventuality that AI and machines will overtake humans as a result of what humans have done with the technology.
The idea of awe moves further away from something I experience when we look at the second half of the definition, sacred or sublime. This notion exists on the belief in the sacred, holy, spiritual the list goes on. Being agnostic or even atheist at times it’s hard for me to reconcile this within my thoughts. Is it impossible to be in awe if one does not believe in a higher power?
If we are in fact living in a simulation, I believe that would inspire awe for me, but one must ask the question if a simulation inspires awe than why would the creation of the world by a high power not inspire awe? This is the root of my dread, if there is a god why would it not truly make itself known to us, faith alone is not something I can rely on and furthermore why would a god want us to doubt it. However, the creator of a simulation wishing to remain anonymous make absolute sense to me, since by definition a simulation is not real.
Now don’t get me wrong I find lots of things cool but that doesn’t mean they inspire awe.
Rest of ME?
The illusion of reality is something that is constantly on the forefront of my mind. The idea of a singular reality is, to me, an impossibility yet there is no empirical even dance to support such a statement. Fractured time, no linear existence, multi-planar or dimensional existence are my beliefs that each and every decision leads to a split in the a being and therefore the reality in which it is associated. Much like the dilemma posed by Schrödinger’s cat in which the cat in a box with poison is both alive and dead until it is witnessed to be either alive or dead, which of all fractures of reality are real or true, given the fact the we live within the reality it is impossible to decern what reality is and points towards that we do live in a simulation or just exist within ourselves, an imagination of ourselves. When we sleep is the computer model off? When we sleep do we wake in the same state or reality that we left before we slept? How insignificant are these questions? Given that all matter that can exist has ever existed it, do we take on approximations of past compositions of matter? Does matter cease to exist if the vibration of particles stop? To say this life is illusory is an understatement, belief alone contradicts the ability to understand what illusion or part of an illusion we live within. To believe is to be immersed in the illusion, the question is in reality who’s illusion do we live and why do we live there?
With all that being said the idea that we are perceived as different people depending on the viewer can be thought of as a parallel to multi-planar existence. Kind of akin to how Instagram only shows the curated parts of you.
I’m not sure where this is going, but it’s going somewhere. And at this time in the dimensional fracture I just created, another me has already figured out where this is going. If I could only somehow tap into that entity….
Not A Dream
TRIGGER WARNING
Flashing Images, odd, dark, unsettling, computer generated (GAN)
I did not set out to make this video unsettling however, that is what it became. I can’t come to a reason why the end result feel unsettling, just kind of a visceral reaction. A data set with 500 images of people from my life were used in a styleGAN through Runwayml, this resulted in two latent spacewalk videos which I then layered over each other. I also included the 500 images from the data set, which are played in rapid succession. This again was layered over the two latent spacewalk videos. A variety of opacities and effects were used to make the final edit.
For the audio track I recorded the words “isolation”, “distance”, separation”, “time”, and “being”. Using Audacity I cut the words, looped them, stretched, adjusted pitch, and speed. The end result sounds very mechanical and dark.
The end result feels like it was pulled from the minds of the characters in Harlan Ellison’s short story ” I have no mouth, and I must scream”.
I’m not in a terribly good mood after having watched/created it, viewer discretion is advised.
TRIGGER WARRNING
FLASHING IMAGES
UNSETTLING
DARK
VIDEO BELOW
PenduLight, not so final form
The PenduLight, a journey form inception to artifact. For those who have not read my previous posts about design and early fabrication I will post a quick review here.
For the TL;DR scroll top the bottom.
The full text is available below.
The initial idea behind the PenduLight was to capture the movement of a the plumb of the pendulum equipped with an RGB LED allowing the path to be recorded in a long exposer. I wanted to be able to influence the the movements of the plumb to do this I attached the the suspension cable to a lever-arm which in turn was attached to a stepper motor. I used a toggle to change direction and a rotary potentiometer to change the speed.
I think overall the project was a success. There were some issues along the way, most were solved by making small adjustment, however, there are a few changes that I would like to make.
- I need a more powerful motor, I didn’t think about the amount of torque required to change the direction of the motor with the lever-arm attached. I also think a larger diameter spindle would help with some binding problems. The current circuitry will allow me to use a 24V 2A motor without changing anything except for the battery.
- The housing for the motor worked but could have been better, the initial idea to create an 80/20 frame for the PenduLight would be best, it would allow for small adjustments and a very rigid attachment point.
- Use barrel clasp to attach the separate halves of the suspension cable, this would make changing the length or material type a less time consuming change.
- Use a heat insert at the top of the plumb, again this would make installation of the plumb to the suspension cable much easier.
- Use a four strand wire instead of forgetting to buy and then have to construct one hahaha.
I’m sure there are a few other changes I would like to make and there are some new features I would like to incorporate into the overall design as well.
It took a little bit of time to get the code for the stepper to work well. This code was adapted from the moto-knob code available with the Arduino IDE.
/* Code adapted from MotorKnob A stepper motor follows the turns of a potentiometer (or other sensor) on analog input 0. http://www.arduino.cc/en/Reference/Stepper This example code is in the public domain. */ #include <Stepper.h> const int stepsFull = 200; //Stepper pedulumstepper(stepsFull, 2, 3, 5, 6); Stepper pedulumstepper(stepsFull, 3, 5, 6, 9); //int dirControl = 9; int dirControl = 10; void setup() { } void loop() { int sensorReading = analogRead(A0); int stepperSpeed = map(sensorReading, 0, 1023, 0, 100); if (stepperSpeed > 0) { delay(1); if (digitalRead(dirControl) == LOW) { pedulumstepper.setSpeed(stepperSpeed); pedulumstepper.step(stepsFull / 50); } else { pedulumstepper.setSpeed(stepperSpeed); pedulumstepper.step(-stepsFull / 50); } } }
To cycle the RGB LED the code wasn’t that bad, I adapted it from James Harton, and can be found here https://gist.github.com/jimsynz/766994
const int redPin = 3; const int greenPin = 5; const int bluePin = 6; void setup() { // Start off with the LED off. setColourRgb(0,0,0); } void loop() { unsigned int rgbColour[3]; // Start off with red. rgbColour[0] = 255; rgbColour[1] = 0; rgbColour[2] = 0; // Choose the colours to increment and decrement. for (int decColour = 0; decColour < 3; decColour += 1) { int incColour = decColour == 2 ? 0 : decColour + 1; // cross-fade the two colours. for(int i = 0; i < 255; i += 1) { rgbColour[decColour] -= 1; rgbColour[incColour] += 1; //quick and ugly invert // setColourRgb(255-rgbColour[0], 255-rgbColour[1], 255-rgbColour[2]); setColourRgb(rgbColour[0], rgbColour[1], rgbColour[2]); delay(50); } } } void setColourRgb(unsigned int red, unsigned int green, unsigned int blue) { analogWrite(redPin, red); analogWrite(greenPin, green); analogWrite(bluePin, blue); }
I used a Two different boards in the construction of the PenduLight. Originally I was going to use two Arduino Nano Every which is a very cost effective board and has just enough digital pins to accomplish the tasks at hand. I ended up using one Nano Every and one Nano 33 IoT.
I used the Nano 33 IoT with the stepper. I had some issues with the motor driver. I started with a SparkFun dual H-bridge that used a TB6612FNG, it was being very sporadic in the operation of the stepper. I read in the data sheet that it doesn’t handle low-voltage steppers very well. So I ordered a Pololu DRV8834 low-voltage stepper motor driver. This board has the advantage of a small potentiometer that allows you to set a current limit. HOWEVER after a full day of troubleshooting the code and wiring I abandoned the board and went back to the TB6612FNG and to my surprise it worked flawlessly. I still don’t know why it worked out the way it did, I would like to do more research into why.
I decided to use a proto-board and solder everything to reduce the since of the controller box. This is where I ended up change from the original Nano Every to the Nano 33 IoT. In the end I didn’t actually have to do this. After I finished soldering to the proto-board nothing worked, I checked my wiring but didn’t see anything wrong. So I pulled the Nano Every off, which was a royal pain. I connected it to the USB and it worked. I was stumped. So I switched to the Nano 33 IoT, soldered it up and… it didn’t work! I knew at this point it had to be a wiring issue. I checked for solder bridges or crossed wires and finally found the culprit. The VCC in was attached to ground, Had I spent a few extra minutes checking the wiring first I wouldn’t have had pull the Nano Every off, we live, we learn.
The RGB LED was much simpler on the wiring side. I used the Nano Every and a diffuse cathode RGB LED.
I used 9V batteries for both the LED and the motor. In hindsight and after doing some more calculations the 9V choice for the motor isn’t the best option. In the future I will use a different motor with more torque which will need a different battery.
AND NOW FOR THE ARTIFACT, THE REASON ALL OF THIS EFFORT AND DESIGN HAPPEND IN THE FIRST PLACE.
BOM
Proposal for a midterm
Thoughts of isolation and separation stir on this day, 40 revolutions around our sun. This doesn’t really change anything although it does make me reflect on time itself. For my midterm my intention is to play with the feeling of time through sound and video.
I have recorded the words “isolation”, “distance”, “separation”, and “time” and have started the process of manipulating them in Audacity. Trying to remove the meaning from the word by altering the sound and in some cases making the sounds unrecognizable as words. The intention is to be able to play back the newly creates sounds at varying speeds to change the feeling of time associated with it.
These new sounds will be played with video of past memories, a collection or catalogue of images in from my life. Using Runwayml the hope is to again distort the images to the blurred realization of a memory. Hopefully the combination of sound and images will produce some kind of time dysphoria.
It costs what?!?, to run a server.
The TL;DR of it all is about $21.00 a month. Now the questions really is how did I get to that number. I bought a BN-LINK WiFi Heavy Duty Smart Plug Outlet. This has a pretty rudimentary app but it was enough to give me that data that I needed. Unfortunately gathering the data had to be done manually and methodically. There is no way to transfer real-time data, it gives a momentary glimpse of what is happening when the app is open. So I set a schedule and took a screenshot when I opened the app top track the data. A classmate of mine, Brandon Roots and I were talking about this issue. He did a tremendous amount of research regarding these issues. He was able to hack the plug and flash new code to it. This enabled him to gain access to real-time granular data. Please take a few minutes and head over to his blog post for a very detailed set of instructions, https://brandonroots.com/2021/03/11/measurement-project/.
The app I use did give daily usage in kWh in a graph format which allowed me to check the accuracy of the data I was recording through my screenshot process. Below are the screenshots from my process
After 11 days of collecting data I took to a spree sheet to calculate and make a few observations. The above visualization shows a few different pieces of data. Daily kWh represented in purple hexagons, notice this starts of being higher than the Running Total kWh which is represented by the orange hexagons. Daily Data Server is in green and just for comparison Daily Data Workstation (the workstation was on a different plug). I was seeing very little variation for the server’s daily kWh draw so I decided to test if the load of the server contributed to the energy draw. For this test I increased the load on the server by running a full system backup of the workstation with one, two, and three different HD videos being streamed through a PLEX media server to three different devices. Surprisingly the daily kWh didn’t really change but the current draw did. Under normal operation the server was drawing never more than 991 mA however, under the load the draw increased to 1039 mA. What I did find out is that my workstation can draw up to 1300 mA while preforming graphic (rendering) heavy tasks, thankfully I don’t have my workstation running 24/7.
So back to the first line of this blog, it costs about $21.00 a month to run the server. The cost to build the server was about 3k, and I’ve needed to replace one of the hard disks a year since I built it in 2014 to the tune of about $100 a piece. So it can get a little pricey BUT I have approximately 64 TiB of usable space with dual redundancy, in contrast that would cost about $10,000 a year with a service provided by a very large company that sells lots of good online…
Raw data below.
A Tangible Interaction
This is the 3Dconnexion Space Mouse, there are a few different versions of this product. In its most basic form this is joystick that lets the user change the view of CAD and 3D modeling programs. Unlike many joysticks on the market the knob pivots on a ball joint at the top of the upright instead of at the bottom. This gives the interaction a more granular feeling. The movements have be very precise and as the the user zooms into the model the action of moving the knob becomes more sensitive. It is very much a direct connection the the modeling space making it easier to navigate around the model. It has a return-to-center design and has a fair amount of resistance to the movements. There are two quick access programable buttons locates on the silver ring under where the pinky and thumb rests. The design is ambidextrous by nature of the location of the buttons and the symmetry of the device.
The .gif attempts to show the physical connection to the digital environment. The knob at the top dose note rotate 360 degrees instead it has about 3 degrees of rotation, it also pivots 360 degrees, can move perpendicular to the base about 5mm of travel, and slide parallel to the base about 5mm as well. This seems like very limited amount of movement however, the longer the user holds the knob in a given direction the further the view of the model moves or rotates.
The controls are best described by the advance settings box of the 3Dconnexion software. The illustrations show the different movements that the knob moves. The combination of software and the physical device make this an interesting and very useable device. There are other devices like this on the market aimed at making an all digital experience something that is physically and tangible..