Saturday, February 15, 2025

6.2: Project updates

The two electronics I've purchased for the project are this GPS module:

https://www.amazon.com/gp/product/B084MK8BS2/ref=sw_img_1?smid=A2QTZX14X1D97I&psc=1

and this microcontroller: https://www.amazon.com/dp/B0B94JZ2YF?ref=ppx_yo2ov_dt_b_fed_asin_title&th=1

I'm thinking that I want to use something like ComfyUI (an AI-image/video generating program that runs locally) to generate the visuals to be projected onto the lampshade. ComfyUI uses flow programming like Touch Designer does, so I found it easy to navigate the interface without as much of a learning curve as there would have been otherwise.

I'll want to use API data to get weather, lighting, and location data to determine the generation of visuals, but for now, these are examples of some images I generated with one of the pre-downloaded safeTensor checkpoints.

For the purpose of this project I will want to figure out how to (if it's not too steep of a project) create my own sort of image library for the AI to draw from when generating images. I don't want my projections to read as AI images, so narrowing down the aesthetic of what the generator draws from will be important. 

ChatGPT is my initial stop for gathering information on the different steps required to make geographic location data eventually turn into a generated image. Here is the code it suggested for having an API cause effects on ComfyUI: 




 What I need to solve next is how ComfyUI can interact with Touch Designer and the electronics I've ordered. 

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

13.2: motor labs

  1. Using transistors to turn things on and off (first LEDs, then motors).  -Transistors help with switching power to devices that require...