Tangentially related: A hobby I found immensely rewarding as of late is to ask a generative model to draw something pretty and then try to make the object in real life.
I have previously asked dall-e to draw me steampunk watch-like gizmos, selected the prettiest and then I went and carved it out of wax. And right now I'm 3d modelling a copper owl jewellery design dreamed up by Midjourney.
It is a really exciting exercise because it makes me think about what do I like about the image, what is the "essence of it". It is also stretches my maker muscles because when I design things for myself from scratch I always keep in mind the constraints of my tools and techniques. But of course the AI has no such compunctions. It just draws something pretty, manufacturability be damned.
I've found it very rewarding as a source of fantastical ideas. I like that generating items in mid-journey can break me out of one line of thinking even if that's because it's result is absurd.
You should mention that this is your project by the way. You're pretending like it's not by writing they instead of we, while on your own website you state that you're co-founder of oio.
Wouldn't a more promising approach be to go through NeRFs first? So first figure out what you want the model to look like from various angles, and then try to make a textured 3d model out of it.
I don't understand, if you have a 3d model, why do these gifs need to be so low res and aliased? The AI isn't generated gifs, why can't they properly render the previews in higher quality? The magic3d models don't look that much more detailed, they're just rendered properly...
I actually thought this initially, but I also believe that given the nature of where we are going (in the 3d space at least) it makes sense to open source something that isn't commercially resellable (thinking all the people who just repackage openai APIs) and let the community iterate over it and make it better perhaps?
Tip: If you're loading these .glbs in blender, add an "Attribute" node in the shader editor, put "Col" in the name field, and connect "Color" to "Base Color" in the material to see the colors.
I'm curios how much more difficult is generative 3D?
We are barely able to generate 2D 1080p images, without artefacts using diffusion methods. How much more complex is the jump to production grade 3D?
As someone creating re-usable 3D product models on Polymock @ https://polymock.com the bigest issue for me, is that you want your models to be editable. 3D modellers want to post-process and add tweaks to existing 3D models. Current AI generators lack this layering structures.
What I actually want, is a 3D generator for textures / materials. It should be a much simpler task for current AI.
At the moment, the state of 3D generators is slowly ripening for indie game developers. I'm sure that in the near future, the models will be high-enough quality to generate a basic character set.
Others have mentioned it, but I think NVIDIA has an edge in 3D synthesis
& rendering. OpenAI is arguable the leader in consumer text-transformers - I wonder the goal for releasing this POC (I don't think they will allocate any serious resource here)
Seems extremely similar to the Point-e work done a few months ago.
I would imagine that just doing Veitoris Rips on the point-e would get close to this solution as well, although that would be a bit too logic based for the ML world we live in now.
Every time I click Samples.md and then click back, GitHub tells me "Access to this site has been restricted." and times me out for a while. Anybody have the same thing or know why?
> How many more steps have to occur between these models and me printing them?
Shouldn't be too hard. Based on the code of the examples it seems it can output polygonal meshes in ply format. Either your slicer natively supports that, or you can use something (for example blender) to convert the ply to a format your slicer supports.
What can go wrong if the model "looks okay" but have some subtle geometric issue (like if it has a missing face, which makes it non-manifold or non-watertight) Nowadays many slicers have tools to heal such problems, but there can be dragons that way.
If that is representative it is a very nice clean geometry. I have sliced it with prusa slicer no problem and I could be printing it within a minute of downloading it if I would need a 3d printed cactus in my life.
> I just got a 3D printer but I'm a total noob.
How noob you mean? If this would be literally your first print I would recommend printing first a sample file which came with your printer. (just to make sure everything is okay with the printer itself) But as a second print you can totally try to print one of these of course. Just don't give up the hobby if something goes wrong :) Something always goes wrong.
I successfully printed a hollow cube yesterday, my first ever print, and failed on about 6 attempts at a "mario ring" file. It's PETG and I'm just fiddling with settings, getting a lot of stringing etc.
Anyway, very new, but I'm prepared for the learning curve. I'm a good debugger, and I don't give up easily, so hopefully that'll carry through to the printing.
I got into 3D printing from zero knowledge a couple of months ago, and i found PETG an absolute nightmare to deal with compared to PLA.
Mind you, i only bought filament from Prusa and AtomicFilament, two very respectable manufacturers. Had zero issues with their PLA, and it looked beautiful. With PETG, no matter how much i tried adjusting live-z levels or playing in PrusaSlicer with print speed levels, temperature, layer height, etc., it all would mess up at some point. Out of 5 prints of the same object, I finally managed to get one great print out of PETG. But when I tried printing another one, I would start having various issues again and would have to adjust things again in hopes of eventually getting it right. I just went back to PLA and had zero issues at all since then.
Even though I am aware of PLA disadvantages compared to PETG, I simply realized that for my purposes, they didn't matter nearly as much as consistency and ease of good prints.
I don't want to deny your experience but I want to provide a counterpoint. I actually find PETG easier and more reliable to print with than PLA, aside from a little bit more stringing. It's especially good for large objects because the warping is very minimal, and it has more strength and flexibility than PLA, and better inter-layer bonding strength, so I don't have to worry as much about brittle failures or print orientation. I use a Prusa MK3S+, with Prusa's textured sheet. After setting the first-layer Z height properly and cleaning the bed with an alcohol pad, it works flawlessly basically every time. I've used both Prusa's PETG and 3rd party filaments with equal success.
After many years of falling out of love with 3D printing, PETG helped me fall back in love with it.
Thanks for providing a detailed counterpoint, gave me some ideas. Next thing I will try is a textured sheet.
For additional context, my main issues with PETG were prints detaching from the printing bed midway through (but i fixed that one by adding brims to my prints + using printing glue on the bed, and a textured sheet would help with that even more), and a more major one was the nozzle getting strings and eventually the material sticking to it and accummulating to the point of the nozzle getting clogged on the outside/unusuable (recovering from which would require aborting the print and doing the 10-min long cleaning ritual with heating up, rubbing alcohol, brushes, etc).
This might sound like an ad, but the textured sheet (https://www.prusa3d.com/product/double-sided-textured-pei-po...) is somehow a miracle when it comes to printing PETG. I can't get PETG to work on a smooth bed either, but with the textured sheet, it works every time. All it takes is to keep the bed clean with an alcohol wipe, and warm it up to ~85 C during the print. Not only does it hold very well during printing, it practically slides right off the bed when the print is finished and the bed cools down.
PETG does tend to curl and stick to the nozzle, that's true. I find the main culprit is if it oozes while the printer is warming up. While I'm waiting for the printer to reach temperature and the print to start, I make sure to remove any filament that might be dribbling out the nozzle using pliers or a wire brush until either the print starts or the nozzle pressure decreases enough for the oozing to stop. As long as the nozzle is clean when the print starts, it shouldn't have an opportunity gather up more material during the print. On the other hand, when 3D printing, getting off to an imperfect start will always tend to snowball toward a terrible end.
Yep, I actually ordered that exact same textured sheet you linked, and it got delivered a couple weeks ago, but I am yet to try it. I still want to make PETG work for me, as the prints i actually managed to finish with it were great.
> While I'm waiting for the printer to reach temperature and the print to start, I make sure to remove any filament that might be dribbling out the nozzle using pliers or a wire brush until either the print starts or the nozzle pressure decreases enough for the oozing to stop.
Yeeeeppp, been doing that exact same thing, and it was working quite well. But not going to lie, I felt a bit destroyed on the inside when I still got the nozzle clogging issue about 10 hours into a 16hr print I had going. I wrote it off as a one-off, but I have been hesitant to try another 16hr-long print with PETG since then (until i at least get the textured sheet set up).
Thanks for your comments in this thread btw, they definitely highlighted quite a lot of issues I had and potential workarounds.
> I don't want to deny your experience but I want to provide a counterpoint.
I heard that PETG is more sensitive to humidity from the air than PLA. So maybe that is the difference? Maybe you live in a dryer climate and the person you are responding to has more moisture in the air?
Could have been, but they just replied that they live in the pacific northwest, which is, incidentally, where I live as well. So that shouldn't be a factor.
I'll check out PLA if I can't get PETG going properly. I also have some TPU to play with, but haven't yet.
Thanks!
I'm also going to re-run the cube to see if the same settings work a second time, prompted by your comment. I've really only had failures with the one model.
I'm also pretty new, and in my experience, when it gets tricky is support. Quality models for 3d printers are made to print without supports, but a lot of these normal models have a lot of weird details and geometries that require supports. And then you gotta fiddle with a bunch of settings to find the right parameters for it.
I'm not sure what format it's generating these models in, but if they're SDL, you should be able to drop them into a slicer program (like PrusaSlicer, Cura, etc) and generate the gcode there (the 3D printer movement instructions). Then load that gcode into your printer (via SD card, wifi, whatever it supports) and go!
If these aren't in SDL, but are some format Blender can open, then Blender can turn them into SDL. It looks from the description like it's compatible with Blender.
From the jupyter notebook in the repo [0], the output is in .ply or polygon file format which is used by some 3D scanners and can be converted to STL supported by most 3D printers.
Ube has been an increasingly trendy/popular flavor for all sorts of dessert food in America; this is perhaps most evident if you visit Hawaii, but it can also be found in pretty much any bubble tea shop as a flavor option. ubaehawaii makes an ube crinkle cookie that is as delicious as it is purple. I’m not affiliated or anything, just a fan of good cookies (and ube!)
I have previously asked dall-e to draw me steampunk watch-like gizmos, selected the prettiest and then I went and carved it out of wax. And right now I'm 3d modelling a copper owl jewellery design dreamed up by Midjourney.
It is a really exciting exercise because it makes me think about what do I like about the image, what is the "essence of it". It is also stretches my maker muscles because when I design things for myself from scratch I always keep in mind the constraints of my tools and techniques. But of course the AI has no such compunctions. It just draws something pretty, manufacturability be damned.