Photography
Related: About this forumAnother "how the sausage was made" post
On another platforms photography group dedicated to aircraft, someone recently asserted that one or more of my posts were fake photos. Im unsure if the basis of the complaint was they were too perfect, i.e AI generated, or screen shots from a video game as I suspect was the case with the Brazilian AF photo. I quickly assuaged their fear by posting the original RAW (Nikon .NEF) and all ended well. I thought maybe I would share these as some might find the sausage interesting.
My workflow for photos is, PS Camera Raw where I have found one of the most important steps is to Dehaze. Shooting RAW contains everything you can see and a lot you cant. This is where I accentuate/de-emphasize colors, bring up shadows and tone down highlights, and sometimes de-noise.
Topaz Labs then is used to de-noise if more is needed, and sharpen. My last post was here was my 55 year old slides for the museum display that I had processed probably 10 years ago. I went back to my original .tif scans and utilized this new workflow and was blown away at the detail and color I could suck out of those scans with the newest software. Something I had never seen before was the diesel exhaust from those Army trucks leaving the DMZ. Topaz is nearing the holy grail of photo magic IMO.
Check out the exhaust of that deuce and half: https://democraticunderground.com/1036151761

progressoid
(51,971 posts)Unfortunately, this AI argument is going to continue for a while.
In the mean time, keep on keeping on!
CaliforniaPeggy
(155,166 posts)If I'm understanding what you're doing: You're not putting anything in the photo that wasn't already there. The pixels contain a lot of information and what you're doing is revealing or concealing that info.
I have zero problem with this.
The problem occurs when new data are inserted into the photo. That is something I disapprove of.
Am I getting what's going on?
HAB911
(9,846 posts)Nothing added, just push and pull of information that is there.
Way back when I was playing with AI, I did some fun stuff like this, where I took an original photo and asked generative AI to change it to a penguin, just for fun. Pretty obvious.
CaliforniaPeggy
(155,166 posts)
hunter
(39,955 posts)I have fewer objections to AI when it is used as a filter rather than a creative tool.
In most scientific uses AI is being used as filter, not as a creative tool, distilling down more data than any human could process. Presumably humans will be able to discern the frequent artifacts this produces.
Topaz does produce discernible artifacts even at low levels of enhancement.
Human vision itself is a similar process. The human eye is a really bad, bad, camera, which is why I sometimes mock Creationists who use it as evidence of some kind of miracle. If some god designed those optics they must have been very drunk, severely hung over, or in a huge hurry, kludging something, anything, together from parts in the junk box.
The magic of human vision is in the post processing. Topaz mimics that post processing, artifacts and all.
Of course AI fails completely as a filter for the internet itself because more than 90% of the internet is crap, not just noise but deliberate misinformation too. That's why AI has ruined search engines.
I think I'd like to play with Topaz but I'm not quite sure what kind of machine to build, preferably a machine built from scrap. The minimum specifications I've seen on their web site seem too low and I don't want to end up with something that's frustratingly sluggish.
I've played a little with web based AI image processing sites, using images I've got no privacy concerns about, and decided I wasn't going to change anything about the way I process photos.
HAB911
(9,846 posts)yes, garbage in, garbage out. Topaz just introduced something with a horrible name, "Wonder". I don't know where it is going, but I have almost the best junk Dell produces, ultra9 285k + an Invidia5080 overclocked GPU and I can't use Wonder. They are allowing free unlimited cloud processing but that, I stay away from. We are all being pushed, Nvidia is all in, Microsoft, Google, I can't seem to get away from it. Adobe swears on its mother's grave they don't scrape the internet or customers for images.
I've never used any of that external AI even for "chatting"
HAB911
(9,846 posts)This Dell, in addition to the dedicated GPU, has an NPU Neural Processing Unit. PS integrates the NPU in some functions such as generative fill, Lightroom uses it for noise reduction in RAW files. Other apps like Capture One, Affinity Photo 2, Luminar Neo, and Adobe Premiere Pro also use the NPU.
Grumpy Old Guy
(4,081 posts)I think their earlier versions worked better. I've had to revert to a version of PhotoAI that is about two years old. The latest version requires 6gb on the graphics processor, and the onboard GPU in my laptop tops out at 2gb and can't be expanded. I have similar issues with Adobe Camera Raw, so I've been applying a lot of sharpening and denoising with Lightroom.
My workflow is similar to yours, and I usually apply a little dehazing as well. To me it seems to take the place of a polarizing filter.
My wife wants to get me a new computer, but there are other toys that I would rather have first.
FYI, lately I've been doing a lot of my processing on my 4k Samsung tablet. I love the display, the LR app integrates perfectly with my phone and desktop, and it's a hell of a lot easier to travel with. I use OTG (on-the-go) connectors for the file transfers which are incredibly fast. I should start a separate thread about this.
HAB911
(9,846 posts)Photography and processing power is now so intertwined it is unavoidable to run up against hardware limitations. I've always been an early adopter and just upgraded specifically for the reasons you mention. PCIe 5.0 with Nvidia Studio Driver and 16G is head and shoulders above the 3.0 I was using. But even so, like I referenced above, this new Topaz thing called Wonder, supposedly something they have been working on for "professionals" is so far unusable, like 30mins processing a photo. I don't have the patience, and can't afford that big boy GPU 5090 at $2500, lol.