Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A lot of tech people online also don't know how to examine their own feelings, and so think they are mysterious and un-defined.

When really they are an actual feedback mechanism, that can totally be quantified just like any control loop. This whole 'unknowable qualia' argument is bunk.





If theyre unknowable, are they not metaphysical and thus should be discarded in reasoning about them?

What's the difference between qualia and a soul?


Qualia are phenomenal properties of experience, a soul is something some religions claim exists outside of measurable physical reality which represents the "essence" of an organism, implying that consciousness is some divine process and conveniently letting us draw lines over whom and what we can and can't morally kill.

Qualia can be an entirely physical phenomenon and is not loaded with theological baggage.


If they're entirely physical, what's the argument that multimodal models don't have them? Is it continuity of experience? Do they not encode their input into something that has a latent space? What makes this differ from experience?

They can be physical, but I'm not claiming to know definitively. The lines are extremely blurry, and I'll agree that current models have at least some of the necessary components for qualia, but again lack a sensory feedback loop. In another comment [0] I quote myself as saying:

  As an independent organism, my system is a culmination of a great deal many different kinds of kins, which can usually be broken down into simple rules, such as the activation potential of a neuron in my brain being a straight-forward non-linear response to the amount of voltage it is receiving from other neurons, as well as non-kins, such as a protein "walking" across a cell, a.k.a continuously "falling" into the lowest energy state. Thus I do not gain any conscious perception from such proteins, but I do gain it from the total network effect of all my brain's neuronal structures making simple calculations based on sensory input.
which attempts to address why physically-based qualia doesn't invoke panpsychism.

[0] https://news.ycombinator.com/item?id=46109999


I do think AI will have them. Nothing says they can't. And we'll have just as hard a time defining it as we do with humans, and we'll argue how to measure it, and if it is real, just like with humans.

I don't know if LLM's will. But there are lots of AI models, and when someone puts them on a continuous learning loop with goals, will be hard to argue they aren't experiencing something.


The color Red is often used. A human can experience 'Red', but 'Red' does not exist out in the universe somewhere. 'Red' Doesn't exist outside of someone experiencing 'Red'. I think philosophers are just using the word qualia to quantify this 'experiencing' inputs.

But, it is still just a way to try and describe this process of processing the inputs from the world.

It isn't metaphysical, because it can be measured.

I might have said 'unknowable' a little flippantly.

I just meant, in these arguments, some people start using 'qualia' to actually mean some extreme things like our mind creates the universe or something.

It's one of those words that isn't defined well.


How is it measured?

Can someone who's never seen red hallucinate something and assume it to be red? What if that red is correctly the red they would see if they saw red?

Can you reproduce this feeling in someone by doing something to their physical body without showing them red?

If so, how does it differ from the latent encoding for uploading an all red pdf to your favorite multi modal model?

Instead of doing that socratic bs you see a lot here, I'll be more direct:

Until there's some useful lines that can be drawn to predict things, I won't accept using a fuzzy concept to make statements about classification as it's an ever shifting goalpost.

There are answers to my legitimate above questions that would make me consider qualia useful, but when I first learned about them, they seemed fuzzy to the point of being empirically not useful. It seems like a secular attempt at a soul.

Now, obviously if you're trying to describe something with experience, it needs some actual memory and processing sensory input. Current Generative AI doesnt have a continuity of experience that would imply whatever qualia could mean, but I find it hard to definitely say that their encodings for image related stuff isn't qualia if we don't have hard lines for what qualia are


I can feel an object and say 'its hot' on a scale of 1-10. The temperature is known. And I can do that multiple times, with some 1-10 scale, to get a sample. Then do that with multiple people.

You can then get a distribution of what people think is 'hot' versus 'cold'. What is icy, versus, bearable.

When you go to a doctors office and they ask you on a scale to rate pain, do you think that is completely bogus?

It isn't exact, but you can correlate between people. Yes, red heads feel more pain, there are outliers.

But a far cry from metaphysical.

The problem here is the word 'qualia'. Its just too fuzzy a term.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: