Is consciousness biological?
September 4, 2021 at 6:15 pm #38995
I’m thinking that a big part of wisdom is compassion and empathy. Am I wrong? And can an artificial intelligence ever develop these attributes?
I would define wisdom as “truth and compassion”. I think that AI entities would be the same as us, i.e. self-interested. But humans are compassionate because of interdependence: what is good for you is good for me. However, this is an instinct rather than a utilitarian calculation. We operate on instincts, so we are good to people even when it doesn’t benefit us.
Can we get AI entities to operate using evolved instincts rather than calculations?September 4, 2021 at 6:59 pm #38996
Both natural and artificial entities operate according to the laws governing the universe, but it seems to me that only carbon-based lifeforms can develop empathy, and only then at the highest level of evolutionary development. Certainly humans can be empathetic (“there but for fortune go you or I”/”I know what you’re feeling right now: I’ve been there myself”), and although dogs and cats may be able to feel a degree of sympathy, I can’t see them feeling empathy. The situation, it seems to me, is even worse for AI. Empathy and wisdom, it seems to me, are not complexity issues. There’s nothing preventing a human with a slightly subnormal level of intelligence from feeling empathy or exhibiting a high degree of wisdom on occation.September 4, 2021 at 7:18 pm #38997
Computing power is such a compartmentalized skill…we all have seen autistic savants who can play back a Bach sonata on one hearing and yet do not have the capacity to dress themselves. I know “brilliant” engineers who now believe the moon shot was a fake and the Clintons run an operation sucking the blood children.
Can we get AI entities to operate using evolved instincts rather than calculations?
You can try to simulate evolution and emotions with a computer but a computer is just a computer. Machine learning involves computers adapting by writing their own software based on interpreted results of the previous iterations. They can do this fast, they can randomize their operation a bit to simulate a changing environment, similar to how biologics mutate and can sometimes take advantage of that.
You can build computers that try to replicate the human neurological systems. I think it could be similar…looking from 40,000 feet, but very different in reality. Those chips are gonna have to sweat when they get nervous, LOL. The gazillions of feedback loops and cell to cell interactions of a biological system may be essentially unknowable.September 4, 2021 at 11:10 pm #38998
I do not have the answer unseen, however obviously if a computer can become far more complex and draw on far more computing powers and networks and greater resources and have an absolute memory and crunch far more data than we can and even utilise forms of intelligence we could never achieve without some sort of enhancement or surgery then it is absolutely not a stretch to envision them having exponentially stronger EVERYTHING. If the mere difference between an ants brain and ours is size and complexity (plus experience) and a computer/android could utilise a far larger and complex brain (via remote networks) and gain enormous experience through memory, other android’s experience and millions of simulations, then it all seems theoretically possible. This is all theoretical…but I see absolutely no reason why it is impossible and it should be something that governments should be planning for now.
September 4, 2021 at 11:12 pm #38999
- This reply was modified 2 weeks, 5 days ago by Davis.
Robert, our brains are just meat computers. They work in a different way but I see no reason why the human mind cannot eventually be mapped out via circuitry and even improved upon.September 5, 2021 at 1:07 am #39001
Robert, our brains are just meat computers. They work in a different way but I see no reason why the human mind cannot eventually be mapped out via circuitry and even improved upon.
Yeah, I know..just meat computers, LOL. They are analog chemical/electrical/mechanical organs composed of 100 billion cells and 100 trillion synapses that took millions of years to evolve and uses about 10 watts to keep the rest of your body working while it computes the inputs provided by your senses and handles all your interpersonal relationships. Right now we look at MRI pictures to see where they light up, LOL. We don’t understand the brain of a worm. We may need to do some some evolving to know how they really work.
But yeah…improving on some peoples brains….would be welcome.September 5, 2021 at 3:39 am #39002
do not have the answer unseen, however obviously if a computer can become far more complex and draw on far more computing powers and networks and greater resources and have an absolute memory and crunch far more data than we can and even utilise forms of intelligence we could never achieve without some sort of enhancement or surgery then it is absolutely not a stretch to envision them having exponentially stronger EVERYTHING.
Computers can potentially amplify computing power far beyond that of an individual human, but to amply X you first need to have X. A googolplex times zero is still zero. So, in order for an AI to feel empathy in any meaningful way, it would have to have at least a smidgeon of empathy to amplify. I just don’t see it.
And what about wisdom like that of King Solomon?September 5, 2021 at 3:55 am #39003
Agree with Davis.
Genesis of life was thought to be possible only under restricted conditions similar to those found here. However Goldilocks is more adventurous than we imagined. Not only has life begun but life has been sustained on earth in impossible conditions. Extremophiles are one example. Deep sea life lacking photosynthesis or light and utilizing geothermal vents is another. So those discoveries served as a fillip to the human imagination of what might be.
Contemplate consciousness and we are lost. It seems a near certainty that many animals possess consciousness. Consciousness almost certainly evolved. But we have no way to divine how different other animals experience of consciousness is. And similar to biogenesis and sustenance of organic life we have no idea what conditions may be utilized to create consciousness and sustain consciousness. One interesting query is whether consciousness requires sensation. There is no reasonable way to extrapolate the nature of consciousness in AI. There is no way to know whether it will be benevolent or malign. Just wild guessing and intuition…
However there are already surprises in how did AI do that? Unfortunately i could not find an article involving AI and the universe but here are a few surprises that are beyond the programming apparently. https://www.livescience.com/65832-ai-creates-model-universe-mysteriously.htmlSeptember 5, 2021 at 9:40 am #39006
And what about wisdom like that of King Solomon?
I think we could get a machine to reach the moral capability of a psychopath, i.e. rational but not emotional. It turns out, this is a huge handicap. They don’t experience emotional resonance, or recognise emotions in others.September 5, 2021 at 11:33 am #39010
What Jake said. Intelligence is already a difficult term to deal with, wisdom is far more nebulous and I would say unconstructive in this kind of conversation. I don’t know if AI could develop human wisdom. I don’t see why they couldn’t develop their own relative AI wisdom. If it was simply a case of realising their goals (which is what humans do most of the time) then they will likely not have a hard time doing so. It depends on how they are programmed and if they become able to change the parameters of their programming and decide for themselves what it is. It is a constraint we can only do to the most limited possible degree. We can change our programming to a very small extent. If AI can change even the parameters of what it means to be an AI…then pfff…what hope do we have if they are melevolent?September 5, 2021 at 11:36 am #39011
I am aware that the brain functions very differently to computers. I still do not see how they could not eventually be mapped out using different materials and or improved upon (or something far more complex developed in a much more rapid way). It took only a decade to develop a go program to beat a human master. That is 10 years to beat what took millions of years of evolution. There are already algorithms developed by “intelligent AI” which we cannot make much sense of and work, and as this technology is adaptive, learns and has a speed and memory capacity and networking ability far beyond our own…it is not that insane to imagine they can develop an equivalent or superior intelligence a LOT faster than we humans did.September 5, 2021 at 11:59 am #39012September 5, 2021 at 12:22 pm #39013
what hope do we have if they are melevolent?
Malevolence is aggressive self-interest, winning out and achieving one’s goals at the express expense of others. In humans, if it’s pervasive, this is because of personality disorders. We already have computer viruses that behave like this.September 5, 2021 at 1:25 pm #39015
what hope do we have if they are melevolent?
Malevolence is aggressive self-interest, winning out and achieving one’s goals at the express expense of others. In humans, if it’s pervasive, this is because of personality disorders. We already have computer viruses that behave like this.
Yes, if we can get over our fears and truly learn to peacefully cooperate…what a wonderful world it would be. So many knuckledraggers still walk amongst us.September 5, 2021 at 1:35 pm #39016
Your brain is not a computer…..
If you and I attend the same concert, the changes that occur in my brain when I listen to Beethoven’s 5th will almost certainly be completely different from the changes that occur in your brain. Those changes, whatever they are, are built on the unique neural structure that already exists, each structure having developed over a lifetime of unique experiences.
Bains being part of a living creature are constantly changing. I read some studies that indicate that mental abuse of children actually causes structural changes in the brains of victims.
You must be logged in to reply to this topic.