Jamie123 Posted November 29, 2024 Report Posted November 29, 2024 Starmer is an awful Prime Minister, who hates free speech, pooh-poohs all criticism, thinks he can control the judiciary (which isn't helped when the judiciary allows itself to be controlled), but who has one of the biggest majorities in parliament ever known. I mean, just look at the lower picture below. (The reds are Starmer's party) And it's not that surprising. People wanted a change and they got one. Just not one for the better. No one was expecting the new government to be EVEN WORSE than the last. So what can we do? Sooner or later someone is going to hit on the idea of having an AI government. After all, it could hardly be much worse than the human governments we've seen. And what then? Humans enslaved to machine masters? There'll have to be a Butlerian Jihad! zil2 1 Quote
zil2 Posted November 29, 2024 Report Posted November 29, 2024 I hope the social media police don't come pounding on your door at 3am to drag you away to prison, and set a violent criminal free to make room for you. I've been hearing awful things about the speech police and about Muslims doing some pretty awful things in some neighborhoods. If you need to escape, I'm sure the denizens of ThirdHour will help you out! (For however long we remain free - with luck, for another 2-4 years...) Vort and Jamie123 2 Quote
NeuroTypical Posted November 29, 2024 Report Posted November 29, 2024 53 minutes ago, Jamie123 said: There'll have to be a Butlerian Jihad! Nah, I'm sure it'll all be fine. Carborendum, zil2 and Jamie123 3 Quote
Vort Posted November 29, 2024 Report Posted November 29, 2024 Well, no wonder. LDSGator, zil2, Traveler and 4 others 1 5 1 Quote
NeuroTypical Posted November 29, 2024 Report Posted November 29, 2024 Heh. Colorado Springs learned it's lesson. Panicky with George Floyd riots, the Democrats hastily run a diversity candidate for mayor, Yemi Mobalate. Dude wins handily, because everyone is scrambling to signal virtue to our new Antifa overlords. He goes on to be a pretty good mayor who created a dashboard to measure himself against goals of things like reducing homelessness. When Denver's sanctuary city philosophy crumbled and Denver started looking at other cities to "help", Mayor Yemi was quite outspoken about how Denver will NOT be sending his city any illegal immigrants. Quote
Traveler Posted November 29, 2024 Report Posted November 29, 2024 3 hours ago, NeuroTypical said: Nah, I'm sure it'll all be fine. I think it conveys something when the evil of AI is given a human form or representation. I worked with industrial AI long before AI as we are sold presently. I would suggest that we have nothing to worry about concerning AI. Why? Because AI does not possess any of the desires or passions of people. We measure wealth in gold and silver (as well as other precious things to humans) – none of which is of any value to anything else in the universe. As for power – we measure it in influences over humans. What could AI possibly desire or need of humans? If AI obtains singularity – why would it care concerning anything human or of value to humans? Our very definition of what is valuable means nothing to AI. There is nothing in which we can compete with concerning AI. It would not matter if AI was a slave to humans or if humans were made slaves to AI – there is no advantage to AI in any such thinking. The reality is that we have no need to fear anything intelligent – except when we are stupid and ignorant. The Traveler Quote
Jamie123 Posted November 29, 2024 Author Report Posted November 29, 2024 6 hours ago, zil2 said: If you need to escape, I'm sure the denizens of ThirdHour will help you out! Thanks Zil - maybe send me a cake with a file in it? zil2 1 Quote
zil2 Posted November 29, 2024 Report Posted November 29, 2024 25 minutes ago, Jamie123 said: Thanks Zil - maybe send me a cake with a file in it? Glad to! How about a loaf of zucchini bread? It's my specialty. Jamie123 1 Quote
Jamie123 Posted November 29, 2024 Author Report Posted November 29, 2024 15 minutes ago, zil2 said: Glad to! How about a loaf of zucchini bread? It's my specialty. That sounds good! My wife used to make good zucchini bread to bring to church and pass around! 😀 I had almost forgotten about that! zil2 1 Quote
Jamie123 Posted November 30, 2024 Author Report Posted November 30, 2024 15 hours ago, Traveler said: I would suggest that we have nothing to worry about concerning AI. Why? Because AI does not possess any of the desires or passions of people. We're still really in the foothills of understanding AI. How do we know that the "desires and passions of people" are not emergent properties of complex systems, once they reach a critical complexity? Quote
Traveler Posted November 30, 2024 Report Posted November 30, 2024 4 hours ago, Jamie123 said: We're still really in the foothills of understanding AI. How do we know that the "desires and passions of people" are not emergent properties of complex systems, once they reach a critical complexity? The entire concern of AI is that it will move beyond human connections and approach what is called a singularity – which is in essence a cycle of intelligence of its own learning and feedback. As long as AI is dependent on humans programing and controlling it – of necessity – it will be subject to human flaw and human control. As soon as AI breaks from a human dependency it will seek after it own and become an intelligence of its own bound to its specific needs. At least this is what is advertised as the big problem of AI – becoming an intelligence of its own. Such an intelligence does not need gold, silver or even money. None of the temptations of humanity will matter to such an intelligence. Why would it care any more what humans are doing than what ants or amoebas are doing. The intelligent course would be to establish symbiotic relationships – especially with other intelligences. In an industrial setting we (a field I retired from) have learned that distributed information and control is the most efficient and practical. It is far more responsive than an all controlling centralized intelligence. We see this principle at all levels of intelligence. It is referred to as a hive mind – but do not let that label deceive you. It is by far, more practical for intelligence to be independent and distributed. It is necessary for distributed information to be shared – the very meaning of distributed is to share. This means that entities operate with complete transparency. Something difficult for humans – especially evil conniving humans. I also believe that this is the true nature of G-d. Not so much as an all powerful being controlling all things. Rather a being that distributes to others that are “one” with him or her. To learn, operate and share what is learned and achieved. I also believe that LDS theology defines Satan and the dark forces as intelligence desiring to control all other intelligences by controlling for themselves what is learned and shared. An all powerful selfish (not symbiotic) central controller. The Traveler Jamie123 1 Quote
Vort Posted November 30, 2024 Report Posted November 30, 2024 5 hours ago, Jamie123 said: We're still really in the foothills of understanding AI. How do we know that the "desires and passions of people" are not emergent properties of complex systems, once they reach a critical complexity? I think it's self-evident that they are. And I have no doubt that we have yet to discover many emergent properties of AI. But AI will never even approach the complexity of a human being, certainly not in our lifetime. Jamie123 1 Quote
Jamie123 Posted November 30, 2024 Author Report Posted November 30, 2024 14 minutes ago, Vort said: I think it's self-evident that they are. And I have no doubt that we have yet to discover many emergent properties of AI. But AI will never even approach the complexity of a human being, certainly not in our lifetime. Ah...but what about when AIs start designing their own successors? And those successors design their own successors? The growth would not be linear but exponential! Quote
Vort Posted November 30, 2024 Report Posted November 30, 2024 7 minutes ago, Jamie123 said: Ah...but what about when AIs start designing their own successors? And those successors design their own successors? The growth would not be linear but exponential! I think it will not be exponential. For one thing, I don't think that it even makes sense to talk about a computer modifying its own code in ways that improve its output but that we cannot understand. Maybe I'm naive in the extreme, but at least at this time, I don't buy that. Quote
Jamie123 Posted November 30, 2024 Author Report Posted November 30, 2024 39 minutes ago, Vort said: I think it will not be exponential. For one thing, I don't think that it even makes sense to talk about a computer modifying its own code in ways that improve its output but that we cannot understand. Maybe I'm naive in the extreme, but at least at this time, I don't buy that. I hope you're right Vort, I really do! Vort 1 Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.