View Full Version : Singularity

November 24th, 2002, 06:00 AM
MM all,

This is my first post on this site as I am new here. Have been a member of the CoA for a while but stumbled upon this site. Oh wee, now after that short intro my question to you is what are all of your veiws on the matter of singularity.

I personally believe that this is going to be the destruction of the planet because of how out of hand we are getting. It is awful the way we keep trying to out do ourselves in this field but every year it happens more and more. I say at this pace AI humans will be very real within 10 - 15 years and we as a species may become extinct in the short years after...

November 24th, 2002, 07:36 PM
If we were to duscuss this in a purely scientific nature one would hve to agree. I also wouldn't be able to say its entirely a bad thing. Evolution my friend. Bacteria, the dinosaurs, and the giant mammals that predated us had to face fact. Evolution happens. If those who are in agreement with evolution(myself inclueded) are correct, we will be replaced someday as the dominant creatures. That is if we don't destroy the planet first. Fact of life really. Just as we as individuals will die, so will the human race with time. Weither its by our own hands or the hands of nature is the only thing we can control, not wheither or not it happens.

November 25th, 2002, 02:32 PM
Hello, and welcome to the scientific pagan forum!

We welcome all manner of crackpot theories here, but be warned, at least one actual scientist reads this forum (hallowed be his name), and His answers may surprise and frighten you.

That said, why don't you give me.. er.. I mean, Him.. a bit more information about what you're talking about. What kind of Singularity are you referring to? If you mean black holes, well yeah, they're out there, and we're probably getting sucked in sooner or later ;)

- Illuminatus!

November 27th, 2002, 10:32 AM
MM Illuminatus,

I am talking about computers gaining enough power and authority by us as humans to take over and eliminate human existance. The making of AI and all that comes with it. We as humans getting to curious to how far we can push the boundries of computers and robotics until there is the creation of artificial life and we are the weakest species left between the two and are forced out of existance. That is what I mean Illuminatus...

Now what is your opinion, err, I mean that person that you know.lol...

November 27th, 2002, 01:46 PM
Well, we don't need to worry about that quite yet.

Have you talked with any of the AI bots on the web, like Smarterchild on AIM? They are so stupid, it's not even funny. They can categorize and prioritize words, and regurgitate them in a pre-programmed context, but I assure you, Hal 9000 is not even on the horizon.

I worked in controls automation for a year, where we programmed machinery and computers to automate the production of pharmaceuticals. Machines can only do what humans explicitly tell them to do. Even if AI reaches, and surpasses HI (two extremely unlikely occurances, at least for the next 20 years), it's a long stretch for a computer to "take over". Most computers, if you hadn't noticed, can't move. At all. Yeah, maybe they'll be able to "hack in" to global systems, but what's that going to do? Make a sattelite fall down? Tell the robots in an auto plant to flail wildly?

The sky is NOT falling, chicken little!

- Illuminatus!

December 3rd, 2002, 03:32 PM
Even if AI does happen before you get too old to worry about it, that doesn't mean we are on the way out. Too much depends on what we do with the AI, how far we go and what controls they will be able to assert back on us. Terminator type worlds are fun to dream about but require some explaination. Most things do better by co-operating unless they gain some major benefit from fighting. There is no reason that AI and humans won't become closer and closer together, a possible merger of the two is just as likely (or unlikely) as AIs gathering and exterminating us.