![]() |
Freethought & Rationalism ArchiveThe archives are read only. |
![]() |
#11 |
Beloved Deceased
Join Date: Oct 2002
Location: Raleigh, NC
Posts: 7,150
|
![]()
You know what's rapture for me, as a geek?
Spamming off two thousand lines of compartmentalized C++ code (yeah, that's right, nested loops, function calls calling other functions, file I/O, external libaries, the whole shebang) with ZERO COMPILE ERRORS THE FIRST TIME! SUCK IT! ![]() (No runtime errors, either; damn thing actually worked. ![]() |
![]() |
![]() |
#12 |
Veteran Member
Join Date: Jun 2002
Location: Pittsburgh
Posts: 3,966
|
![]()
Congradulations, Stil-1...I think...
Not being a programmer, I comprehended little of the above post. ![]() Note, however, that there are other varieties of geekiness. |
![]() |
![]() |
#13 |
Beloved Deceased
Join Date: Oct 2002
Location: Raleigh, NC
Posts: 7,150
|
![]()
Yeah, I admit I laid it on kinda thick. But it was on purpose.
![]() Basically, I wrote an ass-long block of code (for a n00b like me, anyway), and it worked right the first time around. |
![]() |
![]() |
#14 |
Veteran Member
Join Date: Sep 2000
Location: Green Bay, Wisconsin
Posts: 6,367
|
![]()
You know I don't think this thread is best suited for Sec Life. It could be a candidate for Science & Skepticism, but for now I will move it to the larger audience in MD.
Maverick - Sec Life Moderator |
![]() |
![]() |
#15 | |
Veteran Member
Join Date: Feb 2003
Location: Outside of the asylum...
Posts: 2,049
|
![]() Quote:
My fundie ex-sis-in-law says that "God won't let that happen! He'll return first, or stop it.", (He also won't allow human cloning either, or A.I., according to her...) so... Btw, speaking of Vinge...his sci-fi novel "A Fire Upon the Deep" is a must-read, and kinda deals with this same concept - whole civilizations suddenly "transending" or something like that (forgot the exact word he used) yet then often soon after falling apart. (I love his concept of being "God-shattered" in the book) There is also the same, or similar idea of civilizations reaching a singularity then suddenly totally collapsing in Ken MacLeod's "The Cassini Division" and "The Stone Canal" (part of his so-called "Fall-Revolution" series) - bryce |
|
![]() |
![]() |
#16 | |
Veteran Member
Join Date: Feb 2003
Location: Outside of the asylum...
Posts: 2,049
|
![]() Quote:
- bryce See... |
|
![]() |
![]() |
#17 |
Veteran Member
Join Date: Feb 2003
Location: Outside of the asylum...
Posts: 2,049
|
![]()
I read the Vinge article...
I'm not sure that human's will ever be able to create an A.I. equal or superiour to our own brains/intelliegence, because we may not be any more capable of fully comprehending/understanding how our own brains - and how "intelligence" and "self-awareness" - really function, than a chimpanzee fully can comprehend/understand it's own brain and how it works. Maybe we aren't smart enough...or maybe we have reached some magic point in the evolution of intelligence where we are the first form of intelligence (on this planet, at least) capable of understanding it's own _self_ fully. But as for the idea of accidentially creating an A.I., or one evolving on it's own from our own networks and such, that I can see more easily... Then again, we could just _copy_ the human brain, but a perfect copy might _require_ the use of the exact same bluprints, structures, AND materials, and thus would simply be another human brain* - and hell, those are easy to make - me and my ex-wife made three - they're called "kids"... But that would be saying nature will allow only one kind of intelligent brain (flesh-based), or that we aren't smart enough to do more than just make mere exact copies... *(Which is why I never got the idea of exactly _what_ the "Replicants" where in the movie "Blade Runner", I mean if they had DNA and flesh and blood, then they where just gen-genered _people_, not robots or androids...) Btw, I liked David Brin's idea - in his short story "Lungfish" - of a humanity that created A.I., and *knew* that A.I. would long outlast flesh-based life, and yet wanted "humanity" (as a concept, and with it's ideals and beliefs) to persist, in whatever form (and perhaps to allay fear of a "Terminator"-like or "Matrix"-like takeover) created A.I.'s that grew and matured just like humans, and were raised - at first - by human families as human (and given full human rights) so that when they matured they though of themselves as part of humanity. But I think that hummanity is too immature - and perhaps too "speciesist" to do this anytime soon. And who's to say that it would work with a super-human intelligence at *all*... - bryce |
![]() |
![]() |
#18 | |
Beloved Deceased
Join Date: Oct 2002
Location: Raleigh, NC
Posts: 7,150
|
![]() Quote:
Programming could set taboos and all, but I don't think that intelligence in a computer system would be a code-based limitation. ('smarter' = faster computation) |
|
![]() |
![]() |
#19 | |
Veteran Member
Join Date: Jun 2002
Location: Pittsburgh
Posts: 3,966
|
![]()
wonkothesane said:
Quote:
Despite these differences, I also concluded that the similarities between humans and Replicants allows for both groups to be placed within the larger category of "persons". I haven't read Philip K. Dick's novel "Do Androids Dream of Electric Sheep", so I couldn't tell you if it elaborates more upon on the Replicants' nature and manufacture. The role-playing game GURPS has a sourcebook, "Biotech", that includes similar beings, known as "bioroids" or "Biological Androids". Basically, they are vat-grown bodies (either induced with some type of rapid-growth factor or pieced together with some form of nanotechnology) that have either biological or computer-based brains. The malleabillity of the brains seems to be the main difference separating them from Homo sapiens sapiens, which means their personalities and memories can be altered at the will of the manufacturer. |
|
![]() |
Thread Tools | Search this Thread |
|