• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!
  • Welcome to our archives. No new posts are allowed here.

"Eventually, a computer will write the best novel ever written."

dstebbins

Member
Joined
Oct 22, 2005
Messages
169
Reaction score
0
Gender
Male
Political Leaning
Very Liberal
I found this question taking a political test once. Who agrees with me when I say this is total BS?

For the purposes of this thread, I'm going to stretch the definition of "think" a little. Bite for me.

There is one reason a computer cannot write a novel at all, much less a good one. That reason is the one difference between man and robot. That difference is this: A computer cannot think.

Now wait. A computer does not think. A computer computes (remember I'm stretching the definition a bit). A computer does exactly what it's told to do, not taking into account anything but numbers. When we were kids and we were given math problems in school, the teacher called it thinking, but it was actually computing. Thinking involves logic, outside-the-box, and emotion. That's why we still have humans serving on juries instead of a computer that weighs the evidence and gives a verdict based on probability rather than moral. We want someone trying our crimes who can think.

You're next response is probably going to be that we can program a computer to come up with the plot twists that make a good novel. That is true. A computer can do anything that it is programed to do. However, what idiot on the planet is going to make a computer that can think instead of just compute? After all the novels of robots destroying the world, turning against their creators because their CPUs allow them to learn, and as such they learn to revolt? Who on God's green earth is going to make a computer that smart? They can program the computer to do rigid tasks and program it to learn within their programming, but who would make a computer who could actually think?

My conclusion is this: A computer will never write the best novel ever written because we, as humans, know better than to give a computer the power of thought.
 
Computers lack substance so they will never create the greatest novel ever written. At the most, they will produce a few cheesy sci fi and romance novels. At the most. . .
 
FinnMacCool said:
Computers lack substance so they will never create the greatest novel ever written. At the most, they will produce a few cheesy sci fi and romance novels. At the most. . .
That's not exactly what I was trying to say. Yes they lack substance, but they also lack the power of thought. They cannot write any book unless they are told EXACTLY what to do. Computers follow a very rigid set of instructions, and cannot make their own choices, which is required to write a good novel.
 
Artificial intelligence is the technology that gives a computer the ability to think and make decisions. We discussed artificial intelligence in some of my computer science classes. Artificial intelligence is not up to a standard or technological ability to grant computers the ability to think craftily in ways that a human mind can. However, in the future, anything is possible with advances in artificial intelligence. Scientifically speaking, if a human mind that has the ability to think craftily exists, then it is theoretically possible to create such a mind on a computer or machine. At MIT they are trying to create a machine that can reproduce itself, because living things exist with the ability to reproduce themselves, so it's theoretically possible to create a machine or something that is capable of doing the same thing. That's how my professor explained it. I always thought that was interesting. Sort of like man taking the place of God, or what God wrote in the Bible that he created man in his own face. It also raises the interesting thought of man creating an super intelligent machine who rises up against man. Books have been written on this concept and movies made on it.
 
Last edited:
Computers cannot make their own choices. In order to carry out a task, it must be told exactly what to do. While a computer may be the ultimate scientist, it cannot infer, which is required to write a novel. If you put in an entirely new work of literature into a computer and asked it to tell you what the moral is, do you think it could do it?

And even if we can program a computer to think craftily, keep in mind the word "best." The best novels are always the heartfelt ones, the ones not written for the sole purpose of making money. These "heartfelt" stories usually have a moral, or something else that the reader can learn from.

That being said, how can a computer write "the best novel ever written" when it has never felt an emotion and never will feel an emotion? I say "never will feel an emotion" because no mad scientist is stupid enough to create a computer that can comprehend emotion. After all the sci-fi novels and movies of robots turning on the world, who would take the risk of making one that smart?

A computer knows neither right or wrong, or even good and bad like animals can learn. They only understand correct and incorrect, so how can they write the best novel ever written?
 
dstebbins said:
That's not exactly what I was trying to say. Yes they lack substance, but they also lack the power of thought. They cannot write any book unless they are told EXACTLY what to do. Computers follow a very rigid set of instructions, and cannot make their own choices, which is required to write a good novel.

Perhaps not now, but there's no reason that we will not be able to construct artificial "brains" in the future. There is nothing mythical about how the brain works - it uses physical processes to do its work. Sure, we don't completely understand how, but it doesn't make sense to assume that there is something fundamentally unintelligible about how the brain works. Once we get that, it's just a matter of using artificial materials to replicate whatever processes are going on in our brains.
 
Engimo said:
Perhaps not now, but there's no reason that we will not be able to construct artificial "brains" in the future. There is nothing mythical about how the brain works - it uses physical processes to do its work. Sure, we don't completely understand how, but it doesn't make sense to assume that there is something fundamentally unintelligible about how the brain works. Once we get that, it's just a matter of using artificial materials to replicate whatever processes are going on in our brains.
Here's my final argument.

As of right now, a robot can be taught, but it cannot learn. If it is to expand its computing power, it must be reprogrammed.

That being said, many novels and sci-fi movies featured robots who turned against man. These machines set out to destroy the world, and while mankind stopped the robots in every story, many civilian lives were lost before the robot was destroyed.

Who's going to make a robot that smart? Sure we can program it to remain loyal, but if we program the computer to learn, which is what you're suggesting, what is to stop it from learning to disobey its programming?

What kind of moron is going to make a robot that can learn? Seriously, after all these novels and short stories, who's that moronic? For an example, read the famous short story "The Cold Equation."
 
dstebbins said:
Here's my final argument.

As of right now, a robot can be taught, but it cannot learn. If it is to expand its computing power, it must be reprogrammed.

That being said, many novels and sci-fi movies featured robots who turned against man. These machines set out to destroy the world, and while mankind stopped the robots in every story, many civilian lives were lost before the robot was destroyed.

Who's going to make a robot that smart? Sure we can program it to remain loyal, but if we program the computer to learn, which is what you're suggesting, what is to stop it from learning to disobey its programming?

What kind of moron is going to make a robot that can learn? Seriously, after all these novels and short stories, who's that moronic? For an example, read the famous short story "The Cold Equation."

What sort of moron would build a device that has the destructive potential to nearly annihilate our entire planet? It happened, didn't it?

Believe me, progress in A.I. will continue, and the development of a sapient artificial lifeform is practically inevitable.
 
Engimo said:
What sort of moron would build a device that has the destructive potential to nearly annihilate our entire planet? It happened, didn't it?
I take it you're talking about nuclear weapons. You know exactly what I mean, and you must be fundamentally stupid if you can't infer.

Sure, they've been created, but they are completely under our control. We release their fury when we're good and ready. Nuclear Weapons can't learn like a human can. Who's going to make a computer with the power of a nuclear warhead and the ability to think and learn for itself?

Believe me, progress in A.I. will continue, and the development of a sapient artificial lifeform is practically inevitable.
Maybe you should read Frankenstein. That man tried to play God by creating a life that couldn't be destroyed. It backfired on him bigtime. All these novels of people playing God are warnings, and what idiot is going to ignore these warnings that have been around for as long as anyone on this forum can remember and make a fully-intelligent superbeing?
 
dstebbins said:
I take it you're talking about nuclear weapons. You know exactly what I mean, and you must be fundamentally stupid if you can't infer.

What? I was drawing an analogy.

Sure, they've been created, but they are completely under our control. We release their fury when we're good and ready. Nuclear Weapons can't learn like a human can. Who's going to make a computer with the power of a nuclear warhead and the ability to think and learn for itself?

My point is that nuclear weapons have great destructive power, and that is their only purpose. The creation of an artificial brain would not have any inherent danger and could indeed teach us much about our own anatomy and thought processes. Your fear of an artificial intelligence taking over the world or whatever is unfounded and rooted in apocalyptic sci-fi movies.


Maybe you should read Frankenstein. That man tried to play God by creating a life that couldn't be destroyed. It backfired on him bigtime. All these novels of people playing God are warnings, and what idiot is going to ignore these warnings that have been around for as long as anyone on this forum can remember and make a fully-intelligent superbeing?

I've read Frankenstein. It was rather boring, to be honest.
 
Engimo said:
My point is that nuclear weapons have great destructive power, and that is their only purpose. The creation of an artificial brain would not have any inherent danger and could indeed teach us much about our own anatomy and thought processes. Your fear of an artificial intelligence taking over the world or whatever is unfounded and rooted in apocalyptic sci-fi movies.
And how do you figure that? Many pieces of literature have a moral that can be lived by. Children who fake getting sick are reminded of the Little Boy who Cried Wolf. The Odyssey teaches us to never give up on our goals, no matter how hopeless things seem. What makes you think there is no moral in "apocalyptic sci-fi movies?"
 
dstebbins said:
And how do you figure that? Many pieces of literature have a moral that can be lived by. Children who fake getting sick are reminded of the Little Boy who Cried Wolf. The Odyssey teaches us to never give up on our goals, no matter how hopeless things seem. What makes you think there is no moral in "apocalyptic sci-fi movies?"

Because it is unfounded. There is no reason to believe that the creation of some sort of super-A.I. that will destroy all humans and take over the world is inevitable or even probable. If we create an artificial brain, why would it act fundamentally any different from an organic brain?
 
Interesting question.

I think that probably someday, assuming we haven't blown ourselves up first lol, robots will be very advanced and will be a lot like humans. But I don't see why they would be anymore creativethan humans, though. They would probably be able to do mathematical calculations better but I doubt if they would be anymore creative in their ability to write a good story. Maybe some robots would be more creative than your average joe schmoe but I still think really talented humans would be up to par.
 
George_Washington said:
Interesting question.

I think that probably someday, assuming we haven't blown ourselves up first lol, robots will be very advanced and will be a lot like humans. But I don't see why they would be anymore creativethan humans, though. They would probably be able to do mathematical calculations better but I doubt if they would be anymore creative in their ability to write a good story. Maybe some robots would be more creative than your average joe schmoe but I still think really talented humans would be up to par.

For what reason? Provided that we can get an artificial brain working that operates on the same principles as our organic brains, why would it be uncreative? There is nothing mystical our beyond-naturalistic about the way that our brains work, so why should the same creativity not exhibit itself in an artificial brain?
 
Engimo said:
For what reason? Provided that we can get an artificial brain working that operates on the same principles as our organic brains, why would it be uncreative? There is nothing mystical our beyond-naturalistic about the way that our brains work, so why should the same creativity not exhibit itself in an artificial brain?

I don't mean they won't be creative. I meant I don't think they will be any more creative than we are, at least not than our most talented people.
 
George_Washington said:
I don't mean they won't be creative. I meant I don't think they will be any more creative than we are, at least not than our most talented people.

Ahh, I see. The thing is, though, considering the inherent superiority of digital/optical connections to our neurochemical (ion-based) pathways, it is assured that whatever artificial brains we create will have a much faster thinking speed - making them more creative/intelligent.
 
Engimo said:
Ahh, I see. The thing is, though, considering the inherent superiority of digital/optical connections to our neurochemical (ion-based) pathways, it is assured that whatever artificial brains we create will have a much faster thinking speed - making them more creative/intelligent.

Are you sure that digital connections are more efficient than our chemical based ones? I mean, I just don't see how they're more efficient. Maybe they can compute things faster but how does that mean they will invent new things and stuff better than we will?
 
George_Washington said:
Are you sure that digital connections are more efficient than our chemical based ones? I mean, I just don't see how they're more efficient. Maybe they can compute things faster but how does that mean they will invent new things and stuff better than we will?

Yes, they are. Our neural synapses work on the movement of things such as potassium ions, which are much slower than the movement of electrons or photons (in optical computers). Even if they work the same way as our brains, artificial brains will have a tremendous advantage in processing speed.
 
Engimo said:
Yes, they are. Our neural synapses work on the movement of things such as potassium ions, which are much slower than the movement of electrons or photons (in optical computers). Even if they work the same way as our brains, artificial brains will have a tremendous advantage in processing speed.


Hmm, interesting. One thing I was thinking of was that what if robots, perhaps due to their higher processing speed, also have greater emotional side effects? Such as stuff like depression, mental illness, mood swings, etc. It certainly seems like smarter humans tend to have emotional and mental side effects to their intelligence. You know, like Russell Nash, for example.
 
George_Washington said:
Hmm, interesting. One thing I was thinking of was that what if robots, perhaps due to their higher processing speed, also have greater emotional side effects? Such as stuff like depression, mental illness, mood swings, etc. It certainly seems like smarter humans tend to have emotional and mental side effects to their intelligence. You know, like Russell Nash, for example.

Haha, you mean John Nash? Russell Crowe played him in "A Beautiful Mind".

Funnily enough, he was schizophrenic, which he said was brought about by trying to create a reformulation of Quantum Mechanics to be compatable with General Relativity. I don't think it really had anything to do with his intelligence, though, he just had mental problems.
 
Engimo said:
Haha, you mean John Nash? Russell Crowe played him in "A Beautiful Mind".

Funnily enough, he was schizophrenic, which he said was brought about by trying to create a reformulation of Quantum Mechanics to be compatable with General Relativity. I don't think it really had anything to do with his intelligence, though, he just had mental problems.

Yeah oops, I meant John Nash. I was thinking of Russell Crowe, actually. :mrgreen:

If you think about it though, if robots have emotions, who knows what kind of personality traits they'll have. And if they don't emotions, would they still be able to be creative? Do you think they would or no? I think it would probably hinder their creativity.
 
George_Washington said:
Yeah oops, I meant John Nash. I was thinking of Russell Crowe, actually. :mrgreen:

If you think about it though, if robots have emotions, who knows what kind of personality traits they'll have. And if they don't emotions, would they still be able to be creative? Do you think they would or no? I think it would probably hinder their creativity.

It really depends on what type of robot we're talking about. If we're discussing an artificial brain that is based upon the same governing principles as our brains and used the same sort of neural "blueprint" for its design, there is no reason why it would act any differently than a human (with the obvious exception of it being much, much faster), so it would not be excluded from having emotions and such.

Of course, this presupposes that we design the brain with the intent of it acting human. Emotional changes and many other things in the brain are the product of homones acting in certain ways. If we exclude synthetic versions of these hormones and neurochemicals from our artificial brain, it would act differently from a human.

This is all conjecture, though. We're so far away from being able to do anything even remotely like this on a practical level that we cannot do anything but make guesses as to what would happen or how we would go about doing it.
 
Engimo said:
This is all conjecture, though. We're so far away from being able to do anything even remotely like this on a practical level that we cannot do anything but make guesses as to what would happen or how we would go about doing it.

Yeah and I didn't even think we knew enough about our own brain yet to design a duplicate of one. I know doctors say that emotionals are based on hormones but I am not convinced that this is actually true or at least not entirely so. Because I've know and heard about people who've been on mood altering drugs like anti-depressants and have said they've never worked on them.
 
Engimo said:
Because it is unfounded. There is no reason to believe that the creation of some sort of super-A.I. that will destroy all humans and take over the world is inevitable or even probable. If we create an artificial brain, why would it act fundamentally any different from an organic brain?
my point exactly. Humans can turn against their authority, can't they? Teenagers kill their parents all the time, and citizens of tyrannical countries in the past have often turned against their kings, such as the American Revolution. What's to stop a computer from turning against its programmer if we give it the ability to learn?
 
Back
Top Bottom