• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Teaching Artificial Intelligence in Grades 7 to 12

paul342160

DP Veteran
Joined
May 20, 2020
Messages
541
Reaction score
212
Gender
Male
Political Leaning
Undisclosed
In my opinion, students should be slowly introduced to AI before they attend college.

Students who will not be attending college will also benefit.
 
They, for the most part, are teaching children python in 4th grade. I believe they need to do a better job to make sure all students are exposed to python by the time they graduate elementary school. AI, is a little bit more advanced, but, yes, they should be able to do simple numpy from python in high school.
 
In my opinion, students should be slowly introduced to AI before they attend college.

Students who will not be attending college will also benefit.
There is no such thing as "Artificial Intelligence."

What people mistakenly call "AI" is actually an expert system and there is nothing intelligent about the program at all. AI is not possible with binary computers. It may become possible with quantum computers, but that remains to be seen.

Long gone are the days when a student could make sense of what is happening in the computer world. Back in the 1970s and 1980s when computers were still relatively new and unsophisticated they could be comprehended by just one individual, but not any longer. Today the computer industry is specialized and compartmentalized. You have network experts, database administrators, hardware technicians, and a wide variety of different software developers. Nobody has a complete understanding of computers any longer. Hell, they are now into Fifth and Sixth generation programming languages now. Which means just the programming language is five or six times removed from its original machine code which it still uses to function. Nobody is taught how basic functions actually work any longer, they are simply taught which functions to use and why.

Quantum computers have been around since the 1990s, but like the original Sperry-Univac, they are about the size of a small house. Which is why they are still primarily used for research and are not widely used in the commercial world for the last ~30 years. There is also the issue of developing a language for them. With binary computers it is pretty straight forward, but with qubits it is a whole other story. The one thing quantum computers will provide in ample amounts is speed. Quantum computers will make binary super computers appear to be slower than the Commodore 64 (8-bit 1.8 Mhz). With that kind of speed they will be able to blow through any kind of modern-day encryption, or possibly be fast enough to simulate artificial intelligence. Considering the technology is currently in its infancy, I would not rule out the possibility that AI may be possible with quantum computers.

For a computer to be truly artificially intelligent it must exceed its programming. If a computer, of any kind, is only able to perform what it was programmed to perform, then it is an expert system and not AI. Only by exceeding its own programming can a computer develop true AI.
 
Eh as long as they are teaching students to do their own work still. I learned because i had to fetch the sources and write it myself instead of putting it into an AI generator.

We should never give up human creativity.
 
AI & Electricity
One of the hallmarks of artificial intelligence (AI) is an enormous demand for electricity to power the computers. Some projections are saying AI could use 85 terawatt-hours per year, more than many countries use. Currently data centers use 1-1.5% of global electricity use. With AI that number could jump to 6% or more. Some future projections have AI using as much electricity as Germany in the future. Again this will require more commodities especially copper. We like the copper sector because of this and the slow pace of new copper mines coming on line. Electricity will likely be one of the big winners (Stocks) over the next ten years.
 
AI results are improving because experts are being paid by Google and other companies to correct mistakes in AI. One of my friends is being paid to correct errors in Anatomy and Physiology AI results. I'm sure that experts in Biology, Chemistry, Geography, Geology, History, Languages, Math. Physics, Psychology, Political Science, Religion, Sociology, and other fields are helping Google and other companies to improve the truthfulness of AI.
 
There is no such thing as "Artificial Intelligence."

What people mistakenly call "AI" is actually an expert system and there is nothing intelligent about the program at all. AI is not possible with binary computers. It may become possible with quantum computers, but that remains to be seen.

Long gone are the days when a student could make sense of what is happening in the computer world. Back in the 1970s and 1980s when computers were still relatively new and unsophisticated they could be comprehended by just one individual, but not any longer. Today the computer industry is specialized and compartmentalized. You have network experts, database administrators, hardware technicians, and a wide variety of different software developers. Nobody has a complete understanding of computers any longer. Hell, they are now into Fifth and Sixth generation programming languages now. Which means just the programming language is five or six times removed from its original machine code which it still uses to function. Nobody is taught how basic functions actually work any longer, they are simply taught which functions to use and why.

Quantum computers have been around since the 1990s, but like the original Sperry-Univac, they are about the size of a small house. Which is why they are still primarily used for research and are not widely used in the commercial world for the last ~30 years. There is also the issue of developing a language for them. With binary computers it is pretty straight forward, but with qubits it is a whole other story. The one thing quantum computers will provide in ample amounts is speed. Quantum computers will make binary super computers appear to be slower than the Commodore 64 (8-bit 1.8 Mhz). With that kind of speed they will be able to blow through any kind of modern-day encryption, or possibly be fast enough to simulate artificial intelligence. Considering the technology is currently in its infancy, I would not rule out the possibility that AI may be possible with quantum computers.

For a computer to be truly artificially intelligent it must exceed its programming. If a computer, of any kind, is only able to perform what it was programmed to perform, then it is an expert system and not AI. Only by exceeding its own programming can a computer develop true AI.
If your direct knowledge of the tech industry ended in the 80’s, then this opinion makes sense.

Otherwise, it is just showing your lack of knowledge of the current tech industry.
 
Glitch, I'm a scientist. I'm not a computer expert like some of my friends.
 
It's probably important to start them early. Our employers are racing to replace us with it, so they might as well grab the good part of the wave while the rest of us get wiped out by it. I think that AI is potentially going to do what automation couldn't where professional workers are concerned.
 
Glitch, I'm a scientist. I'm not a computer expert like some of my friends.
I have a BSCS from the University of Minnesota and worked in the industry from 1975 until 2013. I specialized in data structures. I have been retired since 2013. While I would not claim to be an expert in the field of AI, I do know a great deal about the subject and its history. There is no such thing as "AI." It has been misnamed by those who do not have the first clue what AI truly is. If a computer is only following its programming, no matter how sophisticated that program may be, then it cannot be AI. It is an expert system, and there is a significant difference between the two.
 
Last edited:
Back
Top Bottom