- Joined
- Nov 20, 2013
- Messages
- 65,386
- Reaction score
- 49,410
- Gender
- Male
- Political Leaning
- Other
- Many goods have become disposable as the cost of manual repair or cleaning has become greater than the cost of making new goods due to mass production. Examples of disposable goods include ballpoint pens, lighters, plastic bottles, and paper towels.
- The design of goods becomes outdated quickly. (And so, for example, a second generation of computers appears before the end of the expected period of usability of the first generation). It is possible to rent almost everything (from a ladder to a wedding dress), thus eliminating the need for ownership.
- Whole branches of industry die off and new branches of industry arise. This impacts unskilled workers who are compelled to change their residence to find new jobs. The constant change in the market also poses a problem for advertisers who must deal with moving targets.
- People of post-industrial society change their profession and their workplace often. People have to change professions because professions quickly become outdated. People of post-industrial society thus have many careers in a lifetime. The knowledge of an engineer becomes outdated in ten years. People look more and more for temporary jobs.
- To follow transient jobs, people have become nomads. For example, immigrants from Algeria, Turkey and other countries go to Europe to find work. Transient people are forced to change residence, phone number, school, friends, car license, and contact with family often. As a result, relationships tend to be superficial with a large number of people, instead of being intimate or close relationships that are more stable. Evidence for this is tourist travel and holiday romances.
- The driver's license, received at age 16, has become the teenager's admission to the world of adults, because it symbolizes the ability to move independently.
https://en.wikipedia.org/wiki/Future_Shock
NASA just released 56 patented space and rocket technologies to the public
On one hand I think this is great in that yes government paid for research is returned to the public to innovate the next great new products with, and to spur forward the US economy. I see this as one of the reasons why the US economy, although battered, has sound fundamentals and foundation on which to continue to build (just as long as the population can maintain the needed education levels).
On the other hand, it feels like the US is giving away all its really good stuff to the entire world.
A long time ago I read a by Alvin Toffler's Future Shock where and his prediction of innovation or change centers (I think it was), who's sole purpose was to innovate solutions to problems that haven't even been identified yet (pardon me, but this was some 30 years ago, and I'm recalling from memory).
From WikiPedia:
If the US isn't at the point of being a post-industrial society, as he describes it, we are certainly very close to the cusp of it. Frankly, I think we've been here awhile now already.
Not really sure where to take this thread from here, have at it and take it where you will.
Greetings, Erik. :2wave:
Great post! :thumbs: My two cents... If "post-industrial" means you do not have a job with the same company for years, and retire from that job, then yes, that does seem to be disappearing, at least in the US. Back in the days after WW2, we were just about the only country that could manufacture anything.
Many people were hired off the streets to work on assembly lines, including our vets returning home, which didn't require a college education. And things were made to last for years back then, and were more costly than we expect to pay today.
Nowadays, things have generally gotten more technical - for lack of a better word - since most everyone is expected to know how to use a computer, as an example. Executives in business today still have secretaries, but they also have a computer in their office, and are expected to know how to use it. This would have been unthinkable to most men in the business world in the 70s. Although machines now do most of the assembly-line work than humans used to do - robots are being used now for welding, etc, - it still requires a technician to monitor on a computer that things are running on schedule, and the specified number of "widgets" are being produced.
The latest seems to be having computers take food orders in fast food businesses, which is still new enough to be a novelty, but it shows a trend. The sad part is that there are not enough decent paying jobs today for those that want to work, so many people are working two or more part-time jobs, which cuts down on family time together, which is not good. It will be interesting to see what our lives will be like five years from now...
NASA just released 56 patented space and rocket technologies to the public
On one hand I think this is great in that yes government paid for research is returned to the public to innovate the next great new products with, and to spur forward the US economy.
Wait a minute... NASA holds patents??? Unless it's a matter of nat'l security, any gov't funded research should be public domain right from the start.
I think you might want to do some deep thinking on the ramifications of making all patents held by something called the National AERONAUTICS and SPACE Agency go public domain immediately.
If it's a NASA patent, it's an AMERICAN patent. If it's public domain, it's open to EVERYONE on Earth.
Not always the smartest thing to do.
Good GOD...
Wait a minute... NASA holds patents??? Unless it's a matter of nat'l security, any gov't funded research should be public domain right from the start.
:slapme:
Did you really miss that part or was this just a knee jerk response?? WE paid for the research and as long as it doesn't compromise out nat'l security, we should have access to it. Now I'd be 100% on board with taking steps to slow down foreign entities from using it, but that's just a matter of time before any significant advance goes global and there's no stopping it (letting the genie out of the bottle).
NASA just released 56 patented space and rocket technologies to the public
On one hand I think this is great in that yes government paid for research is returned to the public to innovate the next great new products with, and to spur forward the US economy. I see this as one of the reasons why the US economy, although battered, has sound fundamentals and foundation on which to continue to build (just as long as the population can maintain the needed education levels).
On the other hand, it feels like the US is giving away all its really good stuff to the entire world.
A long time ago I read a by Alvin Toffler's Future Shock where and his prediction of innovation or change centers (I think it was), who's sole purpose was to innovate solutions to problems that haven't even been identified yet (pardon me, but this was some 30 years ago, and I'm recalling from memory).
From WikiPedia:
If the US isn't at the point of being a post-industrial society, as he describes it, we are certainly very close to the cusp of it. Frankly, I think we've been here awhile now already.
Not really sure where to take this thread from here, have at it and take it where you will.
This is exactly what happened with the microprocessor.
That too was developed with research and development assistance from Uncle Sam and was slated for military and space use and might have remained there had it not been for an agreement to release it to the marketplace.
This is exactly what a helpful government is SUPPOSED to do, help with the heavy lifting on ideas that benefit the largest part of the public and the marketplace, with the intent to stimulate advancement and innovation that provides opportunities to create wealth, jobs and a higher standard of living.
Thus when we "love" our government with the taxes we pay, they "love" us back, in a rather loose manner of speaking.
Of course it does not always work this well but it certainly did with the NASA Moon program.
Sure, the microprocessor would have eventually been invented and released to the marketplace even if Uncle Sam hadn't helped.
Everyone knows that. But not everyone wants to admit that our government made it happen faster, helped to make it cheaper to manufacture and helped to make it more readily available by providing both direct and indirect assistance to the private sector.
Back then, we were doing what Japan eventually started doing IN EARNEST.
We called it "Japan, Inc." but they were just echoing our furtive baby steps in that direction and decided to go the steroids route with the idea.
This type of benevolent public/private partnership, when done right, is an enormous stimulant to the economy.
This is exactly what happened with the microprocessor.
That too was developed with research and development assistance from Uncle Sam and was slated for military and space use and might have remained there had it not been for an agreement to release it to the marketplace.
This is exactly what a helpful government is SUPPOSED to do, help with the heavy lifting on ideas that benefit the largest part of the public and the marketplace, with the intent to stimulate advancement and innovation that provides opportunities to create wealth, jobs and a higher standard of living.
Thus when we "love" our government with the taxes we pay, they "love" us back, in a rather loose manner of speaking.
Of course it does not always work this well but it certainly did with the NASA Moon program.
Sure, the microprocessor would have eventually been invented and released to the marketplace even if Uncle Sam hadn't helped.
Everyone knows that. But not everyone wants to admit that our government made it happen faster, helped to make it cheaper to manufacture and helped to make it more readily available by providing both direct and indirect assistance to the private sector.
Back then, we were doing what Japan eventually started doing IN EARNEST.
We called it "Japan, Inc." but they were just echoing our furtive baby steps in that direction and decided to go the steroids route with the idea.
This type of benevolent public/private partnership, when done right, is an enormous stimulant to the economy.
Huh ?? Uncle Sam invented the Microprocessor ? Try Tedd Hoff who worked for INTEL
Invention of the Microprocessor
Sorry attempt at Revisionism there buddy
Transistor ? Not Uncle Sam, William Shockley at Bell Labs
You know its timed to ABANDON a ideology when you have to make **** up out of whole cloth to support that ideology
The first use of the term "microprocessor" is attributed to Viatron Computer Systems describing the custom integrated circuit used in their System 21 small computer system announced in 1968.
By the late-1960s, designers were striving to integrate the central processing unit (CPU) functions of a computer onto a handful of MOS LSI chips, called microprocessor unit (MPU) chip sets. Building on 8-bit arithmetic logic units (3800/3804) he designed earlier at Fairchild, in 1969 Lee Boysel created the Four-Phase Systems Inc. AL-1 an 8-bit CPU slice that was expandable to 32-bits. In 1970, Steve Geller and Ray Holt of Garrett AiResearch designed the MP944 chip set to implement the F-14A Central Air Data Computer on six metal-gate chips fabricated by AMI.
Intel introduced its first 4-bit microprocessor 4004 in 1971 and its 8-bit microprocessor 8008 in 1972. During the 1960s, computer processors were constructed out of small and medium-scale ICs—each containing from tens of transistors to a few hundred. These were placed and soldered onto printed circuit boards, and often multiple boards were interconnected in a chassis. The large number of discrete logic gates used more electrical power—and therefore produced more heat—than a more integrated design with fewer ICs. The distance that signals had to travel between ICs on the boards limited a computer's operating speed.
In the NASA Apollo space missions to the moon in the 1960s and 1970s, all onboard computations for primary guidance, navigation and control were provided by a small custom processor called "The Apollo Guidance Computer". It used wire wrap circuit boards whose only logic elements were three-input NOR gates.[SUP][11][/SUP]
https://en.wikipedia.org/wiki/Microprocessor
Huh ?? Uncle Sam invented the Microprocessor ? Try Tedd Hoff who worked for INTEL
Invention of the Microprocessor
Sorry attempt at Revisionism there buddy
Transistor ? Not Uncle Sam, William Shockley at Bell Labs
You know its timed to ABANDON a ideology when you have to make **** up out of whole cloth to support that ideology
Nice try, but your gambit didn't work."That too was developed with research and development assistance from Uncle Sam."
Umm. Right. I don't think that government had much to do with the microprocessor. That was all Intel, or what became Intel.
Now, the first instance of use that demanded digital circuity was in fact military missile guidance systems, and that was caused by the tyranny of numbers (numbers of connections in circuits).
Well, the revisionist would be you.
I don't tolerate revisionists as it is. When they attempt to twist what I say, I tolerate it even less.
You're on IGGY too now.
Everyone knows that I said that research and development ASSISTANCE was made available.
Nice try, but your gambit didn't work.
In 1968, Garrett AiResearch (which employed designers Ray Holt and Steve Geller) was invited to produce a digital computer to compete with electromechanical systems then under development for the main flight control computer in the US Navy's new F-14 Tomcat fighter.
The design was complete by 1970, and used a MOS-based chipset as the core CPU.
(MOS= Metal Oxide Semiconductor.)
It was already in use in the Tomcat when the Intel announcement about the 4004 was made.
PS: Regarding Shockley, it might interest you to know that when that announcement was made, the "science editor" at The Washington Post wrote that Bell Laboratories scientists had "invented a new type of transistor tube" (a reference to vacuum tubes in use at the time).
Anyway, I already had enough of your twaddle a long time ago, so you're on my ignore list.
Cheers
A NOR gate is the core of a microprocessor.
https://en.wikipedia.org/wiki/NOR_gate
Did you think microprocessors ran on magic smoke?
My first real JOB was at Penril Data Systems in 1973. We made the very first MODEMS using a combination of semiconductor devices, including op amps, integrated circuit microchips and transistors. I started in circuit pick and pluck assembly, then operated the wave solder chain, then went to work doing burn-in and QC.
What were YOU doing in 1973?
True. But we still are the country that can manufacture anything. It's just that manufacturing some of it doesn't make economical sense anymore, and that's moved to places where it does make economical sense.
The very nature of work here in the US, and even in the rest of the world as we have exported it, has fundamentally changed away from the need for large blue collar, no skill or low skill jobs, hence the demand for such has waned.
Quite true. Those technicians monitoring the line are performing a higher skill and knowledge job than working the line, and are compensated as such.
Dunno. I've not read any futurists since Toffler's book.
But we still are the country that can manufacture anything.
A NOR gate is the core of a microprocessor.
https://en.wikipedia.org/wiki/NOR_gate
Did you think microprocessors ran on magic smoke?
My first real JOB was at Penril Data Systems in 1973. We made the very first MODEMS using a combination of semiconductor devices, including op amps, integrated circuit microchips and transistors. I started in circuit pick and pluck assembly, then operated the wave solder chain, then went to work doing burn-in and QC.
What were YOU doing in 1973?
The tyranny of numbers was a problem faced in the 1960s by computer engineers. Engineers were unable to increase the performance of their designs due to the huge number of components involved. In theory, every component needed to be wired to every other component (or at least many other components), and were typically strung and soldered by hand. In order to improve performance, more components would be needed, and it seemed that future designs would consist almost entirely of wiring.
The first known recorded use of the term in this context was made by the Vice President of Bell Labs in an article celebrating the 10th anniversary of the invention of the transistor, for the "Proceedings of the IRE" (Institute of Radio Engineers), June 1958 [1]. Referring to the problems many designers were having, he wrote:
For some time now, electronic man has known how 'in principle' to extend greatly his visual, tactile, and mental abilities through the digital transmission and processing of all kinds of information. However, all these functions suffer from what has been called 'the tyranny of numbers.' Such systems, because of their complex digital nature, require hundreds, thousands, and sometimes tens of thousands of electron devices.At the time, computers were typically built up from a series of "modules", each module containing the electronics needed to perform a single function. A complex circuit like an adder would generally require several modules working in concert. The modules were typically built on printed circuit boards of a standardized size, with a connector on one edge that allowed them to be plugged into the power and signaling lines of the machine, and were then wired to other modules using twisted pair or coaxial cable.
— Jack Morton, The Tyranny of Numbers
https://en.wikipedia.org/wiki/Tyranny_of_numbers
We are also the country that can innovate anything, research anything, and even develop anything.
And when the government provides a financial boost to help spur any of the above, like it did with the semiconductor industry from the very beginning, and with the IT industry from the very beginning, the private sector gets the sense of security it needs to plow ahead with bold ideas.
Uncle Sam has been happy to open the checkbook and clear away debris and obstacles when ideas have that kind of potential.
The better example, of course, is "Japan, Inc." but they were just copying what we did, only they did it on a much larger scale, extending financial help to every manufacturing and technology market in the entire country.
But at its core, it's the same approach. NASA needed something, the private sector had the brains and the innovation, and the government offered to underwrite some of the financial risk. The military needed something, same thing. The DoD needed something, same idea again.
And it's also the same with medical research. Uncle Sam underwrites lots of this stuff, and the private sector benefits, people get wealthy, and the market is able to advance.
So the issue is really just about how many of us get to share some small slice of the benefits.
Ideally, at least in my humble opinion, it's better if all of the American people get a small dividend in some way.
Not free stuff...I'm talking affordable access to something useful that improves lives and opens new doors, creates jobs.
The heavy lifting that the government sometimes makes available can do an enormous amount of good.
See my post above.
See post #20 and #21, and for the last time (in large print)
I never said that the GOVERNMENT INVENTED THE MICROPROCESSOR.
Or any semiconductor.
If you continue with the Fenton style swiftboating bull**** (above) I will just put you on the ignore list, too.
Then the two of you can whine about me and act juvenile and I can ignore it.
Government heavy lifting is a fact in almost every significant area of high tech in this country.
Actually, the point was that the government didn't support any of those things, transistor, semiconductors nor the microprocessor.
From the posted histories of each, they are inventions of the private sector (mostly Bell Labs and also Intel), so the claim that government is required for these types of developments kinda rings hollow, don't you think?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?