The Machines Are Coming

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
I could swear this is how Skynet started out. Something about removing humans from the equation, computers learning at a geometric rate, becoming self aware. :eek:

But computers do not just replace humans in the workplace. They shift the balance of power even more in favor of employers. Our normal response to technological innovation that threatens jobs is to encourage workers to acquire more skills, or to trust that the nuances of the human mind or human attention will always be superior in crucial ways.
 
Computers neither "learn" or "think"--a common misconception. For some reason, humans enjoy anthropomorphizing machines. Makes for good sci-fi, though...;)
 
The problem with these types of arguments is they don't address the real problem that we are failing to create the jobs to replace those that are automated ... SciFi has painted a nice future where machines empower us to explore, expand into the solar system and beyond, and reach our full potential ... it is humanity that is failing to live up to the vision

when hazardous or menial jobs are automated we don't look to create the new job opportunities this could open up (exploring the sea and building sea cities, building space stations, building Moon and Mars bases, exploring the solar system, etc) ... we suffer from a lack of imagination and spirit of adventure ... creating more menial jobs for humanity doesn't help us grow and evolve, but using the freedom that increased automation gives us to become more than we are does ;)
 
Leaning and thinking is subjective and just a software issue though. Once you have the software refined, certainly they will learn and think. The hardware is already way faster than it needs to be.

BTW, I don't think workers getting "more skills" is the answer, just very specialized skills. That has been the trend since the start of the industrial revolution, and its going to get to a point where one person just knows one tiny piece of a tiny piece of a fraction of the puzzle... that's his domain and he'll be an expert, but know jack all about the rest.

Because as it is, we spend over half our lives just relearning what past generations already knew and getting up to speed.
 
But computers do not just replace humans in the workplace. They shift the balance of power even more in favor of employers.

Unless you are the person who repairs or makes the computers work.

Until then develop self installing and self repairing computers/robots, many of us don't have much to worry about :)
 
It's a weird cycle. The problem with the apocalyptic thinking that robots (machines) will take all the jobs is that companies have to sell their products / services to someone. If too many companies are automating and unemployment goes through the roof, who is going to be left to buy those services / products? The only profitable companies raking in the money will be the one's that actually make the robots.... and it's a good bet the lawyers will find a way to profit on them somehow and like cockroaches survive.

I do think that certain industries, especially those with high turnover or very repetitive tasks, are going to see a lot of job losses over the next 5 years or so. I have to figure the days of being a burger flipper will soon be over for example. I can't imagine that all those Chinese workers assembling iPhone's are still going to have jobs snapping screens and cases together either.
 
I work in a creative field (architecture). My most aspirational career goal is to create software to make my role in the design process obsolete. Better tools means more money for me. I just need to make sure I'm on the right side of the change when it happens.
 
I could swear this is how Skynet started out. Something about removing humans from the equation, computers learning at a geometric rate, becoming self aware. :eek:

I am actually writing a program now that allows machines to design themselves based off of "learned rules"
 
Steve said:
But computers do not just replace humans in the workplace. They shift the balance of power even more in favor of employers. Our normal response to technological innovation that threatens jobs is to encourage workers to acquire more skills, or to trust that the nuances of the human mind or human attention will always be superior in crucial ways.


Our "normal response" exists because that is how it has always been. Economies grow through productivity increases, meaning using automation and better tools to allow for more production from each worker. To say that this will cause mass unemployment would be without precedent in human history. You know what came before the horse and plow? Men with shovels.

kbrickley said:
The problem with these types of arguments is they don't address the real problem that we are failing to create the jobs to replace those that are automated ... SciFi has painted a nice future where machines empower us to explore, expand into the solar system and beyond, and reach our full potential ... it is humanity that is failing to live up to the vision

The real problem has nothing at all to do with "technological unemployment". In fact the real problem of job growth being sluggish is completely unrelated to technology, it has everything to do with 20 years of bad central bank policies that created a hugely distorted economy.
 
Our "normal response" exists because that is how it has always been. Economies grow through productivity increases, meaning using automation and better tools to allow for more production from each worker. To say that this will cause mass unemployment would be without precedent in human history. You know what came before the horse and plow? Men with shovels.



The real problem has nothing at all to do with "technological unemployment". In fact the real problem of job growth being sluggish is completely unrelated to technology, it has everything to do with 20 years of bad central bank policies that created a hugely distorted economy.

It's a weird cycle. The problem with the apocalyptic thinking that robots (machines) will take all the jobs is that companies have to sell their products / services to someone. If too many companies are automating and unemployment goes through the roof, who is going to be left to buy those services / products? The only profitable companies raking in the money will be the one's that actually make the robots.... and it's a good bet the lawyers will find a way to profit on them somehow and like cockroaches survive.

I do think that certain industries, especially those with high turnover or very repetitive tasks, are going to see a lot of job losses over the next 5 years or so. I have to figure the days of being a burger flipper will soon be over for example. I can't imagine that all those Chinese workers assembling iPhone's are still going to have jobs snapping screens and cases together either.

Henry Ford figured this out more than a century ago. There is a delicate balance between the amount of automation that streamlines your production capabilities and the ability of your workers to earn a wage that affords them the ability to purchase the goods or services that you create. If workers are not earning enough, demand for your goods or services will decline, potentially to the point that you cannot sell enough to be sustainable as a business anymore.

The modern flip-side to all of this is that in some developed countries, the death rates are approaching exceeding the birth rates (negative population growth). When the population reaches an equilibrium, a minimum level of automation will be necessary for basic functions to continue.

Central bank mingling in the markets can have an impact on whether or not people are earning a living wage, but there is so much else going on that it is not as simple as adjusting the prime lending rate to reach a desired outcome. The best that we can hope for is that there is wage appreciation - wages increase annually at a faster rate than inflation, which is nominally targeted at 3% for developed nations, if I recall my macroeconomics correctly. In this regard, Federal Reserve monetary policy is important in relation to the working class, but it is far from the only significant factor in play.
 
I find it interesting that people assume "burger flipping" is going to disappear. If so, it will have nothing to do with automation. We have had that capability for years. The large fast food corporations have quietly been buying up every start up using automation for fast food that even looked like it might have a viable plan for national distribution. They have then sat on the technology. Why? The answer is very human. People WANT to be served by other people. If all the big names switched to automation tomorrow, the mom and pops would explode in market share.
 
Henry Ford figured this out more than a century ago. There is a delicate balance between the amount of automation that streamlines your production capabilities and the ability of your workers to earn a wage that affords them the ability to purchase the goods or services that you create. If workers are not earning enough, demand for your goods or services will decline, potentially to the point that you cannot sell enough to be sustainable as a business anymore.

Actually Henry Ford's employees started quitting in droves because of the tool the assembly line takes on an employees happiness. That plus Henry Ford was trying to stave off unions gaining a food hold on his plants.
 
The problem with these types of arguments is they don't address the real problem that we are failing to create the jobs to replace those that are automated ... SciFi has painted a nice future where machines empower us to explore, expand into the solar system and beyond, and reach our full potential ... it is humanity that is failing to live up to the vision

when hazardous or menial jobs are automated we don't look to create the new job opportunities this could open up (exploring the sea and building sea cities, building space stations, building Moon and Mars bases, exploring the solar system, etc) ... we suffer from a lack of imagination and spirit of adventure ... creating more menial jobs for humanity doesn't help us grow and evolve, but using the freedom that increased automation gives us to become more than we are does ;)

You can thank governments, mega-corps, greed, and lust for power for all of that. ^
You definitely painted a nice future though, and it's one I really would have liked to have seen. :(
 
You can thank governments, mega-corps, greed, and lust for power for all of that. ^
You definitely painted a nice future though, and it's one I really would have liked to have seen. :(

Corporations succeeded in making 'vision' a dirty word. Didn't you get the memo?
 
Ducman69;1041561926BTW said:
Past performance is not a guarantee of future results, however. How capable, in general, is the human mind, and at what point does the ability of automation break the equilibrium? It's easy to say to burger flippers, join the knowledge economy. But in our lifetime, we're likely to see computers move into law, medicine, etc.

And, at what point, is stuff "enough". If machines are able to feed, clothe, house the world's population - are jobs necessary? What is the morality of pushing people into near-slave labor to earn wages if/when the marginal cost to produce the aforementioned necessities down to nothing?
 
I'm waiting for the H.R. robot. The one that only hires other robots.

F.U. human and don't come back.
 
You can thank governments, mega-corps, greed, and lust for power for all of that. ^
You definitely painted a nice future though, and it's one I really would have liked to have seen. :(

Well, I like to play Civ and it colors my view of the future ... also I like the Gene Roddenberry Sci Fi even though the Gene Roddenberry future required a world war to motivate people ... I was hoping we could take the Babylon 5 approach and do it without a global conflict

The biggest obstacle is our own lack of peace ... if we channeled the trillions spent on the various military forces around the world into space exploration or technology development we would have several space stations and bases already, and maybe even a few sea metropolises ... just call me a dreamer :cool:
 
The problem with the apocalyptic thinking that robots (machines) will take all the jobs is that companies have to sell their products / services to someone.
How is this a problem? Historically the wealthy and powerful could care less if the poor and weak died en masse in poverty. If you really want to be horrified read up on Enclosure which was straight up, "hrrrm these common folk are doing too well, lets impoverish the lot of them and make ourselves richer in the process!"

George Orwell wrote in 1944

Stop to consider how the so-called owners of the land got hold of it. They simply seized it by force, afterwards hiring lawyers to provide them with title-deeds. In the case of the enclosure of the common lands, which was going on from about 1600 to 1850, the land-grabbers did not even have the excuse of being foreign conquerors; they were quite frankly taking the heritage of their own countrymen, upon no sort of pretext except that they had the power to do so.

I do think that certain industries, especially those with high turnover or very repetitive tasks, are going to see a lot of job losses over the next 5 years or so.
There was a study done back in 2013 that estimated 47% of jobs were at risk of being eliminated through automation over the next 20 yr. The economic effect of that without some sort of social support net like a Mincome or GMI would probably be worse than the Great Depression.

A Mincome or GMI would probably be considered Communist or Socialist which won't fly in our current political environment.
 
It is today almost possible to build a machine to do anything provided it doesn't require abstract thinking problem solving or creativity.

But we don't because human labor is cheaper then making a robot, then hiring someone to maintain that robot + it's parts.
 
Leaning and thinking is subjective and just a software issue though. Once you have the software refined, certainly they will learn and think. The hardware is already way faster than it needs to be.

Not really, no. Thinking like a true AI would think (emotions, will, etc) is a lot harder than refining software and using current hardware. It will probably require a paradigm shift in both. That is, a total redefinition of the way both work. Current machine learning is mainly a statistics based approach iirc. While this is similar to the way our brains work on some levels, there is stuff that's missing. Yeah, we'll be able to get complex automations going that depend on machine learning. That part is actually already there. But true artificial learning and thought that is on the level of humans? No, that's a ways away. I'd say around a decade at very least. Just my intuition, but I think building quantum computers will have some affect on the field.

... Well actually after googling it, it appears NASA has something going on, here:
http://www.nas.nasa.gov/quantum/

It might be less than a decade, then. Who knows?




On the subject of AI and automations replacing humans, I think that humanity needs to evolve faster. That is, we need to force some evolution to happen. According to the age of the planet, it took us like what? Millions of years to get to the intelligence level we're at? That's too slow at this point. We need more sophisticated bioengineering. Plus, better ability to transfer information between humans. Right now, we need to learn and relearn every generation. It'd be nice to kind of just jack in and learn everything about a subject. There would still be some room for each human being unique because biologically our brains would still develop differently and everyone would still have their own expertise. Maybe some studies into eugenics to raise the general intelligence floor of the species as well.
 
How is this a problem? Historically the wealthy and powerful could care less if the poor and weak died en masse in poverty. If you really want to be horrified read up on Enclosure which was straight up, "hrrrm these common folk are doing too well, lets impoverish the lot of them and make ourselves richer in the process!"




There was a study done back in 2013 that estimated 47% of jobs were at risk of being eliminated through automation over the next 20 yr. The economic effect of that without some sort of social support net like a Mincome or GMI would probably be worse than the Great Depression.

A Mincome or GMI would probably be considered Communist or Socialist which won't fly in our current political environment.

There is actually a Libertarian argument for a Mincome: http://www.libertarianism.org/columns/libertarian-case-basic-income#.wtt26u:3cP6
 
Actually Henry Ford's employees started quitting in droves because of the tool the assembly line takes on an employees happiness. That plus Henry Ford was trying to stave off unions gaining a food hold on his plants.

I never said that he put the information to good use (which is a subjective term anyway), but he did think about the issue.

Well, I like to play Civ and it colors my view of the future ... also I like the Gene Roddenberry Sci Fi even though the Gene Roddenberry future required a world war to motivate people ... I was hoping we could take the Babylon 5 approach and do it without a global conflict

The biggest obstacle is our own lack of peace ... if we channeled the trillions spent on the various military forces around the world into space exploration or technology development we would have several space stations and bases already, and maybe even a few sea metropolises ... just call me a dreamer :cool:

I'm with you on that one. In fact, there are some very interesting ideas presented in The Economics of Star Trek. It somewhat parallels what I posted before with respect to certain libertarians arguing for a minimum income.
 
Computers neither "learn" or "think"--a common misconception. For some reason, humans enjoy anthropomorphizing machines. Makes for good sci-fi, though...;)

Eh, but what is "learning" and "thinking". From a biological perspective, humans are just advanced organic machines. We use energy to make connections in the brain, relying on a neural network to give us thought. Learning is just like programming, we input information to strengthen or create more neural activity so that accessing it becomes easier/quicker.

Simplified, but it's not too far off. If someone was able to develop an AI capable of developing it's own opinions/beliefs, they wouldn't be far from developing a robot that could be it's own person.
 
Eh, but what is "learning" and "thinking". From a biological perspective, humans are just advanced organic machines. We use energy to make connections in the brain, relying on a neural network to give us thought. Learning is just like programming, we input information to strengthen or create more neural activity so that accessing it becomes easier/quicker.

Simplified, but it's not too far off. If someone was able to develop an AI capable of developing it's own opinions/beliefs, they wouldn't be far from developing a robot that could be it's own person.

Computers do "think" they just aren't cognitive which is one form of thinking we humans do. It's not cognitive because they aren't driven by needs or desires for survival which requires broad creative solving skills.

However computers do have the ability to "learn" and "process" information and grow from that learning based on a set of static rules and statistical analysis of data set unions.
 
Eh, but what is "learning" and "thinking". From a biological perspective, humans are just advanced organic machines. We use energy to make connections in the brain, relying on a neural network to give us thought. Learning is just like programming, we input information to strengthen or create more neural activity so that accessing it becomes easier/quicker.

Simplified, but it's not too far off. If someone was able to develop an AI capable of developing it's own opinions/beliefs, they wouldn't be far from developing a robot that could be it's own person.

"Machine" is a term invented by us to describe something we invented. You say humans are "basically just advanced organic machines", but machines didn't come into existence until humans created them. I personally don't think we can generalize ourselves to be something that didn't even exist until we created it. Well and furthermore, it doesn't really fit with the formal definition of a machine anyway. It's not like the inventing the word "animal", which described many things on the planet (including us).

The concept of artificial intelligence is also based entirely on using us as a model, because we don't know of any other intelligence in existence. We think that we're the only example of an intelligence at the moment.


How many of you people that think AI is simple and not far away actually program? I think if you did program, you would have a greater appreciation for how difficult what you're discussing is. The hardware imo is actually still insufficient. The way our minds work is simply entirely different from how code and modern computers (internally, on the circuit level) operate. It's not a matter of how many logic gates and transisters do we have. It's a matter of them being incompatible on a fundamental level.
 
But we don't because human labor is cheaper then making a robot, then hiring someone to maintain that robot + it's parts.
The cost is dropping rapidly. The cut off point now is around $15-20/hr in the US, after that the automation is usually cheaper to do.

Even in China the automation is getting good and cheap enough that some jobs that were being done by people are now being done by robots.

Also google Dark Factories. And a chart of US manufacturing employment vs output while you're at it. Despite the persistent myth that 'nothing' is made in the US anymore we're only 2nd to China in terms of manufacturing. They passed us back in 2009-10 or so.

Yet manufacturing labor has dropped steadily for decades while output improved! Automation has already eliminated a huge number of factory jobs but people ignore that fact just because we don't have Jetson's-grade robots or some such nonsense.
 
I thought they called it "synergy". :D

Only if you're talking about vertical integration. Remember, you have to "invest" your time and not spend it. My all time favorite, "good managers don't work, they manage." It was even said with a straight face. :D
 
The problem with these types of arguments is they don't address the real problem that we are failing to create the jobs to replace those that are automated ... SciFi has painted a nice future where machines empower us to explore, expand into the solar system and beyond, and reach our full potential ... it is humanity that is failing to live up to the vision

when hazardous or menial jobs are automated we don't look to create the new job opportunities this could open up (exploring the sea and building sea cities, building space stations, building Moon and Mars bases, exploring the solar system, etc) ... we suffer from a lack of imagination and spirit of adventure ... creating more menial jobs for humanity doesn't help us grow and evolve, but using the freedom that increased automation gives us to become more than we are does ;)

That's because most humans are not adventurous, they are risk adverse, and basically lazy. They want an 8 to 5 job, which they arrive late to and leave early from, and then they want to be left alone to while away their off time however they want to do it. And they will only work as hard as they must in order to reach a comfortable level of life style.

A gamer will only work hard enough to keep his system up, his network on, and his favorite few games in play. A hard core gamer want's all the same stuff, just better.

A hunter needs his truck and his guns & ammo.

A car guy needs twin turbo's and flowmaster pipes and must be able to cover the occasional speeding ticket and repairs for the parts he trashes.

The list is as long as the hobbies are diverse.

Women though, they are rarely satisfied with any level of affluence. For them money means one of two things and sometimes both at once. It's safety, status, or both. At 55 I have come to the conclusion my woman want's me to work until I can't any longer. Then she's going to hope I move on to the afterlife quickly and without fuss.
 
There was a study done back in 2013 that estimated 47% of jobs were at risk of being eliminated through automation over the next 20 yr. The economic effect of that without some sort of social support net like a Mincome or GMI would probably be worse than the Great Depression.

That's alot of income tax revenue gone if almost 50% of jobs disappeared ... and robots don't pay income tax. The government better start realizing that too much automation is not a good thing ... for them.
 
"Machine" is a term invented by us to describe something we invented. You say humans are "basically just advanced organic machines", but machines didn't come into existence until humans created them. I personally don't think we can generalize ourselves to be something that didn't even exist until we created it. Well and furthermore, it doesn't really fit with the formal definition of a machine anyway. It's not like the inventing the word "animal", which described many things on the planet (including us).

The concept of artificial intelligence is also based entirely on using us as a model, because we don't know of any other intelligence in existence. We think that we're the only example of an intelligence at the moment.


How many of you people that think AI is simple and not far away actually program? I think if you did program, you would have a greater appreciation for how difficult what you're discussing is. The hardware imo is actually still insufficient. The way our minds work is simply entirely different from how code and modern computers (internally, on the circuit level) operate. It's not a matter of how many logic gates and transisters do we have. It's a matter of them being incompatible on a fundamental level.

I have a conundrum.

I've built a machine, it's automated. It has many sensors that tell it when to return to it's charging station, or if it's not going to make it then deploy it's solar panel and hibernate until it has enough charge to resume it's trek home. It knows day and night, hot and cold, and responds to the environment. It's self-righting and has two forms of locomotion and selects the one appropriate to the terrain. And it can sense an RC controlled aircraft or "drone" can track and record not only GPS synched full motion video of the aircraft but all electronic emanations coming from it. Every activity of my machine is self contained, there is no computational "thinking or AI" going on. It's just a collection of parts each performing it's configured task as a part of the whole.

How far from being an animal is my machine?
 
That's because most humans are not adventurous, they are risk adverse, and basically lazy. They want an 8 to 5 job, which they arrive late to and leave early from, and then they want to be left alone to while away their off time however they want to do it. And they will only work as hard as they must in order to reach a comfortable level of life style.

A gamer will only work hard enough to keep his system up, his network on, and his favorite few games in play. A hard core gamer want's all the same stuff, just better.

A hunter needs his truck and his guns & ammo.

A car guy needs twin turbo's and flowmaster pipes and must be able to cover the occasional speeding ticket and repairs for the parts he trashes.

The list is as long as the hobbies are diverse.

Women though, they are rarely satisfied with any level of affluence. For them money means one of two things and sometimes both at once. It's safety, status, or both. At 55 I have come to the conclusion my woman want's me to work until I can't any longer. Then she's going to hope I move on to the afterlife quickly and without fuss.

Well, to borrow the title of Arthur C Clarke's famous book, at sometime we must come to Childhood's End ... and staying prisoners of the Earth is our Childhood ... or as Babylon 5 put it:

Sinclair: No. We have to stay here [space]. And there's a simple reason why. Ask ten different scientists about the environment, population control, genetics, and you'll get ten different answers, but there's one thing every scientist on the planet agrees on. Whether it happens in a hundred years or a thousand years or a million years, eventually our Sun will grow cold and go out. When that happens, it won't just take us. It'll take Marilyn Monroe, and Lao-Tzu, and Einstein, and Morobuto, and Buddy Holly, and Aristophanes…[and] all of this…all of this…was for nothing. Unless we go to the stars.

There could be just as many simple and meaningless jobs in space as there are on Earth ... and leaving the timid, or the unimaginative, or the dogma driven people behind isn't a bad thing ... right now we have 7 billion people clinging to one small life raft in space ... our species' chances to grow and evolve all belong in the frontiers (space, the sea, and other planets) not in the comfortable parlor we are trying to cram more and more people into :cool:
 
Not arguing against your comment.

But you know, over the short haul, those less averse to alternative forms of population control will have an impact on Arthur's proposed timeline.
 
I have a conundrum.

I've built a machine, it's automated. It has many sensors that tell it when to return to it's charging station, or if it's not going to make it then deploy it's solar panel and hibernate until it has enough charge to resume it's trek home. It knows day and night, hot and cold, and responds to the environment. It's self-righting and has two forms of locomotion and selects the one appropriate to the terrain. And it can sense an RC controlled aircraft or "drone" can track and record not only GPS synched full motion video of the aircraft but all electronic emanations coming from it. Every activity of my machine is self contained, there is no computational "thinking or AI" going on. It's just a collection of parts each performing it's configured task as a part of the whole.

How far from being an animal is my machine?

It depends on the level of animal you're talking about. From an insect, not very far. From us, very far.

Well actually no. You're kind of far even from a fruit fly. From my dictionary, this is an animal:
"A living organism characterized by voluntary movement"
Key word on voluntary. Does the machine want to move for its own benefit? No, not really. It has no concept of self. It wouldn't care if you just took a hammer and beat the crap out of it and destroyed it. Even if you programmed it so that it would defend itself, you would have to program it to defend itself from any attack.. .but even then it would be far away from an animal because it's not protecting itself of its own will. It's simply doing as you programmed it to. Reason I say it's not too far from a fruit fly is because it kind of does feel like insects are practically like programs... but even then to an extent I think they learn and such. Now if you're asking how far you are functionally (end result wise) from a basic animal... I guess it depends on the complexity of your robot. But you still won't have an animal.

An animal has to perform actions and learn things entirely because it wants to do them. Or needs to, but need is just a very intense case of "want" anyway. Our motivations and desires are kind of a black box at the moment. The reason I say that quantum computing might bring us closer to AI is... well actually probably better covered by this article:
http://www.doc.ic.ac.uk/~nd/surprise_97/journal/vol1/spb3/#RelatedAnchor

But even in the article, all of this functions on the pretense that we actually know how the mind works in the first place. We don't.
 
Only if you're talking about vertical integration. Remember, you have to "invest" your time and not spend it. My all time favorite, "good managers don't work, they manage." It was even said with a straight face. :D

LOL
Oh yes, I know about all of this, all too well...

0dd6f6b0987f012f2fe400163e41dd5b
 
Well, to borrow the title of Arthur C Clarke's famous book, at sometime we must come to Childhood's End ... and staying prisoners of the Earth is our Childhood ... or as Babylon 5 put it:



There could be just as many simple and meaningless jobs in space as there are on Earth ... and leaving the timid, or the unimaginative, or the dogma driven people behind isn't a bad thing ... right now we have 7 billion people clinging to one small life raft in space ... our species' chances to grow and evolve all belong in the frontiers (space, the sea, and other planets) not in the comfortable parlor we are trying to cram more and more people into :cool:

well said.


There was an article awhile ago about the less money collected via fuel tax for hybrid and other fuel-less/very-efficient vehicles. It's taken what, a decade for that to trickle into the consciousness of government in any obvious way? It'll happen with automation/robots/AI/whatever, but it'll take awhile. The money must flow.
Personally, and this is just opinion, I don't think understanding will happen fast enough, and our technology will outpace our good sense and ruin a bunch, if not all, economies at a minimum. And in the case of actual AI, it'll end up even worse when the imminently logical AI says hey guys, you have GOT to both quit breeding and quit trying to live for 150 years if you want to be sustainable. Or gtf off this planet a large pile of you.
I don't think I'll live to see it but I strongly believe some variation of this will play out if we manage not to mire the world in a serious war before hand.
 
Not arguing against your comment.

But you know, over the short haul, those less averse to alternative forms of population control will have an impact on Arthur's proposed timeline.

Near as I can figure we're really pretty much supposed to have both war and giant disease every now and again. They both serve a function for our species, just like a wildfire does for plant life. I suspect we're overdue on both. It took me awhile to understand the difference between not wishing harm on individuals and understanding the need or good in suffering or death on a large scale, but things make more since after getting to that point. Not happy, but makes more sense.
 
Back
Top