I, Robot , I Soldier.

xylaria

Bushcrafter (boy, I've got a lot to say!)
What a thought provoking thread.

soldiers are trained not to dither with a gun when in combat. When under atack, they are trained to do unto others before they do it you. The problem with modern mech warfare it requires the button to pressed ending someone lives without them attacking in that manner first. The button is pressed because on a screen they look like targets to the operator. Pakistan has cut off supply routes to afganistan at the moment, because an drone operator made what has become a common error.

Watch the wikileaks video "collateral murder" [Very graphic adult content] This ultamatly is the problem with modern warfare, there is a lack of thought or higher moral development before a one sided engagement takes place against people that are not equipted to defend themselves in anyway. If you have operators that are playing call of duty in thier heads, there isnt a lot of differance between shooting a legit insurgants, journalists, children, or a civilian abulance. The wikileaks video is disturbing enough, the story behind who leaked it is even worse.
 

RonW

Native
Nov 29, 2010
1,594
153
Dalarna Sweden
I think I know what you mean and I'm playing Devils Advocate in some way , but what would you state is the 'right' way for them to be used??

I guess I'm looking at this whole thing from a morality/humanity standpoint , is it ok to dispatch AI robots to fight a war for you???

No, it most certainly is not.
War in general is hideous, but smart soldiers still are human and can, to some extend, make decisions on their own, based on actual situations they are confronted with. They control the trigger themselves, despite the fact that they are trained and drilled.

What I meant that is is a very real possibility that some "leaders" might be tempted to turn thos machines on their own people in order to suppres and control them.
 

AuldJum

Forager
Sep 18, 2011
109
0
Fife
Humans have always been good at finding ways to kill each other more effectively. Theres an emotional disconnect when it comes to killing in game, millions of people do it, it wouldn't be hard to have robots in war with a human controlling it, you could probably make them think it was a game.Truth is no one cares, wars are fought for commodities, oil in the present, we could stop the killing but we are so pathetic that we won't.It's disgusting.
 

RonW

Native
Nov 29, 2010
1,594
153
Dalarna Sweden
The Pentagon doing research for lifesaving robots.... Am I the only one being a bit (very) sceptic here?

Given the recent developments in US laws, I think a large numer of those robots would suit them very well.... Probably slighty reprogrammed in order to "maintain peace and order" in the streets.
 

BOD

Bushcrafter (boy, I've got a lot to say!)
There will always be screw-ups whether it is droids in the battlespace or humans.

What is dangerous is that their will be no conscience on the battlefield and the locus of responsibility and accountability will be even more fuzzy than it is already.

Take out our children from the risk of combat and it is even easier to obtain popular support for resource wars. Humans need to see death first hand to understand what war is like. If it is on a screen you cannot really understand it.

Already people do not see death since we are "protected" from disturbing images, old people are relocated to homes to die out of the sight of others and often die without a family member around.

A skewed world like that with no moral compass deserves the retribution that the oppressed will visit upon them. A robotic military and a police force would be a symptom of a culture gone wrong which means that there will be oppression of the less fortunate within that society and of other peoples as well.

A system like that is complex and vulnerable to exogenous shocks. Low tech cultures are resilient and resourceful and eventually they will prevail.

Wherever I am I will cheer the underdogs.
 

TeeDee

Full Member
Nov 6, 2008
10,992
4,098
50
Exeter
What is dangerous is that their will be no conscience on the battlefield and the locus of responsibility and accountability will be even more fuzzy than it is already.

Take out our children from the risk of combat and it is even easier to obtain popular support for resource wars. Humans need to see death first hand to understand what war is like. If it is on a screen you cannot really understand it.

I agree completely, I think there needs to be conscious accountabilty in the Zone of operations , by having guys on the ground you highten the risk but increase the presence of Humanity.

Having said that I see intelligent AI as tool that we will continue to develop for a long time and probably have live 'field test' once we are in the final stages.
I'm for one not so worried about the 'turned crazed robot' scenario to subjugate the masses or hacked to turn on its population or anything along the 'skynet' theory.

I'm more concerned if we will find it an easier decision to go to war if we are only sending Hardware that technology has removed a conscious choice ( or lessend the soul searching thought process. ) to not go to war in the 1st place.

< Does that make sense? >
 

Tengu

Full Member
Jan 10, 2006
13,031
1,642
51
Wiltshire
If your talking about AIs then Im sure any AI worth his salt will realise that humans hate robots.

Or he may take the view that he should be fighting on the side of the small guy...who sure as heck wont have built him.
 

TeeDee

Full Member
Nov 6, 2008
10,992
4,098
50
Exeter
Swarm Bot testing [video=youtube;d8TmI7UhGlM]https://www.youtube.com/watch?v=d8TmI7UhGlM[/video]
 

mrcharly

Bushcrafter (boy, I've got a lot to say!)
Jan 25, 2011
3,257
46
North Yorkshire, UK

Hammock_man

Full Member
May 15, 2008
1,501
575
kent
All this reminds me of the guy who only had 2 things to sell.
i) A spear which could go through anything.
ii) A shield which stopped everything.

They have been moving the goalpost ever since they put an edge on a bit of flint
 

demographic

Bushcrafter (boy, I've got a lot to say!)
Apr 15, 2005
4,762
786
-------------
All this reminds me of the guy who only had 2 things to sell.
i) A spear which could go through anything.
ii) A shield which stopped everything.

They have been moving the goalpost ever since they put an edge on a bit of flint

Think they made footballs before they bothered with the goalpost bit as well.;)
 

TeeDee

Full Member
Nov 6, 2008
10,992
4,098
50
Exeter
All this reminds me of the guy who only had 2 things to sell.
i) A spear which could go through anything.
ii) A shield which stopped everything.

They have been moving the goalpost ever since they put an edge on a bit of flint


I'd suggest the difference is that the Spear and Shield STILL required human decision making and interaction. It would appear we are on the cusp of that decision making process being handled by AI.
 

mrcharly

Bushcrafter (boy, I've got a lot to say!)
Jan 25, 2011
3,257
46
North Yorkshire, UK
I'd suggest the difference is that the Spear and Shield STILL required human decision making and interaction. It would appear we are on the cusp of that decision making process being handled by AI.
As I've said above, we've already gone past it, with fire and loiter missiles. The missiles 'loiter' looking for a target and attacking automatically.
 

BCUK Shop

We have a a number of knives, T-Shirts and other items for sale.

SHOP HERE