I, Robot , I Soldier.

  • Hey Guest, Early bird pricing on the Summer Moot (29th July - 10th August) available until April 6th, we'd love you to come. PLEASE CLICK HERE to early bird price and get more information.

TeeDee

Full Member
Nov 6, 2008
10,499
3,702
50
Exeter
Its more the ethical side of implementing AI robots on the battlefield I have the biggest issue and concerns for.
 

demographic

Bushcrafter (boy, I've got a lot to say!)
Apr 15, 2005
4,694
711
-------------
Great, we can make them, flog them to Saudi Arabia, who can use them against civilians and the survivors can then in turn become radicalised and bomb us.

Then we have a better excuse to make more (to stop terrorism, obviously) to sell to the Saudis who can...
 

TeeDee

Full Member
Nov 6, 2008
10,499
3,702
50
Exeter

Oliver G

Full Member
Sep 15, 2012
392
286
Ravenstone, Leicestershire
It's an interesting issue. I certainly don't think you could have any AI making the decision to pull a trigger, the legal and moral implications of harming the wrong person / asset would kick it well out of the park.

One place it certainly has a place is in logistical automation, if you had a convoy you could very easily guide them with GPS (Our university were outstanding in their field with automated tractors). An issue then arises how do you prevent the convoy hitting a kid, if you tell it to stop when a kid walks in front, all I'd do is send a kid in front and then you can loot the trucks to your hearts content.

I think no matter where you use robots / automation you will always have to have a human to interrupt and assess the situation either in the vehicle or remotely depending on the scale.

There are little robots being developed for harvesting, they look like little scarab beetles and a re rather sweet, these could certainly be used to search and flag old mine fields for disposal.

Here's a quandary for you, if you had a robotic asset that could perform the same functions as a soldier, would you be legally / morally obliged to send the robot in rather than the soldier with whom you have a duty of care over? Commanders in the field would see the robot as a disposable asset and the procurement would want to protect their expensive assets, where does the balance lay?
 
  • Like
Reactions: TeeDee

TeeDee

Full Member
Nov 6, 2008
10,499
3,702
50
Exeter
It's an interesting issue. I certainly don't think you could have any AI making the decision to pull a trigger, the legal and moral implications of harming the wrong person / asset would kick it well out of the park.

One place it certainly has a place is in logistical automation, if you had a convoy you could very easily guide them with GPS (Our university were outstanding in their field with automated tractors). An issue then arises how do you prevent the convoy hitting a kid, if you tell it to stop when a kid walks in front, all I'd do is send a kid in front and then you can loot the trucks to your hearts content.

I think no matter where you use robots / automation you will always have to have a human to interrupt and assess the situation either in the vehicle or remotely depending on the scale.

There are little robots being developed for harvesting, they look like little scarab beetles and a re rather sweet, these could certainly be used to search and flag old mine fields for disposal.

Here's a quandary for you, if you had a robotic asset that could perform the same functions as a soldier, would you be legally / morally obliged to send the robot in rather than the soldier with whom you have a duty of care over? Commanders in the field would see the robot as a disposable asset and the procurement would want to protect their expensive assets, where does the balance lay?

I very much like this angle of thought and it does mirror my own - if the underlying point of automated/semi-automated technology is to prevent loss of ( our ) servicemen/women lives would we be morally ok as a nation with sending a completely automated aircraft carrier with ground units to some far off ( technology ) third world country to wage an intervention by force? I realise this is an extreme polar example but would we be accepting in where this could potentially lead? Part of me thinks the whole reason of advancing these technologies it to limit the loss of ( again , Our sides ) human lives - another part of me is mildly fearful of the ethical implications.

Good point on the Automated Convoy - I guess that the convoy would have to possess varying levels of aggressive / defensive 'thinking' - that way you could have it running on friendly roads and transport routes ( UK ) but then stepped up when known to be entering Indian country with varying levels of safety margin in between?


I'd love to know more about the harvesting scarab beetles you mention - sounds interesting.

In answer to you quandry - Yes I think the Country owning the asset is already , via its development , morally obligated to use the tech rather than risk the lives of citizens - I anticipate more use of UAV drones and the growth and use of Land AV -something which does concern me and intrigues me in equal measure.

Thank you for your thoughts and engaging.
 
  • Like
Reactions: Oliver G

TeeDee

Full Member
Nov 6, 2008
10,499
3,702
50
Exeter
It's an interesting issue. I certainly don't think you could have any AI making the decision to pull a trigger, the legal and moral implications of harming the wrong person / asset would kick it well out of the park.

How about the decision for UAV ( Red team ) to engage & destroy UAV ( Blue Team ) - Could that be left alone to A.I without Human intervention? How would that be ? Technology waging independent war upon Technology?

The limiting factor in many Fighter jets is currently that the fact it has to have a Human inside and the various systems surrounding that pilot to support life - also the other limiting factor is the Human itself , the Amount of G's they can withstand , their operational ability without requiring rest/sleep/food , and their ability to think objectively under long periods of stress.

If you remove the Pilot conundrum then jets/drones can be made to fly faster, longer, and have a far more effective operational ability - apart from the fact there is no Human in the loop to add what?.....Moral compass? Humanity? Conscience??

Just spitballing.
 
  • Like
Reactions: Oliver G

C_Claycomb

Moderator staff
Mod
Oct 6, 2003
7,391
2,406
Bedfordshire
In the UK, soldiers cost about £40,000 just to put through basic army training. They require maintenance training, you cannot just park them when there are no conflicts to send them to. If one dies in service there are costs in both returning the remains and compensating family. Despite all that, I can see that a machine could still be more expensive.

Funny I should read this now. I watched Gemini Man with Will Smith just the other day, and the premise is very similar, where the villain is the villain because he has engineered a perfect soldier; not only fast, strong and free of fears, but someone who has no family to mourn them.

It is hard to know where AI will go in the far future. We are still a long way from anything that could replace a foot soldier, both mechanically and computationally. I can see that using AI to assist in assessment could be good. Western forces get very bad press when they make mistakes, say with a drone strike that kills civilians, or drop bombs on the wrong target. In much the same way that AI is being used to spot things on x-rays that are easily missed by human eyes, maybe it could be used to help prevent errors on the battle field.

I worry about AI across the board, not just on the battle field. Its like we are in the 1500s and we have a vision of what fire arms can become in 500 years and are trying to work out whether to keep working on our musket designs. Or in the 1700s, thinking about the advantages of coal and steam, and been granted a view of polluted oceans and a warming climate.

This makes interesting reading. Shows what is already happening with the use of remote tech on a battle field, and it isn't really the US or the UK, or the West that is using and doing.
 

Oliver G

Full Member
Sep 15, 2012
392
286
Ravenstone, Leicestershire
How about the decision for UAV ( Red team ) to engage & destroy UAV ( Blue Team ) - Could that be left alone to A.I without Human intervention? How would that be ? Technology waging independent war upon Technology?

The limiting factor in many Fighter jets is currently that the fact it has to have a Human inside and the various systems surrounding that pilot to support life - also the other limiting factor is the Human itself , the Amount of G's they can withstand , their operational ability without requiring rest/sleep/food , and their ability to think objectively under long periods of stress.

If you remove the Pilot conundrum then jets/drones can be made to fly faster, longer, and have a far more effective operational ability - apart from the fact there is no Human in the loop to add what?.....Moral compass? Humanity? Conscience??

Just spitballing.

I would argue that any UAV engagement would require intervention from a human element, primarily to reduce collateral damage, if you do down a drone where would it land? Would a dog fight over a city pose too much risk to the population (Under the law of armed conflict I would suggest that the risk of damage to civilians was disproportionate to the benefit of downing an enemy drone there and then.

While the tactical elements of a battle effectively boil down to how best to create holes in the bloke trying to kill you the war itself relies mainly on the emotional and moral impact, you need the enemy civilian population to accept that you've won and that they cannot accept any more casualties (Attrition), or that their lives would be better with you in charge than the current government (Hearts and Minds). If it gets to a point of robot wars so to speak the human element is removed, what would the civilian population accept as loss conditions? Most probably they would accept they've lost when the economy can no longer support a war of attrition.

Could you imagine the protests that would happen when two technologically balanced countries declare a drone war? The question I would be asking is: Why do I have to accept the risk as a civilian for the military to play wargames?

On the flip side though, if you had one superpower that kept developing the robotics and automation without another super power keeping them in check then the technological development would slow down and the benefit of the technological progress to the civilian population would be less enticing, you need competition to spur on progress.
 

C_Claycomb

Moderator staff
Mod
Oct 6, 2003
7,391
2,406
Bedfordshire
Thing is, you don't need a super power now to develop this tech. Turkey is doing rather well, and isn't a super power by any stretch. Israel is known for both drones and military tech and hacking. The problem with a lot of this stuff is that you don't need big, costly, armed forces to deploy it.
 
  • Like
Reactions: santaman2000

TeeDee

Full Member
Nov 6, 2008
10,499
3,702
50
Exeter
Excellent contributions gentlemen and food for further thought.

If a soldier or unit makes a decision that results in them ending at a Higher court to answer for their actions ( or crimes. ) who do we take into accountability/responsibility when the same happens with A.I ?

The General for using them?
The Politician for sanctioning their use?
The Company whom created them?
The Programming team who coded them?
 

Oliver G

Full Member
Sep 15, 2012
392
286
Ravenstone, Leicestershire
I would think that the Hauge would already have a framework of inquiry for warcrimes involving unmanned vehicles, whether they have published that is another matter. It gets into a sticky position with importing technologies from companies in countries that have not signed the Geneva Convention, would it come down to clauses in contracts? (You're not allowed to use Apple iTunes in the development of nuclear weapons).

On C_Claycomb's point it will be fascinating to see how less militarily developed countries take their technology, and if/what the specialize in? How would stateless nations develop their tech?
 

Tony

White bear (Admin)
Admin
Apr 16, 2003
24,169
1
1,921
53
Wales
www.bushcraftuk.com
whoever has the least power or money to defend themselves... So, probably the temp programmer they use...

This is one of the problems with politicians being so involved in military decisions and private companies building secret things, there's never a clear path to responsibility...
 
  • Like
Reactions: TeeDee

BCUK Shop

We have a a number of knives, T-Shirts and other items for sale.

SHOP HERE