I, Robot , I Soldier.

SaraR

Full Member
Mar 25, 2017
1,651
1,209
Ceredigion
Wars tend to go on until the political cost gets too high, because of too much money or too many lives are spent/lost, so if instead of people you just have robot soldiers fighting, would you not risk the wars just dragging on for longer?

Especially since the per unit production costs probably would go down as volumes increased and the production methods developed.

And for wars fought away from home, would the electorate care enough about locals in some faraway place to put pressure on their government to resolve it if none of their fellow citizens were killed there in the process?
 
  • Like
Reactions: C_Claycomb

TeeDee

Full Member
Nov 6, 2008
10,992
4,098
50
Exeter
And for wars fought away from home, would the electorate care enough about locals in some faraway place to put pressure on their government to resolve it if none of their fellow citizens were killed there in the process?

I tend to agree - which is why , rather oddly , I think we need the risk of losing ( our sides ) lives on the table to understand what is being waged. (?)


Not sure if that makes sense. I've only got more questions at this stage.
 
  • Like
Reactions: SaraR

Paul_B

Bushcrafter through and through
Jul 14, 2008
6,413
1,702
Cumbria
I'm late to this thread and an early strand about hacking robots interested me. If you follow the idea of hacking what about the existing systems relating to calling in close artillery support. AIUI they call in artillery to hit close to front line positions. If you can hack robots surely you can hack those systems such that your side calls in artillery but give the wrong coordinates or the artillery gets it wrong and you get hit by your own side?

I don't really believe that would happen but I reckon the mythical hacking tech or capability for robot weaponry could easily hit existing artillery systems, in fact that's likely to be easier!

If this has been discussed already I'm sorry, there's simply too much to catch up on.
 

TeeDee

Full Member
Nov 6, 2008
10,992
4,098
50
Exeter
Its more the way that AI will go that is the concern for me. At any time we have a human in the command line you at least have a human conscious at work - be that good or bad.

The point where , as an example, aerial drones become more the mainstay of an countries air element ( no need to have life support , longer flight time , capable of faster G's during maneuvers ) and then engage in aerial contacts will I think a bring a " hands-off " catalyst moment when Human vs Human result in broad stale mates but A.I will operate and act quicker in a decisive and terminal manner.
 

Ozmundo

Full Member
Jan 15, 2023
457
359
48
Sussex
I have no technical knowledge on it but the idea that hacking stuff works like in a Marvel fiction fortunately isn’t the case.

There are issues where electronic warfare techniques can be more impactful specifically but also there are more tools to circumvent such restrictions.

Ukraine is a bit of a test bed. :O_O:

On the responsibility or not of computers making judgement calls, whatever makes the decisions it’s still the same chain of command. Humans have such a great history of doing the right thing anyway…..

Gort, Klaatu barrada Nikto.

I worked with a chap some years back who investigated war crimes. He was adamant that a lot more went on and a lot of people knew about it. Now stuff is starting to make it into the public domain. I’m not sure if AI will make this more or less transparent, every technology has potential for abuse or insight.

There are some autonomous platforms that have quite amazing capabilities in just supporting roles. I am sure they have the ability to save lives, maybe even preventing some stupidity. Bugger, I sound like BAE marketing.

It I was waiting for support or rescue I’d rather a human turn up to help me. Whether or not some loitering bot follows it’s code, I am still yet to be convinced by however.

To paraphrase an old sweat “It takes human intervention to truly **** something up.”

What was Robocop’s fourth directive again? ;)
 

Kav

Nomad
Mar 28, 2021
452
360
71
California
Mother Nature, Gaia ; choose name given by a sentient species that has the hubris to think itself superior and destined to do some great thing, eventually, when we get around to it.
Climate change brought on by us is
Already screaming a futile attempt to get our attention.
Our clever little toys for making war on ourselves are one more indulgence before it all Sto......
 

Paul_B

Bushcrafter through and through
Jul 14, 2008
6,413
1,702
Cumbria
Isn't AI just machine learning, a programming to approximate how humans learn but simpler? For example the various chatbots are "taught" human language by feeding it human books and writings. The quality of the chatbot is dependent on programming quality but also the human created feed in to learn from.

In that case what would military AI learn from?

You see, what AI becomes is really down to humans
 

TeeDee

Full Member
Nov 6, 2008
10,992
4,098
50
Exeter
Isn't AI just machine learning, a programming to approximate how humans learn but simpler? For example the various chatbots are "taught" human language by feeding it human books and writings. The quality of the chatbot is dependent on programming quality but also the human created feed in to learn from.

In that case what would military AI learn from?

You see, what AI becomes is really down to humans

AI would learn and evolve its learning at expediential rates however and would only be constrained by Human thinking if that was programmed in to the original parameters.

Asimov suggested a fictional Three Laws of Robotics but robotics and AI are hugely different and I think it more pertains to A.I.

The laws are as follows: “(1) a robot may not injure a human being or, through inaction, allow a human being to come to harm;

(2) a robot must obey the orders given it by human beings except where such orders would conflict with the First Law;

(3) a robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” Asimov later added another rule, known as the fourth or zeroth law, that superseded the others. It stated that “a robot may not harm humanity, or, by inaction, allow humanity to come to harm.”

These are just fictional precepts. Now imagine what a framework of action could look like without some core ethical structure.

The campaign against Killer Robots https://en.wikipedia.org/wiki/Campaign_to_Stop_Killer_Robots

It has already the support of some key figures - https://www.bbc.co.uk/news/technology-33686581
 

Hammock_man

Full Member
May 15, 2008
1,501
575
kent
Look at the row over cluster bombs, land mines, using civil infrastructure to cover military formations. How, oh how can you trust the other side to comply with any rules on A.I. ?
 
  • Like
Reactions: Scottieoutdoors

Paul_B

Bushcrafter through and through
Jul 14, 2008
6,413
1,702
Cumbria
You talking about USA backed Ukraine or Russia? Of course the reason they're giving them cluster bombs is because they are getting low in ordinance that they're free to give away without undermining their own defence capabilities. Both sides used mines already I read from a pro ukraine uk publication.

When the angels start to look a bit like the devils you know we're in a bad place!
 

Hammock_man

Full Member
May 15, 2008
1,501
575
kent
(Note, I am 100% behind Ukraine)
I write not Justify use of these weapons but rather from the standpoint that if my government will not permit the use of these thing, how do I know the other side won't. I know that if I was in a defensive position with the real probability of being overrun, I field full of land mines in front of me would be a grand thing, tomorrow; that's another day.
 

Paul_B

Bushcrafter through and through
Jul 14, 2008
6,413
1,702
Cumbria
Cluster bombs used to get a success rate of 10% actually working. Now it's a lot higher but still a lot get left in a dangerous state. Mines and cluster bombs kill more civilians after the war than combatants during the war.

International conventions are in place to say that combatants should not target civilians, often ignored I might add by all sides, but cluster bombs and mines do effectively target civilians more than combatants.

Imho they should be banned. The fact USA and Ukraine are with countries like Russia, North Korea and Iran with not banning cluster bombs says a lot. Now Ukraine is desperately fighting a war for its existence. I guess a lot see that a an excusable reason for using this weapon type. But tbh that's the point, when you are at your most stressed as a nation having and keeping your conscience is when it all counts the most.

Sorry for the preach.i don't do that often.
 

TeeDee

Full Member
Nov 6, 2008
10,992
4,098
50
Exeter
Guys - can we keep current geo-political events out of the thread before some Mod sees a reason to close it down. I think A.I is an important subject to discuss - more so when its applied to warfare evolution of nations.

But can we please leave politics and opinions on countries actions out of the thread -- Thanks!
 
  • Like
Reactions: Robbi

Broch

Life Member
Jan 18, 2009
8,490
8,368
Mid Wales
www.mont-hmg.co.uk
Could AI survive an EMP?

Yes; Defence electronics is designed and tested to withstand an EMP. But, AI doesn't necessarily exist at the point of application anyway - you take out one deployed 'machine' in a specific place, but its knowledge, data, self adopted algorithms etc. already reside on countless other data stores throughout the world.

This, to me, is one of the scariest aspects of the technology; within fractions of a second of an AI algorithm self adjusting to new data, or a new scenario, that new algorithm (and hence decision making) has potentially spread to all AI.
 

BCUK Shop

We have a a number of knives, T-Shirts and other items for sale.

SHOP HERE